In-Development TradeDangerous: power-user trade optimizer

ANNOUNCEMENT: Maddavo prices data with EDDN updates

After a crash course in Python, I have now set up LIVE price updates into the Maddavo shared prices data via EDDN. This means that all broadcast price updates are merged into the prices database. There is a delay of about 3 minutes between EDDN price update broadcasts and merging into the database.

Hopefully the data cleaning in place will withstand any bad data that is received. BUT if you find data that doesn't make sense then please let me know so I can try to make something that ignores/removes it.

This also means that you can use EDDN methods to send price updates to the database rather than prices file uploads if you wish.

Cheers,
Maddavo
 
What the HELL! :)

I did not do nothing to findpads.py, just updated database daily/hourly as usual and now it works!
Code:
python3 findpads.py
Commands are: 'quit', 'go', 'jumps N', 'ly N.NN' or 'origin placename'.

origin: SOL, jumps:1, ly/jump:30
command: go
PROCYON/Manakov Dock: ls from star: 0, max pad size: ?
Updated ls from star: 100
Updated max pad size: L
NOTE: PROCYON/Manakov Dock (#3364) updated in /home/<stuff>/data/TradeDangerous.db: ls=100, bm=?, pad=L


origin: PROCYON, jumps:1, ly/jump:30
command:
I have no clue, but who cares if it works, it works :D

BTW when it says "updated in /..." does it only update it to the SQL .db file like it says, or to Station.csv as well?

Only in the local cache, you'll need to do a "trade.py export --table Station" when you're done.

Or ... add a blank line at the end of the file and then the following code (which is not indented, no spaces or tabs infront of it)

Code:
import csvexport
csvexport.exportTableToFile(tdb, tdb.tdenv, "Station")
print("Rebuilt Station.csv")

- - - - - Additional Content Posted / Auto Merge - - - - -

After a crash course in Python, I have now set up LIVE price updates into the Maddavo shared prices data via EDDN. This means that all broadcast price updates are merged into the prices database. There is a delay of about 3 minutes between EDDN price update broadcasts and merging into the database.

Hopefully the data cleaning in place will withstand any bad data that is received. BUT if you find data that doesn't make sense then please let me know so I can try to make something that ignores/removes it.

This also means that you can use EDDN methods to send price updates to the database rather than prices file uploads if you wish.

Cheers,
Maddavo

That's really awesome. What's your thinking on updates that people upload to you?

-Oliver

- - - - - Additional Content Posted / Auto Merge - - - - -

UPDATE:

I've checked in several small updates today, but I've also confirmed 140+ new Systems from the EDStarCoordinator (and checked in the little script I use for doing this).

I say "confirmed" because, well, Systems are kind of important. So I made myself a crap little tool that does some sanity checking on all the new EDSC Systems and then helps me to paste them into the game's Galaxy Map SEARCH box so I can check the star is where it's supposed to be.

If you are visiting uncharted systems, you can contribute to the crowd source project here: http://edstarcoordinator.com/

If anyone happens to know how to get in touch with the guy, I have a list of bad star names, positions, etc...
 
Last edited:
That's really awesome. What's your thinking on updates that people upload to you?

Updates can still be uploaded as per previous via the upload form and uploaddata.asp page. It may be better to migrate user price updates to EDDN so that non-TD users also get the benefit of price updates. It looks like most people use EliteOCR which I believe has a EDDN upload/sending function. So if you use EliteOCR then send via that method. If you update prices via the GUI or text editor then you can still upload as usual but this is where an EDDN sender would probably be useful.

I've also asked in the EDDN group whether it would be beneficial to rebroadcast old data so all the prices (in the last 24 or 48hrs say) can be captured by a listener over a period (say 3 hrs).

I have not come across any bad data that I couldn't handle yet so hopefully this won't contaminate the data we have. But as with the Slaves/Silver issue we had a while ago, everyone should just be aware and on the lookout for strange prices.
 
I have now set up LIVE price updates into the Maddavo shared prices data via EDDN.
Congratulations! This makes me very very happy :D

Hopefully the data cleaning in place will withstand any bad data that is received.
You should really contact themroc.

This also means that you can use EDDN methods to send price updates to the database rather than prices file uploads if you wish.
Excellent, this speeds up my time spent in market sharing considerably. This is so awesome, Hooyah!

BTW since EDDN is pretty much "live" stuff, do you have any recommendations how often we can/should hammer your site for price updates, could you tighthen the 2 day prices file to maybe even 12 hrs or something?

Bandwidth is no issue for me, but I'm sucker for optimization :)

kfsone said:
Only in the local station, you'll need to do a "trade.py export --table Station" when you're done.
Okay that is no problem, I'll look the code you gave. Thanks man.
 

wolverine2710

Tutorial & Guide Writer
After a crash course in Python, I have now set up LIVE price updates into the Maddavo shared prices data via EDDN. This means that all broadcast price updates are merged into the prices database. There is a delay of about 3 minutes between EDDN price update broadcasts and merging into the database.

Hopefully the data cleaning in place will withstand any bad data that is received. BUT if you find data that doesn't make sense then please let me know so I can try to make something that ignores/removes it.

This also means that you can use EDDN methods to send price updates to the database rather than prices file uploads if you wish.

Cheers,
Maddavo

PERFECT. Now TD users have a lot more info. Perhaps close to the info in Slopeys BPC as I think most data in the BPC comes from importing a .bpc file coming from EliteOCR. EOCR automatically sends data to EDDN. Another OCR tool, RegulatedNoise also has an option to send data to EDDN as well. Not sure if RN also has a .bpc output. If not Slopey is missing his data, while EDDN and your merge tool receives it!! Now with your tool it also means TD users don't have to be connected to EDDN all the time - in case an EDDN-tap.py surfaces.

Question: Do you currently also send data to EDDN (using a http request) when users upload their .prices file to your merge tool. That way non TD users would also get the TD .prices info!! Which as commanders have said, might a be relative little data BUT of high quality!!!

TLDR; PERFECT news, nice piece of work.

Edit: After reading the brilliant news, didn't read further and created a post. It seems uploads to EDDN are currently not yet supported. Luckily uploading to EDDN is much easier then getting ZMQ running ;-)
 
Last edited:
Oh btw wolverine2710, all my TD price uploads are done with EliteOCR, so while technically TD users wont send anything to EDDN... EDDN still doesn't lose anything as these same guys do it just with EliteOCR :)

Sure some people might tweak one or two commodities by hand... but that will be drop in the bucket EDDN wise.
 
Updates can still be uploaded as per previous via the upload form and uploaddata.asp page. It may be better to migrate user price updates to EDDN so that non-TD users also get the benefit of price updates. It looks like most people use EliteOCR which I believe has a EDDN upload/sending function. So if you use EliteOCR then send via that method. If you update prices via the GUI or text editor then you can still upload as usual but this is where an EDDN sender would probably be useful.

I've also asked in the EDDN group whether it would be beneficial to rebroadcast old data so all the prices (in the last 24 or 48hrs say) can be captured by a listener over a period (say 3 hrs).

Well, I was actually thinking it might be useful for you to act as a hub in both directions - when you process a .prices file, EDDN-transmit the entries you received via uploads out to EDDN. I think that jibes well with your transmitting old updates periodically.

The problem is the data, especially using their one-item-per-message, gets ridiculously large.

https://www.dropbox.com/personal/Public/ed/eddntransform (the scripts are included in the directory)

Code:
osmith@WOTSIT ~/Dropbox/public/ed/eddntransform
$ ls -1sh
711k mad.2d.prices
4.8M mad.all.prices
2.9M eddn.2d.json
 20M eddn.all.json
551k batched.2d.json
3.7M batched.all.json
...

I'm guessing you culled the data recently.

The jist is: ~700kB of .prices data becomes 2.9MB of eddn data, 5MB of .prices data becomes 20MB of eddn data.

Alternatively, using a terse, batched form, 700kB of price data becomes 500kB and the savings scale up to the full dump.

I realize this doesn't matter much to your service, but it's going to matter a lot to the EDDN network as the data grows. If you send 20MB of historical updates to EDDN and EDDN has 500 subscribers, it is going to have to buffer and transmit - assuming no retrans - 26MB (mega bytes) of data per user, for a grand total of 12GB of data.

If you have a 10Mb/s (note lower b) internet connection and you can get full download speeds, it'll take you 20 seconds to download that. But most users are going to be getting under a Mb/s on the download, due to the way ZeroMQ's fan-out works, which means they can anticipate your update clogging up the EDDN network for them for 3 and a half minutes... This, in itself, will cause queueing issues that are going to draw it out to 5 minutes.

This means a really good chance, at internet scale, of lost data, individual price updates.

And this is with a fairly tiny portion of the full ED dataset and a small number of listeners. Heck the data I just tested this with was small relative to the data from your site yesterday.

Of course, if EDDN mostly becomes a mechanism for developers to exchange prices and most users actually get their data from sites like your own, then that dials-back the scale some, but it's still kind of insane, IMSE, going forward. We have 110k prices in the database, in a few months we'll probably have pipped a million. The sheer overhead of the current EDDN json format is an order of magnitude more than the actual data.

It isn't going to scale.

-- EDIT --

Also, be aware that ZeroMQ's "pub/sub" model of broadcasting isn't "free", the sender is actually having to send individual copies of the data to each subscriber, and further ZeroMQ's pub/sub requires receiver-side filtering ... so clients can opt-out of seeing your update but they actually still have to download it.
 
Last edited:
Oh, I'm not envying you now, Mad :)

Code:
import.prices:617 WARNING Unrecognized STAR/Station: "ADEO/DOBROVOLSKI CITY"
import.prices:671 WARNING Unrecognized STAR/Station: "ADEO/OOBROVOLSKI CITY"
...
import.prices:2261 WARNING Unrecognized STAR/Station: "BALDUK/DUTTON STATION"
import.prices:2330 WARNING Unrecognized STAR/Station: "BALDUR/DUTTON STATION"

It's going to be a lot of effort fishing out these badly OCRd D<->Os, B<->8s, K<->Rs, etc. I sort of wish they would consult the "dictionary" available to them.

I would suggest that you might want an attribute locally to track whether a submission came from yourself or EDDN, and you could perhaps put that in a comment in the .prices file for stations:

Code:
  @ SOL / Abrahame Linclog  # EDDN

  @ SOL / Abraham Lincoln
 
Yeah, I've hit that as well and sent him a PM about it.

Not sure how common similar named stations are, but considering it probably happens because people are doing scans in EliteOCR I'm guessing the issue will usually be local to the area, say a 20 Ly radius, so an idea could be to flag all duplicate station names within said radius.
 
Hi,
Hopefuly someone might be able to help. I've just tried to import some new market data for "GUI Wande/Aikin Horizons" & "GUI Wande/Clement Station" as there was none from the last upload using the maddavo plugin. Get an error:

import.prices:16 ERROR Unrecognized STAR/Station: "GUI WANDE/AIKIN HORIZONS"
and
import.prices:16 ERROR Unrecognized STAR/Station: "GUI WANDE/CLEMENT STATION"

The System name was already present but the station names were not.

I've been following the same process for the system OCHOENG and haven't had any problems. Ive checked for any formatting mistakes in the import.prices file but can't see any.

Anyone got any ideas? Thanks in advance
 
Hi,
Hopefuly someone might be able to help. I've just tried to import some new market data for "GUI Wande/Aikin Horizons" & "GUI Wande/Clement Station" as there was none from the last upload using the maddavo plugin. Get an error:

import.prices:16 ERROR Unrecognized STAR/Station: "GUI WANDE/AIKIN HORIZONS"
and
import.prices:16 ERROR Unrecognized STAR/Station: "GUI WANDE/CLEMENT STATION"

The System name was already present but the station names were not.

I've been following the same process for the system OCHOENG and haven't had any problems. Ive checked for any formatting mistakes in the import.prices file but can't see any.

Anyone got any ideas? Thanks in advance

Your "Station.csv", and therefore your database doesn't know about that station.

Code:
[color=gold]$ trade.py local -vv --ly 0 guiwande[/color]
System              Dist
  /  Station                               StnLs Age/days BMkt Pad Itms
-----------------------------------------------------------------------
GUI WANDE           0.00

If the DB had information about stations in the system:

Code:
[color=gold]$ trade.py local -vv --ly 0 volkhab[/color]
System              Dist
  /  Station                               StnLs Age/days BMkt Pad Itms
-----------------------------------------------------------------------
VOLKHAB             0.00
  /  Gutierrez Terminal                    2,191        -    ? Med    0
  /  Schweickart Hub                       14.0K        -    ? Med    0
  /  Vernadsky Dock                          401     0.10  Yes Lrg   63


You can use the "station" sub-command to add the stations:

Code:
[color=gold]$ trade.py station --add "guiwande/aikin horizons"[/color]
or if you know the ls-from star, the pad size and whether the station has a black market:
[color=gold]$ trade.py station --add "guiwande/aikin horizons" --ls=3948 --pad=l --bm=y[/color]

The most recent version of the code (6.7.0) will automatically add temporary stations for you.

- - - - - Additional Content Posted / Auto Merge - - - - -

GOOD NEWS, EVERYONE!

Code:
v6.7.0 Jan 17 2015
[color=gold]. (kfsone) ".prices" import automatically creates local placeholders
        for unknown stations when using "-i", e.g:
            trade.py buildcache -f -i
            trade.py import --plug=maddavo  (this plugin sets -i for you)
. (kfsone) Added a "--ls-max" option to "run" for filtering stations
. (kfsone) +140 Systems
. (tKE)  "buy" sub-command now supports ship names (find a ship vendor)[/color]
. (kfsone) Partial code/documentation cleanup
. (kfsone) Added a "getRoute" function to TradeDB()
        import tradedb
        tdb = tradedb.TradeDB()
        lave = tdb.lookupPlace("Lave")
        barn = tdb.lookupPlace("Barnard's Star")
        tdb.getRoute(lave, barn, maxJumpLy=12)

v6.6.1 Jan 10 2015
[color=gold]. (kfsone) Added "--blackmarket" option to "run" command, restrictions
    to stations which have a black market.
. (kfsone) Added "--end-jumps" ('-e') to "run" command, includes stations
    from systems within this many jumps of the specified --to.
      e.g.
        trade.py run --from sol --to lave -e 3
        will find runs from sol to somewhere within 3 jumps of lave.[/color]
. (kfsone) "rares" now has a friendly error when no rares are found
+ Data: kfsone, maddavo, Dave Ryalls, NeoTron, Tyler Lund, Jared Buntain,
+ Stars: EDStarQuery/kfsone
  [I'm adding several hundred new systems from EDStarQuery but I'm also
   manually vetting all of them!]
 
More new code today!

commit b33b3f186b3a49d8beebca5bf4c90962a29f702a
Author: Oliver Smith <oliver@kfs.org>
Date: Sat Jan 17 16:41:32 2015 -0800

Added --prune-score and --prune-hops to RUN

These allow you to eliminate candidate routes early on in the
"run" command based on their performance compared to the current
leader.

For example

Code:
      --prune-score 22.5 --prune-hops 3

says that from the 3rd hop, begin to eliminate routes which have
scored under 22.5% of the current best candidate. This can
significantly improve the time to calculate long runs.

But if the early hops are all poor performers it can keep you from
seeing a gold mine a few hops away that requires you to take a few
low-profit hits first.

E.g.

Code:
      -1-> 60cr/ton -2-> 90cr/ton -3-> 60cr/ton -4-> 50cr/ton
      ...
      -1-> 10cr/ton -2-> 50cr/ton -3-> 10cr/ton -4-> 900cr/ton

    "--prune-score 50 --prune-hops 4"

would cause you to miss the second route.

Note: pruning only takes place when there are more than 10 candidates.
 
If (as per Maddavo's site) uploads direct to EDDN are now the preferred method then can someone clarify:

1 - Does EliteOCR upload data automatically? There is a very obvious button for TD export, but I'm not clear how to make it send to EDDN.

2 - Do we still upload our stations.csv for new/updated station information?

Thanks :)
 
If (as per Maddavo's site) uploads direct to EDDN are now
It's not, it downloads from EDDN.

Does EliteOCR upload data automatically?
No...


There is a very obvious button for TD export, but I'm not clear how to make it send to EDDN.
There is a very obvious button for Export to EDDN, next to the Trade Dangerous Export button.

Do we still upload our stations.csv for new/updated station information?
Yes please, especially updating landing pad size is important for larger ships.
 
If (as per Maddavo's site) uploads direct to EDDN are now the preferred method then can someone clarify:

1 - Does EliteOCR upload data automatically? There is a very obvious button for TD export, but I'm not clear how to make it send to EDDN.

2 - Do we still upload our stations.csv for new/updated station information?

Thanks :)

I think I'm going to extend the .prices format so that the station line reads as

Code:
@ SYSTEM / STATION #{ls=N,bm=N,pad=N}

where ls, bm and pad are each optional and order is not compulsory, and it can be ignored, but if anything wants to pick it up, it's there as a hint.

-Oliver

- - - - - Additional Content Posted / Auto Merge - - - - -

Also, after you update your prices with TD (which produces an 'updated.prices' file in your td directory) you can upload by running

Code:
$ misc/madupload.py
or
C:\trade\> misc\madupload.py

This optionally takes a file to upload. If you want to send your station.csv to mad:

Code:
$ misc/madupload.py data/Station.csv
or
C:\trade\> misc\madupload.py data\Station.csv
 
It's not, it downloads from EDDN.

No - I was noting that Maddavo prefers us to now upload direct to EDDN, per his website

"if you have the facility to send updates via EDDN then that is the preferred method now"

There is a very obvious button for Export to EDDN, next to the Trade Dangerous Export button.

Well apparently I was using a very old version. The front page of the EliteOCR thread hasn't shown an update for a long time (until a day or so ago).

Yes please, especially updating landing pad size is important for larger ships.

Of course, the question was about method not requirement - given the switch to EDDN I wanted to be sure nothing had changed.

I think I'm going to extend the .prices format

Might be worth letting Seebek know, I guess. A couple of drop downs to easily add (at least) BM & Pad size should be easy enough and would act as reminders when players are gathering new station info.

Also, after you update your prices with TD (which produces an 'updated.prices' file in your td directory) you can upload by running

I generally just upload the import.prices from EliteOCR anyway. If there's a new station, I edit stations.csv in something useful like vi or notepad++ and upload that too.
I had a look at the different scripts, but actually find it more intuitive to do it by hand - possibly because I'm old fashioned like that.

So pretty much I won't change my routines I guess :)
 
Tried to search post but using x52 and mfd returns nothing. Is there anyway to have the display also show station name or is there anyway to alter the output text? For instance instead of reading #1 Buy 56 x Animal Meat @ 950cr have line 3 display station name. Also is there anyway to advance the display w/o having to alt+tab out of the game?
 
Tried to search post but using x52 and mfd returns nothing. Is there anyway to have the display also show station name or is there anyway to alter the output text? For instance instead of reading #1 Buy 56 x Animal Meat @ 950cr have line 3 display station name. Also is there anyway to advance the display w/o having to alt+tab out of the game?

Heh, "Saitek" is the only keyword long enough to match.

Can you give me a concrete example of what it is you want on the display and I'll look at adjusting it.

I designed the checklist code to work with Saitek's "soft button" support (the buttons just below the MFD) but I couldn't get their driver to work. Then I found that they also don't work in Saitek's SDK demo (its in your program files\saitek\directoutput\ somewhere, test.exe I think) either. I wrote to their support, got intermittent signs of discomfort followed by a request for some unrelated technical details before being told it was very complicated to ask the programmers questions and never heard back.
 
Your "Station.csv", and therefore your database doesn't know about that station.

Code:
$ trade.py local -vv --ly 0 guiwande
System Dist
/ Station StnLs Age/days BMkt Pad Itms
-----------------------------------------------------------------------
GUI WANDE 0.00

If the DB had information about stations in the system:

Code:
$ trade.py local -vv --ly 0 volkhab
System Dist
/ Station StnLs Age/days BMkt Pad Itms
-----------------------------------------------------------------------
VOLKHAB 0.00
/ Gutierrez Terminal 2,191 - ? Med 0
/ Schweickart Hub 14.0K - ? Med 0
/ Vernadsky Dock 401 0.10 Yes Lrg 63


You can use the "station" sub-command to add the stations:

Code:
$ trade.py station --add "guiwande/aikin horizons"
or if you know the ls-from star, the pad size and whether the station has a black market:
$ trade.py station --add "guiwande/aikin horizons" --ls=3948 --pad=l --bm=y

The most recent version of the code (6.7.0) will automatically add temporary stations for you.

- - - - - Additional Content Posted / Auto Merge - - - - -

GOOD NEWS, EVERYONE!

Code:
v6.7.0 Jan 17 2015
. (kfsone) ".prices" import automatically creates local placeholders
for unknown stations when using "-i", e.g:
trade.py buildcache -f -i
trade.py import --plug=maddavo (this plugin sets -i for you)
. (kfsone) Added a "--ls-max" option to "run" for filtering stations
. (kfsone) +140 Systems
. (tKE) "buy" sub-command now supports ship names (find a ship vendor)

. (kfsone) Partial code/documentation cleanup
. (kfsone) Added a "getRoute" function to TradeDB()
import tradedb
tdb = tradedb.TradeDB()
lave = tdb.lookupPlace("Lave")
barn = tdb.lookupPlace("Barnard's Star")
tdb.getRoute(lave, barn, maxJumpLy=12)

v6.6.1 Jan 10 2015
. (kfsone) Added "--blackmarket" option to "run" command, restrictions
to stations which have a black market.
. (kfsone) Added "--end-jumps" ('-e') to "run" command, includes stations
from systems within this many jumps of the specified --to.
e.g.
trade.py run --from sol --to lave -e 3
will find runs from sol to somewhere within 3 jumps of lave.

. (kfsone) "rares" now has a friendly error when no rares are found
+ Data: kfsone, maddavo, Dave Ryalls, NeoTron, Tyler Lund, Jared Buntain,
+ Stars: EDStarQuery/kfsone
[I'm adding several hundred new systems from EDStarQuery but I'm also
manually vetting all of them!]



Thanks for the prompt reply.
 
Last edited:
Back
Top Bottom