In-Development TradeDangerous: power-user trade optimizer

Yeah, now I've worked out the EDDN thing the data is exploding. I'll move within a few days. But also looking at compression.

As a side note, since I kinda prodded the topic in the first place... I started adjusting transfers.py to handle gzip encoding and ran into a bit of fun I'd forgotten about, at least with my test server (nginx). With content-encoding: gzip --> no content-length header. Fun fun.

Also an update on this:

Well, I'm nowhere near 'usable' with it (mostly because it's the first time I've dove into python), but I'm poking at a bottle.py based front end for TD to be used with something like overwolf or the steam overlay browser from a user's local system. I ran across a php-based tool someone threw together with overwolf in mind, but it just gives a blank entry for command line arguments, then runs trade.py with that. I started in on my little toy to, hopefully, a) keep the database loaded, b) use the internals of yours directly from python, and c) avoid having to run a separate webserver.

I traded sleep for a deeper understanding of TD's internals. My little toy needs a proper frontend attached, backend and frontend refactored out to be a bit more separate (and modular), and probably a LOT of api cleanup (it's a bit random on naming things, mostly pulling wholesale from trade.py command line arguments), but it's working to take input and spit out json for most things. Run's going to take a bit of work for output handling, but it is moving along. I'll get it put up in a git repo once I've separated things slightly. I'm really, incredibly, impressed with how clean TDs code is, by the way. Once I'd wrapped my head around it, I realized how incredibly trivial it is to use :)
 
Small bug report:

Using TD 390e560e4a9c:
Code:
trade import --plug=maddavo
NOTE: Last download was ~3.07 days ago, downloading full file
Connecting to server: [url]http://www.davek.com.au/td/prices.asp[/url]
import.prices: 13,257,259/13,257,259 bytes |  95.09KB/s | 100.00%
PLUGIN ERROR: Station ERAVATE/Ackerman Market has suspicious date: 2015-01-21 16:29:00 (newer than the import?)

It was 16:59 local time (CET), but I don't know if the new prices were really imported, so I used the opportunity to upgrade TD to 0b6c8d25cd09 and run again:
Code:
trade import --plug=maddavo
NOTE: Stale/missing local copy, downloading full .prices file.
Connecting to server: http://www.davek.com.au/td/prices.asp
PLUGIN ERROR: Data is not Maddavo's prices list:

http://www.davek.com.au/td/prices.asp shows:
Code:
Error: Cannot find current TradeDangerous.prices
It may be temporarily unavailable, please check back later.
(but it seems this is OK again now - 17:09, maybe rebuild started at 17:00?)

But even with this new file the same error:
Code:
trade import --plug=maddavo
NOTE: Stale/missing local copy, downloading full .prices file.
Connecting to server: http://www.davek.com.au/td/prices.asp
import.prices: 13,260,483/13,260,483 bytes |  75.73KB/s | 100.00%
PLUGIN ERROR: Station ERAVATE/Ackerman Market has suspicious date: 2015-01-21 16:29:00 (newer than the import?)

And my suspicion above was justified, the data was not imported, as seen in a trade run:
Code:
trade.py: Error: There is no trading data for ANY station in the local database. Please enter or import price data.

I just commented the check in plugins/maddavo_plug.py, so I can import again, so not critical for me.

But the most important report from me:
Thank you for this tool! I could not imagine going back to paper+pencil (or a text editor) to write down prices.
 
With the removed check the import works (see previous post) and the OCR oerp check already found something:
Code:
NOTE: Ignoring 'ZARYA MANAS/ANTONIO DE ANORAOE HUB' because it looks like OCR derp.

The correct name (also in import.prices) is "ZARYA MANAS/Antonio de Andrade Hub".
What's the desired procedure here?
Should TD learn to auto-correct such things?
Should it be reported to Maddavo?
Or will it vanish automatically at some point?
 
Looks like the derp-detector still needs a little work.

NOTE: Added local station placeholder for SAFFRON/CRESSVVELL (#4977)
NOTE: Added local station placeholder for SAFFRON/CRESSWELL (#4978)

NOTE: Added local station placeholder for SRS 2543/0IRAC TERMINAL (#4984)
NOTE: Added local station placeholder for SRS 2543/DIRAC TERMINAL (#4985)

NOTE: Added local station placeholder for WADJALI/KIDMAN TERMINAL (#4995)
NOTE: Added local station placeholder for WADJALI/KIOMAN TERMINAL (#4996)

The import did detect the derp for ZARYA MANAS/ANTONIO DE ANORAOE HUB so I know I'm running the right code drop.

In other news, last night TD suggested slave trade several times, and I rejected it... except once. And when I got to the destination, I discovered the station didn't actually buy slaves. Grrrr. I OCR'd a new set of trade data (EliteOCR is a real timesaver), exported to EDDN and into TD, then got a real destination to ditch the cargo. I'd like to think this was a case of a station changing its policy on slavery; does that happen in-game?

(Let me be the most recent in a long line of people who say Bravo Zulu - well done. TD is a totally awesome tool.)
 
Looks like the derp-detector still needs a little work.

NOTE: Added local station placeholder for SAFFRON/CRESSVVELL (#4977)
NOTE: Added local station placeholder for SAFFRON/CRESSWELL (#4978)

NOTE: Added local station placeholder for SRS 2543/0IRAC TERMINAL (#4984)
NOTE: Added local station placeholder for SRS 2543/DIRAC TERMINAL (#4985)

NOTE: Added local station placeholder for WADJALI/KIDMAN TERMINAL (#4995)
NOTE: Added local station placeholder for WADJALI/KIOMAN TERMINAL (#4996)

The import did detect the derp for ZARYA MANAS/ANTONIO DE ANORAOE HUB so I know I'm running the right code drop.

It just has a list of blacklisted names. I've added filters for the above.

Minyeni said:
In other news, last night TD suggested slave trade several times, and I rejected it... except once. And when I got to the destination, I discovered the station didn't actually buy slaves. Grrrr. I OCR'd a new set of trade data (EliteOCR is a real timesaver), exported to EDDN and into TD, then got a real destination to ditch the cargo. I'd like to think this was a case of a station changing its policy on slavery; does that happen in-game?

Yes, it does. Sometimes they show up as having "0L" demand and other times it stops showing up. I don't know whether the "0L" is a bug in ED or whether that is their internal way of saying "resume purchasing this sooner".
 
With the removed check the import works (see previous post) and the OCR oerp check already found something:
Code:
NOTE: Ignoring 'ZARYA MANAS/ANTONIO DE ANORAOE HUB' because it looks like OCR derp.

The correct name (also in import.prices) is "ZARYA MANAS/Antonio de Andrade Hub".
What's the desired procedure here?
Should TD learn to auto-correct such things?
Should it be reported to Maddavo?
Or will it vanish automatically at some point?

Reported to Mad or you can send him a delete: which you do by sending the station with a -1 ls-from-star distance.

Code:
# in bash ...
$ head -1 data/Station.csv >badstation.csv
$ echo "'ZARYA MANAS','ANTONIO DE ANORAOE HUB',-1,'?','?'" >>badstation.csv
$ cat badstation.csv
unq:name@System.system_id,unq:name,ls_from_star,blackmarket,max_pad_size
'ZARYA MANAS','ANTONIO DE ANORAOE HUB',-1,'?','?'
$ misc/madupload.py badstation.csv

(PS: I'm actually going thru and pruning these now)
 
Code:
# in bash ...
$ head -1 data/Station.csv >badstation.csv
$ echo "'ZARYA MANAS','ANTONIO DE ANORAOE HUB',-1,'?','?'" >>badstation.csv
$ cat badstation.csv
unq:name@System.system_id,unq:name,ls_from_star,blackmarket,max_pad_size
'ZARYA MANAS','ANTONIO DE ANORAOE HUB',-1,'?','?'
$ misc/madupload.py badstation.csv

Very timely; just caught some more OCR derpage (ALIT/RITTENHOUSE OOCK) and fired off the upload. Time for me to dust off my wretched Python skills and build a version of madupload.py that just prompts for the derp'd system name and station, constructs the right csv, and sends it along. It's 30 seconds of Perl for me, but since I don't know Python all that well... sigh. Heck, I could code it up in C++ in about 3 minutes including taking the time to construct the Makefile. :)
 
Code:
# in bash ...
$ head -1 data/Station.csv >badstation.csv
$ echo "'ZARYA MANAS','ANTONIO DE ANORAOE HUB',-1,'?','?'" >>badstation.csv
$ cat badstation.csv
unq:name@System.system_id,unq:name,ls_from_star,blackmarket,max_pad_size
'ZARYA MANAS','ANTONIO DE ANORAOE HUB',-1,'?','?'
$ misc/madupload.py badstation.csv

Very timely; just caught some more OCR derpage (ALIT/RITTENHOUSE OOCK) and fired off the upload. Time for me to dust off my wretched Python skills and build a version of madupload.py that just prompts for the derp'd system name and station, constructs the right csv, and sends it along. It's 30 seconds of Perl for me, but since I don't know Python all that well... sigh. Heck, I could code it up in C++ in about 3 minutes including taking the time to construct the Makefile. :)

As a long time perl hack myself, I have this advice for learning Python:

1. Write your code this way for a while, and remove the comments before showing it to other people:

Code:
if (condition):
#{
    im_indented(yahoo);
    while (True):
    #{
        nesting(nesting(nesting()))
    #}
#}

Wean yourself off the ()s on conditionals later on.

2. Remember that Python has a "repl", and even better it has a GUI repl that comes with.

- just type "python" for the repl and don't be afraid to leave one open to mess with,
- "idle" (as in eric, not as in slacking) is the name of the GUI repl ... start -> idle3 on windows

3. s/cpan/pip/

As of Py3.4 there's a package manager, "pip". "pip search curl", "pip install requests", etc.

4. "ipython" is an improved repl for python and it helps a lot with a LOT of perl advantages,

The notebook stuff is a repl on steroids that you access through your browser...

Code:
$ pip install ipython ipython[notebook]

5. There's "help".

Python codifies this notion that if you put a string as the first line of a class or function body, that serves as a docstring.

Code:
def foo():
  """ Wibbles like crazy """

def bar():
  " Wibbles like mad "

You can access this via the help() command in the repl.

Code:
>>> import sys
>>> help(sys)
... copious output ...
>>> import tradedb
>>> help(tradedb.TradeDB)
... not nearly enough output ...
>>> curse_that_kfsone()

Python tripple-quote strings are multi-line strings like Perl's qq{} and q{}.

Because there's a repl, and because of the way I've written tdb, you can do this:

Code:
$ ipython
>>> import tradedb ; tdb = tradedb.TradeDB()
>>> print(len(tdb.systemByID))   # yeah, I should expose these nicely
23482
>>> sol = tdb.lookupSystem("sol")
>>> asc = tdb.lookupPlace("ascendingphoenix") # looks up a station
>>> import math
>>> print(math.sqrt(sol.distToSq(asc.system)))
>>> starsNearSol = [ sys.name() for sys, dist in tdb.genSystemsInRange(sol, 8) ]
>>> print("Sol's 8.0 ly neighbors are:", starsNearSol)
Sol's 8.0 ly neighbors are: ['ALPHA CENTAURI', "BARNARD'S STAR", 'LUHMAN 16', 'WISE 0855-0714', 'WOLF 359']

and various other shennigans.
 
Last edited:
Attention Explorers ... those of you who are exploring systems not yet in the database, I have added a small tool for submitting distances to EDSC.

It's nothing special - it just prompts you to enter distances to particular stars or enter your own star names.

But it DOES copy the names into the "clipboard" (paste buffer) so that you can alt-tab to the game, paste the name into your search box and when the map is done zooming, it'll tell you the distance. BOOM.

The script is "submit-distances.py" in the trade directory. It's a little messy, if anyone feels like cleaning it up... :)

I may try this out this weekend - assuming my ED verbose logging was on and it logged the names of all the systems I visited over the last few weeks - heading out past Witch Head Nebula now... If I get really ambitious I'll take a peek at the script (and get going on the jython UI again). I have some more station info I will try to get in via git as well and figure out how to send a pull request to you, as soon as I figure out the merge tool - which does not seem to be installed by default so I need to find one first. ;-)

Edit: just saw your note on Atlassian Source Tree so have that installed now, yay!
 
Last edited:
Looks like the derp-detector still needs a little work.

NOTE: Added local station placeholder for SAFFRON/CRESSVVELL (#4977)
NOTE: Added local station placeholder for SAFFRON/CRESSWELL (#4978)

NOTE: Added local station placeholder for SRS 2543/0IRAC TERMINAL (#4984)
NOTE: Added local station placeholder for SRS 2543/DIRAC TERMINAL (#4985)

NOTE: Added local station placeholder for WADJALI/KIDMAN TERMINAL (#4995)
NOTE: Added local station placeholder for WADJALI/KIOMAN TERMINAL (#4996)

I've tracked down the following circumstances where this can happen:

- Maddavo Share data is imported into TD
- Someone else submits station derps to Maddavo Share
- Maddavo Share deletes those stations and prices and regenerates Station.csv / .prices files.
- Maddavo Share data is imported into TD with Station option
** We now have the situation where the TD database (local TD.prices file) still has the prices for stations that were deleted on Maddavo Share, but those stations are NOT in the Station.csv that was just downloaded. So when the cache is rebuilt then those prices are orphaned and need a placeholder station.

The only way I have fixed this in TD is to either delete the stations/prices from the local TD.prices file (tedious) or redownload a fresh TD.prices (also tedious on current server). With the number of derps that we're all picking up, this is happening quite a bit.
 
Small bug report:

Using TD 390e560e4a9c:
Code:
trade import --plug=maddavo
NOTE: Last download was ~3.07 days ago, downloading full file
Connecting to server: [url]http://www.davek.com.au/td/prices.asp[/url]
import.prices: 13,257,259/13,257,259 bytes |  95.09KB/s | 100.00%
PLUGIN ERROR: Station ERAVATE/Ackerman Market has suspicious date: 2015-01-21 16:29:00 (newer than the import?)

This should not happen now. I've updated the EDDN filter so prices that are later than now (UTC-wise) are timestamped as now.

It was 16:59 local time (CET), but I don't know if the new prices were really imported, so I used the opportunity to upgrade TD to 0b6c8d25cd09 and run again:
Code:
trade import --plug=maddavo
NOTE: Stale/missing local copy, downloading full .prices file.
Connecting to server: http://www.davek.com.au/td/prices.asp
PLUGIN ERROR: Data is not Maddavo's prices list:

http://www.davek.com.au/td/prices.asp shows:
Code:
Error: Cannot find current TradeDangerous.prices
It may be temporarily unavailable, please check back later.

Yeah, this can happen - I'm trying to figure out how to avoid this. Basically someone was downloading the file when the server wanted to replace it with a newer one. I'm not happy with how this works at the moment. I know how to avoid it, but the prices filename that you get won't be consistent (I think). I'll do some testing.

Anyhoo, when this happens, the prices file will be available again within 3minutes.
 
Hi All,

I just finished moving to another server. Let me know if it's faster for you to upload/download data. It should be, the old server was in my house at the end of an ADSL2+ connection far from the exchange. The uplink was being hammered in the last few days.

You do not need to change any urls or anything - all should just continue to work. If however some function is broken on the new server I've probably got some permission wrong or some web config - let me know and I'll fix it.

Amazing the data that we've got to use in TD now, I hope we can keep it clean. I added some more price sanity checks so we shouldn't get the silly 1M+ gains per hop anymore.

Cheers,
Maddavo
 
Last edited:
Edit: just saw your note on Atlassian Source Tree so have that installed now, yay!

So, there is git which, well, frak git. And then there is SourceTree, which is git without the "frak git" but with sugar and lightness and joy instead.

- - - - - Additional Content Posted / Auto Merge - - - - -

Hi All,

I just finished moving to another server. Let me know if it's faster for you to upload/download data. It should be, the old server was in my house at the end of an ADSL2+ connection far from the exchange. The uplink was being hammered in the last few days.

You do not need to change any urls or anything - all should just continue to work. If however some function is broken on the new server I've probably got some permission wrong or some web config - let me know and I'll fix it.

Amazing the data that we've got to use in TD now, I hope we can keep it clean. I added some more price sanity checks so we shouldn't get the silly 1M+ gains per hop anymore.

Cheers,
Maddavo

WOW!

Code:
$ /c/dev/trade/trade.py import --plug=maddavo
Connecting to server: http://www.davek.com.au/td/prices-2d.asp
import.prices: 2,818,039/2,818,039 bytes |   0.70MB/s | 100.00%

Before I'd be lucky to get 45KB (note capital B, this is in bytes not bits) - that's 5.6Mb/s.

So I deleted data/maddavo.stamp and pulled again, very nice:

Code:
$ /c/dev/trade/trade.py import --plug=maddavo
NOTE: Stale/missing local copy, downloading full .prices file.
Connecting to server: http://www.davek.com.au/td/prices.asp
import.prices: 14,837,898/14,837,898 bytes |   2.33MB/s | 100.00%
 
Last edited:
The only way I have fixed this in TD is to either delete the stations/prices from the local TD.prices file (tedious) or redownload a fresh TD.prices (also tedious on current server). With the number of derps that we're all picking up, this is happening quite a bit.
The derpage was strong with this one:
NOTE: Added local station placeholder for TARACH TOR/TRANQUEUTY (#5251)
NOTE: Added local station placeholder for TARACH TOR/TRANQUNLRY (#5252)
NOTE: Added local station placeholder for TARACH TOR/TRANQUWUTY (#5253)
I wonder - would it make sense to compute Hamming Distance between the Soundex encoding of strings to detect duplicates?
  1. Sort by Soundex of the station name
  2. Compute Hamming Distance amongst neighbors in the sort
  3. Very low Hamming Distance in the same system suggests duplication
  4. Rotate all the station name soundex strings one character (i.e. shift the first character to the tail end) and repeat steps 1-3. (Should catch cases where the first character was borked.)


Then you're left with the simple (ha!) exercise of determining the correct name. (In this case, Station.csv says it's TARACH TOR/Tranquillity, but I have to note that Tranquility is spelled with but a single letter ell.)
 
The derpage was strong with this one:

I wonder - would it make sense to compute Hamming Distance between the Soundex encoding of strings to detect duplicates?
  1. Sort by Soundex of the station name
  2. Compute Hamming Distance amongst neighbors in the sort
  3. Very low Hamming Distance in the same system suggests duplication
  4. Rotate all the station name soundex strings one character (i.e. shift the first character to the tail end) and repeat steps 1-3. (Should catch cases where the first character was borked.)


Then you're left with the simple (ha!) exercise of determining the correct name. (In this case, Station.csv says it's TARACH TOR/Tranquillity, but I have to note that Tranquility is spelled with but a single letter ell.)

That wasn't ocr derp, and developing a system to do that while processing the .prices file - thousands or potentially millions of entries - that's not a project I'd want to embark on; I had enough "fun" trying to do that when I was running a web-to-mail gateway in the 90s and some chinese universities recognized me as a way to get their pr0n :)
 
Then you're left with the simple (ha!) exercise of determining the correct name. (In this case, Station.csv says it's TARACH TOR/Tranquillity, but I have to note that Tranquility is spelled with but a single letter ell.)

HA - you'd think so. I entered this station. Sadly in-game it really is "TRANQUILLITY".
Screenshot_0080.jpg
Serenity now!

- - - - - Additional Content Posted / Auto Merge - - - - -

There was an upload limit on the new server that was stopping uploads >200k. The Station.csv is a tick over 200k so quite a few of you found this. Fixed now.
 
Last edited:
Subject of adding stations -

I see you have a thread on bitbucket for submitting stations. What is your preference? Submitting via bitbucket, or uploading to Maddavo and letting nature take its course?
 
Small Problem with stations again:

I edit stations.csv to add some stations (I find this quicker/easier than using trade.py station).
Upload to stations.csv to Maddavo, along with pricing to TD and Madd.

After a while I do a generic import from Maddavo (trade.py import --plug maddavo --opt syscsv --opt stncsv)

I get the following

Code:
NOTE: Added local station placeholder for CD-46 14401/KLEIN ENTERPRISE (#5492)
NOTE: Added local station placeholder for CD-46 14401/STITH RING (#5493)
NOTE: Added local station placeholder for LTT 9156/KOROLEV HUB (#5494)
NOTE: Added local station placeholder for NAHUA/FULLER VISION (#5495)

Which is the stations most recently added - but these stations aren't entered into the local station.csv

So when I add the next stations and try to import data from those

Code:
D:\Gadgets\TradeDangerous\trade.py: D:\Gadgets\TradeDangerous\data\TradeDangerous.prices:210496 ERROR Unrecognized STAR/Station: "CD-46 14401/KLEIN ENTERPRISE"

I'm in an area of the Empire which doesn't seem to have ANY stations entered anywhere near me and this is going to be potentially painful. How can I work around this?

Adding stations using the trade.py station command seems not to cause this problem, so I can do that - but that's a hell of a command line where I could instead manually enter 4 short csv separated fields.
 
Last edited:
Seems to be getting confused about Non-Lethal Weapons.

Code:
D:\Gadgets\TradeDangerous>trade.py buy --near contiku nonlethal
D:\Gadgets\TradeDangerous\trade.py: Item "nonlethal" could match Non-Lethal Weapons or nonlethalweapons

D:\Gadgets\TradeDangerous>trade.py buy --near contiku nonlethalw
D:\Gadgets\TradeDangerous\trade.py: Item "nonlethalw" could match nonlethalweapons or Non-Lethal Weapons

Same thing either way.
 
Small Problem with stations again:

I edit stations.csv to add some stations (I find this quicker/easier than using trade.py station).
Upload to stations.csv to Maddavo, along with pricing to TD and Madd.

After a while I do a generic import from Maddavo (trade.py import --plug maddavo --opt syscsv --opt stncsv)

I get the following

Code:
NOTE: Added local station placeholder for CD-46 14401/KLEIN ENTERPRISE (#5492)
NOTE: Added local station placeholder for CD-46 14401/STITH RING (#5493)
NOTE: Added local station placeholder for LTT 9156/KOROLEV HUB (#5494)
NOTE: Added local station placeholder for NAHUA/FULLER VISION (#5495)

Which is the stations most recently added - but these stations aren't entered into the local station.csv

So when I add the next stations and try to import data from those

Code:
D:\Gadgets\TradeDangerous\trade.py: D:\Gadgets\TradeDangerous\data\TradeDangerous.prices:210496 ERROR Unrecognized STAR/Station: "CD-46 14401/KLEIN ENTERPRISE"

I'm in an area of the Empire which doesn't seem to have ANY stations entered anywhere near me and this is going to be potentially painful. How can I work around this?

Adding stations using the trade.py station command seems not to cause this problem, so I can do that - but that's a hell of a command line where I could instead manually enter 4 short csv separated fields.

I don't know about all those problems you are having with maddavo but I have a little station.bat file in the same folder as trade.py to help me with all those long switch names

Code:
trade.py station --ls-from-star %1 --bm %2 --pad-size %3 %4 -%5

It's then called as (for example to add a new station thats 123 ls from the star, has no black market and a large pad size):

Code:
station.bat 123 N L [COLOR=#E4E4E4]"<system>/<stationname>" a[/COLOR]

If you were updating a station you would use 'u' at the end rather than 'a'.

It's not perfect but it does help a little.
 
Last edited:
Back
Top Bottom