In-Development TradeDangerous: power-user trade optimizer

Oliver, I opened issue #238 to cover tradedangerous falling over when importing price data written by the latest version of the Regulated Noise OCR tool. Seems that tool decided to start writing timestamps in full ISO 8601 format, which includes an explicit offset from UTC. I think the .prices regex in cache.py is failing to match lines with those offsets, and I know the maddavo plugin, while successfully ignoring the timezone offset plugin, is unhappy about seeing updates from users located in east-of-Zulu timezones, thinking they happened in the future.

Details in the issue, of course.

If I can get my Python regex knowledge up to snuff, I'll try submitting a PR with fixes. Not sure how well that'll go.
 
Oliver, I opened issue #238 to cover tradedangerous falling over when importing price data written by the latest version of the Regulated Noise OCR tool. Seems that tool decided to start writing timestamps in full ISO 8601 format, which includes an explicit offset from UTC. I think the .prices regex in cache.py is failing to match lines with those offsets, and I know the maddavo plugin, while successfully ignoring the timezone offset plugin, is unhappy about seeing updates from users located in east-of-Zulu timezones, thinking they happened in the future.

Details in the issue, of course.

If I can get my Python regex knowledge up to snuff, I'll try submitting a PR with fixes. Not sure how well that'll go.

Looks like Mad already fixed? :)
 
Hi,

Possible bug or just my user_error here:

Code:
trade.py local -vv -ly 0 'Persephone'

Gives me:

Code:
trade.py: System/Station "Persephone" could match PERSEPHONE, AVALON/Persephone or ERLIK/Persephone

Edit: Figure it out. Persephone/ will give you the right starsystem :)
 
Last edited:
Looks like Mad already fixed? :)
Yep, Mad fixed it. If the "contract" for his output is "all times are always UTC without decoration", then you don't need to do anything, as appears to be the case. The "defensive programmer" in me wants to add 8601 Time Format support to TD just in case, but I'm ruthlessly stamping out that urge.
 
Looks like Mad already fixed? :)

Yep - my parser broke over the weekend with the RN v1.84_0.23 non-UTC timezones coming via EDDN. I updated the parser to accept those but didn't regenerate the affected lines... this resulted in some price lines being out of whack for a few hours last night until I regenerated them. The price lines are all UTC now.
 
Yep - my parser broke over the weekend with the RN v1.84_0.23 non-UTC timezones coming via EDDN. I updated the parser to accept those but didn't regenerate the affected lines... this resulted in some price lines being out of whack for a few hours last night until I regenerated them. The price lines are all UTC now.

In fact they never sent UTC timestamp ;) They just didn't tell it ^^
I got some update pushed in future from RN, that's why I told them to work with UTC or adding timezone compliant with ISO.
 
Last edited:
Hi,

Possible bug or just my user_error here:

Code:
trade.py local -vv -ly 0 'Persephone'

Gives me:

Code:
trade.py: System/Station "Persephone" could match PERSEPHONE, AVALON/Persephone or ERLIK/Persephone

Edit: Figure it out. Persephone/ will give you the right starsystem :)

"@Perseph" would also work ('@' means 'star name', '/' means 'station name').

- - - Updated - - -

7.1.1

Just landed 7.1.1, minor change to edscupdate but I also verified and added over 500 new systems in the central region (sol/lave/achenar/shinrarta/new yembo); these are the most pertinent to trading - there are stars in the queue 25k light years from anywhere, no stations out there, so no use having them in the db yet -- they're in the queue, but for now...
 
Just came across this tool, and I absolutely LOVE it...

I've been looking through your source, and hope I can eventually write code half that clean/organized/functional.

Do you think there would be much room for performance improvement by leveraging pandas to store the database? It could complicate the bundling, but one thing about pandas is that it is really really fast when working with large dataframes.
 
Just came across this tool, and I absolutely LOVE it...

I've been looking through your source, and hope I can eventually write code half that clean/organized/functional.

Do you think there would be much room for performance improvement by leveraging pandas to store the database? It could complicate the bundling, but one thing about pandas is that it is really really fast when working with large dataframes.

After reading the introduction, I don't see anything specific that would help.

You can often improve performance by tweaking command-line parameters depending on what your ship can/can't do. If you're a typical trader with a cargo scoop and not minding a few jumps between stations, you can replace "--ly 10.5 --jumps 3" with "--ly 31.5 --jumps 1". Now, you're going to get some runs that are more than 3 jumps but TD will be much more efficient; the problem being that when you specify multiple jumps, for every system that TD reaches it has to ripple outwards again. That could perhaps be optimized by remembering the 1-hop reach of every system encountered so that it can be referenced on subsequent hops. I'll look at implementing that sometime this week.

But most of the CPU is spent in comparing the buy/sell lists at stations; reducing the stations that get compared or the number of items improves performance. There are several ways that can be optimized, unfortunately TD is solving an NP problem which eliminates many of the most obvious ones, but two I've contemplated toying with include making the buy/sell lists numpy contiguous arrays ordered by itemID so that I can use vector operations to say "tell me how many of each of these I can buy" and "what will the gain/ton be?"

But, again, we have a form of backtracking, so it's very difficult to do most of the easy optimizations -- we have to get all destinations for each origin in a hop at once, Python threading wouldn't help because we're not IO bound and Python's multiprocessing wouldn't help because the memory footprint would be fairly large. We could work around that by using messaging (zeromq or similar) and push data, but the overhead of moving the data would eat into the gains - although all we have to send is [credits, capacity, limit, srcStnID, [buyItems], dstStnID, [sellItems]]. The item lists either have to be [itemID, cr] or we have to cache the lists and then we can just send the costs.
 
After reading the introduction, I don't see anything specific that would help.

You can often improve performance by tweaking command-line parameters depending on what your ship can/can't do. If you're a typical trader with a cargo scoop and not minding a few jumps between stations, you can replace "--ly 10.5 --jumps 3" with "--ly 31.5 --jumps 1". Now, you're going to get some runs that are more than 3 jumps but TD will be much more efficient; the problem being that when you specify multiple jumps, for every system that TD reaches it has to ripple outwards again. That could perhaps be optimized by remembering the 1-hop reach of every system encountered so that it can be referenced on subsequent hops. I'll look at implementing that sometime this week.

But most of the CPU is spent in comparing the buy/sell lists at stations; reducing the stations that get compared or the number of items improves performance. There are several ways that can be optimized, unfortunately TD is solving an NP problem which eliminates many of the most obvious ones, but two I've contemplated toying with include making the buy/sell lists numpy contiguous arrays ordered by itemID so that I can use vector operations to say "tell me how many of each of these I can buy" and "what will the gain/ton be?"

But, again, we have a form of backtracking, so it's very difficult to do most of the easy optimizations -- we have to get all destinations for each origin in a hop at once, Python threading wouldn't help because we're not IO bound and Python's multiprocessing wouldn't help because the memory footprint would be fairly large. We could work around that by using messaging (zeromq or similar) and push data, but the overhead of moving the data would eat into the gains - although all we have to send is [credits, capacity, limit, srcStnID, [buyItems], dstStnID, [sellItems]]. The item lists either have to be [itemID, cr] or we have to cache the lists and then we can just send the costs.

Thanks for the detailed reply. I'm just starting to get into the world of scientific computing and data analysis, and pandas is the default tool to use, but for the reasons you specified, it may not add much, and it would further complicate things by requiring an extra package to be installed.

On another subject, I do have a feature request. When using the 'buy' command, it lists stations and the light-years to the respective systems, but it does not list how far the stations are from the navigation point in light-seconds, that information would be useful in that view.

If you would like me to try and make that change and submit a pull request, let me know.
 
On another subject, I do have a feature request. When using the 'buy' command, it lists stations and the light-years to the respective systems, but it does not list how far the stations are from the navigation point in light-seconds, that information would be useful in that view.
Add the "-vv" option (verbose, doubled) and you'll get station distance from the primary. Not as clean a tabular form of output as the non-verbose format, but it's there.
 
Powerplay is going to make the data more volatile methinks.

Black Markets opening/closing.
Price hikes/discounts on commodities.
 
Hey... Trade Dangerous said it wanted me to post a screencap because Gold was out of the expected ranges. So... here it is! Screen Shot 2015-06-01 at 23.34.44 .jpg

- - - Updated - - -

Hey... Trade Dangerous said it wanted me to post a screencap because Gold was out of the expected ranges. So... here it is!
Okay, this is the weirdest thing... when I went back into the screen, the gold sell price went up to 9270. I did not leave the station... in fact, I didn't even leave the commodities screen! It just changed it on me! Annoyingly, I'd sold a bunch of gold to this station at the cheap price! Very weird.
 
WOW. So AnthorNet just uploaded a HUUUUGGGEEE Station.csv file with an additional 20k+ stations - doubling our stations.

Be prepared for a slight delay the next time you do an update!

Sooo much to explore....
 
WOW. So AnthorNet just uploaded a HUUUUGGGEEE Station.csv file with an additional 20k+ stations - doubling our stations.

Be prepared for a slight delay the next time you do an update!

Sooo much to explore....

This must be from the data hacked from the FD servers a while ago. We have been avoiding using that as it seems less than ethical.
 
This must be from the data hacked from the FD servers a while ago. We have been avoiding using that as it seems less than ethical.

Hi smacker,
I parsed a lot of sources to make my database, but not those one, even if I saw them on EDDB IRC.
And I wouldn't go that way !
 
Back
Top Bottom