In-Development TradeDangerous: power-user trade optimizer

I have recently been asked about "old prices" that persist at a station for items that are no longer bought or sold at a station. The following is an explanation of the scenario and some advice on how to resolve it.

Firstly, the situation doesn't have to do with crowdsourced data - it can happen with just local data that is updated using the import command. Using the import command, a user adds data for a station - this might be from EliteOCR, ED:Market Connector or Maddavo Market Share data. Then at some later time data is updated for a station via the import command. But a commodity that was listed the first time is now missing. This means that prices get updated for all the commodities except the missing one. The price for the missing item remains in the local data with the original timestamp. This situation can lead to some undesirable trade routes being generated where you are told to deliver a load of something that you can no longer sell for a profit at the station or worse it may be illegal. Then you have to work out what to do with your cargo and generate a whole new trade run. But then the trade run generation may be tainted again causing you to try to buy/sell non-existent/unwanted items.

The TD import command used to delete this old price - essentially overwriting data for a station. But since a few versions ago the import command merges data and there is no option to clobber old data.

So, the price database needs to be 'cleaned' of old prices. On Maddavo Market Share, there is a routine that runs every 3 hours that deletes any price that is 2days older than the latest station price. This has the effect that prices for a station are all timestamped within a 2-day window. Ages ago this timeframe was 14days, then it was reduced to 7days and now it is 2days. This is because we initially were updating data completely manually and people didn't update all the prices for a station so a 14-day window was good. But now the vast majority of price updates are automated so even a 2-day window is probably stretching it. But I digress...

The price 'cleaning' on MMS just makes sure that anyone using that data isn't downloading old prices. BUT it doesn't do anything to the old prices in your local database - you still need to 'clean' your local data. There is no command that does this automatically. You can manually UPDATE prices for a station - I use this command (trade update "SYSTEM/Station" --sub -T) to open Sublime and look at the timestamp column to see if there are any prices that are old compared to the other timestamps. This is extremely tedious and has to be done station-by-station.

Sick of this, I created the following SQL query to 'clean' the local database. It will delete all prices that are 2 days older than the latest price at their respective stations. I use DBBrowser for SQLite to open the TradeDangerous.db file and run the query. The query is:

Code:
DELETE FROM StationItem WHERE StationItem.station_id || '/' || StationItem.item_id IN
(SELECT StationItem.station_id || '/' || StationItem.item_id AS Thing FROM StationItem INNER JOIN
 (SELECT MAX(modified) AS Stationtimestamp, station_id FROM StationItem GROUP BY station_id) AS MyQ
 ON MyQ.station_id = StationItem.station_id WHERE datetime(modified)<datetime(Stationtimestamp,'-2 day'))

The "-2 day" at the end might be substituted with some other timeframe like "-6 hour".

I am hopeless at Python. Please can someone make a Python command that will do that OR better modify the import command to have the option of running that after an import (or having an option to clobber previous data for a station would achieve the same thing - actually better).

Cheers,
Maddavo
 
That's because the maddavo plugin sets "tdenv.mergeImport" to "True".

Normaly that would only happen when you specify "--merge-import" (or "-M") in the commandline, but the plugin always sets it to True.

If you don't want this, just delete the line "tdenv.mergeImport = True" (file: plugins/maddavo_plug.py, Line: 695).
 
That's because the maddavo plugin sets "tdenv.mergeImport" to "True".

Normaly that would only happen when you specify "--merge-import" (or "-M") in the commandline, but the plugin always sets it to True.

If you don't want this, just delete the line "tdenv.mergeImport = True" (file: plugins/maddavo_plug.py, Line: 695).


Aha! Superb.
 

wolverine2710

Tutorial & Guide Writer
In the past your excellent tool, thread has been added to EDCodex. Before EDCodex was released on the 17th of August you have received in the period 5th - 10th of August a PM with an invitation and a special link. After registering and logging in you would automatically become owner of your entry. According to the admin tool(s) you haven't used the special link (yet). Perhaps you have missed the PM or have been (temporarily) away from ED. Its also possible you choose not to claim your entry. Note: Its also possible to assign another commander editing rights for your entry. In either case please send us a PM. You can find your EDCodex entry here.

Alternative way to get ownership
The special link will cease to function in the (near) future, for security reasons. Should you after that point want to become owner of your entry you can use the "Claim ownership" button. In that case please send biobob or myself a PM with the email addressed you used for registration - for verification. You can also use this procedure if you no longer have the PM.

What is EDCodex:
Its a website with a database of currently approx 215+ tools,threads,websites,videos for ED. Any one can and is encouraged to add entries there. EDCodex is and should be community-driven. EDCodex companion thread. Its equally suited for PC's, tablets and smartphones and has RSS feeds.

With kind regards,
Biobob
Wolverine2710
 
Last edited:
With the v1.4 CQC update, you will need a new Ship.csv file with the Imperial Eagle, Federal Gunship and Federal Assault Ship.

You can get a new Ship.csv from HERE. It goes in your data folder.

Cheers,
Maddavo
 
I don't know whether FDev have turned off the API to relieve server load but it doesn't seem to be working for me at the moment. Deleting the cookie file and reauthenticating tells me that authentication is working but access is being denied. I still have my database, but I will be quickly lost without updating it and I really don't like crowd sourced data. I hope it isn't over as trading is nonsensical using in game information.
 
Yeah, new logins appear to be broken in the API. I can see some updates flowing through EDDN, so it looks like some existing tokens are working, but if your token expires you cannot re-authenticate right now.

Give it a few days. The same thing happened with the 1.3 launch. Eventually FD fixed it.
 
Hi, I've been trying to understand how tradedangerous works, but I'm unable to grasp it. How is the goodness of a trade loop determined and what does -s parameter does with trade.py run? To my understanding it looks for best possible trade within -s hops from given --from point? If that is the case, then I don't understand these results.

If I run trade.py run --credits 21m --capacity 104 --ly-per 17.0 --avoid FACECE -vv --ls-max 1300 --hops 2 --from "cemiess" --loop -s 10
I get following results
LTT 8517/Howe Ring -> LTT 8517/Howe Ring (score: 195295.146800)
Load from LTT 8517/Howe Ring (921ls, BMk:N, Pad:M, Shp:Y, Out:Y, Ref:Y):
104 x Imperial Slaves 13 766cr vs 16 274cr, 34 hrs vs 26 hrs
Unload at TEAKA/Maccurdy City (132ls, BMk:N, Pad:L, Shp:Y, Out:Y, Ref:Y) => Gain 260 832cr (2 508cr/ton) => 21 260 832cr
Load from TEAKA/Maccurdy City (132ls, BMk:N, Pad:L, Shp:Y, Out:Y, Ref:Y):
104 x Superconductors 6 224cr vs 7 471cr, 26 hrs vs 34 hrs
Unload at LTT 8517/Howe Ring (921ls, BMk:N, Pad:M, Shp:Y, Out:Y, Ref:Y) => Gain 129 688cr (1 247cr/ton) => 21 390 520cr
----------------------------------------------------------------------------
Finish at LTT 8517/Howe Ring (921ls, BMk:N, Pad:M, Shp:Y, Out:Y, Ref:Y) gaining 390 520cr (1 877cr/ton) => est 21 390 520cr total

All fine and good, to my understanding above is the best possible trade loop within 170 ly from CEMIESS.

But if I run the exact same command with like this: trade.py run --credits 21m --capacity 104 --ly-per 17.0 --avoid FACECE -vv --ls-max 1300 --hops 2 --from "cemiess" --loop -s 3

I get following results:
CEMIESS/Jones Hub -> CEMIESS/Jones Hub (score: 206703.758560)
Load from CEMIESS/Jones Hub (1.21Kls, BMk:N, Pad:M, Shp:Y, Out:Y, Ref:Y):
104 x Beryllium 7 600cr vs 9 086cr, 25 hrs vs 26 hrs
Unload at LTT 9810/Whitford Vision (107ls, BMk:N, Pad:M, Shp:Y, Out:Y, Ref:Y) => Gain 154 544cr (1 486cr/ton) => 21 154 544cr
Load from LTT 9810/Whitford Vision (107ls, BMk:N, Pad:M, Shp:Y, Out:Y, Ref:Y):
104 x Imperial Slaves 13 785cr vs 16 275cr, 26 hrs vs 25 hrs
Unload at CEMIESS/Jones Hub (1.21Kls, BMk:N, Pad:M, Shp:Y, Out:Y, Ref:Y) => Gain 258 960cr (2 490cr/ton) => 21 413 504cr
----------------------------------------------------------------------------
Finish at CEMIESS/Jones Hub (1.21Kls, BMk:N, Pad:M, Shp:Y, Out:Y, Ref:Y) gaining 413 504cr (1 988cr/ton) => est 21 413 504cr total

This latter result is way better than the first one, it gives better profit and even tradedangerous internal score calculation seems to give this latter higher score. So what I'm to do with this tool since it does not seem very reliable or am I using it wrong?
 
Hi, I've been trying to understand how tradedangerous works, but I'm unable to grasp it. How is the goodness of a trade loop determined and what does -s parameter does with trade.py run? To my understanding it looks for best possible trade within -s hops from given --from point? If that is the case, then I don't understand these results.

trade.py run --credits 21m --capacity 104 --ly-per 17.0 --avoid FACECE -vv --ls-max 1300 --hops 2 --from "cemiess" --loop -s 10
gaining 390 520cr (1 877cr/ton) => est 21 390 520cr total

trade.py run --credits 21m --capacity 104 --ly-per 17.0 --avoid FACECE -vv --ls-max 1300 --hops 2 --from "cemiess" --loop -s 3
gaining 413 504cr (1 988cr/ton) => est 21 413 504cr total

This latter result is way better than the first one, it gives better profit and even tradedangerous internal score calculation seems to give this latter higher score. So what I'm to do with this tool since it does not seem very reliable or am I using it wrong?

It does seem a bit odd - but there may be something going on with the jump range numbers - you have specified the starting jumps but no unladen jump range. You may be better off getting the GUI to generate the commands for the runs - check out TradeDangerous Helper at https://bitbucket.org/WombatFromHell/trade-dangerous-helper
 
I'm thinking it's because that first hop in example 1 is the most profitable hop, so it starts with that one, the choices are limited on the second hop. Could be a bug. I've noticed this sort of behavior myself. Usually giving it more hops to play with gives better results.

Anyone know what's up with KFSOne recently? He's been terribly quiet.

EDIT: Also -- The API appears to be working again!
 
Last edited:
After fiddling with the tool some more, it definitely looks like it is suffering from some kind of bug when using loops. Optimal route seems to differ based on parameters -s and --hops. -emp only affects the range from which optimal trades are sought. (In effect if you put -emp with e.g. 25, then even lower number defined for -s will cause worse calculation results). I've started to wade through the code but it is tad slow since its totally new to me and I'm c++ coder myself with only a little bit of knowledge about python. Wish I had more time to invest in this. Maybe way the nodes and links are calculated is the problem, but need to dig deeper. I wonder if there is way to unit test the methods, that way bugs could be found easier.

Increasing the number of hops does seem to help in a sense that cemiess <-> ltt 9810 seems to be the best route detected, only the source and destination stations seem to fluctuate a bit. I give another example:
trade.py run --credits 21m --capacity 104 --ly-per 17.0 --avoid FACECE -vv --loop --ls-max 1300 -s 3 --hops 8 --from "cemiess"
NOTE: Pruned 2538 origins too far from any end stations
NOTE: Pruned 4718 origins too far from any end stations
NOTE: Pruned 2655 origins too far from any end stations
CEMIESS/Jones Hub -> CEMIESS/Jones Hub (score: 220162.359560)
Traffic happens between Cemiess and Ltt 9810

but with trade.py run --credits 21m --capacity 104 --ly-per 17.0 --avoid FACECE -vv --loop --ls-max 1300 -s 20 --hops 8 --from "cemiess"
NOTE: Pruned 275 origins too far from any end stations
NOTE: Pruned 4205 origins too far from any end stations
NOTE: Pruned 8214 origins too far from any end stations
LTT 9810/Gutierrez Terminal -> LTT 9810/Gutierrez Terminal (score: 211305.200080
...
Load from LTT 9810/Trimble Vision (187ls, BMk:N, Pad:M, Shp:N, Out:N, Ref:Y):
104 x Imperial Slaves 13 794cr vs 16 468cr, 2 days vs 68 days
Unload at SELKANDE/Lewis Gateway (540ls, BMk:N, Pad:L, Shp:Y, Out:Y, Ref:Y) => Gain 278 096cr (2 674cr/ton) => 22 612 208cr
Load from SELKANDE/Lewis Gateway (540ls, BMk:N, Pad:L, Shp:Y, Out:Y, Ref:Y):
104 x Crop Harvesters 1 866cr vs 2 620cr, 68 days vs 2 days
Unload at LTT 9810/Gutierrez Terminal (469ls, BMk:N, Pad:L, Shp:Y, Out:Y, Ref:
...
Note that this latter route has lower score and for some reason we wind up visiting SELKANDE for worse results. Unless there is some kind of hidden mechanism that makes routes a->b->a->b->a repeated worse than if we sometimes visit system c in the meanwhile. That would kinda make sense since was it Oliver who stated that running only this a<->b route repeatedly will only make you shoot yourself in the foot since the value goods will lower quite fast or something like that?

Btw I'm using git version 81090cb63890 for testing I think. I assume that /kfsone-tradedangerous-81090cb63890 in that directory the latter part refers to git hash. (Didn't manage to download the source using git because:
jagge@censored:~$ git clone git@bitbucket.org:kfsone/tradedangerous.git
Cloning into 'tradedangerous'...
Permission denied (publickey).
fatal: Could not read from remote repository.

Please make sure you have the correct access rights
and the repository exists.)
 
I was thinking of maintaining a fork on github while kfsone is out of action (hope all is well). I don't really have the time to support it but if someone else feels up to it I'm sure a few of us could contribute. I could certainly help with setting up some unit tests; the python tools of visual studio seem to have come a long way since I last played.
 
FYI: We just passed 20,000 stations with markets that we have prices (albeit some are very old now...)
Amazingly, there are still about 16,000 stations out there with markets that we don't have prices for yet - haha.
 
Been slowly updating market price data for the stations that are with large landing pad, within <1000 Ls and missing the data. I wish more commanders would update data on these stations that yet don't have price data.
 
Been slowly updating market price data for the stations that are with large landing pad, within <1000 Ls and missing the data. I wish more commanders would update data on these stations that yet don't have price data.

With the EDAPI I am updating every station and outpost I land at regardless if I am trading or not. Good Data is important to everyone. Hate making a 90ly trip for a ship that isnt there ;)
 
I'm specifically going out to those stations that are missing price data, I'm using Maddavo's great edit station page to filter out those stations, very easy and enjoyable to find next nearest one.

If everyone just updates the existing trading route stations, no new markets are ever opened. Its very important (kind of like filling the missing distance from star values) to go and find those stations.

I roughly remember the number of <1000Ls large pad stations being 4700 when Maddavo told me about edit stations pages new features, today its already 5370, indeed nice increase of new markets.

Quick tutorial for anyone who wants to help:
1) go to edit stations page and type in the system you're at
2) choose MaxPadSize: L, MaxLsFromStar: 1000 and tick 0Ls box
3) choose first system in the "Nearest stations missing prices" box (doesn't have to be first but its the closest one)
3a) if the station(s) are missing distance from star data, please use system map to fill it in
4) travel there, dock to the station and use (for example) market connector to update the market prices
5) jump back to part 3
All done, everyone benefits :)
 
Last edited:
With the EDAPI I am updating every station and outpost I land at regardless if I am trading or not. Good Data is important to everyone. Hate making a 90ly trip for a ship that isnt there ;)

For all of you using the EDAPI plugin for Trade Dangerous, you may want to grab version 3.5.1 of the plugin from my repo. It adds the 1.4 ships, and EDDN support for shipyards and outfitting.

There's a pull request open against TD, but I'm not sure when kfsone will return. :(
 
Back
Top Bottom