In-Development TradeDangerous: power-user trade optimizer

Neither do I, but I'd be interested in helping get a mirror/backup off of the ground if possible. What is entailed in either mirroring or haveing similar functionality using JSONS or whatnot? My programming skills are limited to statistical programming; I have no experience or skills with websites or internet communication :(
 
It seems that Oliver (kfsone), bernd (gazelle), and Dave (maddavo) have all moved on to bigger and better things, and G-d bless them for it! Is there anyone who uses Trade Dangerous who would be able to figure out how the imports work (perhaps based on the edapi plugin) and who could engineer a way for us to use the master EDDB database (which is constantly being updated by those of us running EDMC or EDDiscovery and the like) instead of maddavo's database? As EDDB is updated in realtime, it's more accurate than maddavo, I reckon, but they may want us (or whomever) to cretae a semi-static databse which only pulls maybe once an hour so as to minimize traffic. I don't know :) Perhaps there is a charitable soul in EDDB or EDSM etc who would be willing to help us out? Is there a developers forum on this or another related database which at which we can ask?
 
Last edited:
edapi plugin

hi all,

is it not possible in a easy way to use this plugin ?
(Avi0013 also mentioned it many times)
the data is already there !

i still love TD (or would love it even more again:p)
if only one gifted person could do a little programming,
to make it again a little more usable (speed especially),
i would love this person too, forever[yesnod]

greetings and thank you
(beyond all things)
nepo
 
I have not abandoned TD. I'm just really lazy to change/add things if I don't need them. I'm only using TD for my private dataset and for that the EDAPI plugin ist enough.

Let's see what's needed to make an EDDB importer:
  • Download listings.csv (~141,439 kb)
  • Where is that commodity_id?
    • Integrate commodity_id form EDDB into TD
    • Sync commodities.json (~99 kb)
  • Ok but station_id is still missing.
    • Integrate station_id form EDDB into TD
    • Sync stations.jsonl (~112,158 kb)
    • But the stations have only an system_id?
      • Integrate system_id form EDDB into TD
      • Sync systems_populated.jsonl (~21,040 kb)
  • Now we are ready to import the prices from EDDB.

As you can see, that's some work to do. And in the worst case you'll have to download ~274,736 kb every day from EDDB. Don't know if themroc (EDDB) would like that.
 
I have not abandoned TD. I'm just really lazy to change/add things if I don't need them. I'm only using TD for my private dataset and for that the EDAPI plugin ist enough.

Let's see what's needed to make an EDDB importer:
  • Download listings.csv (~141,439 kb)
  • Where is that commodity_id?
    • Integrate commodity_id form EDDB into TD
    • Sync commodities.json (~99 kb)
  • Ok but station_id is still missing.
    • Integrate station_id form EDDB into TD
    • Sync stations.jsonl (~112,158 kb)
    • But the stations have only an system_id?
      • Integrate system_id form EDDB into TD
      • Sync systems_populated.jsonl (~21,040 kb)
  • Now we are ready to import the prices from EDDB.

As you can see, that's some work to do. And in the worst case you'll have to download ~274,736 kb every day from EDDB. Don't know if themroc (EDDB) would like that.

Since Maddavo's website is now defunct, what wold be involved in taking over that functionality from him ad providing a replacement site?
 
Since Maddavo's website is now defunct, what wold be involved in taking over that functionality from him ad providing a replacement site?

Maddavo made the market share by himself and never shared the code (with me), so I can't tell you what would be needed. Abstractly speaking you would need:
  • 24/7 server
  • EDDN listener
  • some Database
  • processing program
 
Hi, gazelle. Still no thought about Oliver's approved changes to the penalty? :D

I'm still not happy about it. Try it out in Excel/Calc for different values. I was thinking about a sigmoid function like:
Code:
# maybe make A and B parameters
A = 4000 # 50% value for ls==A if lsPenalty==1
B = 500 # smaller = more rectangular, higher more S shaped, >1

# this for each ls value, 0 <= lsPenalty <= 1
ls = dstStation.lsFromStar
val = math.exp((A-ls)/B)
multiplier = 1 - lsPenalty/(1 + val)
/edit: formula
 
Last edited:
Okay, so, this is about 95% finished, in that I haven't gotten the RareItem import coded yet, but I've spent the last few days working on a plugin that does half the job Maddavo's site used to do:

When the plugin is run (depending on the chosen options), it will download the needed daily dumps from eddb and process them to create all the tables in the TradeDangerous database, including market data, and will generate a .prices file from the listings.csv

This won't get the most up-to-the-minute market data, since the eddb dumps happen once per day, it gets everything but the data submitted since the latest dump, which means not more than 24 hours after the submission.
It does check to see if the data already entered is newer than the data from eddb, so it won't overwrite newer data gathered by the user (i.e. with "trade.py import -P edapi -O eddn") if that's more recent.

I'm working on a user-side eddn listen server next, so that this plugin and the listen server combined fully replace Maddavo's site.

It makes a few changes to the database, removing AUTOINCREMENT from most of the tables and adding a cost column to the Upgrade table, which means on first run it has to completely erase the database and rebuild it from scratch. I did not make the decision to do this lightly: AUTOINCREMENT was removed so that the TDDB entries would match with EDDB, not have a random id based on at what point the entry was added, which simplifies adding entries that reference entries in other tables (like Stations reference the id of the System they're in, for example).

For some reason, trade.py throws an error trying to compile the help for the plugin, so these are the options:

'item': "Regenerate Categories and Items using latest commodities.json dump.",
'system': "Regenerate Systems using latest system-populated.jsonl dump.",
'station': "Regenerate Stations using latest stations.jsonl dump. (implies '-O system')",
'ship': "Regenerate Ships using latest coriolis.io json dump.",
'shipvend': "Regenerate ShipVendors using latest stations.jsonl dump. (implies '-O system,station,ship')",
'upgrade': "Regenerate Upgrades using latest modules.json dump.",
'upvend': "Regenerate UpgradeVendors using latest stations.jsonl dump. (implies '-O system,station,upgrade')",
'listings': "Update market data using latest listings.csv dump. (implies '-O item,system,station')",
'all': "Update everything with latest dumpfiles. (Regenerates all tables)",
'clean': "Erase entire database and rebuild from empty. (Regenerates all tables.)",

If you want to try it out, place 'eddblink_plug.py' in the plugins folder of your TD install and run it.
On first run it will run with '-O clean' regardless of the options passed to it, because of the changes it needs to make to the database.
Thereafter, it is recommended to run with '-O listings', as the Ship, ShipVendor, Upgrade, and UpgradeVendor tables don't need to be updated under normal circumstances.

If anyone has any improvements to make the thing run faster, I'd appreciate it.

If anyone is willing to mirror the eddb files so as not to stress the eddb server, that'd be awesome, let me know the URL and I'll update the plugin to use that instead.

I've included the .prices file that it generated as well as the updated Category, Item, Station, and System .csv files, so anyone that wants can have a market update that's current as of approximately 2018-04-21 03:00:00 UTC (The approximate time of the latest dump as of this writing.)

The plugin and the market update can both be found in my Google Drive:
eddblink_plug.py
Market Update 2018-04-21.7z
 
Wow eyeonus, this was unexpected. I was beginning to think that TD was on its last legs. Many thanks for doing this. I'll give it a try tomorrow when I'm back at my pc.
 
TD is too awesome to let die. It is literally THE BEST trade route planner.

Also, consider this a beta release for my plugin. I'm sure there're bugs I haven't found in my testing, and TD itself needs some changes to its export to csv code to reflect the database changes, so this is still a work in progress.

Thanks for trying it out, everyone that does. I hope you guys like it.

Thanks especially to anyone who finds a bug, and to anyone who provides an improvement to the code.
 
Last edited:
I've updated the plugin.

It will now make the needed changes to csvexport.py to not ignore the item ID numbers in the tables, as those are no longer autoincremented and therefore need to be exported.

I think there was a couple of bugs I found and fixed, but I'm exhausted and my memory isn't working very well.

Any case, the update is in the same location on my Drive, so just re-download.
 
Last edited:
Thanks Eyeonus. I downloaded the plugin and ran it. I got an error while it was running because it did not create the EDDI folder in my C:\trade\data folder. I created the folder myself and re-ran it and everything continued nicely. I only had time to do one quick trade run but it all worked ok. I’ll look to doing some longer ones over the next couple of days.

I hope word spreads that you have done your bit to keep TD alive because, like you, I agree that it’s one of the best trade programmes going!
 
Thanks for the heads up. I've added code to create that folder if it doesn't exist, it'll be in the next update. I'm currently working on trying to speed up station updates- I'd forgotten to make sure it checks that the station has a more recent updated date than in the database before updating it, so it updates every station, every time. That's really, really slow. The checks make it so creating the UpgradeVendor table on first run is really slow - with almost 68,000 stations, and up to 900 upgrades at each station, adding the checks takes a while, but updating is much faster. I'm still trying to work out how I'm going to make it fast-ish in both scenarios, once I figure that out, I'll upload the update.
 
Just so you know, the plugin doesn't in any way affect trade route planning- that's entirely handled by TD itself. This plugin only deals with creating the database TD uses to do said planning.

Plugin updated.
Changelog:
Makes sure ./data/eddb exists and creates it if not.
Correctly converts UNIX-epoch time to UTC time, not local time.
Checks timestamps in source file against database and only updates Systems, Stations, ShipVendors, UpgradeVendors that have a newer timestamp. Significant speedup.
Does not check timestamps when database empty (i.e. first run or when running with '-O clean'). Even more significant speedup.
Shows progress report for processes that take a while due to having a lot of data to process. (Systems, Stations, Listings)
Added two new options: skipvend, force.

'item': "Regenerate Categories and Items using latest commodities.json dump.",
'system': "Regenerate Systems using latest system-populated.jsonl dump.",
'station': "Regenerate Stations using latest stations.jsonl dump. (implies '-O system')",
'ship': "Regenerate Ships using latest coriolis.io json dump.",
'shipvend': "Regenerate ShipVendors using latest stations.jsonl dump. (implies '-O system,station,ship')",
'upgrade': "Regenerate Upgrades using latest modules.json dump.",
'upvend': "Regenerate UpgradeVendors using latest stations.jsonl dump. (implies '-O system,station,upgrade')",
'listings': "Update market data using latest listings.csv dump. (implies '-O item,system,station')",
'all': "Update everything with latest dumpfiles. (Regenerates all tables)",
'clean': "Erase entire database and rebuild from empty. (Regenerates all tables.)",
'skipvend': "Don't regenerate ShipVendors or UpgradeVendors. Supercedes '-O all', '-O clean'.",
'force': "Force regeneration of selected items even if source file not updated since previous run. "
"(Useful for updating Vendor tables if they were skipped during a '-O clean' run.)"
 
Last edited:
Hi, gazelle.
Sorry, offline for a while.

The problem with a sigmoid function is that it trails off at the extremes and is only "active" near the midpoint (technically maps (-Inf, Inf) to (-1, 1). Oliver wanted the opposite; a function which penalizes distance at an increasing rate over a certain threshold, which he picked as 4000 LY, IIRC. So after that point both the first and second derivatives should be positive. The sigmoid function has negative second derivative beyond the inflection points towards the tails.
 
Thank you, CMDR eyeonus, that is fantastic news!!

Unfortunately, I dont have much time to test right now as there is too much going on in RL, but thank you very much!!!!
 
Back
Top Bottom