In-Development TradeDangerous: power-user trade optimizer

Ok - I think we have this squared away. I'm not sure why the "UTF-8" error sprang when it did, but I have a fix for it in the code now, including the fact that the Station.csv seems to have changed URL.

I'm not sure why the station.csv changed to station.asp, but I made the plugin try the old one first and then the new one. The .asp version seems to have lots of stations missing so I'm guessing Dave put the TD .csv there temporarily to help restore order.
 
error on iport from maddavo

downloaded latest TD and executed import

got:

Code:
C:\Trade>trade.py import --maddavo --option=stncsv --option=syscsv
Connecting to server: [URL]http://www.davek.com.au/td/System.csv[/URL]
data/System.csv: 1,393,426/1,393,426 bytes | 147.52MB/s | 100.00%
Connecting to server: [URL]http://www.davek.com.au/td/Station.csv[/URL]
Got HTTP 404 Error: Trying alternate URL...
Connecting to server: [URL]http://www.davek.com.au/td/station.asp[/URL]
data/Station.csv: 44,287/44,287 bytes |  42.19MB/s | 100.00%
Connecting to server: [URL]http://www.davek.com.au/td/prices.asp[/URL]
import.prices: 3,939,666/3,939,666 bytes |  88.84KB/s | 100.00%
NOTE: Rebuilding cache file: this may take a moment
*** INTERNAL ERROR: NOT NULL constraint failed: ShipVendor.station_id
CSV File: C:\Trade\data\ShipVendor.csv:10
SQL Query: INSERT INTO ShipVendor (station_id,ship_id,cost) VALUES((SELECT Stati
on.station_id FROM Station INNER JOIN System USING(system_id) WHERE System.name
= ? AND Station.name = ?),(SELECT Ship.ship_id FROM Ship WHERE Ship.name = ?),?)

Params: ['Aptet', 'Torricelli Port', 'Dropship', '37814205']

C:\Trade>trade.py buildcache -f
NOTE: Rebuilding cache file: this may take a moment
*** INTERNAL ERROR: NOT NULL constraint failed: ShipVendor.station_id
CSV File: C:\Trade\data\ShipVendor.csv:10
SQL Query: INSERT INTO ShipVendor (station_id,ship_id,cost) VALUES((SELECT Stati
on.station_id FROM Station INNER JOIN System USING(system_id) WHERE System.name
= ? AND Station.name = ?),(SELECT Ship.ship_id FROM Ship WHERE Ship.name = ?),?)

Params: ['Aptet', 'Torricelli Port', 'Dropship', '37814205']

C:\Trade>

any suggestions ?

cheers
bubba
 
It appears that the system and station files from maddavo don't contain some of the stations found in the shipvendor csv file.

Solution is to:
A) Visit and upload those prices to maddavo to get the stations, or
B) Remove the stations from your shipvendor.csv, or
C) Use the files downloaded from the bitbucket.
 
maddavo, how frequently people upload updates to your site?

I think on average every 10mins or so? Mostly now ppl are uploading one station at a time rather than the whole thing.

I see the prices file has blown out to 3.5Mb. How long does that take most ppl to download? Although the data total is unlimited on the server, I think it's not a particularly fast connection. I am thinking of making two more files - a 24hr file and a 2hr file (or something like that). So that way you could just get the latest updates and not have to wait for stuff that you already have. I haven't looked to see how big those would be but I imagine they would be much smaller.
 
Some people had recently reported using EliteOCR for updating prices. I just installed it and spent the evening updating prices from about 30 stations. It works very well.

For those who are not familiar with this tool, see this thread . You can use screenshots of the market screen and it will convert them to a TD-compatible .prices file which can then be imported to TD or uploaded to the sharing site.

YMMV on this. I tried it and it did a bang up job for about 90% of the text, but it would stumble on some station names, it would identify silver as slaves most of the time and miss entire lines, removing those items from my station price data entirely. I think the idea is neat and I suspect the quality of the tool will improve greatly in short order, but I think it needs a little more time in the oven.
 
Download speed for me was like less than 100kb/s I think, not too bad as we don't have to do that every few minutes. Does the download already do some compression or just plain text?

I only did one update, was it like two days ago, the area where I was had no station data so I had to move to Aulin area to get some price info. Then most of the price info was off BUT this is galaxy servers fault as I believe the markets are really a mess right now. Although I ran into few "22 days" datas which is quite old heh.

I updated bunch of prices, so now I need to upload the complete TradeDangerous.prices to your (maddavo) site and the site can merge my edits to the existing database?

Developers promised market trading fixes next week, so I'm going to chill until that as I believe most of the prices change again so no need to collect that info over the weekend.
 
I updated bunch of prices, so now I need to upload the complete TradeDangerous.prices to your (maddavo) site and the site can merge my edits to the existing database?

Yep - that's what it does. I think I just saw a 3.9Mb incoming file, maybe that was you. It takes about 3min to process that and then 13sec to regenerate the prices file. It checks for new files every 2mins.

RE: Stations
I have just merged and imported all stations from the latest TD so that should solve the ShipVendor issues. I've also put a pull request back to TD repository for all the stations in the prices database so they're synced.
 
Last edited:
just retried, diff error...

C:\td>trade.py import --maddavo --option=stncsv --option=syscsv
Connecting to server: http://www.davek.com.au/td/System.csv
data/System.csv: 1,393,426/1,393,426 bytes | 165.96MB/s | 100.00%
Connecting to server: http://www.davek.com.au/td/Station.csv
Got HTTP 404 Error: Trying alternate URL...
Connecting to server: http://www.davek.com.au/td/station.asp
data/Station.csv: 46,139/46,139 bytes | 43.95MB/s | 100.00%
Connecting to server: http://www.davek.com.au/td/prices.asp
import.prices: 3,962,884/3,962,884 bytes | 82.19KB/s | 100.00%
NOTE: Rebuilding cache file: this may take a moment
*** INTERNAL ERROR: NOT NULL constraint failed: Station.system_id
CSV File: C:\td\data\Station.csv:138
SQL Query: INSERT INTO Station (system_id,name,ls_from_star) VALUES((SELECT Syst
em.system_id FROM System WHERE System.name = ?),?,?)
Params: ['Apalai', 'Gubarev Base', '0']

deleted all missing stations (20 or so) in /data/stations.csv...
now getting:
C:\td> trade.py run -v --fr Mok/Beth --cr 5000 --cap 8 --ly 8.56
NOTE: Rebuilding cache file: this may take a moment
NOTE: Missing "C:\td\data\TradeDangerous.prices" file - no price data
C:\td\trade.py: Error: Start station MOKOSH/Bethe Station doesn't have any price
data.
This can happen when there are no profitable trades matching your criteria, or i
f you have not yet entered any price data for the station(s) involved.


See 'C:\td\trade.py update -h' for help entering/updating prices, or obtain a '.
prices' file from the web, such as maddavo's: http://www.davek.com.au/td/


See https://bitbucket.org/kfsone/tradedangerous/wiki/Price Data for more help.

here is ther systems i deleted:

"C:\td> trade.py run -v --fr Mok/Beth --cr 5000 --cap 8 --ly 8.56
NOTE: Rebuilding cache file: this may take a moment
*** INTERNAL ERROR: NOT NULL constraint failed: Station.system_id
CSV File: C:\td\data\Station.csv:138
SQL Query: INSERT INTO Station (system_id,name,ls_from_star) VALUES((SELECT Syst
em.system_id FROM System WHERE System.name = ?),?,?)
Params: ['Apalai', 'Gubarev Base', '0']

C:\td> trade.py run -v --fr Mok/Beth --cr 5000 --cap 8 --ly 8.56
NOTE: Rebuilding cache file: this may take a moment
*** INTERNAL ERROR: NOT NULL constraint failed: Station.system_id
CSV File: C:\td\data\Station.csv:146
SQL Query: INSERT INTO Station (system_id,name,ls_from_star) VALUES((SELECT Syst
em.system_id FROM System WHERE System.name = ?),?,?)
Params: ['Arin', 'Kettle Landing', '0']

C:\td> trade.py run -v --fr Mok/Beth --cr 5000 --cap 8 --ly 8.56
NOTE: Rebuilding cache file: this may take a moment
*** INTERNAL ERROR: NOT NULL constraint failed: Station.system_id
CSV File: C:\td\data\Station.csv:390
SQL Query: INSERT INTO Station (system_id,name,ls_from_star) VALUES((SELECT Syst
em.system_id FROM System WHERE System.name = ?),?,?)
Params: ['DX Cancri', 'Horowitz Landing', '0']

C:\td> trade.py run -v --fr Mok/Beth --cr 5000 --cap 8 --ly 8.56
NOTE: Rebuilding cache file: this may take a moment
*** INTERNAL ERROR: NOT NULL constraint failed: Station.system_id
CSV File: C:\td\data\Station.csv:390
SQL Query: INSERT INTO Station (system_id,name,ls_from_star) VALUES((SELECT Syst
em.system_id FROM System WHERE System.name = ?),?,?)
Params: ['Dyaushis', 'Helmholtz Station', '0']

C:\td> trade.py run -v --fr Mok/Beth --cr 5000 --cap 8 --ly 8.56
NOTE: Rebuilding cache file: this may take a moment
*** INTERNAL ERROR: NOT NULL constraint failed: Station.system_id
CSV File: C:\td\data\Station.csv:421
SQL Query: INSERT INTO Station (system_id,name,ls_from_star) VALUES((SELECT Syst
em.system_id FROM System WHERE System.name = ?),?,?)
Params: ['Feng Huang', 'Murdoch Plant', '0']

C:\td> trade.py run -v --fr Mok/Beth --cr 5000 --cap 8 --ly 8.56
NOTE: Rebuilding cache file: this may take a moment
*** INTERNAL ERROR: NOT NULL constraint failed: Station.system_id
CSV File: C:\td\data\Station.csv:649
SQL Query: INSERT INTO Station (system_id,name,ls_from_star) VALUES((SELECT Syst
em.system_id FROM System WHERE System.name = ?),?,?)
Params: ['Kassi Hua', 'Clauss Platform', '0']

C:\td> trade.py run -v --fr Mok/Beth --cr 5000 --cap 8 --ly 8.56
NOTE: Rebuilding cache file: this may take a moment
*** INTERNAL ERROR: NOT NULL constraint failed: Station.system_id
CSV File: C:\td\data\Station.csv:649
SQL Query: INSERT INTO Station (system_id,name,ls_from_star) VALUES((SELECT Syst
em.system_id FROM System WHERE System.name = ?),?,?)
Params: ['Katae', 'Coulomb Survey', '0']

C:\td> trade.py run -v --fr Mok/Beth --cr 5000 --cap 8 --ly 8.56
NOTE: Rebuilding cache file: this may take a moment
*** INTERNAL ERROR: NOT NULL constraint failed: Station.system_id
CSV File: C:\td\data\Station.csv:654
SQL Query: INSERT INTO Station (system_id,name,ls_from_star) VALUES((SELECT Syst
em.system_id FROM System WHERE System.name = ?),?,?)
Params: ['Keling', 'Lopez De Villalobos', '0']

C:\td> trade.py run -v --fr Mok/Beth --cr 5000 --cap 8 --ly 8.56
NOTE: Rebuilding cache file: this may take a moment
*** INTERNAL ERROR: NOT NULL constraint failed: Station.system_id
CSV File: C:\td\data\Station.csv:673
SQL Query: INSERT INTO Station (system_id,name,ls_from_star) VALUES((SELECT Syst
em.system_id FROM System WHERE System.name = ?),?,?)
Params: ['Kuikian Batji', 'ALI INSTALLATION', '0']

C:\td> trade.py run -v --fr Mok/Beth --cr 5000 --cap 8 --ly 8.56
NOTE: Rebuilding cache file: this may take a moment
*** INTERNAL ERROR: NOT NULL constraint failed: Station.system_id
CSV File: C:\td\data\Station.csv:1012
SQL Query: INSERT INTO Station (system_id,name,ls_from_star) VALUES((SELECT Syst
em.system_id FROM System WHERE System.name = ?),?,?)
Params: ['Maitaokona', 'Gresley Gateway', '0']

C:\td> trade.py run -v --fr Mok/Beth --cr 5000 --cap 8 --ly 8.56
NOTE: Rebuilding cache file: this may take a moment
*** INTERNAL ERROR: NOT NULL constraint failed: Station.system_id
CSV File: C:\td\data\Station.csv:1020
SQL Query: INSERT INTO Station (system_id,name,ls_from_star) VALUES((SELECT Syst
em.system_id FROM System WHERE System.name = ?),?,?)
Params: ['Markanomovoy', 'Compton Dock', '0']

C:\td> trade.py run -v --fr Mok/Beth --cr 5000 --cap 8 --ly 8.56
NOTE: Rebuilding cache file: this may take a moment
*** INTERNAL ERROR: NOT NULL constraint failed: Station.system_id
CSV File: C:\td\data\Station.csv:1077
SQL Query: INSERT INTO Station (system_id,name,ls_from_star) VALUES((SELECT Syst
em.system_id FROM System WHERE System.name = ?),?,?)
Params: ['Nener', 'Popov Dock', '0']

C:\td> trade.py run -v --fr Mok/Beth --cr 5000 --cap 8 --ly 8.56
NOTE: Rebuilding cache file: this may take a moment
*** INTERNAL ERROR: NOT NULL constraint failed: Station.system_id
CSV File: C:\td\data\Station.csv:1111
SQL Query: INSERT INTO Station (system_id,name,ls_from_star) VALUES((SELECT Syst
em.system_id FROM System WHERE System.name = ?),?,?)
Params: ['Nuwang', 'Schade Outpost', '0']

C:\td> trade.py run -v --fr Mok/Beth --cr 5000 --cap 8 --ly 8.56
NOTE: Rebuilding cache file: this may take a moment
*** INTERNAL ERROR: NOT NULL constraint failed: Station.system_id
CSV File: C:\td\data\Station.csv:1221
SQL Query: INSERT INTO Station (system_id,name,ls_from_star) VALUES((SELECT Syst
em.system_id FROM System WHERE System.name = ?),?,?)
Params: ['Shama', 'Clairaut Port', '0']

C:\td> trade.py run -v --fr Mok/Beth --cr 5000 --cap 8 --ly 8.56
NOTE: Rebuilding cache file: this may take a moment
*** INTERNAL ERROR: NOT NULL constraint failed: Station.system_id
CSV File: C:\td\data\Station.csv:1440
SQL Query: INSERT INTO Station (system_id,name,ls_from_star) VALUES((SELECT Syst
em.system_id FROM System WHERE System.name = ?),?,?)
Params: ['WX Ursa Majoris', 'Teng-Hui Terminal', '0']

C:\td> trade.py run -v --fr Mok/Beth --cr 5000 --cap 8 --ly 8.56
NOTE: Rebuilding cache file: this may take a moment
MOKOSH/Bethe Station -> BD+23 3151/Houssay Port:
Load from MOKOSH/Bethe Station: 4 x Agri-Medicines (@1039cr), 4 x Pesticides (
@190cr),
Jump MOKOSH -> BD+23 3151
Load from BD+23 3151/Bamford Ring: 6 x Animal Meat (@1078cr), 1 x Fruit And Ve
getables (@202cr),
Finish BD+23 3151/Houssay Port + 5,329cr => 10,329cr
 
Last edited:
The problem is not that the stations are missing and have to be deleted, it's that the system.csv from maddavos site doesn't contain the systems they reference.
I used the system.csv included with TD and it imported fine.
 
Thinking about it it seems likely that these problem stations refer to beta systems that actually no longer exist, which explains why they're missing in newer system.csv files.
If they were completely removed from the station.csv and .prices files available on both sites the problem should be gone for good right?
 
Yep - that's what it does. I think I just saw a 3.9Mb incoming file, maybe that was you.
No but I just uploaded it now UTC 0702hrs, file was 3.15mb in size.

I hope I didn't mess it up, one of my earliest trade.py update's was without the timestamp and I think it then thinks all prices are updated, dunno. Now I'm doing all with -T and erase the timestamp of prices I edit, should be all good.
 
Passin' on what I've run into hopefully it will help.

After having multiple problems with what was a working copy, I checked out a clean git copy. First attempt to do anything results in duplicate stations on the following lines of the Station.csv:

465, 903, and 918. Easily fixable.

Then I try to import an updated prices file and get a warning on line 22063 of the prices file that HSIEN BAJI/STIRLING PLATFORM is an unrecognized Star/Station.

So, there's still some funkiness with the "vanilla" data on the git clone and it doesn't like the prices file for obvious reasons.

If I try ANY kind of update to the station or systems file from maddovo, it breaks. I've tried a plain update, saving the stations and systems csv's directly into data (this worked last night).
 
Well... Poking around the new git clone, I noticed a couple of new options for update... So I ran

Code:
"trade.py import --plug=maddavo"
(There's a typo in the usage help)...

This downloaded the updates, complained about some Unrecongized star/stations, but, trade.py WORKED!!!

By the way, should have said it first in my first post, what a great tool, thank you VERY much.
 
I think on average every 10mins or so? Mostly now ppl are uploading one station at a time rather than the whole thing.

I see the prices file has blown out to 3.5Mb. How long does that take most ppl to download? Although the data total is unlimited on the server, I think it's not a particularly fast connection. I am thinking of making two more files - a 24hr file and a 2hr file (or something like that). So that way you could just get the latest updates and not have to wait for stuff that you already have. I haven't looked to see how big those would be but I imagine they would be much smaller.


Another solution might be to zip up the file... file size then is just 375kb
 
I think on average every 10mins or so? Mostly now ppl are uploading one station at a time rather than the whole thing.

I see the prices file has blown out to 3.5Mb. How long does that take most ppl to download? Although the data total is unlimited on the server, I think it's not a particularly fast connection. I am thinking of making two more files - a 24hr file and a 2hr file (or something like that). So that way you could just get the latest updates and not have to wait for stuff that you already have. I haven't looked to see how big those would be but I imagine they would be much smaller.

If more people are using EliteOCR to grab prices data, that also captures the Demand as well as the supply. Perhaps that's one of the reasons? I wonder how big the file would be without the demand data? There are not any tools I know that are using it.
 
I'm still a bit confused on a few things:

When I create a import.prices from EliteOCR, if the station and system are not known when I try to run "trade.py import import.prices"
How do I correctly add the system and station?

Then how do I update maddavo so everyone else has the benefit?
 
Hello again.

I have been trying to use the command "trade.y import --maddavo --option=stncsv --option=syscsv" . In the beginning it works and it starts downloading system.csv , but it stops and I get the following error:
UnicodeUncodeError: 'charmap' codec can't encode character '\xe1' in position 7261: character maps to <undefined>

Is this something I can somehow fix, or is it a known issue?
 
I'm still a bit confused on a few things:

When I create a import.prices from EliteOCR, if the station and system are not known when I try to run "trade.py import import.prices"
How do I correctly add the system and station?

Then how do I update maddavo so everyone else has the benefit?

Add the following to your AppConfig.xml file, which is in: C:\Program Files (x86)\Frontier\EDLaunch\Products\FORC-FDEV-D-1002

VerboseLogging="1"

I added it right underneath the

LogFile="netLog"

line... now EliteOCR can pull the system data from the logs on any new screenshots... fixes this problem completely.
 
Add the following to your AppConfig.xml file, which is in: C:\Program Files (x86)\Frontier\EDLaunch\Products\FORC-FDEV-D-1002

VerboseLogging="1"

I added it right underneath the

LogFile="netLog"

line... now EliteOCR can pull the system data from the logs on any new screenshots... fixes this problem completely.

That... wasn't what he asked.

You can add the station to Station.csv in the data directory. If you're also missing the system, that's a lot more painful. You'll need to work out the system's coordinates first, then enter the info in System.csv. You can find some instructions on Maddavo's instruction page.
 
Back
Top Bottom