In-Development TradeDangerous: power-user trade optimizer

Just confirm in case I missed something, all outpost stations are s + m only, all 'slot' stations have l pads? Since a lot of '?' can be filled in via patient system view on the map I'd like to know if there are any odd cases before committing bad data.

There are no stations that are 'S' pad size anymore. @Gazelle pointed this out last month when max_pad_size was implemented in the data files - there was a game update in Gamma 3.9 that added a medium pad to all outposts that only had small pads.

If a docking request is denied at an outpost for a ship that needs a medium pad, then it is because it is in-use. I recall a game update (pre-Gamma I think) that forced NPCs to leave the medium pad if a player requests docking - but it takes time for them to clear it. And then if you request quickly after there can be a pad orientation bug that stops you from docking - but that's another issue.

All stations are either 'M' or 'L'. This CAN be determined from the GM. This is extremely good for large ships because you can check out the destination (if it's close enough for the GM to give you system map data access). So if a TD hop says "(no details)" for a station, then you can check the GM before embarking on that hop, update the destination station and re-run if necessary.

I see a few 'S' pads have crept into the data. I'll double-check those stations in-game to confirm they have M pads, and make it so all S's get turned into M's automagically.
 
lovely stuff :) sadly I don't get the gaming time i'd like so I've probably missed stuff in change logs since I started in July.

@maddavo - I assume changes from ? to M or L or adding BM data without adding new stations will be picked up ok uploading station to your site?
 
lovely stuff :) sadly I don't get the gaming time i'd like so I've probably missed stuff in change logs since I started in July.

@maddavo - I assume changes from ? to M or L or adding BM data without adding new stations will be picked up ok uploading station to your site?

Yep - absolutely. Data for existing stations is updated.

I've just visited half the stations that were listed as 'S' and managed to land at all of them with an Asp (which needs a medium pad).

I think that's enuf. I've updated the data and import so S's are treated as M's for the pad size. If anyone comes across a station that really only has small pads then please take screenshots of all the pads on the station so we can take a look.

BTW: You can recognise the size of pads very easily by looking at how many of those arrow panels there are at the 'in' end of the pad (the panels that flash/raise/lower and you have to wait for when you launch). Small pads have 1 panel, medium pads have 2 panels and large pads have 3 panels.
 
Last edited:
I believe they are called blast shields

152030.jpg


:)

And I agree, I haven't seen location since betas where you cannot land on ASP and other ships which require medium pad (meaning Small pad only outposts do not exist anymore).
 
Great to hear :)

The progression of technology is rather fast, in ten years we've gone from 1940's to 3301.

I could make a doctor and time reference here and it'd all be mutton geoff :)

- - - - - Additional Content Posted / Auto Merge - - - - -

Update: I've gotten a slew of additional work done on the refactor of how we're loading data (fixes the out of memory error, improves performance) and I think it's about ready for roll out, but it's 1am and it was a rather zany rush so I'd like chance to test it a little before I bring it on - if nothing else, I'd like to check some before-and-after routes to make sure that it's just faster and not, well, broken :)

Any brave souls who are willing to test the "trade-loading-refactor" branch and take it for a spin would be much appreciated.

- - - - - Additional Content Posted / Auto Merge - - - - -

This wiki CookBook#markdown-header-how-can-i-add-a-missing-station page is wrong on the "--blackmarket y" parameter, it doesn't work, the "--bm y" seems to work.

If someone else doesn't (it's a wiki, after all), I'll fix it :) it's probably --black-market :)

-Oliver
 
Can someone please remind me how the "--ls-penalty" function works? Say I wanted to filter our stations with >3,000ls distance from man star?
 
I might have overlooked this, but yesterday I got one set of data, then I updated the data and today I still get:
Code:
  Load from HILL PA HSI/Reilly Terminal (33ls/star, No/bm, Lrg/pad):
      532 x Personal Weapons      1,884cr each,  1,002,288cr total,  data from 20days and 4days
I looked at my own notes and there is no personal weapon sold in this station.

So if I submit full and correct set of commodities in import.prices file into maddavo's site... it doesn't actually delete old bad data which should not exist in the first place. I need to set the "<commodity name> 0 0 - - " entry for any deletions, which is a pain. I wish there would be some simple method to just get EliteOCR import.prices file and be done with it, no need to go manually hunting any leftovers from beta/gamma days or similar.

Its just me mumbling out loud, no question or anything here. Just a mumble. Mumblemumble ;)
 
I thought so too, but they are identical files content wise, makes no difference if its TD created updated.prices, TradeDangerous.prices or EliteOCR created import.prices - all the same.

Although updated.prices is always just one station where TradeDangerous.prices is everything and import.prices from EliteOCR can be as many or few as there is data on while you export. That is how I understand the system.
 
@Snake_Man and @aguettinger: re prices files

It sounds like you have a handle on the prices files - I'll just add some clarifying comments.

The TradeDangerous.prices file is the prices file used by TD. Usually it contains a complete list of prices for all stations and is either created by TradeDangerous or downloaded from my site.

The import.prices file contains prices to be imported into TD. It is a partial list of stations (or maybe just one). It may come from various sources - TD makes it using the import command to download the latest prices from my site, EliteOCR also makes an import.prices. The import.prices files a primarily for importing into TD.

The updated.prices file contains a FULL LIST of prices for a station and is generated by TD when a station is updated by it using the trade.py update command. This file is slightly different, it contains a full list of all commodities whether they are sold or not. If there is a commodity that isn't bought/sold at a station then it has 0 sell and 0 buy. THIS file is that is best to upload to my site. As @Snake_Man has identified, the 0 - 0 price will delete that commodity from the station.

All the prices files can be uploaded to my site. But a 7Mb TradeDangerous.prices file will take time to upload and a minute or two to process. It's probably better to upload incrementally. The 'madupload' command automatically uploads the updated.prices file.

There is one problem where EliteOCR is used - it doesn't generate a file that has all commodities in it, just the ones that were in the market. If you upload the import.prices then it won't delete any that aren't there. You could do this: import import.prices into TD, use TD to update the station, make a minor change to 'touch' the data and generate an updated.prices file and then upload the updated.prices file. But that is a hassle.

We need EliteOCR to make the import.prices file with all the commodities in it OR maybe some function in TD to process an EliteOCR file, create an updated.prices file and upload it - all in the one cmd.
 
Hi kfsone,

Just trying trade-loading-refactor, and getting two errors, one of which I have been getting with the main branch as well. I am sure it is likely something I am doing wrong, but here they are.

1. Get this when I try to rebuild the cache in both main branch and trade-loading-refactor:

Code:
Owner@ARYA /c/apps/trade-loading-refactor (master)
$ trade.py buildcache -f
NOTE: Rebuilding cache file: this may take a moment.
*** INTERNAL ERROR: NOT NULL constraint failed: ShipVendor.station_id
CSV File: C:\apps\trade-loading-refactor\data\ShipVendor.csv:209
SQL Query: INSERT INTO ShipVendor (station_id,ship_id) VALUES((SELECT Station.st
ation_id FROM Station INNER JOIN System USING(system_id) WHERE System.name = ? A
ND Station.name = ?),(SELECT Ship.ship_id FROM Ship WHERE Ship.name = ?))
Params: ['KALIKI', 'Oren City', 'Adder']

I am pretty sure this may be because my system/station file may not be in sync and does not contain KALIKI/Oren City. Workaround is to delete contents of ShipVendor.csv as if I delete just that station, I get the same error on another, rinse and repeat. I tried updating system.csv and station.csv but it does not seem to fix this problem, so not sure what is going on. I am running my own local database. This error started happening for me about a week ago (meant to mention it then but as you were busy and sick and I had a workaround and figured it was something in my setup, did not want to bug you. Bug you, get it?)

2. This error I only get with trade-loading-refactor when I try to rebuild the cache using my tradedangerous.prices file and the -f option:

Code:
Owner@ARYA /c/apps/trade-loading-refactor (master)
$ trade.py buildcache -f
NOTE: Rebuilding cache file: this may take a moment.
Traceback (most recent call last):
  File "./trade.py", line 76, in <module>
    main(sys.argv)
  File "./trade.py", line 50, in main
    results = cmdenv.run(tdb)
  File "c:\apps\trade-loading-refactor\commands\commandenv.py", line 79, in run
    return self._cmd.run(results, self, tdb)
  File "c:\apps\trade-loading-refactor\commands\buildcache_cmd.py", line 70, in
run
    buildCache(tdb, cmdenv)
  File "c:\apps\trade-loading-refactor\cache.py", line 866, in buildCache
    processImportFile(tdenv, tempDB, Path(importName), importTable)
  File "c:\apps\trade-loading-refactor\cache.py", line 688, in processImportFile


    columnDefs = next(csvin)
StopIteration

As I am not sure what the issue is here exactly (I bet it is something I have in my .prices or station.csv file which is not in the branch files, or perhaps I should have started just by importing my prices file rather than rebuilding the cache) and have not yet checked out my ED 1.04 upgrade to ensure it works, going to call it quits for tonight on testing this branch. Tomorrow I'll do a clean clone of it again and try from scratch with an empty database and prices file and see how it goes and let you know.
 
The updated.prices file contains a FULL LIST of prices for a station and is generated by TD when a station is updated by it using the trade.py update command. This file is slightly different, it contains a full list of all commodities whether they are sold or not. If there is a commodity that isn't bought/sold at a station then it has 0 sell and 0 buy. THIS file is that is best to upload to my site. As @Snake_Man has identified, the 0 - 0 price will delete that commodity from the station.
Ah-ha!

But here is my personal problem as EliteOCR user. When I bring import.prices from EliteOCR to TD, no updated.prices file is created which I could upload to maddavo's site. TD is not wrong by not creating it, why would it as I'm just importing the stuff heh.

The 'madupload' command automatically uploads the updated.prices file.
Can madupload command upload import.prices file as well or is it hardcoded like you said?

I'm again coming to my problem, I'm bringing in prices from EliteOCR and currently and painstakingly I import the import.prices into TD, then I upload the same file (leaving old outdated not used commodities still in place) into maddavo's site.

It would be so cool if I can import.prices into TD and then just use madupload command to upload any changes I made into maddavo's site.

There is one problem where EliteOCR is used - it doesn't generate a file that has all commodities in it, just the ones that were in the market. If you upload the import.prices then it won't delete any that aren't there. You could do this: import import.prices into TD, use TD to update the station, make a minor change to 'touch' the data and generate an updated.prices file and then upload the updated.prices file. But that is a hassle.
Yes today I had to do this to delete that one example commodity few posts above. A hassle.
 
(Also to Maddavo who I know reads this thread)

There will always be mistakes in the data - we are all only human

BUT -

Is there anything can be done about persistently wrong info. For example, LTT 5212 does not sell slaves, never has. Same with FongWang*. But unless I've recently been there myself and uploaded, I will get slave trading routes between the two - even though slaves are illegal all through both systems.

So someone is somehow repeatedly uploading the same wrong information over and over.

Another example was Zeta Aquilae / Julian Gateway which KFSOne had to code a special case for.

In these cases where we know there are numerous repeats of particular bad data, is it feasible to implement a mechanism to mitigate this?

It seems unlikely, but I feel better for having a moan :)

*There is a space in this system name, but the swear filter seems to think I'm touching myself - or something.
 
Last edited:
@Snake_Man and @aguettinger: re prices files

It sounds like you have a handle on the prices files - I'll just add some clarifying comments.

The TradeDangerous.prices file is the prices file used by TD. Usually it contains a complete list of prices for all stations and is either created by TradeDangerous or downloaded from my site.

The import.prices file contains prices to be imported into TD. It is a partial list of stations (or maybe just one). It may come from various sources - TD makes it using the import command to download the latest prices from my site, EliteOCR also makes an import.prices. The import.prices files a primarily for importing into TD.

The updated.prices file contains a FULL LIST of prices for a station and is generated by TD when a station is updated by it using the trade.py update command. This file is slightly different, it contains a full list of all commodities whether they are sold or not. If there is a commodity that isn't bought/sold at a station then it has 0 sell and 0 buy. THIS file is that is best to upload to my site. As @Snake_Man has identified, the 0 - 0 price will delete that commodity from the station.

All the prices files can be uploaded to my site. But a 7Mb TradeDangerous.prices file will take time to upload and a minute or two to process. It's probably better to upload incrementally. The 'madupload' command automatically uploads the updated.prices file.

There is one problem where EliteOCR is used - it doesn't generate a file that has all commodities in it, just the ones that were in the market. If you upload the import.prices then it won't delete any that aren't there. You could do this: import import.prices into TD, use TD to update the station, make a minor change to 'touch' the data and generate an updated.prices file and then upload the updated.prices file. But that is a hassle.

We need EliteOCR to make the import.prices file with all the commodities in it OR maybe some function in TD to process an EliteOCR file, create an updated.prices file and upload it - all in the one cmd.

Pretty spot on: It's worth noting, there is a script in the "misc" directory that will actually upload your current "updated.prices" file to mad. You need to install the "requests" package for it to work:

Code:
[color=silver]C:\Windows\CatPix\Trade\>[/color] pip install requests
[color=silver]C:\Windows\CatPix\Trade\>[/color] trade.py update -GF ...
[color=silver]C:\Windows\CatPix\Trade\>[/color] misc/madupload.py
Upload complete.
 
Can madupload command upload import.prices file as well or is it hardcoded like you said?

Code:
#! /usr/bin/env python

import pathlib
import platform
import re
import sys

import requests   # the actual script has a helper here

############################################################################

upload_url = 'http://www.davek.com.au/td/uploaddata.asp'
upfile = "updated.prices"
if len(sys.argv) > 1:
    upfile = sys.argv[1]

############################################################################

if not pathlib.Path(upfile).is_file():
    raise SystemExit("ERROR: File not found: {}".format(upfile))

files = {
}
r = requests.post(
        upload_url,
        files={
            'Filename': (
                upfile,
                open(upfile, 'rb'),
                'text/plain',
                {
                    "Expires": '300',
                }
            ),
        }
)

response = r.text
m = re.search(r'UPLOAD RESULT:\s*(.*?)<br', response, re.IGNORECASE)
if not m:
    raise Exception("Unexpected result:\n" + r.text)

resultCode = m.group(1)
if resultCode.startswith("SUCCESS"):
    raise SystemExit("Upload complete")

print("Upload failed: {}".format(resultCode))

The problem is that TradeDangerous uses "destructive imports" -- if you import the following file:

Code:
@ 1 G. CAELI/Smoot Gateway
   + Chemicals
      Explosives                  399       0     75124M         -  2014-12-26 01:45:00

TradeDangerous sees this as meaning Smoot Gateway sells Explosives and nothing else. The absence of data is considered a deletion.

Maddavo's site, for good reasons, considers this to be an update of Explosives.

What we found is that lots of stale data winds up lying around telling you that Station So-And-So is buying Algae at 3,000,000,000cr, and while people are frantically updating their prices for it, by *not* sending an explicit "this is no-longer available" to maddavo's site, the data never goes away.

- - - - - Additional Content Posted / Auto Merge - - - - -

In these cases where we know there are numerous repeats of particular bad data, is it feasible to implement a mechanism to mitigate this?

Open the text file "corrections.py" in the main TD directory.

-Oliver
 
In my ubuntu there is no "pip" and when I do sudo apt-get install python-requests, it says its already in latest version. When I run madupload.py it says ImportError: No module named 'requests'
 
Hi kfsone,
Code:
Owner@ARYA /c/apps/trade-loading-refactor (master)
$ trade.py buildcache -f
NOTE: Rebuilding cache file: this may take a moment.
*** INTERNAL ERROR: NOT NULL constraint failed: ShipVendor.station_id
CSV File: C:\apps\trade-loading-refactor\data\ShipVendor.csv:209
SQL Query: INSERT INTO ShipVendor (station_id,ship_id) VALUES((SELECT Station.st
ation_id FROM Station INNER JOIN System USING(system_id) WHERE System.name = ? A
ND Station.name = ?),(SELECT Ship.ship_id FROM Ship WHERE Ship.name = ?))
Params: ['KALIKI', 'Oren City', 'Adder']

I am pretty sure this may be because my system/station file may not be in sync and does not contain KALIKI/Oren City. Workaround is to delete contents of ShipVendor.csv as if I delete just that station, I get the same error on another, rinse and repeat. I tried updating system.csv and station.csv but it does not seem to fix this problem, so not sure what is going on. I am running my own local database. This error started happening for me about a week ago (meant to mention it then but as you were busy and sick and I had a workaround and figured it was something in my setup, did not want to bug you. Bug you, get it?)

If you're going to be selective in what you take from the TD:Station.csv, you probably want to apply the same filtration to the TD:ShipVendor.csv, and/or remove it after updating.

The error is not very friendly, but there's a multi-fold reason for this: a) the SQLite api doesn't provide good feedback on these errors (you're seeing what it gives us, and it's as unhelpful to us as it is to you), b) it would be very costly in perf and code complexity to pre-empt them, c) to discourage people from playing fast-and-loose with the approaches they take to managing their files (for the time being), d) because we're moving towards a system based on the .db file rather than the .csv files.

2. This error I only get with trade-loading-refactor when I try to rebuild the cache using my tradedangerous.prices file and the -f option:

Code:
Owner@ARYA /c/apps/trade-loading-refactor (master)
$ trade.py buildcache -f
NOTE: Rebuilding cache file: this may take a moment.
Traceback (most recent call last):
  File "./trade.py", line 76, in <module>
    main(sys.argv)
  File "./trade.py", line 50, in main
    results = cmdenv.run(tdb)
  File "c:\apps\trade-loading-refactor\commands\commandenv.py", line 79, in run
    return self._cmd.run(results, self, tdb)
  File "c:\apps\trade-loading-refactor\commands\buildcache_cmd.py", line 70, in
run
    buildCache(tdb, cmdenv)
  File "c:\apps\trade-loading-refactor\cache.py", line 866, in buildCache
    processImportFile(tdenv, tempDB, Path(importName), importTable)
  File "c:\apps\trade-loading-refactor\cache.py", line 688, in processImportFile


    columnDefs = next(csvin)
StopIteration

As I am not sure what the issue is here exactly (I bet it is something I have in my .prices or station.csv file which is not in the branch files, or perhaps I should have started just by importing my prices file rather than rebuilding the cache) and have not yet checked out my ED 1.04 upgrade to ensure it works, going to call it quits for tonight on testing this branch. Tomorrow I'll do a clean clone of it again and try from scratch with an empty database and prices file and see how it goes and let you know.
[/QUOTE]

Ok, that's a case where I can add an exceptionhandler with a warning. It'll basically tell you this:

"unexpected end of file". I've only seen this happen when you have made a .csv file blank, including removing the first line, rather than removing the file entirely.
 
Back
Top Bottom