In-Development TradeDangerous: power-user trade optimizer

This is a great tool. Really useful and far easier to get my head around for an old command-line chap like me... :)

Couple of questions:

1. When fetching the source from bitbucket, does this include the latest prices data or is better to do the --plug=maddavo sync?

2. What's best practice for keeping your own data intact, both prices and additional stations when you check out the source for an update? I've noticed the prices and stations are intact still in my data folder, so doing a 'buildcache -f' should bring this back in? Is it worth doing a diff?
 
Combat Stabalisers in Bhil Mina/Mohmand Gateway need to be purged. At least, the station wasn't buying them (and they had no entry for them at all - so I'm assuming it was an auto-OCR error by someone) when I visited it.

Same for Animal Monitors OCR-auto mistake against Auto-Fabricators (Remove Animal Monitors) at TABIT/Haber City.

- - - - - Additional Content Posted / Auto Merge - - - - -

Same for Animal Monitors OCR-auto mistake against Auto-Fabricators (Remove Animal Monitors) at TABIT/Haber City.

Maybe a simple approach would be to store the Galactic Average Price for each commodity, then allow the user to set a percentage threshold up and down to match prices against in the db.

This seems like a straightforward way to get around bad data until someone comes up with a better solution.
 
ALPHA CENTAURI/Hutton Orbital

Palladium 13148 13149 ? 579M 2014-12-22 00:39:00
Platinum 13107 13108 ? 598M 2014-12-19 03:24:46

Paladium is the correct one. Platinum does not exist there.
 
ALPHA CENTAURI/Hutton Orbital

Palladium 13148 13149 ? 579M 2014-12-22 00:39:00
Platinum 13107 13108 ? 598M 2014-12-19 03:24:46

Paladium is the correct one. Platinum does not exist there.

These two seem to get mixed up quite often!
 
Hey guys, i love the tool. Many options and good calculations. My Problem are the out of date data. I use the Plugin to retrieve maddovs prices. Is there a better source for prices? And can i say td to filter all data which are older then the given age?
 
As EliteOCR works so fine for me now, I'll try to update prices for TradeDangerous too. But basically everything older than dec 22nd is data I wouldn't trust because of the galaxy server update.
 
add the ability to restrict ship size dock pads

This is in the works.

- - - - - Additional Content Posted / Auto Merge - - - - -

Hey guys, i love the tool. Many options and good calculations. My Problem are the out of date data. I use the Plugin to retrieve maddovs prices. Is there a better source for prices? And can i say td to filter all data which are older then the given age?

When you do a run you can use the -MD option or (--max-days-old) to restrict the maximum age (in days) of trade data to use. See "trade.py run -h" for all the options.
 
How does --max-days-old handle different timezones?

Is the database in maddavo's AU or games UTC time zone?

I have not done much experimenting with this yet, but I think it works ok when I update locally, but when I pull updates from the server, then I get even on new data all kinds of "33 hours" ages like I got just now when I updated Sol/Columbus, uploaded prices to maddavo site and bit later import command updated.

BTW "trade.py import --help" command says:

--maddavo [Deprecated] Import prices from Maddavo's site. Use "--plug=madadvo" instead
 
The way the file is formatted, there would be spaces instead of numbers so the size would be the same. What makes it big is the number of stations and prices which just grows. It is up to 4Mb now and I see updates are coming in much quicker. I suspect with new players and everyone using EliteOCR it is easier to update prices so it's all snowballing. I'm considering purging older data from before 30th Nov. Surely that's not relevant now


purge all data from before the release December 16th instead
 
Dave: Part of the problem is people submitting similar but different names - you have quite a few duplicates in there. For instance, check the stations you have in Tyr...

This is one possible problem with trusting user-submitted data: we make mistakes (c.f. I screwed up "Ron Hubbard" because Rob Hubbard wrote awesome tunes and he and I hung out on C|Net). There are other cases caused by Frontier's typos. Sometimes people correct them (search for 'HANGAR' in your .csv, and notice that each one has a corresponding 'HANGER' entry because that's actually what Frontier used)

Hey, I fell for that one. Couldn't find MCC467/RonHubbard. :)
I see it's fixed now.
 
Last edited:
Navigation is broken:

Code:
kfsone-tradedangerous-ce7ca724008c>trade.py nav --ly-per 14.14 Orrere "39 Tauri"
System                   JumpLy
-------------------------------
ORRERE                     0.00
USZAA                      2.00
ARQUE                     17.47
VUCUB HUAN                 9.45
CRUCIS SECTOR FM-V B2-1   20.05
CRUCIS SECTOR FM-V B2-4   11.25
ER 8                      12.66
LHS 336                   17.28
LHS 2447                   6.21
MILDEPTU                  17.64
G 99-49                   14.90
ROSS 33                   16.04
ERNE                      13.11
39 TAURI                  16.04

Even with --ly-per "10" I still get jumps > 14 LY.

This is now fixed (c/o Gazelle).
 
This is a great tool. Really useful and far easier to get my head around for an old command-line chap like me... :)

Couple of questions:

1. When fetching the source from bitbucket, does this include the latest prices data or is better to do the --plug=maddavo sync?

The bitbucket repository does not include sources (and I will probably remove systems and stations soon, I think the code and data need to be separated so people can keep up with the rate at which TD changes without having the PITA of keeping their data from getting screwed up)

2. What's best practice for keeping your own data intact, both prices and additional stations when you check out the source for an update? I've noticed the prices and stations are intact still in my data folder, so doing a 'buildcache -f' should bring this back in? Is it worth doing a diff?

At the moment, the ".db" file is a "cache"; TD uses it as a crutch to save it from having to parse the .csv files every time and instead treat the ".db" as an actual database file.

With this will come support for a proper "import" along the lines people are expecting, that is, it will add what's in the imported file without deleting what isn't, only updating newer data rather than always doing it blindly, etc.

So at the moment, the manual process is a bit of a mess.
 
These two seem to get mixed up quite often!

New version of the update UI yesterday should make it a little harder to make these mistakes, but it won't be perfect until we get the current errors out. I'd love to replace the UI with something ... usable. I went with the UI system that comes with Python to save people installing packages, but it's bloody awful to work with and I'm a bloody awful UI programmer. I may take a look at using GTK+, which means an additional install step, but if it's easy enough to work with, I might be able to make it pay off with additional UI elements e.g. for updating stations etc, except that I'm a bloody awful UI programmer.

The other option might be for me to take a crack at a Mono/.NET UI for it; I've done that before, I just don't want to introduce a lot of install steps :)
 
Change log as of tonight. The "pad size" support is there but there are no options to filter by it yet.

If you have station distance, black market data or pad size data to contribute, please submit it here:
https://bitbucket.org/kfsone/tradedangerous/issue/105/collect-distance-black-market-and-pad-size


Code:
v6.3.0 Dec 23 2014
. (OpenSS) Script for Windows users (see scripts/README.txt)!
. (kfsone) Added support for pad sizes at stations,
. (kfsone) Revamped output from "local", "buy", "sell" and "olddata".
    - "--ages" is the default now (so the option is not needed),
    - Added pad size display,
    - Improved black market display,
    - Changed '+' to '/' for consistency on station lines,
. (bgol) Fixed weirdness with rangeCache
. (bgol/kfsone) Fixed jump distances and performance of "nav" command
+ Stations, Data: kfsone

v6.2.4 Dec 21 2014
. (kfsone) Experimental "add-station" command in misc,
. (kfsone) Added "--near" to olddata command,
. (kfsone) Route calculation performance,
. (kfsone) Added "Black Market" flag to station data,
. (kfsone) Added Black Market indicators to "local" command,
. (kfsone) Reorganized Ship and ShipVendor data (prices are ship based now),
. (kfsone) Draft version of "jsonprices",
. (gazelle) Auto-completion for bash users (see scripts/README.txt)
. (gazelle) Nice overhaul of the csv export command
. (kfsone) Fix for UTF-8 decoding error,
. (kfsone) Rebuild cache before .prices file after downloading .csvs
. (maddavo) Combat Stabilisers do exist
+ Systems, Stations, Data: Maddavo, Gazelle, Kfsone, many others

v6.2.3 Dec 17 2014
. (kfsone) "maddavo" import plugin:
            . --opt=skipdl will use previous downloads
            . added timestamp tracking/checking (data/maddavo.stamp)
            . don't rebuild the cache if nothing new was downloaded,
            . use "-v" to see stats on what updates you've downloaded,
            . use "-q" to see less output when using the plugin
. (kfsone) Changed "nav" to show station count with "-v" instead of "--stations"
. (kfsone) Removed the StationLink table - cache builds should be MUCH faster

v6.2.2 Dec 17 2014
. (kfsone) Removed Alloys, Combat Stabilisers, Cotton and Plastics
  [there was no entry for them anywhere in the price database]
. (kfsone) Plugins can now have their own options
    maddavo's plugin will take --option=syscsv and
      --option=stncsv to download those .csv files
. (kfsone) Added "--download" option to "import" to stop after downloading
. (kfsone) Added "--url" option to "import", e.g.
    trade.py import --download --url http://kfs.org/td/prices kfs.prices
. (kfsone) "--check --mfd" should now work if you have 64-bit drivers
. (kfsone) Added "--max-days" to run command
+ Stations, ships: kfsone

v6.2.1 Dec 12 2014
. (kfsone) Added "olddata" command
. (kfsone) "run" (with -vv) will now show data age
. (kfsone) Gamma 2.0.5 renamed "Slaves" category to "Slavery"
. (kfsone) "sell" now has a --ages option too
. (kfsone) "buy" and "sell" --near now checks the station/system too
. (kfsone) "buy" now shows average cost if you specify --detail (-v)
. (kfsone) "sell" now shows average value if you specify --detail (-v)
. (kfsone) Fixed item name matching (--avoid)
. (kfsone) Fixed use of via in "run"
. (kfsone) Exposed cache.regeneratePricesFile()
. (kfsone) Call regeneratePricesFile() after calling plugin.finish()
. (kfsone) General code cleanup (removed buildLinks and loadTrades)
. (kfsone) Added VisualStudio pyproj (great for performance analysis)
+ Stations, distances, ship data: gazelle, gulasch, kfsone, mseven
 
Thanks for the xmas gift update, much appreciated :)

Now that I run the new version, first I imported new data with:
Code:
trade.py import --plug=maddavo --opt=syscsv --opt=stncsv -v
then I run query:
Code:
trade.py run --cr 2000000 --cap 532 --ly 12.02 -vvv --avoid edwardsring,chiherculis --fr aulinenterprise
and results say 18days old data. Then I run:
Code:
trade.py update -T --editor=nano aulin/enterprise
and the prices (with few exceptions, not the results query showed though) are dated 2014-12-24 02:27:00

Any ideas where I went wrong? :eek:
 
Back
Top Bottom