In-Development TradeDangerous: power-user trade optimizer

Not quite sure about how TD works. Would it be possible just to fetch your tradedangerous.sql file and run sqllite3 on it or would that destroy data in the database - like prices info?

You can just copy the new tradedangerous.sql into your data directory (or edit your existing one) and the next call to trade.py will recreate the database and reimport your price file. The price file ist keept up to date which every "update" command you have done.

If a system or a station is renamend you had prices for then you also have to manually correct the names in the price file first otherwise the import will result in an error.
 

wolverine2710

Tutorial & Guide Writer
Commanders, there are new checked/confirmed coords. Done by RedWizard. You might want to check them out in the crowd sourcing thread here.

Also please DO check out his tool, its looking sweet and very easy to use. Its on github here. For those unfamiliar with git just select 'download as zip'.
 

wolverine2710

Tutorial & Guide Writer
Guess that's a no :(

You seemed to have missed the following from the OP: "TradeDangerous is Open Source - I've written it so that other developers can make use of the extremely fast and efficient calculation engine to build tools."

Commanders here do want to help but there are timezones to consider - so a bit of patience is a good thing. When you had searched this thread a bit you would have found post #116 and post #227. First one has built a GUI, second one trying to built in. Last post is from today. Hope the info helps a bit.
 
Last edited:
Hello guys.

Code:
C:\td>trade.py run --sh hauler --fr chan --cr 20000
Traceback (most recent call last):
  File "C:\td\trade.py", line 936, in <module>
    main()
  File "C:\td\trade.py", line 925, in main
    tdb = TradeDB(debug=args.debug, dbFilename=args.db)
  File "C:\td\tradedb.py", line 344, in __init__
    self.reloadCache()
  File "C:\td\tradedb.py", line 411, in reloadCache
    data.buildcache.buildCache(dbPath=self.dbPath, sqlPath=self.sqlPath, pricesPath=self.pricesPath)
  File "C:\td\data\buildcache.py", line 156, in buildCache
    processPricesFile(tempDB, pricesPath)
  File "C:\td\data\buildcache.py", line 118, in processPricesFile
    for price in priceLineNegotiator(pricesFile, db, debug):
  File "C:\td\data\buildcache.py", line 99, in priceLineNegotiator
    itemID = itemsByName["{}:{}".format(categoryID, itemName)] if qualityItemWithCategory else itemsByName[itemName]
KeyError: 'Battle Weapons'

Anyone able to give me a hand?

Anyone? I have the correct version of python installed!

Did I miss a step somewhere? :(
 
As a player with very little programming experience, am finding it difficult to work out how to update prices.

I tried running something like

trade.py update "chango" --editor "C:\Program Files (x86)\Notepad++\notepad++.exe" --all

from the cmd prompt and get a prices.tmp file in notepad++.

The problem is what to do next. Not sure how to edit the prices and have it update the main price list. Any help would be much appreciated.

I don't know of a command-line option for Notepad++ that makes it wait until you're done editing to return control to the calling process.

I would encourage you to use either

Code:
trade.py update --note chango

or install Sublime Text and use

Code:
trade.py update --subl chango

You only need the "--all" if you intend to supply the timestamp, demand and stock details. I would start out just using the price columns.

I'll try npp at home tonight and see if I can get it to run in a "wait until done editing" fashion, and if so I'll add "--n++" for launching it.

-Oliver
 
I'm tinkering with a web interface passing args to trade.py in the background. I have managed to get trade.py to output the -h help and the run -h help on a web page so I know that python runs it OK.

But when I try to get it to execute the run command with actual args then I get the following output:
Code:
Traceback (most recent call last):
File "C:\[REDACTED]\TradeDangerous\trade.py", line 1008, in 
main()
File "C:\[REDACTED]\TradeDangerous\trade.py", line 997, in main
tdb = TradeDB(debug=args.debug, dbFilename=args.db)
File "C:\[REDACTED]\TradeDangerous\tradedb.py", line 344, in __init__
self.reloadCache()
File "C:\[REDACTED]\TradeDangerous\tradedb.py", line 411, in reloadCache
data.buildcache.buildCache(dbPath=self.dbPath, sqlPath=self.sqlPath, pricesPath=self.pricesPath, debug=self.debug)
File "C:\[REDACTED]\TradeDangerous\data\buildcache.py", line 152, in buildCache
with sqlPath.open() as sqlFile:
File "C:\python34\lib\pathlib.py", line 1070, in open
opener=self._opener)
File "C:\python34\lib\pathlib.py", line 944, in _opener
return self._accessor.open(self, flags, mode)
File "C:\python34\lib\pathlib.py", line 323, in wrapped
return strfunc(str(pathobj), *args)
FileNotFoundError: [Errno 2] No such file or directory: 'data\\TradeDangerous.sql'

I can run the command on the web server console OK so I think it must be some kind of permissions thing. But I have set permissions for the TradeDangerous directory to "Full Control" to the Everyone user in IIS. Anyone know why this would be?

Could you grab the change I just pushed to buildcache.py and run it again with "-w", that should give you some debug output to help (specifically, tell us where the script is running so that we can tell if it's not using the correct path).
 
Late to the party....Sorry.

Is TD reliant solely on "crowd-sourced" input for the pricing and availability data and working in B2.x today?

I know that Slopey's tools got Neutered by FD with the removal of the data extract function on the client.

How does TD keep up to date with real-time?

Many TIA.
 

wolverine2710

Tutorial & Guide Writer
Copy paste from the crowd sourcing thead.

Smacker, 397 star systems, GREAT work.
I've been slacking but here is one more.
INSERT INTO "System" VALUES(,'Wredguia XH-Q B46-3',-81.96875,46.71875,-40.375,'2014-10-16 16:44:44');

Raw data from RW's tool.
Code:
INSERT INTO "System" VALUES(,'Wredguia XH-Q B46-3',-81.96875,46.71875,-40.375,'2014-10-16 16:44:44');

{
  "name": "Wredguia XH-Q B46-3",
  "x": -81.96875,
  "y": 46.71875,
  "z": -40.375,
  "calculated": true,
  "distances": [
    {
      "system": "Sol",
      "distance": 102.624
    },
    {
      "system": "Wolf 497",
      "distance": 97.503
    },
    {
      "system": "Huokang",
      "distance": 72.276
    },
    {
      "system": "Demeter",
      "distance": 55.152
    },
    {
      "system": "Clotti",
      "distance": 34.755
    },
    {
      "system": "Haras",
      "distance": 52.505
    }
  ]
}

RW's tool has reduced my copy/paste stuff - with changes for error - quite a bit ;-)

Hopefully "the One" can add it to TOTR soon ;-)
Gonna grab your tradedangerous.sql from your fork in the mean time.
 
Last edited:

wolverine2710

Tutorial & Guide Writer
Is TD reliant solely on "crowd-sourced" input for the pricing and availability data and working in B2.x today?

I know that Slopey's tools got Neutered by FD with the removal of the data extract function on the client.

How does TD keep up to date with real-time?

Many TIA.

Yes. Forgot the url but someone posted an url where you can upload your .prices file and then get an updated one back. If more cmdrs use it it could be somewhat up to date - as in a wee bit.

Unless OCR-ing the commodities succeeds there will be no 'near' realtime updates for TD. Using TD as a great navigation tool atm.
 
There are some systems in Brookes' file that does not have a corresponding entry in the sqlite database yet (I used the one from Smacker's fork).

For my own use I added a fake station to these systems, named "Unknown station XXX", with no other meaningful data. It's useful to know if a station is there for route planning, my starchart uses this information.

I don't use Trade Dangerous, but if you want me to add this data, I can create a fork to add it to TradeDangerous.sql.
 

wolverine2710

Tutorial & Guide Writer
There are some systems in Brookes' file that does not have a corresponding entry in the sqlite database yet (I used the one from Smacker's fork).

For my own use I added a fake station to these systems, named "Unknown station XXX", with no other meaningful data. It's useful to know if a station is there for route planning, my starchart uses this information.

I don't use Trade Dangerous, but if you want me to add this data, I can create a fork to add it to TradeDangerous.sql.

Not sure what you mean exactly. MB's file should contain list of star sytems with a station in it. According to the repository kfsone/smacker this is not always the case. Is that what you mean?. Note sure why you want to add a "Unknown station XXX" ? Maybe I've interpreted your post incorrectly!

Maybe best to post your inserts here so that smacker can add them to his .sql file. I can imagine otherwise there will be to many forks and it will clutter things. Just my 2 eurocent. Thanks for the input btw.
 
Not sure what you mean exactly. MB's file should contain list of star sytems with a station in it. According to the repository kfsone/smacker this is not always the case. Is that what you mean?. Note sure why you want to add a "Unknown station XXX" ? Maybe I've interpreted your post incorrectly!

Maybe best to post your inserts here so that smacker can add them to his .sql file. I can imagine otherwise there will be to many forks and it will clutter things. Just my 2 eurocent. Thanks for the input btw.

Sorry for the confusion. Perhaps it is because English is not my primary language. I use the coordinates outside tradedangerous, for my own map. For this map I just want to know if there is a station at all.

My idea was that once a real station is added to one of these systems, then the "Unknown" entry would be removed. Until then we would at least know if there was a station at all.

Or perhaps I am wrong, I assumed if a system is inhabited, there must be at least one station or outpost suitable for refueling, if nothing else. Is my assumption wrong?

As for forks, they are quite normal with distributed version control systems. If I made one and added my changes, then Smacker or kfsone could merge them with a couple of clicks. At least that is how github works, but bitbucket can't be that much different.
 
Here are the files and the SQL I used anyway. They are quite trivial, and like I said, likely useless except for routing.

https://dl.dropboxusercontent.com/u/276965/missing-stations.zip

Code:
BEGIN TRANSACTION;
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 0', (SELECT system_id FROM System WHERE name = '21 Eta Ursae Minoris'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 1', (SELECT system_id FROM System WHERE name = '26 Draconis'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 2', (SELECT system_id FROM System WHERE name = 'Aeolus'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 3', (SELECT system_id FROM System WHERE name = 'Altais'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 4', (SELECT system_id FROM System WHERE name = 'BD+55 1519'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 5', (SELECT system_id FROM System WHERE name = 'BD+65 1846'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 6', (SELECT system_id FROM System WHERE name = 'Balmus'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 7', (SELECT system_id FROM System WHERE name = 'CR Draconis'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 8', (SELECT system_id FROM System WHERE name = 'Demeter'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 9', (SELECT system_id FROM System WHERE name = 'Eos'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 10', (SELECT system_id FROM System WHERE name = 'Eta Coronae Borealis'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 11', (SELECT system_id FROM System WHERE name = 'GCRV 13292'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 12', (SELECT system_id FROM System WHERE name = 'GD 319'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 13', (SELECT system_id FROM System WHERE name = 'HIP 108110'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 14', (SELECT system_id FROM System WHERE name = 'HIP 7338'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 15', (SELECT system_id FROM System WHERE name = 'HIP 93119'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 16', (SELECT system_id FROM System WHERE name = 'HR 7925'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 17', (SELECT system_id FROM System WHERE name = 'Jaitu'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 18', (SELECT system_id FROM System WHERE name = 'LHS 140'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 19', (SELECT system_id FROM System WHERE name = 'LHS 215'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 20', (SELECT system_id FROM System WHERE name = 'LHS 2948'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 21', (SELECT system_id FROM System WHERE name = 'LHS 6309'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 22', (SELECT system_id FROM System WHERE name = 'LP 37-75'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 23', (SELECT system_id FROM System WHERE name = 'LTT 16016'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 24', (SELECT system_id FROM System WHERE name = 'LTT 16523'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 25', (SELECT system_id FROM System WHERE name = 'Lakluit'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 26', (SELECT system_id FROM System WHERE name = 'Lalande 29917'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 27', (SELECT system_id FROM System WHERE name = 'Loga'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 28', (SELECT system_id FROM System WHERE name = 'Manamaya'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 29', (SELECT system_id FROM System WHERE name = 'Medusa'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 30', (SELECT system_id FROM System WHERE name = 'Meliae'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 31', (SELECT system_id FROM System WHERE name = 'Men Samit'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 32', (SELECT system_id FROM System WHERE name = 'Merki'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 33', (SELECT system_id FROM System WHERE name = 'Mufrid'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 34', (SELECT system_id FROM System WHERE name = 'Njirika'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 35', (SELECT system_id FROM System WHERE name = 'Ross 210'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 36', (SELECT system_id FROM System WHERE name = 'Tollan'), 0);
INSERT INTO Station(name, system_id, ls_from_star) VALUES('Unknown Station 37', (SELECT system_id FROM System WHERE name = 'Zosia'), 0);
COMMIT;
 
Back
Top Bottom