In-Development TradeDangerous: power-user trade optimizer

Those stations are not in TD. Someone is using a modified version of TD that submitted those prices. We will add them in so everyone doesn't have to remove them from the prices file.

This is the reason we ask people to use the main and submit fixes or additions to main so we don't have multiple copies with different data.

Could you tell me where you get your .prices from?
Because us 'new users' don't have any other source other than this one
 

ShadowGar

Banned
Could you tell me where you get your prices from?
Because us 'new users' don't have any other source other than this one

At the moment, I would suggest that website. Using the price files are not the issue. The issue is that someone was using their own modified Trade Dangerous program with custom stations. They added in the extra stations before we got them in the newer version, which caused the conflict. The newer version on current pull request has those systems in it. I just checked it. So once kfsone approves the code updates, everything should work as normal.

I would suggest that if someone has new stations and stars to add, to submit a ticket here https://bitbucket.org/kfsone/tradedangerous/issues

So we can keep track of them and add them. If people go ahead and add them in themselves and then submit their prices file to that price site, you will error out everyone without that station.
 
There must be great pressure on kfsone to merge all changes in.

But adding new stations should not cause a conflict in the first place.

If the system/station data would be in a json (or whatever structured) format without numeric system/station IDs, conflicts could not happen. I did not think of this issue earlier. Perhaps using an SQL dump for the "base" star/system is not a good idea, at least not suitable for distributed development.
 

ShadowGar

Banned
I submitted them to the current pull request. I don't know if it will get in. But once the pull goes through, I will add them myself if it doesn't get in on this pull.

At the moment I'm waiting on the pull request from another fork to be approved before I add anything since there is changes that will conflict with my anything I add.

But keep them coming!
 
Been reading the discussion about the prices file. Yeah, note that the website just adds new stations/system combinations that it finds in an uploaded prices file. I didn't want to have to fiddle with it on an ongoing basis. It doesn't check the station/system combinations against any table. It assumes good data from a TD generated file.

Yeah, there were two stations identified that were in the prices data that weren't in the sql. As ShadowGar identified, this is going to happen alot in the future so would be good if someone made a handler for that in TD.
 
There must be great pressure on kfsone to merge all changes in.

But adding new stations should not cause a conflict in the first place.

If the system/station data would be in a json (or whatever structured) format without numeric system/station IDs, conflicts could not happen. I did not think of this issue earlier. Perhaps using an SQL dump for the "base" star/system is not a good idea, at least not suitable for distributed development.

It's not the station data, it's more that there's a LOT of work being done by the folks who are contributing right now, for instance I'm about to merge 4.2.2, in which Smacker65/ShadowGar have incorporated a new way of indicating where data came from. Some of that spilled over to previous pull requests (bitbucket is apparently a bit crap at this, which is weird because it's kind of the core of what they are supposed to do).

Anyway, star/station updates incoming.
 
Been reading the discussion about the prices file. Yeah, note that the website just adds new stations/system combinations that it finds in an uploaded prices file. I didn't want to have to fiddle with it on an ongoing basis. It doesn't check the station/system combinations against any table. It assumes good data from a TD generated file.

Yeah, there were two stations identified that were in the prices data that weren't in the sql. As ShadowGar identified, this is going to happen alot in the future so would be good if someone made a handler for that in TD.

We have a task to address this, and my current thinking is that we add an "import" command to trade.py so that you don't have to overwrite your .prices file.

After all, that's basically what the "update" command is doing, it just creates the file first and lets you edit it.
 
Thanks to (in alphabetical order ;-P) ShadowGar, Smacker65 and Wolverine. 4.2.3 has a whole glut of new data, and the ability to tell where a system/station was sourced from (which may prove handy internally or for developers consuming the TD data)

v4.2.3 Oct 17/2014
. (ShadowGar, Smacker65) Imported Harbinger and RedWizzard system data,
also added tracking of where data has come from. Thanks also to Wolverine
for his work identifying data sources.
 
The value of a trade route depends greatly on the length of the route, and the stops in between those are necessary, therefore routing is also necessary. Otherwise the user has to specify which systems he thinks can be reached, but then it would be still good to know if a hop can be done without refueling. My planner used the number of jumps plus the numbers of refuels needed, the latter being multiplied by a factor (default 10) to calculate the cost of a trade hop. Did you came up with a better method?

Fundamentally, TD is trying to solve the NP variant of the profit problem: It's not just trying to solve travelling salesman (best route) but also knapsack (best cargo). Combine these two together and add the a-star algorithm and you'll find that TD is kinda NP+ :). Even so, I think factoring in refueling is a good idea, but we probably need to also introduce the concept of travel time - because we don't already have enough axes to search along while figuring out the best route - it would, however, solve the problem of "Sell 100 x Fish, Buy 100 x Fish and fly on" :)
 
Fundamentally, TD is trying to solve the NP variant of the profit problem: It's not just trying to solve travelling salesman (best route) but also knapsack (best cargo). Combine these two together and add the a-star algorithm and you'll find that TD is kinda NP+ :). Even so, I think factoring in refueling is a good idea, but we probably need to also introduce the concept of travel time - because we don't already have enough axes to search along while figuring out the best route - it would, however, solve the problem of "Sell 100 x Fish, Buy 100 x Fish and fly on" :)

I don't know what variant my project used. Seems like I have a lot of theory to learn. I just created a list of hops, ordered after profit/effort, and connected them until after there were some circles or the time run out.
 
Meanwhile I put together a script that exports and imports the sqlite database in json format, dropping the IDs themselves.

It works with the current schema defined by TradeDangerous.sql, but is pretty generic, does not use any hard coded table or column names. If would have to be improved if duplicate station names were allowed (perhaps use *_id fields of tables having duplicate name entries).
 

wolverine2710

Tutorial & Guide Writer
Thanks to (in alphabetical order ;-P) ShadowGar, Smacker65 and Wolverine. 4.2.3 has a whole glut of new data, and the ability to tell where a system/station was sourced from (which may prove handy internally or for developers consuming the TD data)

v4.2.3 Oct 17/2014
. (ShadowGar, Smacker65) Imported Harbinger and RedWizzard system data,
also added tracking of where data has come from. Thanks also to Wolverine
for his work identifying data sources.

Don't shoot the messenger but Harbinger - the man who never sleeps - at 10:38 PM posted ANOTHER 25 3D coordinates in the crowd sourcing thread. Here is Harbingers post:

25 newly mapped systems:

  • PI-1 Ursae Minoris (-52.1875, 39.75, -24.96875).
  • Wregoe WV-E D11-104 (-45.4375, 31.3125, -25.125).
  • Aktzin (-51.78125, 23.9375, -29.40625).
  • LP 4-258 (-46.875, 32.1875, -37.21875).
  • LP 5-110 (-51.5625, 37.21875, -41.375).
  • Wregoe MK-Q B46-2 (-56, 44.46875, -43.5625).
  • BD+81 297 (-47.03125, 41.0625, -41.8125).
  • NSV 4864 (-46.09375, 38.625, -36.0625).
  • Enlil (-37.5625, 39.09375, -39.40625).
  • KUI 47 (-38.9375, 43.5625, -38.96875).
  • Apep (-42.8125, 46.9375, -38).
  • LHS 2610 (-41.3125, 40.4375, -27.375).
  • 23 H. Camelopardalis (-37, 26.25, -36.625).
  • K Camelopardalis (-45.21875, 26.625, -43.46875).
  • LHS 207 (-47.84375, 28.65625, -45.8125).
  • Wredguia BT-0 B47-0 (-64.5, 38.75, -33.6875).
  • Wredguia FZ-M B48-3 (-65.8125, 39.1875, -24.25).
  • NLTT 46629 (-67.34375, 38.21875, -25.9375).
  • Wredguia DK-G C24-11 (-70.34375, 46.03125, -24.6875).
  • NLTT 42620 (-61.78125, 46.28125, -21.53125).
  • LTT 18432 (-59.21875, 44.84375, -22.6875).
  • Wredguia BT-0 B47-3 (-75.90625, 50.84375, -23.5625).
  • Wredguia DK-G C24-17 (-75.375, 54.5625, -13.78125).
  • LTT 14761 (-72.3125, 60.96875, -19.28125).
  • Jata (-67.78125, 58.3125, -28.8125).

Verification

SQL for Smacker's fork of Trade Dangerous:
Code:
INSERT INTO "System" VALUES(X,'PI-1 Ursae Minoris',-52.1875,39.75,-24.96875,102,'2014-10-17 18:50:49');
INSERT INTO "System" VALUES(X,'Wregoe WV-E D11-104',-45.4375,31.3125,-25.125,102,'2014-10-17 18:57:43');
INSERT INTO "System" VALUES(X,'Aktzin',-51.78125,23.9375,-29.40625,102,'2014-10-17 19:03:13');
INSERT INTO "System" VALUES(X,'LP 4-258',-46.875,32.1875,-37.21875,102,'2014-10-17 19:10:26');
INSERT INTO "System" VALUES(X,'LP 5-110',-51.5625,37.21875,-41.375,102,'2014-10-17 19:14:58');
INSERT INTO "System" VALUES(X,'Wregoe MK-Q B46-2',-56,44.46875,-43.5625,102,'2014-10-17 19:20:21');
INSERT INTO "System" VALUES(X,'BD+81 297',-47.03125,41.0625,-41.8125,102,'2014-10-17 19:24:46');
INSERT INTO "System" VALUES(X,'NSV 4864',-46.09375,38.625,-36.0625,102,'2014-10-17 19:30:16');
INSERT INTO "System" VALUES(X,'Enlil',-37.5625,39.09375,-39.40625,102,'2014-10-17 19:45:43');
INSERT INTO "System" VALUES(X,'KUI 47',-38.9375,43.5625,-38.96875,102,'2014-10-17 19:50:53');
INSERT INTO "System" VALUES(X,'Apep',-42.8125,46.9375,-38,102,'2014-10-17 19:55:34');
INSERT INTO "System" VALUES(X,'LHS 2610',-41.3125,40.4375,-27.375,102,'2014-10-17 20:00:39');
INSERT INTO "System" VALUES(X,'23 H. Camelopardalis',-37,26.25,-36.625,102,'2014-10-17 20:14:49');
INSERT INTO "System" VALUES(X,'K Camelopardalis',-45.21875,26.625,-43.46875,102,'2014-10-17 20:19:11');
INSERT INTO "System" VALUES(X,'LHS 207',-47.84375,28.65625,-45.8125,102,'2014-10-17 20:23:21');
INSERT INTO "System" VALUES(X,'Wredguia BT-0 B47-0',-64.5,38.75,-33.6875,102,'2014-10-17 20:44:13');
INSERT INTO "System" VALUES(X,'Wredguia FZ-M B48-3',-65.8125,39.1875,-24.25,102,'2014-10-17 20:48:28');
INSERT INTO "System" VALUES(X,'NLTT 46629',-67.34375,38.21875,-25.9375,102,'2014-10-17 20:52:35');
INSERT INTO "System" VALUES(X,'Wredguia DK-G C24-11',-70.34375,46.03125,-24.6875,102,'2014-10-17 20:59:30');
INSERT INTO "System" VALUES(X,'NLTT 42620',-61.78125,46.28125,-21.53125,102,'2014-10-17 21:05:23');
INSERT INTO "System" VALUES(X,'LTT 18432',-59.21875,44.84375,-22.6875,102,'2014-10-17 21:09:33');
INSERT INTO "System" VALUES(X,'Wredguia BT-0 B47-3',-75.90625,50.84375,-23.5625,102,'2014-10-17 21:17:32');
INSERT INTO "System" VALUES(X,'Wredguia DK-G C24-17',-75.375,54.5625,-13.78125,102,'2014-10-17 21:23:26');
INSERT INTO "System" VALUES(X,'LTT 14761',-72.3125,60.96875,-19.28125,102,'2014-10-17 21:27:19');
INSERT INTO "System" VALUES(X,'Jata',-67.78125,58.3125,-28.8125,102,'2014-10-17 21:31:54');
(The regular version of Trade Dangerous may need to remove "102," from each line, not sure if Smacker's changes to the database structure have been merged)
 
Props to Gazelle

I've just merged Gazelle's feature change to move all of the star/station/etc data out of the ".sql" file and into .csv files. He came up with a really nice implementation and he had a lot of additional systems.

Just be aware that there's probably going to be a bit of settling period given that so many people are changing things. Please use the issue tracker if you run into issues: https://bitbucket.org/kfsone/tradedangerous/issues?status=new&status=open
 
I don't know what variant my project used. Seems like I have a lot of theory to learn. I just created a list of hops, ordered after profit/effort, and connected them until after there were some circles or the time run out.

I'm mostly self-taught, and I solved the "knapsack-salesman" problem primarily by rubber-ducking it with my wife and sitting down with her and a bunch of monopoly money and cards and theory-crafting solutions.

Please. PLEASE. Don't ever tell her the rubber-duck part.

Mostly I chose to solve the hard version because I needed to spool up my Python for this job I've just started.
 
I've just merged Gazelle's feature change to move all of the star/station/etc data out of the ".sql" file and into .csv files. He came up with a really nice implementation and he had a lot of additional systems.

Just be aware that there's probably going to be a bit of settling period given that so many people are changing things. Please use the issue tracker if you run into issues: https://bitbucket.org/kfsone/tradedangerous/issues?status=new&status=open

Looks nice, very similar to my json solution. Only in the json everything was in a single file, including the initialisation sql. The benefit of a single file it is that it is, known what tables are available: those that exist in the json.

My solution, json-sqlite had, in addition some features (those could also be added to Gazelle's solution)
  • dynamic table/format lookup (querying sqlite_master and using table_info)
  • dynamic dependency resolution
  • table names don't need to be hardcoded in the project
 
Looks nice, very similar to my json solution. Only in the json everything was in a single file, including the initialisation sql. The benefit of a single file it is that it is, known what tables are available: those that exist in the json.

Thanks (I think). My decision for using the csv format and multiple files is based on the thought that it's easier for an "end-user" to understand (and I'm not that familiar with the json format).

My solution, json-sqlite had, in addition some features (those could also be added to Gazelle's solution)

That will come in handy. Thanks for sharing the source. I'm in the process to automate the csv file generation. Right now it's a mixture of selects and manual editing.
 
Back
Top Bottom