Discussion What is the most efficient way to crowdsource the 3D system coordinates

Harbinger

Volunteer Moderator
Does that mean the coordinates you posted earlier need recalculating?

The example I've been working with today (LP 378-541) varies by this amount now that I've taken the 1/32 grid into account:

0.000379, 0.000842, -0.000337

For all intense purposes the coordinates I previously provided should be close enough to those you would get even if taking 1/32 into account.

Although in saying that, I'll go through the places I've navigated previously once I have sufficient time to do so.

EDIT: Actually shouldn't they be close enough that a 1/32 correction can be easily done on them anyway?

My original averaged result for LP 378-541:

(1.187879, 20.719592, 2.343413)


((1.187879+(1/64))*32)=38.512128
truncated to 38
38/32 = 1.1875

((20.719592+(1/64))*32)=663.526944
truncated to 663
663/32 = 20.71875

((2.343413+(1/64))*32)=75.489216
truncated to 75
75/32 = 2.34375

Which gives (1.1875, 20.71875, 2.34375) and is identical to the result I gained in my 1/32 tests and was verified by wolverine2710 earlier.

I'll run my original coordinates through the same routine and post an updated list shortly.
 
Last edited:
[*]Additional sytems: I have an other field where I can put in a new reference point. I can't add multiple new reference points.
You can, but the way it works might be a little too subtle ;) What happens is that once you leave the "other" field, if it recognises the name as a potential reference system from Michael's list, it adds a row to the table with that name and any distance you've already entered and it resets the "other" row. I think I'll change this behavior because there are problems with it: you can't change the system once it's been recognised, the test is case sensitive which is not ideal if you're typing, and I think it might be good to allow distances to systems that aren't on Michael's list (they wouldn't be used to trilaterate but could be used for verification later).

[*]JSON output. Had to RTFM before it worked. Had to press DONE first.
[*]DONE. Clears form and enables JSON output. When experimenting (with JSON outpu) I personally don't like it be cleared and like to get JSON output without pressing DONE - when name and 4 distances are entered. But that just me, can work/live with it.
I'm going to change that too. The way it works was basically due to laziness: I didn't want to have to deal with the existing JSON output changing. But I think it would be better to have it update dynamically. So I'll have two JSON boxes: first one will be the live current data, second one will be all the completed data. Hitting "done" will append the current data to the completed data and reset the form.

[*]Added LP 378-541 coords to systems.json - multiplied calculated coords by 32. In real and calculated colums my entered data was shown with an error of 0. After that changed X coordinate so it was totally wrong. Again real and calc colums showed my input and error zero. CURIOUS about this. Could you elaborate?
Checkall doesn't quite work the way you're expecting. It's really for testing the algorithm and reference systems. What it does is for each system in the table (except the reference systems) it applies the trilateration using the coordinates from the file. If the result matches the file's coordinates then we know the algorithm and reference systems are good. So when you change coordinates in the file, it is using the modified coordinates as both input and to check the output and so you still get an error of zero (provided the algorithm is right and the reference systems are good).

I am going to put together some sort of verification page today. That'll use distance data to confirm the coordinates are right. For cases where we don't have the original distance data I think it might be worth having a page where you can say "I'm at system X" and the page then gives you a couple of distances to check.

About system.json. Multiplying the coordinates by 32 seemed like a good idea at the time (as it made everything integers) but I'm now leaning towards changing it to the having the floating point coordinates so Conquenchis would be (-62.7813, 23.09375, -36.4375) instead of (-2009,739,-1166). What do you think?

I like the JSON output. Suggestion: Atm the coords in TD have to manually inserted with insert statement like: INSERT INTO "System" VALUES(61,'Nyon T''ao Wujin',-81.71875,58.8125,-35.28125,'2014-10-02 14:21:15');. Its Nyon T'ao Wujin (1 ' character) but double '' here for escaping.\
It's easy to generate the inserts. I'll add that.

My plan today is to get a "bulk locate" page working so I can verify the data we've already got and then make those changes to entry.html. Should have a new version uploaded in about 12 hours or so.
 
Last edited:

Harbinger

Volunteer Moderator
Here's an update to my previous list to make them 1/32 compliant:
  • Perkwunos (-40.5, 46.75, 6.75)
  • LP 274-8 (-29.75, 44.9375, 20.3125)
  • Rho Coronae Borealis (-29.875, 42.09375, 22.21875)
  • LP 386-49 (-25.53125, 33.375, 26)
  • LP 329-18 (-26.625, 39.875, 23.28125)
  • LHS 3124 (-26.28125, 43.4375, 24.1875)
  • LHS 399 (-13.5625, 36.15625, 25.78125)
  • LHS 396 (-9.875, 30.84375, 20.46875)
  • OT Serpentis (-11.125, 30.34375, 18.40625)
  • DE Bootis (-7.4375, 32.625, 17)
  • Veren's Stop (-12.96875, 21.96875, 20.28125)
  • G 181-6 (-20.3125, 20.3125, 14.0625)
  • Marcov's Point (-22.34375, 14.625, 17.65625)
  • LP 229-17 (-23.03125, 9.03125, 8.96875)
  • G 203-51 (-15.875, 12.03125, 5.34375)
  • Bidmere (-7.4375, 13.21875, 1.90625)
  • Wise 1405+5534 (-8.03125, 13.4375, -1.84375)
  • LHS 455 (-16.90625, 10.21875, -3.4375)
  • LP 71-165 (-20.90625, 11.09375, -2.21875)
  • LHS 465 (-23.9375, 12.21875, -0.625)
  • Austern (-25.15625, 15.34375, 9.375)
  • G 202-48 (-15.53125, 14.375, 1.875)
  • 2MASS 1503+2525 (-6.1875, 18, 8.125)
  • DG Canum Venaticorum (-3.125, 25.53125, 2.6875)
  • LP 378-541 (1.1875, 20.71875, 2.34375)
 
Looks like I have a lot to catch up on. :eek:

Also looks like a lot of raw data has turned up while I was away. That's good, though I'll need to gather it all together before I can use it to update my maps.

The major feature of my "resolve" script is about a metric tonne of data validation. It's set up so that you can dump whatever data you like into it, and it'll try to pull out the best data and then tell you how reliable it turned out to be. It'll even try to cope with obvious errors such as capitalisation of system names, while rejected data will be analysed for possible typos of system names, so that you can go back and correct them.

Anyway, you've been itching to actually *see* my scripts, so... linkie.

In that archive, stars.py is an early attempt at algebraic trilateration - which didn't work. You can safely ignore it, unless you want to figure out where I went wrong. I settled on using least-squares instead.

resolve.py is the script which turns primary-coordinates.csv (containing a uniquified combination of official beta1 and beta2 coordinates) and distances.csv into coordinates.csv, which contains both the official and newly-calculated coordinates. This is also where all the heavy validation logic is.

Feel free to run resolve.py on the .csv files already present. You'll see just what sort of validation you need to do on crowdsourced data, and why.

starmap.py takes coordinates.csv and generates a crapload of maps from it. It'll take a while.

starroute.py is possibly the most interesting script for end-users. Example usage:

Code:
./starroute.py Aulin Rahu
This produces the most fuel-efficient route between the two named systems, assuming unlimited jump range.

Code:
./starroute.py "i bootis" "ngoloki anaten"
It can cope with lazy capitalisation. If you misspell it, it'll also suggest corrections.

Code:
./starroute.py Aulin Rahu 15
Adding one numeric parameter limits the maximum jump range. It will find the most fuel-efficient route within that constraint.

Code:
./starroute.py Aulin Rahu 8
If no route exists within the constraint, it'll tell you so.

Code:
./starroute.py Aulin Rahu 15 1.25
If you want to take fewer jumps in preference to minimising fuel consumption, use a lower value for the fuel consumption power factor. The default is 2.15.

Code:
./starroute.py Aulin Rahu 15 1
This will find the shortest-distance (fewest jumps) route. If used with a large enough jump range, it will always be a direct jump.
 
Near I Bootis the released spreadsheet have a system called "Destination". This system is not reachable in game. Noticed it when I did a fuel efficient route past I Bootis.

For routing purposes this system should probably be set to coords way out of the galaxy until it appears selectable in game.
 
]So it seems we are on our own.

On your own is always the best and most immersive way to play a game and get the most out of it yourself rather than play someone else's data.
I never use walkthroughs, or cheats because I don't want to cheat myself out of finding what someone else did by just playing and enjoying the game.
For all intents and purposes games are more fun played than replayed.
 
Last edited:
On your own is always the best and most immersive way to play a game and get the most out of it yourself rather than play someone else's data.
I never use walkthroughs, or cheats because I don't want to cheat myself out of finding what someone else did by just playing and enjoying the game.
For all intents and purposes games are more fun played than replayed.

And with that mindset I would also expect you never to tell others how they should play the game?
 

wolverine2710

Tutorial & Guide Writer
On your own is always the best and most immersive way to play a game and get the most out of it yourself rather than play someone else's data.
I never use walkthroughs, or cheats because I don't want to cheat myself out of finding what someone else did by just playing and enjoying the game.
For all intents and purposes games are more fun played than replayed.

You've quoted me from my OP. You seem to have missed (which can of course happen) the part where I asked to concentrate only on the technical aspects. Hence I can't comment on your post.

But in general. Its all about opinions and respect. I respect your opinion - which you are entitled to. I hope you respect my opinion and wishes. In the OP I've mentioned a great thread to discuss it further - I'm monitoring that thread. If you want we (and perhaps others) can discuss it further there. In doing so keeping this thread clean.
 
Last edited:

wolverine2710

Tutorial & Guide Writer
Looks like I have a lot to catch up on. :eek:

Also looks like a lot of raw data has turned up while I was away. That's good, though I'll need to gather it all together before I can use it to update my maps.

Welcome back. I saw your "Complete Maps of the Bubble" thread and it looks smashing. I will have a look at your scripts later on but first trying to get this thread to an successful end. Quote form the movie "The Fly": Be afraid, be very afraid I've some suggestions for you later on ;-)

Most important question: Do you round your coords to a 1/32LY grid?

Your resolve script looks very useful - to double check data. Especially wrt to data from JesusFreke. He's one of the brave volunteers who created a program to calculate coords. Double checking by a commander revealed that it needs a bit of fine tuning and some of the results are a bit off BUT he has created a spreadsheet which contains ALL the distances he has taken. You can find the data here. Perhaps you could use your resolve script to generate new coords from it. Would you be abe to do that?

Wrt catching up with data. If you read the posts of the last 3-4 days (do take a few cups of tea for that) we are trying to set it up in such a way that whenever data is entered with for example the webpage of Harbinger or the web pages of RedWizard (which can run on your local computer) on the website of Harbinger a sytem-coords.json and/or .csv file is constructed/updated which contains the latest list of (checked) coords.

You are more then welcome to use that data. Perhaps you are even interested in uploading coords created by your program to it as well. If so just let us know. Again welcome back and happy reading ;-)
 
Last edited:
To go one step forward on the data collection, I would suggest also storing stations and distance to them from main star for each system. This will alow tools like Chromatix to give an estimate time usage for each route.

That is important when you start looking at the fewest jumps (that might force a refuel) and more jumps that would allow you to make the route without refuel. Fuel usage formula is a known that is possible to estimate from emprical data gathered for your ship.

This could allow a tool to answer "Can I make this mission in time" You are here: You want to go here: Answer: You will spend an estimated xx minutes to get there. Of course anyone trying to nail that to nearest millisecond will get into problems, since there are a lot of unknown along the route. Interdictions, station/planet alignment, that Type 9 blocking the entrance for half a minute etc etc. But the estimate would be a very helpful 3rd party tool.
 

wolverine2710

Tutorial & Guide Writer
To go one step forward on the data collection, I would suggest also storing stations and distance to them from main star for each system. This will alow tools like Chromatix to give an estimate time usage for each route.

That is important when you start looking at the fewest jumps (that might force a refuel) and more jumps that would allow you to make the route without refuel. Fuel usage formula is a known that is possible to estimate from emprical data gathered for your ship.

This could allow a tool to answer "Can I make this mission in time" You are here: You want to go here: Answer: You will spend an estimated xx minutes to get there. Of course anyone trying to nail that to nearest millisecond will get into problems, since there are a lot of unknown along the route. Interdictions, station/planet alignment, that Type 9 blocking the entrance for half a minute etc etc. But the estimate would be a very helpful 3rd party tool.

I do agree with the usefulness of that data but personally I think that is something for phase 2. We already are struggling to get phase 1 rounded up - ideally automized. I've been looking at the tradedangerous.sql file and atm there are 285 entries for stations. Snippet of that file:
CREATE TABLE Station
(
station_id INTEGER PRIMARY KEY AUTOINCREMENT,
name VARCHAR(40) COLLATE nocase,
system_id INTEGER NOT NULL,
ls_from_star DOUBLE NOT NULL,

UNIQUE (name),

FOREIGN KEY (system_id) REFERENCES System(system_id)
ON UPDATE CASCADE
ON DELETE CASCADE
);
INSERT INTO "Station" VALUES(1,'Beagle 2 Landing',4,0.0);
INSERT INTO "Station" VALUES(2,'Dahan Gateway',12,0.0);
INSERT INTO "Station" VALUES(3,'Freeport',38,0.0);
INSERT INTO "Station" VALUES(4,'Chango Dock',20,0.0);
INSERT INTO "Station" VALUES(5,'Azeban City',15,0.0);
INSERT INTO "Station" VALUES(6,'Aulin Enterprise',5,0.0);
INSERT INTO "Station" VALUES(7,'WCM Transfer Orbital',30,0.0);
INSERT INTO "Station" VALUES(8,'Romanenko Estate',44,0.0);
INSERT INTO "Station" VALUES(9,'Romanek''s Folly',41,0.0);
INSERT INTO "Station" VALUES(10,'Bradfield Orbital',45,0.0);
INSERT INTO "Station" VALUES(11,'Gorbatko Reserve',10,0.0);
INSERT INTO "Station" VALUES(12,'Cuffey Plant',2,0.0);
INSERT INTO "Station" VALUES(13,'Hay Point',42,0.0);
INSERT INTO "Station" VALUES(14,'Bresnik Mine',16,0.0);
INSERT INTO "Station" VALUES(15,'Vonarburg Co-operative',55,0.0);
INSERT INTO "Station" VALUES(16,'Olivas Settlement',7,0.0);
INSERT INTO "Station" VALUES(17,'Moxon''s Mojo',9,0.0);
INSERT INTO "Station" VALUES(18,'Bowersox Mines',48,0.0);
It can easily be extended with more stations/platforms. Scripts can be written to extract data from it in a json/csv/xml format. Script can be written to extend it with more data coming from some source - crowd sourced for example. Is relative simple, once a commander has bought data for a system the cmdr can just read the info from the nav menu when visiting that star. Perhaps its even possible to extract that info from the network logs. I still have to sent IxForres a PM if he perhaps is willing to resurrect its web-api. An ideal place for that data.

Atm we are missing missing coords data for systems which do not have a station in it - at least not in SB2. My suggestion: lets do this first and then in phase 2 we can get the other info - extremely useful This is of course my personal view. We are in this together. What do others think?
 
Last edited:
I do agree with the usefulness of that data but personally I think that is something for phase 2.

I agree with that. Station distances would be very useful but let's get the system data working well first. I also think stations are more likely to change before release (e.g. I expect multiple markets in single systems will be added) so any data we capture now will have to be rechecked anyway.
 

wolverine2710

Tutorial & Guide Writer
@RedWizard.


  1. "Other field". It was indeed to subtle for me. I entered a "wolverine" system which of course is not in MB's list.
  2. "Done and JSON output". New version sounds good.
  3. "verification page today". Not toally understand, will test it.
  4. "System.json. Coords multiplied by 32.". Perhaps better to have it as a floating point. More readable, but it has drawback, see beneath.
  5. "TD inserts". Very useful and thanks.
  6. "bulk locate". I have to admit I don't understand, will test it though.

Concerning JSON output. Can't get it work atm. Iirc it contains aside from calculated coords also the distances taken. Can't remember if also the error for a certain distance is shown. Anyway, anyway ideal for bookkeeping, and less copy paste.

Post #293. Automated solution. I totally forgot to ask if this something you would be willing to implement. I apologize for that. Harbinger said he could implement it at his side. I think it would require few changes in your tool. The harder work would be at his end ;-> Has he perhaps contacted you?

An update button would be required - or some check in the background to see if an updates system-coord.json file exists. The publish button could just send a your existing json structure for a system to HB's site. Perhaps extended with the name of the volunteer who uploaded it. To make sure no inaccurate data is send the publish button would only selectable if your tool has determined that enough good distances have been entered. As in RMS 0.000. Before I forget. The entry.html should check if system name entered is already in the system-coord.json on HB's site. To prevent double entries.

I said it before. Your tool and HB's tool are ready to be used for crowd sourcing - just like Huamage spreadsheet and others. The challenge/headache is to coordinate everything. Making sure a commander is NOT wasting time with distance for a system which already been done. Manually updating a list of done sytems etc. This does NOT have to automated for SB2 but it would make live easier ;-)

So looking forward to bringing your tool down to my local dungeon and putting it on the rack and have some 'fun' with it ;->
 
Last edited:
  1. "Other field". It was indeed to subtle for me. I entered a "wolverine" system which of course is not in MB's list.
  2. "Done and JSON output". New version sounds good.
  3. "verification page today". Not toally understand, will test it.
  4. "System.json. Coords multiplied by 32.". Perhaps better to have it as a floating point. More readable, but it has drawback, see beneath.
  5. "TD inserts". Very useful and thanks.
  6. "bulk locate". I have to admit I don't understand, will test it though.
I've just checked in an update. The change to system.json (item #4), the SQL output (#5), and the JSON output changes (#2) are in it. Note though that I can't generate the ids for the first column of the SQL statements (because I don't know what's already in the database) so these will have to be added. If it's an autoincrement column then I could leave it out but i'll need the column names for the rest of the columns.

The "other field" changes (#1) are not in. Not going to get time to do those until tomorrow.

The "bulk locate" page (#6) is in. You probably don't need to worry too much about this yet though you can take a look if you like. Essentially bulklocate.html takes distance data (in distances.json) for multiple systems and calculates coordinates for as many as possible using my version of trilateration and Tunamage's algorithm. I use it to compare with coordinates other people have generated to try to figure out what systems I'm confident about and what systems need more data or need checking. More on this in the next post.

The "verification page" (#3) - I think I was talking about a page to ask volunteers for specific information I need. Nothing implemented yet.

Post #293. Automated solution. I totally forgot to ask if this something you would be willing to implement. I apologize for that. Harbinger said he could implement it at his side. I think it would require few changes in your tool. The harder work would be at his end ;-> Has he perhaps contacted you?

I'm happy to do the frontend changes necessary in my code. Is Harbinger setting up the server side? He hasn't contacted me yet. Obviously I'd prefer to use the JSON format I've already developed, but I can adapt if he wants to do something different. I've changed the format slightly in the latest version (mostly so that the JSON generated when adding a system is consistent with the reference system data), it now looks like this:

For a reference system, just name and coordinates:
Code:
{"name":"Eranin", "x":-22.84375, "y":36.53125, "z":-1.1875},

For a crowd-sourced system I also store the distance data and flag the system as having calculated coordinates:
Code:
{
  "name": "Achenar",
  "contributor": "Ann, Bob, Charlie",      // not currently implemented
  "contributed": "2014-10-16T13:38:28.660Z"     // not currently implemented
  "calculated": true,
  "x": 67.5,
  "y": -119.46875,
  "z": 24.84375,
  "distances": [
    {
      "system": "Aulin",
      "distance": "176.513"
    },
    {
      "system": "Ross 905",
      "distance": "167.107"
    },
    {
      "system": "41 Gamma Serpentis",
      "distance": "165.847"
    },
    {
      "system": "Theta Draconis",
      "distance": "205.549"
    },
    {
      "system": "Tring",
      "distance": "243.178"
    }
  ]
},

The things on my list of improvements to entry.html are (I think we've both thought up with some of these independently):
  1. Change the "other" behavior
  2. Capture volunteer name and time
  3. Check if system is already recorded
  4. Webservice integration
 
I've started trying to verify the data we've got, starting with Codec's spreadsheet. Here's what I've found so far.

14 systems get identical coordinates (when rounded to the 1/32 Ly grid) in Codec's spreadsheet, using my version of trilateration, and using TunaMage's algorithm. I'm pretty happy to call these confirmed. They are:
16 Cephei (-108, 30.03125, -42.25)
4 Cephei (-132.09375, 35.5, -27.875)
Achenar (67.5, -119.46875, 24.84375)
Austern (-25.15625, 15.34375, 9.375)
Culan (-36.0625, 14.625, 13.8125)
Etamin (-132.46875, 74.875, 25.53125)
GD 356 (-50.9375, 44.15625, 7.3125)
HIP 103014 (-132.46875, 33.25, -28.5625)
HIP 109479 (-127.5, 29.875, -48.5)
HIP 111494 (-106.9375, 30.0625, -48.4375)
HIP 94802 (-114.59375, 51.40625, -25.90625)
Ross 52 (-8.4375, 29.15625, 13.3125)
Taran (-42.5, 45.46875, -3)
WREDGUIA XX-O B47-2 (-116.4375, 55.125, -38.3125)

There are 6 systems where my coordinates match Codec's coordinates but coordinates from TunaMage's algorithm are slightly different. I think this is because TunaMage's algorithm is more sensitive to bad reference system choices since it only uses 4 references. I'll look into these some more. They are:
Alpha Cygni (-1405, 46.09375, 132.5)
Dziban (-62.15625, 38.46875, -14.3125)
Enif (-536.59375, -363.03125, 236.1875)
LHS 6354 (-65.28125, 27.34375, -16.125)
NLTT 44050 (-57.46875, 38.8125, -21.09375)
Rigel (385.78125, -360.40625, -682.53125)

There are four systems where my coordinates and the coordinates from TunaMage's algorithm match but Codec's spreadsheet has different values. I'm pretty confident I've got these right. They are:
Alphard (138.84375, 88.46875, -73.5)
Hagalaz (-50.59375, 45.15625, 11.09375)
LP 180-17 (-56.84375, 37.71875, 12.71875)
Mirphak (-274.96875, -47.625, -422.65625)

Finally there are three systems where Codec's spreadsheet, my coordinates, and coordinates from TunaMage's algorithm are all different: Polaris, Sagittarius A*, and SDSS J1416+1348. I'm pretty confident about my coordinates for Polaris (-322.6875, 194.59375, -212.4375) as there are 5 reference systems and no error. The other two both show error with my algorithm which implies that the distance data is inconsistent for these two. I need to look into it further.

There are some other systems that don't have enough distances to systems on Michael's list to generate coordinates. I should be able to get coordinates for some of these using the above "confirmed" systems as references. This is less ideal as a mistake in one of the "confirmed" systems will propagate to any system using it as a reference but it's better than nothing.

TD DB inserts for the "confirmed" systems above (can also supply JSON if anyone wants it):
INSERT INTO "System" VALUES(,'Achenar',67.5,-119.46875,24.84375,'2014-10-16 13:51:23');
INSERT INTO "System" VALUES(,'Etamin',-132.46875,74.875,25.53125,'2014-10-16 13:51:23');
INSERT INTO "System" VALUES(,'Enif',-536.59375,-363.03125,236.1875,'2014-10-16 13:51:23');
INSERT INTO "System" VALUES(,'Alpha Cygni',-1405,46.09375,132.5,'2014-10-16 13:51:23');
INSERT INTO "System" VALUES(,'Mirphak',-274.96875,-47.625,-422.65625,'2014-10-16 13:51:23');
INSERT INTO "System" VALUES(,'Alphard',138.84375,88.46875,-73.5,'2014-10-16 13:51:23');
INSERT INTO "System" VALUES(,'Rigel',385.78125,-360.40625,-682.53125,'2014-10-16 13:51:23');
INSERT INTO "System" VALUES(,'Ross 52',-8.4375,29.15625,13.3125,'2014-10-16 13:51:23');
INSERT INTO "System" VALUES(,'GD 356',-50.9375,44.15625,7.3125,'2014-10-16 13:51:23');
INSERT INTO "System" VALUES(,'Hagalaz',-50.59375,45.15625,11.09375,'2014-10-16 13:51:23');
INSERT INTO "System" VALUES(,'Taran',-42.5,45.46875,-3,'2014-10-16 13:51:23');
INSERT INTO "System" VALUES(,'LP 180-17',-56.84375,37.71875,12.71875,'2014-10-16 13:51:23');
INSERT INTO "System" VALUES(,'HIP 94802',-114.59375,51.40625,-25.90625,'2014-10-16 13:51:23');
INSERT INTO "System" VALUES(,'4 Cephei',-132.09375,35.5,-27.875,'2014-10-16 13:51:23');
INSERT INTO "System" VALUES(,'HIP 103014',-132.46875,33.25,-28.5625,'2014-10-16 13:51:23');
INSERT INTO "System" VALUES(,'HIP 109479',-127.5,29.875,-48.5,'2014-10-16 13:51:23');
INSERT INTO "System" VALUES(,'16 Cephei',-108,30.03125,-42.25,'2014-10-16 13:51:23');
INSERT INTO "System" VALUES(,'HIP 111494',-106.9375,30.0625,-48.4375,'2014-10-16 13:51:23');
INSERT INTO "System" VALUES(,'Austern',-25.15625,15.34375,9.375,'2014-10-16 13:51:23');
INSERT INTO "System" VALUES(,'Culan',-36.0625,14.625,13.8125,'2014-10-16 13:51:23');
INSERT INTO "System" VALUES(,'Dziban',-62.15625,38.46875,-14.3125,'2014-10-16 13:51:23');
INSERT INTO "System" VALUES(,'NLTT 44050',-57.46875,38.8125,-21.09375,'2014-10-16 13:51:23');
INSERT INTO "System" VALUES(,'LHS 6354',-65.28125,27.34375,-16.125,'2014-10-16 13:51:23');
INSERT INTO "System" VALUES(,'WREDGUIA XX-O B47-2',-116.4375,55.125,-38.3125,'2014-10-16 13:51:23');

Anyway I'm off to bed so that's it for now ;)
 

wolverine2710

Tutorial & Guide Writer
Sweet dreams ;-)
Its 16:46 here so a bit to early to go to bed.
Have dragged your tool to the dungeon, no squeals yet. Must research 'other' ways...

Will test further, have already seen nice things like a drop down menu for the 'other' which reacts to my input. Sweet, darn sweet... Probably will be doing a bit of flying and collect distances, perhaps for the ones you mentioned.
 
Last edited:
I've started trying to verify the data we've got, starting with Codec's spreadsheet. Here's what I've found so far.

14 systems get identical coordinates (when rounded to the 1/32 Ly grid) in Codec's spreadsheet, using my version of trilateration, and using TunaMage's algorithm. I'm pretty happy to call these confirmed. They are:

Culan (-36.0625, 14.625, 13.8125)
Thanks, I have added these to my TradeDangerous fork. Small correction, that should be Culann

System Count = 397
Plot of systems HERE
 
Last edited:
I do agree with the usefulness of that data but personally I think that is something for phase 2. We already are struggling to get phase 1 rounded up - ideally automized. I've been looking at the tradedangerous.sql file and atm there are 285 entries for stations. Snippet of that file:
CREATE TABLE Station
(
station_id INTEGER PRIMARY KEY AUTOINCREMENT,
name VARCHAR(40) COLLATE nocase,
system_id INTEGER NOT NULL,
ls_from_star DOUBLE NOT NULL,

UNIQUE (name),

FOREIGN KEY (system_id) REFERENCES System(system_id)
ON UPDATE CASCADE
ON DELETE CASCADE
);
INSERT INTO "Station" VALUES(1,'Beagle 2 Landing',4,0.0);
INSERT INTO "Station" VALUES(2,'Dahan Gateway',12,0.0);
INSERT INTO "Station" VALUES(3,'Freeport',38,0.0);
INSERT INTO "Station" VALUES(4,'Chango Dock',20,0.0);
INSERT INTO "Station" VALUES(5,'Azeban City',15,0.0);
INSERT INTO "Station" VALUES(6,'Aulin Enterprise',5,0.0);
INSERT INTO "Station" VALUES(7,'WCM Transfer Orbital',30,0.0);
INSERT INTO "Station" VALUES(8,'Romanenko Estate',44,0.0);
INSERT INTO "Station" VALUES(9,'Romanek''s Folly',41,0.0);
INSERT INTO "Station" VALUES(10,'Bradfield Orbital',45,0.0);
INSERT INTO "Station" VALUES(11,'Gorbatko Reserve',10,0.0);
INSERT INTO "Station" VALUES(12,'Cuffey Plant',2,0.0);
INSERT INTO "Station" VALUES(13,'Hay Point',42,0.0);
INSERT INTO "Station" VALUES(14,'Bresnik Mine',16,0.0);
INSERT INTO "Station" VALUES(15,'Vonarburg Co-operative',55,0.0);
INSERT INTO "Station" VALUES(16,'Olivas Settlement',7,0.0);
INSERT INTO "Station" VALUES(17,'Moxon''s Mojo',9,0.0);
INSERT INTO "Station" VALUES(18,'Bowersox Mines',48,0.0);
It can easily be extended with more stations/platforms. Scripts can be written to extract data from it in a json/csv/xml format. Script can be written to extend it with more data coming from some source - crowd sourced for example. Is relative simple, once a commander has bought data for a system the cmdr can just read the info from the nav menu when visiting that star. Perhaps its even possible to extract that info from the network logs. I still have to sent IxForres a PM if he perhaps is willing to resurrect its web-api. An ideal place for that data.

Atm we are missing missing coords data for systems which do not have a station in it - at least not in SB2. My suggestion: lets do this first and then in phase 2 we can get the other info - extremely useful This is of course my personal view. We are in this together. What do others think?
This is where I am collating data. We have 397 systems so far. Also a few Station distances, I am working on adding more.
 

wolverine2710

Tutorial & Guide Writer
Smacker, 397 star systems, GREAT work.
I've been slacking but here is one more.
INSERT INTO "System" VALUES(,'Wredguia XH-Q B46-3',-81.96875,46.71875,-40.375,'2014-10-16 16:44:44');

Raw data from RW's tool.
Code:
INSERT INTO "System" VALUES(,'Wredguia XH-Q B46-3',-81.96875,46.71875,-40.375,'2014-10-16 16:44:44');

{
  "name": "Wredguia XH-Q B46-3",
  "x": -81.96875,
  "y": 46.71875,
  "z": -40.375,
  "calculated": true,
  "distances": [
    {
      "system": "Sol",
      "distance": 102.624
    },
    {
      "system": "Wolf 497",
      "distance": 97.503
    },
    {
      "system": "Huokang",
      "distance": 72.276
    },
    {
      "system": "Demeter",
      "distance": 55.152
    },
    {
      "system": "Clotti",
      "distance": 34.755
    },
    {
      "system": "Haras",
      "distance": 52.505
    }
  ]
}

RW's tool has reduced my copy/paste stuff - with change for error - quite a bit ;-)
 
Last edited:
Back
Top Bottom