Discussion What is the most efficient way to crowdsource the 3D system coordinates

EDSC/TGC is using trilateration with local grid testing as JesusFreke described. His code is C# I believe, and not publicly available. I use a very similar algorithm in my Javascript version which is available here: https://github.com/SteveHodge/ed-systems
The CPU requirements stem from testing the grid locations around possible candidates generated by trilateration. This is necessary because of the imprecise input data, to maximise the chance of getting an accurate location from poorly chosen references, and out of a desire to have confidence in the outcome. Try testing your algorithm with Sagittarius A* or Alpha Cygni to see why this approach is helpful.

Why approximate, when there's an analytic solution?

Code:
If your three points happen to lie nicely on the axes, such as:

[FONT=courier new]p1 = ( 0     , 0     , 0      )
p2 = ( coef_d, 0,    , 0      )
p3 = ( coef_i, coef_j, 0      )
[/FONT]
Then the equations linking the distances (r1, r2 and r3)
to your unknown  point, punknown = (x, y, z) are:

[FONT=courier new]r1*r1 = x*x + y*y + z*z
r2*r2 = (x-coef_d)*(x-coef_d) + y*y + z*z
r3*r3 = (x-coef_i)*(x-coef_i) + (y-coef_j)*(y-coef_j) + z*z[/FONT]

and can be easily re-arranged to find two possible solutions:

[FONT=courier new]candidateplus = (x, y, zplus);
candidateminus = (x, y, zminus);[/FONT]

where:

[FONT=courier new]x = ( r1*r1 - r2*r2 + coef_d*coef_d ) / ( 2*coef_d );

y = ( ( r1*r1 - r3*r3 + coef_i*coef_i + coef_j*coef_j ) /
            (2*coef_j) ) - ( ( coef_i*x ) / coef_j );

zplus = sqrt(r1*r1 - x*x - y*y);
zminus = -zplus;[/FONT]

You can then use any additional points, p4 = (x4, y4, z4), you have, to work out which candidate
is the one you want:

[FONT=courier new]accuracyplus = (x - x4)*(x - x4) + (y - y4)*(y - y4) + (zplus - z4)*(zplus - z4) - r4*r4;
accuracyminus = (x - x4)*(x - x4) + (y - y4)*(y - y4) + (zminus - z4)*(zminus - z4) - r4*r4;

if(accuracyplus < accuracyminus)
{
    solution = candidateplus;
} else {
    solution = candidateminus;
}[/FONT]

If your points don't lie nicely like that, you can transform them so they do (unless they're in a straight line, in which case you're out of luck - use different points).

I've got a java class that does the appropriate transform, if anyone's interested.
 
Short answer, because I'm on my way back from holidays.
I have in mind that EDSC uses a quadratical approximation to find the coordinates. This involves some (small) errors caused by the system, and it's quite CPU-intense, as far as I read. Besides that it is in Python, and I like PHP ;)
The preferred way is to have an algorithm which calculates the location based on some distances,and no estimation in here.
I know the values are rounded to 2 decimal points, and there are CMDRs mistyping distance values by mistake, and sure there are trolls giving false distances by purpose.
I don't know which methods are used by others, except EDSC. And maybe my information about the EDSC way is outdated, so pls let me know, also about other methods already used by others. Maybe I overlooked something here.
Anyway yesterday I implemented backwards checking of the coordinates found, did about 100-200 manual tests with EDSC coordinates, using 4-6 reference systems. And it showed either false coordinates which were easily detected, or the right coordinates. No false positive.
In the next days I'll make it work with any number of distances >= 4 and make some 10.000 tests with random EDSC coordinates.

The test I used to check my trilateration code was:

1. Generate 100 stars with random coordinates
2. Calculate the distances between each pair of stars that are with 15 light years of each other,
to end up with a realistic web
3. Pick 4 of the stars to uses as a basis, and retain the coordinates of those 4
4. Using just the distance data, plus the basis, work outwards using trilateration code
to re-create coodinates for the remaining 96 stars.

It would be interesting to add in some noise to the distance data, and see how much it affects the resulting coordinates.

If you have lots of distance data, but it is of varying reliability, my suggested approach would be to use the analytic approach on several different sets of 4, and take the median. You could then fine-tune it by moving the solution around by small amounts, with a 'hotter / colder' approach until you've found a maximum number of distances to that point which are exact.
 
Another small correction I got from feedback on my route planner:

'Core Sys Sector CQ-Y C22' is actually 'Puppis Sector FW-W C1-22'

That's at 32.65625 | 26.71875 | 8.8125
 
Why approximate, when there's an analytic solution?
... code ...
If your points don't lie nicely like that, you can transform them so they do (unless they're in a straight line, in which case you're out of luck - use different points).

You just reinvented the wheel. That's exactly how trilateration works. But because of the rounding in the displayed distances it's only an approximation.
 
Does EDStarCoordinator have it's own thread yet? I ask because I'm trying to use the API using PowerShell and not having much luck so far.

This is the PowerShell (v3) command I'm using to try to get a list of system coordinates:

Code:
Invoke-RestMethod -ContentType 'application/json; charset=utf-8' -Method Post -Uri 'http://edstarcoordinator.com/api.asmx/GetSystems' -Body (ConvertTo-Json @{ ver = 2; test='true'; outputmode=2; filter = @{ knownstatus = 1; date = "2014-09-18 12:34:56"} })

I've tried various parameters, but I've only received a "(500) Internal Server Error" response back.
 
Does EDStarCoordinator have it's own thread yet? I ask because I'm trying to use the API using PowerShell and not having much luck so far.

This is the PowerShell (v3) command I'm using to try to get a list of system coordinates:

Code:
Invoke-RestMethod -ContentType 'application/json; charset=utf-8' -Method Post -Uri 'http://edstarcoordinator.com/api.asmx/GetSystems' -Body (ConvertTo-Json @{ ver = 2; test='true'; outputmode=2; filter = @{ knownstatus = 1; date = "2014-09-18 12:34:56"} })

I've tried various parameters, but I've only received a "(500) Internal Server Error" response back.

You need to wrap your query with
Code:
@{ data = ... }
ie ther must be a single key named "data" in the posted query.
 
Why approximate, when there's an analytic solution?

Due to the inaccuracy of the inputs (which are rounded to 2 decimal places and affected by the single precision implementation of the distance calculation in ED) and the potentially poor choice of reference systems the analytic solution alone won't work in all cases.

You're the second person this week to turn up and basically tell us we're doing it wrong. With all due respect you are actually at the point we were a couple of months ago. I don't expect you to read all of this mammoth thread to find out why we use the algorithms we use, but you could at least take a look at my code and do some testing on it before you try to teach us what we already know (not that I'm claiming my code is in any way elegant or well written - it's really just a hack but it works and it's publically available).

As I said, run your algorithm against Sag A* and Alpha Cygni using the data I provided and you'll find it's not so simple as just using unmodified trilateration.
 
Due to the inaccuracy of the inputs (which are rounded to 2 decimal places and affected by the single precision implementation of the distance calculation in ED) and the potentially poor choice of reference systems the analytic solution alone won't work in all cases.

You're the second person this week to turn up and basically tell us we're doing it wrong. With all due respect you are actually at the point we were a couple of months ago. I don't expect you to read all of this mammoth thread to find out why we use the algorithms we use, but you could at least take a look at my code and do some testing on it before you try to teach us what we already know (not that I'm claiming my code is in any way elegant or well written - it's really just a hack but it works and it's publically available).

As I said, run your algorithm against Sag A* and Alpha Cygni using the data I provided and you'll find it's not so simple as just using unmodified trilateration.

It was a genuine question. I was wondering why it appeared that some weren't using the analytic approach, at least to calculate the likely start point for a search.

Someone above mentioned "from all "near" 1/32ly grid points near the calculated point", and doing an exhaustive search like that struck me as inefficient (though it will give the right answer). You've got errors from rounding, errors from movement of stars over time, and errors from mis-typed distances. But if you've got several entries for the same distance by different people, at around the same time period, the star movement isn't likely to be a big factor, and the typos are unlikely to be all the same typo - the correctly typed distance is likely to outnumber them. So mainly what you're facing is the rounding error, plus dealing with a minority of inconsistent data. And in that situation, there's a more efficient way to search the 3D grid to find the 'sweet' spot, than searching all the intersections on the grid. Because if you know the same rounding algorithm as is being used by Elite, then it isn't a case that either the distance from a 'known' star to a candidate position either matches or it doesn't - you can check how closely it matches, and see which of the 26 surrounding points makes the highest number of distances improve their match, then move your search centre over in that direction, and repeat.

As you say, this is a long thread, and someone else has probably suggested this before, but it is quicker to have an idea re-posted and for someone else who have already read it to reply "Nope, we've tried that."

For reference, the code I saw linked was https://github.com/SteveHodge/ed-systems/blob/master/trilateration.js and I had a look at that. I'm not saying it produces wrong answers. I was just wondering if there was a way to get faster answers.
 
You're the second person this week to turn up and basically tell us we're doing it wrong.
Oh boy, you're talkin bout me? Why do people don't read my posts? Can you pls show me the part where I said "You're wrong"?
If it's not me you're talking about, pls ignore... ;)
Edit: Maybe you meat this:
... as I don't really like the way they are "calculated" right now.
So once more: Please tell me where I said "You are wrong"!
 
Last edited:
It was a genuine question. I was wondering why it appeared that some weren't using the analytic approach, at least to calculate the likely start point for a search.
The majority of us do use the analytic solution as the starting point for the search. Although I think there were 1 or 2 that were using alternate approaches.

Someone above mentioned "from all "near" 1/32ly grid points near the calculated point", and doing an exhaustive search like that struck me as inefficient (though it will give the right answer).

The "from all near 1/32ly grid points" comment was me (link for reference). As I described in that post, what you actually end up with is the intersection of multiple thin spherical shells, which give you a small 3d volume. When you perform trilateration, it will give you a point within that small intersection volume. But the real coordinate could be anywhere in that volume. So you end up needing to look around the area near the point you calculate with trilateration - or get more distances, which will (usually) decrease the size of the intersection volume.

What I do, instead of using a sphere/cube around the calculated point is to use a fairly sophisticated heuristic that tries to follow the contour of the error function around the candidate region. Where the error function is the sum of squares of distance errors to a given point, where the error is 0 if the calculated rounded distance exactly matches the rounded distance that the user input from the game. And the candidate region is the intersection volume, i.e. the region where the error function is 0.

And then I evaluate every grid point within that contour, to ensure that I've evaluated every grid point that might be in the candidate region. If I've only found 1 point that actually has an error of 0, then I know I've found the right coordinate, assuming no typos. Or if I've found multiple points with an error of 0, then it's impossible to determine which is the real coordinate, without more data.

You've got errors from rounding, errors from movement of stars over time, and errors from mis-typed distances. But if you've got several entries for the same distance by different people, at around the same time period, the star movement isn't likely to be a big factor, and the typos are unlikely to be all the same typo - the correctly typed distance is likely to outnumber them.

To the best of my knowledge, the system coordinates do not change over time. The coordinates of a star within the system can and will change, but the coordinates of the system within the galaxy won't change.

As for typos, In the majority of cases, you will only 1 user that has input the distances for a given star pair. Especially for people (like me) who don't use a standard set of reference stars. So we can't rely on multiple entries of the same distance. Instead, we ensure that there are enough distances to guarantee that there is enough redundancy to detect at least, e.g. 2 typos distances.

So mainly what you're facing is the rounding error, plus dealing with a minority of inconsistent data. And in that situation, there's a more efficient way to search the 3D grid to find the 'sweet' spot, than searching all the intersections on the grid. Because if you know the same rounding algorithm as is being used by Elite, then it isn't a case that either the distance from a 'known' star to a candidate position either matches or it doesn't - you can check how closely it matches, and see which of the 26 surrounding points makes the highest number of distances improve their match, then move your search centre over in that direction, and repeat.

That doesn't always work though. The problem is that you're sampling the error function in 1/32 LY increments, but the function can have much smaller features. The clearest example of this is when you're dealing with a star far away, using references that are bunched together in relation to that star, due to the long distance. In that case, you can end up with a a candidate region that is a very long, thin and narrow "needle" shape, which can be many 1/32LY units in length, but is narrow enough that it passes between almost all grid points except possibly 1 or 2 somewhere along it's length. In that case, trilateration will produce a point within the needle, but almost certainly not one that is grid-aligned. Searching the 27 nearby grid points won't give you any indication which direction you should look, because the error function in that case is related to how close the needle is to a given grid point, which is mostly unrelated to where the needle will actually encompass one of the grid points. And just because you find 1 grid point within the needle, you still have to search along the rest of the length of the needle, because it's possible it encompasses multiple grid points - in which case you need more data. i.e. the distance to another system which will let you discriminate between the 2 or more candidate points i.i.e. another thin spherical shell which only encompasses one of the candidate points.


Mathematically, that "needle" could be described as a system of inequalities, and then you could add an additional constraint to the system to enforce the 1/32LY grid thing. And then you could theoretically try to solve that system of inequalities, which would give you 0 or more points that satisfy the equations. However, that would be a very complex system of equations.. and my math skills aren't up to be able to solve an arbitrary system of inequalities of that form algorithmically. Although I'm a bit curious if something like mathematica or matlab could.
 
Last edited:
It was a genuine question.

Oh boy, you're talkin bout me?

Ok, I was unreasonably grumpy and that bit wasn't fair. Sorry about that. I guess I felt a little insulted by the "there's a really easy way to do this" suggestion as it implies we weren't smart enough to figure that out when we've actually put a lot of thought and work into this. Most of the 4 or 5 implementations by people here that I've looked at start with trilateration (or an equivalent algebraic method); the changes and extensions to that basic algorithm are all in response to actual problems we've encountered in real data.


Someone above mentioned "from all "near" 1/32ly grid points near the calculated point", and doing an exhaustive search like that struck me as inefficient (though it will give the right answer). You've got errors from rounding, errors from movement of stars over time, and errors from mis-typed distances. But if you've got several entries for the same distance by different people, at around the same time period, the star movement isn't likely to be a big factor, and the typos are unlikely to be all the same typo - the correctly typed distance is likely to outnumber them. So mainly what you're facing is the rounding error, plus dealing with a minority of inconsistent data. And in that situation, there's a more efficient way to search the 3D grid to find the 'sweet' spot, than searching all the intersections on the grid. Because if you know the same rounding algorithm as is being used by Elite, then it isn't a case that either the distance from a 'known' star to a candidate position either matches or it doesn't - you can check how closely it matches, and see which of the 26 surrounding points makes the highest number of distances improve their match, then move your search centre over in that direction, and repeat.

JesusFreke's post answers this pretty comprehensively.

For me personally, I just do an exhaustive search of a small area around the candidate points (in practice I've found search just the 8 corners of the cube containing the candidate is typically sufficient), but I check all candidates (each subset of three distances generates two candidate points via trilateration). Error detection in the common case where distances have only been entered by one person is really the main motivation of my choice of algorithm. I think the more sophisticated search algorithm JesusFreke uses is rarely warranted in practice though it is a fascinating problem to solve.
 
Last edited:
The majority of us do use the analytic solution as the starting point for the search. Although I think there were 1 or 2 that were using alternate approaches.
Yes, exactly. And I don't think it's a mistake to try another way of finding the coordinates, although the ways are quite similar. Both ways should find the exact same vector at the end.
That doesn't always work though. The problem is that you're sampling the error function in 1/32 LY increments, but the function can have much smaller features. The clearest example of this is when you're dealing with a star far away, using references that are bunched together in relation to that star, due to the long distance. In that case, you can end up with a a candidate region that is a very long, thin and narrow "needle" shape, which can be many 1/32LY units in length, but is narrow enough that it passes between almost all grid points except possibly 1 or 2 somewhere along it's length.
Fully agree, except one point: Just from a feeling I say, this is probably not a needle, I believe it looks like a plate or lens. On one hand I would need to visualize it to be sure, on the other hand it doesn't change the result at all ;)

However, that would be a very complex system of equations.. and my math skills aren't up to be able to solve an arbitrary system of inequalities of that form algorithmically. Although I'm a bit curious if something like mathematica or matlab could.
I didn't try these, I don't have access to them. I used to have during studies, but this is decades ago. I tried maxima instead, which should play in the same league. I tried different approaches, each using 4 reference systems ended with either no solution "[]", evaluation stopped automatically because "out of memory" or I stopped calculation by hand after several hours without result. While using 2 reference systems it worked quite well, and with 3 systems it worked, spit out 2 solutions (as expected) but the result was 3 or 4 screen pages in length. I didn't dare to use 5 systems... ;)

- - - - - Additional Content Posted / Auto Merge - - - - -

Ok, I was unreasonably grumpy and that bit wasn't fair. Sorry about that. I guess I felt a little insulted by the "there's a really easy way to do this" suggestion as it implies we weren't smart enough to figure that out when we've actually put a lot of thought and work into this. Most of the 4 or 5 implementations by people here that I've looked at start with trilateration (or an equivalent algebraic method); the changes and extensions to that basic algorithm are all in response to actual problems we've encountered in real data.
Peace, man! We are all following the same aim! I believe it was some kind of misunderstanding. I didn't want to tell someone he's doing wrong or isn't smart enough, and I know my way is not really easy - took me several days running in dead ends. I wonder how cpu intense this trilateration thing is. Anyone got some PHP code for it?
 
Last edited:
Fully agree, except one point: Just from a feeling I say, this is probably not a needle, I believe it looks like a plate or lens. On one hand I would need to visualize it to be sure, on the other hand it doesn't change the result at all ;)


There are many different possibilities. The "needle" is one of the possibilities, and I find that it works a good mental corner case when thinking about how a particular algorithm will work, because it's easy to visualize, and it tends to be a problematic case. And in fact, such a needle provided an "ah-ha!" moment for me, when I was looking at why one of my earlier algorithms didn't work as I expected.

Peace, man! We are all following the same aim! I believe it was some kind of misunderstanding. I didn't want to tell someone he's doing wrong or isn't smart enough, and I know my way is not really easy - took me several days running in dead ends. I wonder how cpu intense this trilateration thing is. Anyone got some PHP code for it?

trilateration is quite cheap, computationally. I haven't specifically profiled it, but it's just a series of a few normal math operations with no loops. i.e. it's O(1). But as I describe in my couple of long posts... the point you get with trilateration is only the starting point.... :)
 
Last edited:
The majority of us do use the analytic solution as the starting point for the search. Although I think there were 1 or 2 that were using alternate approaches.

My approach generated all possible coordinate triplets using 1/32Ly integer values that could potentially yield a displayed distance.

For each reference p(sys, dist) and q(sys,dist) from the set of all reference systems where p!=q it generated a 3D torus (that would have been a circle if the distances would not be rounded) that could be valid from both p and q, in a 3D bitmap (voxel map?).

If there are enough valid references, and there are no invalid references, there will be exactly one point where all the generated voxel maps overlap.

This approach was able to detect some problems with positions during the Beta 3 effort, but I haven't used it since then.
 
I think someone need to start working on a EDSC alternative in case it goes offline. Would be good if Tornsoul would give a little ping to this thread that things are working as it should.
 

wolverine2710

Tutorial & Guide Writer
I think someone need to start working on a EDSC alternative in case it goes offline. Would be good if Tornsoul would give a little ping to this thread that things are working as it should.

I believe Biteketkergetek is working on this. From post #1994: "I'm going to set up my own EDSC mirror and would like to have two versions of it, the raw version (sorted by EDSC id) and a version processed with:". Not sure about if he's going to mirror the API as well. There has been some discussion about the POST method used by TornSoul (which is for his C# approach the easiest) instead of GET.

Edit: Checked profile of TornSoul. His latest post was the 16th of December in this post. Not sure what his current situation is. Would be nice to have an update from him as TGC is used by quite a few tools atm - which is GOOD.
 
Last edited:
I believe Biteketkergetek is working on this. From post #1994: "I'm going to set up my own EDSC mirror and would like to have two versions of it, the raw version (sorted by EDSC id) and a version processed with:". Not sure about if he's going to mirror the API as well. There has been some discussion about the POST method used by TornSoul (which is for his C# approach the easiest) instead of GET.

It would probably be best if a TGC replacement provided a compatible API but I think it would involve quite a lot of work to keep the error/status reporting the same.
 
I wonder how cpu intense this trilateration thing is. Anyone got some PHP code for it?

Trilateration itself is not cpu intensive at all. But you can do it for each combination of 3 distances from the total set. If you do that then you're running it n(n-1)(n-2)/6 times. So that's O(n^3) which would be a big problem if we were dealing with a lot of distances. In my implementation most of the time is spent checking distances but it's still fast enough up to about 15 distances which takes around a second on my machine. I have thought of various optimisations, mostly taking advantage of the way distances are added incrementally (e.g. only testing the n/(n-3) new candidate regions along with the leading current candidate), but I haven't bothered implementing many of them yet.

I don't think anyone here has implemented trilateration in php. It's pretty straight forward though. Should be easy to implement from the wikipedia article. Alternatively you could port it from my javascript (trilateration.js/getCandidates()).
 
Last edited:
Back
Top Bottom