You could halfway fix that for LHS/Lalande/Ross type names, where there is a template for what is right or wrong.
I would suggest a function where we could "vote" the name up. Or maybe look at what is submitted by most. The two could even be the same, where we get a wrong name from EDSN, and just send corrections every time we either send distances for that system or that system get into the distance list for other system. Maybe add a weight to the different sources. Like getting a systemname from ReadWizzard would most likely be a correct spelling, so you could say his spelling precedes the first submit to DB.
I'm willing to bet the list of systems with different capitalization will be relative short and a decent admin form would be a couple of minutes a week, not a full time job. So a manual approach might not be that hard to do.
Fixed a bug where I tried to do trilateration on distances already in the DB where the ref system coords where not known.
Vector math on null vectors don't work too well
This would occur if the submitted p0 already had distances in the DB besides those submitted (and some of those where to ref systems not having coords)
Just a FYI if you've had strange irregular errors on trying to submit.
Huge thanks to JesusFreke for supplying a really good bug report with a working (well not working ) jsfiddle and all.
Or maybe look at what is submitted by most. The two could even be the same, where we get a wrong name from EDSN, and just send corrections every time we either send distances for that system or that system get into the distance list for other system. Maybe add a weight to the different sources. Like getting a systemname from ReadWizzard would most likely be a correct spelling, so you could say his spelling precedes the first submit to DB.
I don't store subsequent duplicate (sans case) names, so there's no way to "count most submitted".
The cr value is simply incremented in case of a duplicate (sans case).
And the distance table uses "id" for the systems (from the system table), not the actual text string - So nothing to be gotten from there either.
I'm willing to bet the list of systems with different capitalization will be relative short and a decent admin form would be a couple of minutes a week, not a full time job. So a manual approach might not be that hard to do.
Fixed a bug on the EDSC submission form, where if the form had remembered your name from last time, it would not actually submit it, unless you'd first clicked in the commandername input box.
Assuming you're asking about the 150 stars I added - no, not yet. I had tried to submit the distances to the test instance of EDSC, but I kept getting errors for specific systems, even though other systems worked fine. I've PM'd TornSoul, but he's probably out for the holidays or something.
I'll definitely get them submitted to EDSC when possible.
I've been talking with TornSoul today and we managed to iron out all the kinks I was hitting when trying to submit. I've submitted around 200 stars to the TGC production database, although it looks like some of them are missing coordinates for some reason, although I've been able to calculate the coordinates for all of them locally.
Yeah making something foolproof is most likely impossible - Hence I'm not even going to try tbh.
I'd be open to manually (well I'd script it) edit the DB if someone provides a list with names that need fixing along with what they should be corrected to.
Yeah, I don't think it will be much of an issue. I'll compare what's in TGC and systems.json and give you a list, though I haven't been monitoring the names in systems.json particularly closely.
And the input "date" filter is put straight through to the DB as a datetimeoffset as well (obviously without the TZ adjustment bit though...)
I'm outputting them as UTC values however.
So yeah perhaps not optimal...
What would you prefer?
EDIT:
As an undocumented feature (that ought to work - haven't tested) add a "Z" behind the date and it should work as you'd expect: ie. "date":"2014-11-30 22:29:04Z",
Ah, I was wondering what you were doing about timezones. The specific thing I'm doing is looking up the latest updatedate in my local DB and using that for the 'date' query filter, so I'll always get some overlap, but be sure not to miss anything. So, yeah, you need to be using the same timezone (personally I'd store it all in UTC) for both output and the date filter interpretation.
Ah, I was wondering what you were doing about timezones. <snip> So, yeah, you need to be using the same timezone (personally I'd store it all in UTC) for both output and the date filter interpretation.
Datetimeoffset is UTC - in a sense (and more - that's the beauty of it.)
Btw I've testet the "Z" postfix - Works like a charm.
The Z post fix tells .net "I am a UTC datetime" (as opposed to local server time)
Which then means that when I convert that to a datetimeoffset, it'll get converted correctly, without me having to worry about daylight saving and TZ and what not. Datetimeoffset handles all that automatically.
However, if there is no Z post fix - It's treated as local (server) time. And when it gets converted to datetimeoffset it gets "my server" local offset added.
Date/time is a mess in general
Not sure how to handle this one tbh...
I think the best way would simply be to update the api docs - telling people to add the Z post fix.
As that is in fact "the correct" (ISO compliant and all) way in this case (I admit to having overlooked that at first)
I can live with the Z postfix (indeed I changed my script to use it already), but as I said I'd have just stored and processed it all in UTC from the start anyway, it's the only sane thing to do. Then you get the choice to display in another timezone as and when you wish. Anything else is asking for headaches.
Yes, I have scripts that force TZ=UTC and set postgresql explicitly to UTC as well. The database for my RSS feed is all-UTC.
From a data structure point of view what I've outlined above makes sense.
Some things to note:
System
Except for "Population" and "Security" (and name and coords) all other properties of a system are derived from something else in the system.
Ie. it's not a property of the system per se, but of other "sub properties" of "entities" of the system (pardon the terminology...)
Notably: economy:
- A systems economy listing, is the sum total of all station economies.
I've yet to see anything contradict this.
government/allegiance/state
- A systems government/allegiance/state is derived from those of the controlling faction.
Faction
A faction has straightforward properties - Nothing special there (except I've removed "relation" as I assume that will end up being "per commander")
However - With this structure, it will be needed to as a minimum enter those factions that are "in control" of stations.
The rest can be ignored if people don't feel like entering the data for those.
But the data for those factions that control stations are needed - and does not require extra typing as such (as it's listed for each station as well)
Station
Stations "inherit" all the properties of whatever faction controls it.
So it makes sense to simply list the owning faction - The data will be in the faction object.
This obviously will require a lookup for that data... (maybe controversial - as the values are not "directly" properties of the station now)
Now for what I think might be the most controversial...
facilities/economies
- facilities: Instead of listing each possibly facility as a boolean, I suggest using a bitmask instead.
So instead of ["Commodities", "Refuel", "Repair", "Re-Arm", "OutFitting"] you'd have perhaps "58" (ie. 111010)
- economy: Likewise for economy. Instead of having "economy1"/"economy2" a bitmask for all possible economies is used instead.
So instead of ["Refinery", "Industry"] you'd have perhaps "20" (ie. 10100)
Now - Coding wise bitmasks in these two cases makes very much sense.
It does however raise the bar a little bit for both consumers of the data and likewise contributors (those coding the submission forms - not the actual people entering the data), as a bitmask is not very "human readable".
So I'm not sure what you all think of this...
It's not like bitmask as a concept are *that* hard - It's just that the output won't be as easy to "eyeball parse" (via a fiddle or whatever) as listing out the full names of say the economies.
But then again - Output from the API isn't really supposed to be looked at by human eyes anyhow... There should be some UI presenting that data - And likewise for data submission, which can be done in many ways in fact, and still result in the same data being submitted.
And the input "date" filter is put straight through to the DB as a datetimeoffset as well (obviously without the TZ adjustment bit though...)
I'm outputting them as UTC values however.
So yeah perhaps not optimal...
What would you prefer?
EDIT:
As an undocumented feature (that ought to work - haven't tested) add a "Z" behind the date and it should work as you'd expect: ie. "date":"2014-11-30 22:29:04Z",
I am working on getting new updates with the time filter to. And its good to know about this problem now...
If you output them as UTC it would be best to have the time filter as UTC too i think.
Fun that the example was from my added stars. And i have a question about that. A few of the stars i added had no solution even if i added lots of stars. Look at the star Maia. I might have done some wrong but it worked fine with the form with redwizzards code.
I've finally figured something out that *I* am happy with, everyone else might not be however...
<snip>
Now - Coding wise bitmasks in these two cases makes very much sense.
It does however raise the bar a little bit for both consumers of the data and likewise contributors (those coding the submission forms - not the actual people entering the data), as a bitmask is not very "human readable".
So I'm not sure what you all think of this...
It's not like bitmask as a concept are *that* hard - It's just that the output won't be as easy to "eyeball parse" (via a fiddle or whatever) as listing out the full names of say the economies.
But then again - Output from the API isn't really supposed to be looked at by human eyes anyhow... There should be some UI presenting that data - And likewise for data submission, which can be done in many ways in fact, and still result in the same data being submitted.
I don't mind the bit mask if we have to use it. I can get my head around it if necessary. If there was an easier, more human way of dealing with this, it'd be a nice bit of easy for me.
That being said, do you have a guide to what value is related to what mask?
I don't mind the bit mask if we have to use it. I can get my head around it if necessary. If there was an easier, more human way of dealing with this, it'd be a nice bit of easy for me.
That's the one downside to bitmasks - They are not very readable.
But as I mentioned, no human is actually supposed to see the raw output of the API anyways...
A few of the stars i added had no solution even if i added lots of stars. Look at the star Maia. I might have done some wrong but it worked fine with the form with redwizzards code.
Could you tell me the coordinates reported by rw's tool please.
I did a quick test, and the reason why TDC doesn't find any coordinates for Maia is because it can't find a candidate where all the distances in the DB matches perfectly (there are 20 distances in total).
If I recall rw's tool is not quite as stringent.
It's possible one of those distances is wrong, who knows... (But if I have the coords from rw I'd be able to check)
It's one of the issues I've raised as a concern earlier - As as of yet I have no way of dealing with that.
And haven't really had the time/inclination to pursue it further atm.
do you have quota caps on your api? My HeatMap has "recovery" runs once a day, where it tries to remap all unknown systems. It would then send a bunch of queries in a short time.