One (more) suggestion... being able to import old log files would be nice.
Define "old logs"
One (more) suggestion... being able to import old log files would be nice.
I have a suggestion, maybe a manual checkbox for "First Discovery" on systems or bodies? Obviously there's no way to know if you'll actually get credit for a First Discovery scan until you get back to sell the data and see if someone got there before you, but it would be cool for helping to keep track of places you (possibly) have your name on.
I've thought about that one (great minds think alike)
Either that, or the default is First Discovery, with a "previously discovered" marker for the ones you know are so. Because the majority of exploration data is (hopefully) going to be discovered by you first - quite often I see perhaps a couple of planets are scanned previously and the rest are unscanned. Same for stars.
Regds
Cool. I actually hit a lot of previously discovered stars, but that's mostly because I'm still in "tourist" mode; visiting popular nebulae and other well-known stars.![]()
Define "old logs"![]()
Not a bad idea, and I tried a few, but no luck so far. Name of the process doesn't change when I change the file name. Needs more research though.Just a stupid idea, but may be you can try it...
If you copy/paste a dummy windows program (calculator... minesweeper or whatever) and rename it to "EliteDangerous32.exe"...
May be it will start as a sort of fake ED...
(It probably won't work, but ... not a lot to do to test this)
I've realised its because I didn't have verbose logging on, so no source data![]()
I've been to Sag. A* and beyond, and believe me, you do not want to have to tick "First Discovered" for every object you find![]()
Download SQLite Admin and open your faulty DB. Under "Tables" go for 'systems' and then 'names'.
In the right field, take the 'Edit' tab then go to the very last entry at the bottom. You should see the double entry there. Kill it, save it and then it should work.
Make a backup first.
This error seems to happen after accidentally closing CL and then launching it again while the process is still active in the Task Manager. You then have a 'shadow' CL running and both CLs then make DB entries, which results in said error.
At least that was what I could reproduce on my PC.
Thanks for this, Andrew. But I need more detailed guidance. I tried to fix my database but it didn't work. maybe because I have several dupe entries? Can anyone provide a more detailed guide to how to use SQLite Admin to fix the CL database? (I have no programming experience in this area) I've tried a few times to no avail.
Is the follow screen where I should be editing? Tables > Systems > Fields > name? Just this one table?
View attachment 47838
CMDR Genar-Hofoen,
I installed this soft yesterday, and so far it looks great.
i have an idea for very small improvement... It could be interesting to add some sort of notation to a system, so user can set 0 or 1 "star" to useless system, and 5 stars to amazing system (for whatever reason this system is amazing, it could just be due to a wonderfull sight.)
Just started using it and loving it so far.
It could really do with a note somewhere in the install instructions about needing to play in Windowed or Borderless mode for the overlay to work. Took me a while to find that tidbit in the changelogs.
Awesome app though.
Yep. Basically, if you haven't had verbose logs on since the first time you started playing ED, all that data is locked up in FDEV's servers.
The only reason we have EDDiscovery and all the other 3rd party applications the way they operate right now (scraping verbose logs), is because there is no official ED API. Such an API could have ways of getting a list of all the system's you've been in - but I very much doubt FDEV even store that kind of data - the storage requirements would be pretty awesome.
They must store that data somewhere as otherwise how would they stop you rescanning a body you've already scanned? Also there are storing all the tagging data too. However that's not too bad a problem to solve as they only need to store data for bodies people have scanned. So they don't need store it for every body - just the scanned ones - and they don't need to store data about the "well known" bodies in the bubble either. So I think for many CMDRs thats a few hundred or thousand entries. I'm nearly elite in exploration and I've got less than 11,000 sold scans. If everyone travelled as much as I have then they'd have about 6.6bn data points to worry about, but i suspect the real figure is probably just in the tens maybe hundreds of millions. So perfectly manageable![]()
Try this to get a list of the systems you've got duplicates. If the first column is a 2 or more then its a duplicate entry.
select count(1), name from systems group by name order by 1 desc
that will give you a list of all the systems which have duplicates