Discussion EDTS: A collection of scripts for route planning and other tools

wolverine2710

Tutorial & Guide Writer
Rep +1. Very nice set of very usefull tools. Thanks for sharing!!

Due to the way the forums works this excellent piece of info will be snowed rather soon I'm afraid and lost. Though now that it has been moved to this sub-forum the odds are a bit better ;-) I certainly missed it initially and I've been searched the forums quite a bit for new tools ;-( I've taken the liberty to add it to EDCodex. Its a VERY large collection of third party tools,threads,videos, live streams etc. EDCodex entry: "EDTS: A collection of scripts for route planning and other tools". Anyone can and is encouraged to add entries. If you would like to edit yours, please register and send me or biobob a PM with the email address used for registering. We will give you then directly full edit access to your entry. I'm sure you can do a much better job then I did.
 
I don't really know Python, nor am I a programmer, but due to the...inconsistent results of online tools I've spent awhile trying to get this working. I've got Python 2.7 installed, and the scripts placed in the scripts directory. However, I consistently get stuck here:

gCJBKLr.jpg


I've left it running overnight, but it never seems to get past this step. Any ideas?
 
i'll add my shrug of the shoulders. it is certainly worth persisting with as it's an excellent tool.

I'm sure Alot will be along with some sage advice to set you on your way soon.
 
The update process is getting quite... intensive these days due to the size of the data.
With that said... not going to lie, that's a weird place for it to be getting stuck. That's probably the least intensive of the data that it imports.

You could still try to apply some magic mostly done by furrycat, by passing the -b flag to update.py:

python update.py -b

This should try to import the EDSM/EDDB data in smaller chunks, which might work better. It's not currently the default because it's a bit hacky, but it may well help. :)

Edit: Just realised that the -b flag was only added a week or two ago, so you'll need quite a recent version for it to work.
 
Last edited:
The update process is getting quite... intensive these days due to the size of the data.
With that said... not going to lie, that's a weird place for it to be getting stuck. That's probably the least intensive of the data that it imports.

You could still try to apply some magic mostly done by furrycat, by passing the -b flag to update.py:

python update.py -b

This should try to import the EDSM/EDDB data in smaller chunks, which might work better. It's not currently the default because it's a bit hacky, but it may well help. :)

Edit: Just realised that the -b flag was only added a week or two ago, so you'll need quite a recent version for it to work.

Thanks for responding! I'll...try to figure out how to try that out. So far I've been launching the scripts by double clicking on them. /noob
 
Update:

EDSM systems database appears to copy over correctly, but the populated systems jam up th CPU for about half an hour, then proceed to download for a short time, then jam up the CPU again. Process also appears to be using only a single core (surprise).

tYN5WyZ.jpg


And then after 2ish hours of running, errors and a crash (from remote host termination?)
 
And then after 2ish hours of running, errors and a crash (from remote host termination?)
Yowch, that's pretty crazy... The EDDB systems data should be taking all of about 5-10 seconds, not hours!

So far I've only managed to reproduce this on Python 2.x, just looking into what's going on now... It looks like it's taking ages to do updates to the relevant tables.
I might also make the "-b" flag the default, because the size of the EDSM data is really starting to get a bit silly. :p
 
OK, I've pushed a change which appears to fix the problem on my machine at least (with Python 2.7 on Windows).
It looks like certain database indexes (which speed things up rather a lot, important on tables with 2.2M entries!) were being ignored, resulting in the process being ally sloooow.

Let me know if that solves your problem Bruski... If you want slightly more realtime help, you're welcome to join the EDCD Discord server which has a channel on it just for EDTS. :)

Edit: Should mention, I haven't made "-b" the default just yet, so you'll still want to specify it for now.
 
Last edited:
Sweet, it seems to have loaded everything up fine 'n fast! Now I just have to figure out how to make it do things. :D
 
EDTS is now able to work directly with the visited stars cache. You'll need the develop branch.
Code:
$ git fetch origin develop
$ git checkout -b develop origin/develop
You'll also need to grab the latest EDSM data dump.
Code:
$ python update.py
Now we can now read star cache and extract the system names known in EDSM.

Suppose you took a copy of the star cache and placed it in the same directory as your EDTS checkout:
Code:
$ python vsc.py read VisitedStarsCache.dat
Bento
Lave
NLTT 48288
Acokwech
HIP 42196
Ormt
Pontus
Morten-Marte
LHS 3479
Chukchitan
Warwal
LTT 2581
Zaonce
Brani
LHS 3447
Eravate
LTT 16242
Anouphis
Chimba
Lazdones
Aryak
Lie Zhi
Yembo
8055177614042
11666607777217
9466242147777
58156541430568
3932143227618
3721329117555
40550396742488
58147951496008
Note how systems which don't have their internal ID mapped appear in the list as the raw ID.

We can also construct a star cache file. The most obvious use case is to generate a star cache with all the stars known to EDSM. It's very easy to do:
Code:
$ python vsc.py write VisitedStarsCache.dat
The operation is quite slow - it takes about an hour on my machine - but is significantly faster than feeding a giant text file of system names to the game in an ImportStars.txt file.

The results are quite spectacular:
visited.png


There is an important caveat, however. Although procedurally-generated systems' internal IDs can be computed from their name, hand-authored systems can't be programmatically determined. EDTS has a fairly large database of known systems but it is incomplete. You will see output such as the following when writing the cache and affected systems will show up as unvisited.
Code:
[12:27:32.251] [starcache ] [  ERROR] HIP 39665 has no id64!
Currently there is no fix other than updating id64data.py with the output of a star import - or waiting what is likely to be a reasonably long time for Alot or myself to do it - then rerunning the update and export.

EDTS can help you map known systems to IDs, although it is still a slow and tedious process. So long and tedious, in fact, that I'd be surprised if anyone actually wanted to do it, so I'll just dump it under this spoiler tag.
Suppose you have a text file unknown.txt containing a list of systems whose IDs are not known. Suppose further that your visited stars cache file lives in %LOCALAPPDATA%\Frontier Developments\Elite Dangerous\1234. You can then run this command:
Code:
$ python vsc.py batch -d "%LOCALAPPDATA%\Frontier Developments\Elite Dangerous\1234" unknown.txt output.txt
EDTS will create one or more files called ImportStarsFull.txt, ImportStars0.txt, ImportStars1.txt etc. It will then in turn copy each one as ImportStars.txt to the directory specifed by -d and wait for them to be converted into a visited stars cache.

You need to exit to main menu then load back into any game mode to trigger the conversion!

After all the conversions are done you will be left with files called VisitedStarsCacheFull.dat, VisitedStarsCache0.dat etc. EDTS will process all of them together and write output.txt which will contain Python code you can merge into id64data.py - and preferably submit back to Alot!

Or if there are duplicate system names in the list it will blow up with an error message.
You can also write a cache file which contains only the systems which match an EDTS filter. Just include the filter as an extra argument to vsc.py write. For instance to include only systems within 15Ly of Sol do:
Code:
$ python vsc.py write CloseSystems.dat close_to=Sol,distance<15
Other than testing that the functionality works without having sit for an hour waiting to dump All The Things, though, I'm not convinced there's much point in doing that.
 
Last edited:
Top Bottom