Or a more transparent set of source publicly available somewhere. The project wasn't short of people willing to pick up the hosting, just short on the code itself.
I like the idea of the EDDN listener (which I too missed in your OP), though perhaps less so the idea of running my PC 24/7... Soo...
Feature request - EDDN listener (optionally) dumps to a file, entries older than 24 hours are automatically trimmed (to save size and not duplicate EDDB).
That can then be easily translated to running on PC the way you are thinking, or putting on a server as a downloadable if someone can be bothered (potentially me, unless my former boss gets stressed about bw usage).
This gives us a best of both worlds, live data if you want it, or any given website is broken, plus something approaching maddavo's old 2d-prices.
Presumably this is all in platform independent python, so it runs fine on Linux/FreeBSD/*nix as well as M$.
Again, maybe I could, dependant upon bw usage. Not sure how many people use TD - probably not many so that doesn't scare me, but the thought of being used as a generic EDDB mirror by the community at large might get me in trouble. Keeping a "last 24 hours" file is less worrying as that really would likely only be useful to the TD people.
First off, keep in mind that you wouldn't NEED to keep your PC running 24/7. Not running it 24/7 means some data might not get updated until the next EDDB dump, but even without running the listen server at all, you're still guaranteed to have the data within 24 hours of it being uploaded. The only difference between running the server and not running the server is that without the server running, you'll have to get the latest dump from EDDB to have all the data. (That is, with the listener running, the first EDDB dump that occurs AFTER the start time of the listener is the last dump you'll need to get as long as the listener is running.)
So, if you're going to go on vacation for a few weeks and decide to shut down the PC, it just means that you'll have to do a dump using the EDDBlink plugin to catch your local data up to date and restart the listener.
As far as a server listener, that's not a bad idea. My plan is to make the client-side listener write directly to the TD database, so once I have that coded, all that would need to happen is to have an instance of TD on the server, with the EDDBlink plugin, and an instance of the listener. Then it would be: 1) run EDDBlink with '-O clean' 2) Start the listener 3) Run EDDBlink with '-O all' once the dumps are updated to get the data between last dump and listener start 4) Trigger TD to re-export all the DB files on a timed basis, like maybe every 2 hours or so, and provide them for download.
From that point, it'd be a simple matter to write some code that downloads the files from the server and trigger a re-cache in TD. Providing all the files (all the .csv's and the .prices file) comes out to about 993MB. Not all the files are needed to be updated all the time, however. The .prices file (319MB on my PC right now) would need to be updated pretty constantly, but the other ones wouldn't need to be very often.
And of course, users would still have the option of dumping from eddb (or an eddb mirror if/when that exists) using the plugin and running the listener locally if they don't want to have to depend on a server to get the data- and/or if they want to make sure they have the latest data, as the server's data would still be a bit behind (up to 2 hours, or whatever time the export trigger is set to).
As far as bandwidth, wrt mirroring the EDDB dumps, if only TD users with my plugin are downloading them from the mirror, that equals exactly one download per user per file, as the plugin checks the timestamp and only downloads if the server copy is newer than the local copy. I don't think anyone outside of TD would really be using it, for two reasons- the other trading programs have their own mirror if they aren't downloading straight from EDDB, and there isn't much use for any of the files outside of a program that processes the data from them, so random CMDR's won't be downloading it all willy-nilly.
Going off of the latest dump as of this writing, the total size of all the files needed ('commodities.json', 'listings.csv', 'modules.json', 'stations.jsonl', and 'systems_populated.jsonl') is about 281MB. So it'd be about 0.3GB per user per day, and you can safely ignore users who decide to run their own listen server.
Based on my math, it looks like it'd actually be less bandwidth to host a mirror- the .prices file on its own is larger than all the dump files together.... Hmmm. Maybe exporting it back out into a listings.csv instead? I don't know.
I feel like we're getting ahead of ourselves. Let's test out this plugin and make sure it works correctly, get it added to TD itself so you don't have to find the link to my Google Drive folder, then see about getting the EDDN listener up and running correctly, and once all that is taken care of, then we can revisit all this.