Updates can still be uploaded as per previous via the upload form and uploaddata.asp page. It may be better to migrate user price updates to EDDN so that non-TD users also get the benefit of price updates. It looks like most people use EliteOCR which I believe has a EDDN upload/sending function. So if you use EliteOCR then send via that method. If you update prices via the GUI or text editor then you can still upload as usual but this is where an EDDN sender would probably be useful.
I've also asked in the EDDN group whether it would be beneficial to rebroadcast old data so all the prices (in the last 24 or 48hrs say) can be captured by a listener over a period (say 3 hrs).
Well, I was actually thinking it might be useful for you to act as a hub in both directions - when you process a .prices file, EDDN-transmit the entries you received via uploads out to EDDN. I think that jibes well with your transmitting old updates periodically.
The problem is the data, especially using their one-item-per-message, gets ridiculously large.
https://www.dropbox.com/personal/Public/ed/eddntransform (the scripts are included in the directory)
Code:
osmith@WOTSIT ~/Dropbox/public/ed/eddntransform
$ ls -1sh
711k mad.2d.prices
4.8M mad.all.prices
2.9M eddn.2d.json
20M eddn.all.json
551k batched.2d.json
3.7M batched.all.json
...
I'm guessing you culled the data recently.
The jist is: ~700kB of .prices data becomes 2.9MB of eddn data, 5MB of .prices data becomes 20MB of eddn data.
Alternatively, using a terse, batched form, 700kB of price data becomes 500kB and the savings scale up to the full dump.
I realize this doesn't matter much to your service, but it's going to matter a lot to the EDDN network as the data grows. If you send 20MB of historical updates to EDDN and EDDN has 500 subscribers, it is going to have to buffer and transmit - assuming no retrans - 26MB (mega bytes) of data per user, for a grand total of 12GB of data.
If you have a 10Mb/s (note lower b) internet connection and you can get full download speeds, it'll take you 20 seconds to download that. But most users are going to be getting under a Mb/s on the download, due to the way ZeroMQ's fan-out works, which means they can anticipate your update clogging up the EDDN network for them for 3 and a half minutes... This, in itself, will cause queueing issues that are going to draw it out to 5 minutes.
This means a really good chance, at internet scale, of lost data, individual price updates.
And this is with a fairly tiny portion of the full ED dataset and a small number of listeners. Heck the data I just tested this with was small relative to the data from your site yesterday.
Of course, if EDDN mostly becomes a mechanism for developers to exchange prices and most users actually get their data from sites like your own, then that dials-back the scale some, but it's still kind of insane, IMSE, going forward. We have 110k prices in the database, in a few months we'll probably have pipped a million. The sheer overhead of the current EDDN json format is an order of magnitude more than the actual data.
It isn't going to scale.
-- EDIT --
Also, be aware that ZeroMQ's "pub/sub" model of broadcasting isn't "free", the sender is actually having to send individual copies of the data to each subscriber, and further ZeroMQ's pub/sub requires receiver-side filtering ... so clients can opt-out of seeing your update but they actually still have to download it.