Release EDDN - Elite Dangerous Data Network. Trading tools sharing info in an unified way.

Status
Thread Closed: Not open for further replies.
Then there comes two options to my mind:
1. if it really hangs on recv(), it's ZMQ related issue and you can try another/newer version. When fixed timeout set, recv() should return FALSE even when network issues, etc.
2. Your script hangs/break somewhere else and it slipped out of your attention. I can only recommend to track down problematic place using log file(s). So dump to file what exactly is returned from recv(), what exactly is returned from uncompress, etc. Eventually, you can also set more "sensitive" error logging in PHP and check your error logs (you should also notice script timouts there).
Thanks for your input, I think I may of solved the issue. Its now been running for nearly 3 hours without stopping. I thought ill try replacing the ZMQ php lib files, encase they were old or the wrong ones. So hopefully that all was all it was.. although knowing my luck it will probably stop in a minute (because ive jinxed it now).

If this is the case, ive just the problem of keeping a listener running. The only way I can do it is by leaving my own PC on (which isnt ideal). What I need to do is have a listener forward on the JSON to a particular address on my site (as the site should be running 24/7). However even just a few hours of EDDN data is a big improvement. Thank-you guys for setting it up, its a lot better than I had it before.

[update 1] well unfortunately it stop after 3 hours. looking at the logs, when there was 5 minutes of inactivity, it stop working. so it cant be to do with the script run time. im thinking its something to do with reconnection, so added these two;

Code:
$subscriber->setSockOpt(ZMQ:: SOCKOPT_RECONNECT_IVL, 300);
$subscriber->setSockOpt(ZMQ:: SOCKOPT_RECONNECT_IVL_MAX, 300);

Although I couldnt understand whats meant by 'initial reconnection' and 'max reconnection'.... im guessing thats in seconds? hopefully its not m/s. The documentation near useless.

[update 2] Still disconnected after 5 minutes of inactivity.
 
Last edited:
Thanks for your input, I think I may of solved the issue. Its now been running for nearly 3 hours without stopping. I thought ill try replacing the ZMQ php lib files, encase they were old or the wrong ones. So hopefully that all was all it was.. although knowing my luck it will probably stop in a minute (because ive jinxed it now).

If this is the case, ive just the problem of keeping a listener running. The only way I can do it is by leaving my own PC on (which isnt ideal). What I need to do is have a listener forward on the JSON to a particular address on my site (as the site should be running 24/7). However even just a few hours of EDDN data is a big improvement. Thank-you guys for setting it up, its a lot better than I had it before.

[update 1] well unfortunately it stop after 3 hours. looking at the logs, when there was 5 minutes of inactivity, it stop working. so it cant be to do with the script run time. im thinking its something to do with reconnection, so added these two;

Code:
$subscriber->setSockOpt(ZMQ:: SOCKOPT_RECONNECT_IVL, 300);
$subscriber->setSockOpt(ZMQ:: SOCKOPT_RECONNECT_IVL_MAX, 300);

Although I couldnt understand whats meant by 'initial reconnection' and 'max reconnection'.... im guessing thats in seconds? hopefully its not m/s. The documentation near useless.

[update 2] Still disconnected after 5 minutes of inactivity.

Hi,

You can have a look at my eddb implementation: https://gist.github.com/themroc5/99ac8973db4732692a88. It's based on the Yii2 framework but the idea should be clear.
I had "silent inactivity" too sometimes in the beginning, so I added a 10 minute silent timeout to it. This script works perfectly for many months already. I'm using libzmq v4.0.5.
Generally all I do is listen and dump the json in a mysql db. A second script is doing the parsing job later (the process call). That's good to decouple the code. The latter can break and I still don't loose any data. The only time this went down was the OOM Killer killing mysql :/

I hope it helps. Hit me up if you have more questions.
 
I hope it helps. Hit me up if you have more questions.
hi there, thanks for the information.

It looked like you used a disconnect function in your version, although I didnt see that command in the PHP documentation (plus you gave it value, I gather that's required). I thought I would have to re-create the connection. Hopefully its okay. Thanks for your help guys.

Btw, when looking at the logs, it was around 5 minutes of inactivity before disconnection, not 10. So you could be missing a few posts.
 
Last edited:
It looked like you used a disconnect function in your version, although I didnt see that command in the PHP documentation (plus you gave it value, I gather that's required). I thought I would have to re-create the connection. Hopefully its okay. Thanks for your help guys.

It only reconnects if no message was received within 10 minutes. You can configure it that way:

Code:
$socket->setSockOpt(ZMQ::SOCKOPT_RCVTIMEO, 600000);

You should not reconnect if the stream is actually working.
 
It only reconnects if no message was received within 10 minutes. You can configure it that way:

You should not reconnect if the stream is actually working.

A big thanks for the help. With your suggestion I have changed my code to the following (I also removed the above code as its old, and may mislead);

Code:
$eddn_address	= "tcp://eddn-gateway.elite-markets.net:9500";
$edd_timeout	= 600000;

// ===================================================
// Send an EDDN, JSON to site for processing
// ===================================================

$context 	= new ZMQContext();

$subscriber = $context->getSocket(ZMQ::SOCKET_SUB);

// Disable filtering.
$subscriber->setSockOpt(ZMQ::SOCKOPT_SUBSCRIBE, "");

// set timeout
$subscriber->setSockOpt(ZMQ:: SOCKOPT_RCVTIMEO, $edd_timeout);

// connect
$subscriber->connect($eddn_address);
		
// --------------------------------------
// ZMQ connection
// --------------------------------------

print "\n";

while (true)
{
	$message	= $subscriber->recv();
	
	// check for timeout
	if ($message === false)
	{
		print ".. reconnecting\n";
		
		// disconnect, then reconnect after 3 seconds
		$subscriber->disconnect($eddn_address);
		sleep(3);
		$subscriber->connect($eddn_address);
	}
	else
	{
		// Receive raw market JSON strings.
		$market_json 		= gzuncompress($message);

		// Un-serialize the JSON data to a named array.
		$json_data			= json_decode($market_json, true);
		
		if ($json_data != null)
		{

Thanks guys, this issue has been causing me a lot of problems - being hard to test (waiting for inactivity on EDDN). I noticed your using the relay, and im using the gateway. What's the difference?
 
Roguey I have similar problem running python EDDN listener (written by AnthorNet) when I ran my network in NAT setup with tcp port 9500 forwarded. However if I run my network in normal mode the listener works fine.

Common theme to your problem is that my listener also received just fine for bit over an hour, then it stopped which was fixed if I restarted it.

Are you running your receiving computer behind NAT by any chance?
 
Roguey I have similar problem running python EDDN listener (written by AnthorNet) when I ran my network in NAT setup with tcp port 9500 forwarded. However if I run my network in normal mode the listener works fine.

Common theme to your problem is that my listener also received just fine for bit over an hour, then it stopped which was fixed if I restarted it.

Are you running your receiving computer behind NAT by any chance?

hi there, well it sounds like a similar problem. Over the weekend the listener worked for 3 hours before stopping. After much tracing/testing, I found out it was to do with automatic disconnection (after a set amount of time of inactivity), ie. if nothing received on port 9500 for awhile. If that happens, then disconnect and reconnect. After restarting the program it would work again.
 
Last edited:
hi there, well it sounds like a similar problem. Over the weekend the listener worked for 3 hours before stopping. After much tracing/testing, I found out it was to do with automatic disconnection (after a set amount of time of inactivity), ie. if nothing received on port 9500 for awhile. If that happens, then disconnect and reconnect. After restarting the program it would work again.


Hello,
That's a very odd bug !
I never could reproduce it when Snake Man told me.
All my test are reconnecting good even when I shut down my network.

You can find some working examples in our GitHub:
Here: https://github.com/jamesremuscat/EDDN/tree/master/examples/PHP

The re-connection helps with slow clients by restarting the connection when you receive no messages for a certain amount of time.
The only place I could think that can come is one of the library used by ZMQ.
 
I noticed your using the relay, and im using the gateway. What's the difference?

At the moment, what you're doing only works because the gateway and the relay happen to be the same machine on the same IP address ;-)

Uploaders should do so via the gateway (eddn-gateway.elite-markets.net:8080)
Subscribers should do so via a relay (eddn-relay.elite-markets.net:9500)

In the future, there might be different relays depending on message type (so you can subscribe to only the messages your application needs), or we might separate the relay and gateway servers... so it's best to use the hostnames above.
 
The EDDN is currently unavailable: our hosting provider appears to have dropped off the Internet. I'm poking their support team...

EDIT: No sooner do I post that, do they reappear... all is good again.
 
Last edited:
A while back I posted up details of a tool that was storing the incoming commodities data of EDDN in DynamoDB (which in a characteristically unimaginative turn I am now officially naming EDDN on DynamoDB ;) ). Previously, commanders wanting access to this had to ask me for an AWS access key & secret, which had the plus side of using ready-made AWS SDKs and the downside of me handling user access & that barrier to entry.

This has been expanded to a public API now in beta (giving me an excuse to play with AWS API Gateway) as well as proper load balanced listeners to the EDDN firehose.

Beta public API access can be found at: https://43h3di62h7.execute-api.eu-w...modities/YYYY-MM-DD?from=1234&to=4567&limit=1 (and since it's in beta, URL is messy, there are no nice API docs etc. as yet)

  • Day should be in YYYY-MM-DD (all times & days reflect original EDDN gatewayTimestamp)
  • from (optional) should be epoch time in microseconds (inclusive)
  • to (optional) should be epoch time in microseconds (inclusive)
  • limit (optional) should be an integer
Assuming a 200 OK is returned, returned data is a JSON object with three properties:
  • Count - number of items returned
  • Items - array of raw EDDN JSON strings
  • LastEvaluatedTimestamp - optional timestamp indicating the last key returned if more data is available
Thus a query for 2015-01-01 will return 165 items with no LastEvaluatedTimestamp as there were only 165 entries on that day. A similar query for 2015-08-01 will return 155 items with LastEvaluatedTimestamp set to 1438397278805941. This indicates that there are more entries available for that day if one were to set 'from' as 1438397278805942.
 
Last edited:
I only see commodity schemas in there. Are you storing shipyard messages?

Not at present; I've not had time to review the new additions, but it should, so I'll see what I can do!

Edit: A first draft of shipyards should be up now (same API structure, swap 'commodities' for 'shipyards' e.g. what has been captured today)
 
Last edited:
Assuming a 200 OK is returned, returned data is a JSON object with three properties:
  • Count - number of items returned
  • Items - array of raw EDDN JSON strings
  • LastEvaluatedTimestamp - optional timestamp indicating the last key returned if more data is available

I really like this effort, and I am adding it to my soon-to-be-announced desktop trading tool.

However, can I suggest that the resulting JSON also contains the YYYY-MM-DD string?

This makes it possible to request more data solely on the response received, making it easier for receiving the reply in frameworks using callbacks (such as Qt).
It can of course be deduced from LastEvaluatedTimeStamp, but that seems a bit like a kludge and it will not significantly impact the size of the response.
 
Last edited:

wolverine2710

Tutorial & Guide Writer
@Askarr. Very nice to see its now a public available API. Congratz.
Note: I like your changes in EDCodex for your entry.
 
I really like this effort, and I am adding it to my soon-to-be-announced desktop trading tool.
Glad you like it! Just to confirm expectations, EDDN on DynamoDB is a pure archive of EDDN - there's nothing in the way of filtering, cleanup, spelling correction of incorrectly OCRed data etc. It's primarily intended for catching up missing data where EDDN subscribers have failed, or bulk one-off queries for data analysis.

If you intend to connect a desktop tool to it, can I please request that your tool caches its query results - there's not a huge amount of processing power currently allocated behind the API (mainly due to evaluating costs; sadly I'm not Twitter :) ), and thousands of commanders all hitting it could easily cause a degraded experience (I'm interested to know your intended query rate & likely number of users). A lazy fetch of data slowly over time is going to be much more palatable.

Of course, if you really want absolutely up-to-the-second data, your tool could subscribe to the EDDN firehose directly. There is nothing in principle stopping a desktop tool connecting straight to the 0MQ publisher. That said, jamesremuscat or others might weigh in and comment that also might be heading towards performance problems with that many potential subscribers.
However, can I suggest that the resulting JSON also contains the YYYY-MM-DD string?
Sure, added that in.

Edit: Actually, thinking about this more, I guess I'd also ask what primary use-case you had in mind for your tool's use of EDDN. Data purely by time may or may not be the most useful or efficient representation. Do you need all the data or is it a subset you're after?

If it's complex, send me a PM so that we don't clutter the thread up too much.
 
Last edited:
Glad you like it! Just to confirm expectations, EDDN on DynamoDB is a pure archive of EDDN - there's nothing in the way of filtering, cleanup, spelling correction of incorrectly OCRed data etc. It's primarily intended for catching up missing data where EDDN subscribers have failed, or bulk one-off queries for data analysis.

If you intend to connect a desktop tool to it, can I please request that your tool caches its query results - there's not a huge amount of processing power currently allocated behind the API (mainly due to evaluating costs; sadly I'm not Twitter :) ), and thousands of commanders all hitting it could easily cause a degraded experience (I'm interested to know your intended query rate & likely number of users). A lazy fetch of data slowly over time is going to be much more palatable.

Of course, if you really want absolutely up-to-the-second data, your tool could subscribe to the EDDN firehose directly. There is nothing in principle stopping a desktop tool connecting straight to the 0MQ publisher. That said, jamesremuscat or others might weigh in and comment that also might be heading towards performance problems with that many potential subscribers.

Edit: Actually, thinking about this more, I guess I'd also ask what primary use-case you had in mind for your tool's use of EDDN. Data purely by time may or may not be the most useful or efficient representation. Do you need all the data or is it a subset you're after?

Thanks for the quick YYYY-MM-DD.

To answer your other questions. My application currently has a use count of exactly one (me), but I plan to release is as a open source application real-soon-now.
I am currently doing initial data-import from the EDDB.io archives, and I am listening to the EDDN firehose as well, so the queries to EDDNoDDB will be to fill the "hole" in the history when the application is not running.
I am of course saving all the data to disk between sessions. I am currently working on heuristics to keep the local database fairly up to date, without thrashing the upstream providers too much.

My current idea is to make the application do a full reimport from EDDB.io every week (to get systems and faction changes), and do intermediate updates as needed from EDDNoDDB. Does this sound okay?

I guess this is as good a time as any to drop a link to my Imgur album with some screenshots.
 
Last edited:

wolverine2710

Tutorial & Guide Writer
Last edited:
have you guys added equipment data to EDDN yet? as it would be very useful to add, seeing as my site can deal with this information. many thanks.
 

wolverine2710

Tutorial & Guide Writer
have you guys added equipment data to EDDN yet? as it would be very useful to add, seeing as my site can deal with this information. many thanks.

If you mean 'ship outfitting' information as in what equipment can I by in a station/outpost then the answer is currently NOT.

Otis B. has created a pull-request for that: " Deploy outfitting schema #32". Based on (enhancement) issue #20: Proposal: Schema for Outfitting. Its is unknown to me why its not in EDDN. I've been over my eyeballs in work for EDCodex and haven't been able to follow EDDN carefully enough. I've put it NOW on my TODO list, sending an email to James.

This might be a good time to have a look at the proposal by Otis B.
 
Status
Thread Closed: Not open for further replies.
Back
Top Bottom