Release EDDN - Elite Dangerous Data Network. Trading tools sharing info in an unified way.

Status
Thread Closed: Not open for further replies.
Still working on a relational database structure that would work efficiently here. Personally, would go for SQL Server and ASP.NET/MVC - just because that's what I'm familiar with. Would initially expose an API using a 'classic' SOAP-based web service, all nice and bog-standard, and relatively simple and quick to implement. Would look to a more optimised API using JSON, but would need to research it a bit first.

Agree on your platform choice... Check out ServiceStack rather than bothering with SOAP :)
 
Agree on your platform choice... Check out ServiceStack rather than bothering with SOAP :)

Cheers for that, it looks interesting - I will probably have a little play with it on the weekend.

- - - - - Additional Content Posted / Auto Merge - - - - -

Please don't use SOAP. Use REST! See this: http://rest.elkstein.org/

Q. Why do I use SOAP?
A. All my client applications are developed using .NET, I just create a reference to the SOAP Web Service in my project, and hey presto! it's all there to use in my code, no fuss, no hassle, nice and easy.

Any API I do end up creating will have such a SOAP interface, but that's not to say I won't create an interface using REST as well :D
 

wolverine2710

Tutorial & Guide Writer
Hello commanders,

Just arrived in Germany, still full of Adrealine. A lot has been discussed this morning and a lot to discuss. Have to go away for several hours so I can't go into each aspect arm. My peronaö view. EDDN should be used for market prices - at least initially. For the static data we alrready have TGC - now in beta but things go fast.

ZeroMQ is indeed just a fast and versatile transport layer, supported by 40+ languages. EMDN has proven itself. At first it was an ELK stack and data was stored in ElasticSearch. Used by Andreas to showcase Kibaba.Later it was just a firehose (pours out data) without any data stored. Marketdump send using ZeroMQ a zlib compressed JSON structure to EMDN and every client could subscribe to this firehose. The BPC was a subscriber. Emdn-tap of TD was a subscriber. Marketdump sendfs data to EMDN. It was bundled with the BPC. Marketdump came with a firehosr.exe program (python) so you could what was going on. About every 2-3 seconds someone opened the commoditiesmarket, was intercrpterd by MD and send to EMDN.

I DO see a valid argument for REST for uploading data to EDDN - authentication aspect. Its also my believe that a firehose (ZeroMQ) is the way to go. NO data is srtored in a database. This can be done later, in parallel with setting up EDDN. Its just a matter of storing the received data. A REST based web-api can thenbe created to retrieve data from it. Perhaps data of the last day or so.We can look into that later. In the trhead here I mentioned github where Andreas EMDN repo is. Perhaps best ot have a POC with just ZeroMQfor uploading and also the firehose. I also think that JSON is the easiest way to upload data. Output in JSON seems logical best but can XML/CSV as well. I really think its best to first start datbase less and later expand upon. Concerning databases. Suppose the implemtor of EDDN stops for some reason. In that case an open source database would be best. Same for tools used. Lets try to do it with tools (can be MS based) which do not need a license. Example Slopey uses devexpress (very expansive) which makes it impossible for him to release the code etc.

In NO way I want discourage commanders but I think its best to start simple. Lets DO discuss all things further. REST, SOAP, databases, stacks. its very usefull. Just trying to give my personal view and trying to coordinate things. Please DO comment on what I just said. Again thanks for all the input, keep it coming.

Have to go in 10 minutes so have to stop.
 
Last edited:
People seem to want to go with whatever they are comfortable coding for. Personally, I've seen/coded a lot of SQL and NoSQL implementations, and for this use case, a dead simple Node.js + MongoDB service is all you need. JSON/BSON wins. SOAP/ASP/SQL/postgres would be way too heavy for this.
 
I work at a hosting company and I would be willing to donate resources. If we can get specific information on the requirements so that I can get an idea whats needed that would be helpful, assuming that a hosting location would be helpful.


edited for grammar/clarity.
 
Last edited:

wolverine2710

Tutorial & Guide Writer
Okay, I actually just talked to my boss, who I recently got to pick up the beta himself, and he said that depending upon what is needed we (the company - Vivio Technologies) would be willing to donate it.

My ( I mean ofc our) Xmas present comes early it seems ;-) BRILLIANT.
Just back, seeing black btw but gonna bite the bullet, it ssems I have some 3+ PM's waiting which I HAVE to read. Perhaps they are from SantaClaus. Ho Ho Ho ;-)
 
Last edited:
I'm with Wolverine on this. Dont think data history or databases. It should be a instant deliver mechanism. The tool that you put data into should send it to the intake of the firehose, and it should then be delivered to whoever listens to it.

Now hopefully some of the tools will save the data, and make it available for everyone. Hopefully the most popular tools will also have enough logic behind them to check if the firehose is down, and put the backed up data back in once its back up.
And this is where TGC should get into market data as well. As a master copy of the latest snapshot.
 

wolverine2710

Tutorial & Guide Writer
I think I have to make things a bit clearer. My dream would be EDDN implemented as an ELK stack, output on the firehose. An elk stack means elasticsearch and kibana. Last one as I described is a very powerfull visualisation tool. If possible everybody should be able to readonly access the elasticsearch database with kibana on their own local machine. Kibana uses elasticsearch. If not possible then everyone having an elk stack on their machine which somehow can get fed by data stored in A database. The database sitting on the EDDN machine makes the most sense.

For now and to get things started a simple EDDN based on an ZeroMQ publisher/subscriber pattern is the best way to get started. If that is setup then clients can be made in multiple languages to send data to EDDN and clients to receive data from the firehose. Then in EDDN stuff likes checks for validity of data has to be implemented etc etc. A big concern is poisining the well. For that a client which uploads data to EDDN as REST would be great.It would enable things OAuth etc. Which could/would be a deterrent.

I`ve so much more to tell/ask you but things did go much faster then I thought/imagined, much faster then other projects. If I had known I had created my thread after Tuesday next week because I won´t be able to have much inernet access time. Going away. I will try my best to login...

The POC which will be set up hopefully is just a POC, After that REST, databases have to be discussed. I value each and every input I have received so far and the offer received for hosting is absolutely totally great.
 
Last edited:
All of the analysis stuff can be done at a higher layer. I thought the point was to pool data for redistribution?

Issue API key tokens with email confirmation for the devs, accept only data from them (and their clients) with minimal validation, implement a ban list, add a reliability variable to reflect poisoning attempts, keep 7 days of data (not that much and helps newcomer devs to kickstart their apps), use a push architecture (socket io?) with throttled pull (full db pull max once every hour or full db as static download with 24h pull every hour), replicate to a backup MongoDB, etc.
 

wolverine2710

Tutorial & Guide Writer
Status update

Errantthought who works for hosting company viviotech has talked to his Boss - who just bought the beta. Result: EDDN can be hosted for free there. AWESOME and SO much appreciated. The specs of the VPS:
1 - 2.4Ghz or Higher vCPU Core
2GB ECC Registered RAM
80 GB RAID 10 HDD Space
20 Mbps Unmetered Bandwidth
If all goes well the server will be ready today. On my request its going to be a Linux server to be more specific CentOS. This should be more then enough, especially the bandwidth. jamesremuscat (which seems to a Linux guy as well) is going to setup a POC for EDDN, written in Python. EMDN was also written in Python. What is needed are three things, client sending data to EDDN, EDDN (pubsub pattern) with a firehose as output. And a client which gets data from the firehose. Once that is working clients can be written in whatever language to send and receive data to/from EDDN. ReoMQ has support for more then 40 languages.

To give you a better understanding of what a client looks like, an example by Andreas of a subscriber client written in Python. Its from his EMDN github repo. It receives data from the EMDN firehose in zlib compressed JSON format. Of course after receiving things have to be done with the received data,like updating Thrudds database, or the tradedangerous.prices file and/or the TD database or whatever one want to do with the data.

Second is what appears to be the code for data received from the marketdump tool (csv format), wich is parsed and then send to the firehose in zblib compressed JSON format. Code.

Last example is a subscriber written in C#. Its from the ZeroMQ website, you can find it here. The ZeroMQ guide can be found here. I have to say that I´m impressed with ZeroMQ

Edit: Added the following
Note: EDDN could output its data on multiple ports at once. Ie 5050 for JSON, 5051 for XML, 5052 for CSV. It could even have multiple listener ports, 4000 for JSON, 4041 for XML, 4042 for CSV. When data on any of these ports is received and parsed it can then be published on ports 5050-5052. But I like the KISS principle. Hence one input format. If it turns out this would complicate things for the auhors of upload clients multiple input format could be considered.
 
Last edited:
Great!

Last version (EMDN), used the client item names. We do not have access to these (legaly at least) now. I would suggest the following, and with OCR in mind.

System names: Upper case. Station names: Upper case. Item names: Upper case. Use in game spelling. Fields SELL BUY DEMAND DEMANDLEVEL SUPPLY SUPPLYLEVEL should be included. I would also suggest that EDDN use NULL in place of - not 0. DEMANDLEVEL and SUPPLYLEVEL 1-3 (like the scraped data, no need to waste bandwidth no LOW/MED/HIGH).

ZeroMQ will only be for "getting data" right? I would think it is easier to just "POST" a json or whatever to some web backend.
 
I used Slopey's tool during early beta days, and thought it brilliant. However, I've been playing w/o trading tools in B3 and have learned a heckuva lot more about trading. Moreover, I'm enjoying trading much more - actual skill/thoughtful planning is required now. So currently I do not plan on using any trading tools come launch.

That does not negate the OP's question, however. My concern will be that all these tools rely on manual user input with no verification. Doesn't that open the back end/data up for tons of manipulation by those who find great routes? Say someone one finds they can buy a product for 1,000 at location A, and sell for 2,000 at B. What's to prevent them from overriding prices of A to 2,000 and keeping the profit all to themselves? Sure, there's some safety in numbers (i.e. maybe another commander will catch the impropriety and correct it), but isn't the system open to manipulation, gamesmanship - and of course untimely data?
 

wolverine2710

Tutorial & Guide Writer
Great!

Last version (EMDN), used the client item names. We do not have access to these (legaly at least) now. I would suggest the following, and with OCR in mind.

System names: Upper case. Station names: Upper case. Item names: Upper case. Use in game spelling. Fields SELL BUY DEMAND DEMANDLEVEL SUPPLY SUPPLYLEVEL should be included. I would also suggest that EDDN use NULL in place of - not 0. DEMANDLEVEL and SUPPLYLEVEL 1-3 (like the scraped data, no need to waste bandwidth no LOW/MED/HIGH).

ZeroMQ will only be for "getting data" right? I would think it is easier to just "POST" a json or whatever to some web backend.

Thanks for your input. I will try to dig up the JSON format Andreas used, could be somewhere in his marketdump thread. We can discuss if its needs adjusting and stuff like uppercase, null,vebosity etc. The JSON should indeed have all the colums which are in the commodities market. Iirc Tornsouls which used .Net C# and lambd expression found out that he had to use null for a field instead of leaving out the field. It borked his code, something to keep in mind. Wrt 1-3 instead of LOW/MED/HIGH. I can imagine that when compressing the output stream it does not matter wrt bandwidth but this can be tested and discussed.

Not entirely sure what you mean with "ZeroMQ will only be for "getting data" right? I would think it is easier to just "POST" a json or whatever to some web backend.". EDDN receives data on port A (clients send/upload data to that port) and then sends it to port X. This is the port where the sibscriber clieints which receive data listen to. The consumer of data can be a web backend but it also can be a command line tool which runs locally on the machine of a commander. Like kfsone´s TD´s emdn-tap.py client which was in older versions before it was removed due to the changed data policy (its ofc still in the repo). In the case of emdn-tap.py a POST would not work, as there is no web backend. But perhaps I´m mssing something.

Note: EDDN could output its data on multiple ports at once. Ie 5050 for JSON, 5051 for XML, 5052 for CSV. It could even have multiple listener ports, 4000 for JSON, 4041 for XML, 4042 for CSV. When data on any of these ports is received and parsed it can then be published on ports 5050-5052. But I like the KISS principle. Hence one input format. If it turns out this would complicate things for the auhors of upload clients multiple input format could be considered.
 
I used Slopey's tool during early beta days, and thought it brilliant. However, I've been playing w/o trading tools in B3 and have learned a heckuva lot more about trading. Moreover, I'm enjoying trading much more - actual skill/thoughtful planning is required now. So currently I do not plan on using any trading tools come launch.

That does not negate the OP's question, however. My concern will be that all these tools rely on manual user input with no verification. Doesn't that open the back end/data up for tons of manipulation by those who find great routes? Say someone one finds they can buy a product for 1,000 at location A, and sell for 2,000 at B. What's to prevent them from overriding prices of A to 2,000 and keeping the profit all to themselves? Sure, there's some safety in numbers (i.e. maybe another commander will catch the impropriety and correct it), but isn't the system open to manipulation, gamesmanship - and of course untimely data?

Yes it will open for manipulation, but also for a ban to whoever exploit the tool. To anyone that plan to sabotage this tool. Use your time in game instead. It will be more productive for you and more productive for the community. However, if you think you find a good trade route noone force you to submit the data. Its up to your own conscience if you have relied heavily on comminty shared data to get where you are now and chose NOT to share back when find something.
 
Thanks for your input. I will try to dig up the JSON format Andreas used, could be somewhere in his marketdump thread. We can discuss if its needs adjusting and stuff like uppercase, null,vebosity etc. The JSON should indeed have all the colums which are in the commodities market. Iirc Tornsouls which used .Net C# and lambd expression found out that he had to use null for a field instead of leaving out the field. It borked his code, something to keep in mind. Wrt 1-3 instead of LOW/MED/HIGH. I can imagine that when compressing the output stream it does not matter wrt bandwidth but this can be tested and discussed.

Not entirely sure what you mean with "ZeroMQ will only be for "getting data" right? I would think it is easier to just "POST" a json or whatever to some web backend.". EDDN receives data on port A (clients send/upload data to that port) and then sends it to port X. This is the port where the sibscriber clieints which receive data listen to. The consumer of data can be a web backend but it also can be a command line tool which runs locally on the machine of a commander. Like kfsone´s TD´s emdn-tap.py client which was in older versions before it was removed due to the changed data policy (its ofc still in the repo). In the case of emdn-tap.py a POST would not work, as there is no web backend. But perhaps I´m mssing something.

Note: EDDN could output its data on multiple ports at once. Ie 5050 for JSON, 5051 for XML, 5052 for CSV. It could even have multiple listener ports, 4000 for JSON, 4041 for XML, 4042 for CSV. When data on any of these ports is received and parsed it can then be published on ports 5050-5052. But I like the KISS principle. Hence one input format. If it turns out this would complicate things for the auhors of upload clients multiple input format could be considered.

The reason I ask is that for getting data I like the ZeroMQ method of getting data pushed to me, however I dont want to have a process constantly running to put data to EDDN. Also I dont want to set up that system every time I submit data. It would be much more convenient for me to just POST that data every time I update my system. Also I think EMDN posted line by line, I would very much want to post all data for a station when I visit because I will bulk submit to my own system (this is where a constant connection could help), but also have the chance to update line by line (for example thrudds tool will update every field when change focus, here he would have to chose if want to actually post that data for every update, every line would get posted multiple times as user change data).
 
Status
Thread Closed: Not open for further replies.
Back
Top Bottom