And I thought removing the accented chars will make less trouble, sorry for the inconvenience.
Oliver, I opened issue #238 to cover tradedangerous falling over when importing price data written by the latest version of the Regulated Noise OCR tool. Seems that tool decided to start writing timestamps in full ISO 8601 format, which includes an explicit offset from UTC. I think the .prices regex in cache.py is failing to match lines with those offsets, and I know the maddavo plugin, while successfully ignoring the timezone offset plugin, is unhappy about seeing updates from users located in east-of-Zulu timezones, thinking they happened in the future.
Details in the issue, of course.
If I can get my Python regex knowledge up to snuff, I'll try submitting a PR with fixes. Not sure how well that'll go.
trade.py local -vv -ly 0 'Persephone'
trade.py: System/Station "Persephone" could match PERSEPHONE, AVALON/Persephone or ERLIK/Persephone
Yep, Mad fixed it. If the "contract" for his output is "all times are always UTC without decoration", then you don't need to do anything, as appears to be the case. The "defensive programmer" in me wants to add 8601 Time Format support to TD just in case, but I'm ruthlessly stamping out that urge.Looks like Mad already fixed?![]()
Looks like Mad already fixed?![]()
Yep - my parser broke over the weekend with the RN v1.84_0.23 non-UTC timezones coming via EDDN. I updated the parser to accept those but didn't regenerate the affected lines... this resulted in some price lines being out of whack for a few hours last night until I regenerated them. The price lines are all UTC now.
Hi,
Possible bug or just my user_error here:
Code:trade.py local -vv -ly 0 'Persephone'
Gives me:
Code:trade.py: System/Station "Persephone" could match PERSEPHONE, AVALON/Persephone or ERLIK/Persephone
Edit: Figure it out. Persephone/ will give you the right starsystem![]()
Just came across this tool, and I absolutely LOVE it...
I've been looking through your source, and hope I can eventually write code half that clean/organized/functional.
Do you think there would be much room for performance improvement by leveraging pandas to store the database? It could complicate the bundling, but one thing about pandas is that it is really really fast when working with large dataframes.
After reading the introduction, I don't see anything specific that would help.
You can often improve performance by tweaking command-line parameters depending on what your ship can/can't do. If you're a typical trader with a cargo scoop and not minding a few jumps between stations, you can replace "--ly 10.5 --jumps 3" with "--ly 31.5 --jumps 1". Now, you're going to get some runs that are more than 3 jumps but TD will be much more efficient; the problem being that when you specify multiple jumps, for every system that TD reaches it has to ripple outwards again. That could perhaps be optimized by remembering the 1-hop reach of every system encountered so that it can be referenced on subsequent hops. I'll look at implementing that sometime this week.
But most of the CPU is spent in comparing the buy/sell lists at stations; reducing the stations that get compared or the number of items improves performance. There are several ways that can be optimized, unfortunately TD is solving an NP problem which eliminates many of the most obvious ones, but two I've contemplated toying with include making the buy/sell lists numpy contiguous arrays ordered by itemID so that I can use vector operations to say "tell me how many of each of these I can buy" and "what will the gain/ton be?"
But, again, we have a form of backtracking, so it's very difficult to do most of the easy optimizations -- we have to get all destinations for each origin in a hop at once, Python threading wouldn't help because we're not IO bound and Python's multiprocessing wouldn't help because the memory footprint would be fairly large. We could work around that by using messaging (zeromq or similar) and push data, but the overhead of moving the data would eat into the gains - although all we have to send is [credits, capacity, limit, srcStnID, [buyItems], dstStnID, [sellItems]]. The item lists either have to be [itemID, cr] or we have to cache the lists and then we can just send the costs.
Add the "-vv" option (verbose, doubled) and you'll get station distance from the primary. Not as clean a tabular form of output as the non-verbose format, but it's there.On another subject, I do have a feature request. When using the 'buy' command, it lists stations and the light-years to the respective systems, but it does not list how far the stations are from the navigation point in light-seconds, that information would be useful in that view.
Add the "-vv" option (verbose, doubled) and you'll get station distance from the primary. Not as clean a tabular form of output as the non-verbose format, but it's there.
Okay, this is the weirdest thing... when I went back into the screen, the gold sell price went up to 9270. I did not leave the station... in fact, I didn't even leave the commodities screen! It just changed it on me! Annoyingly, I'd sold a bunch of gold to this station at the cheap price! Very weird.Hey... Trade Dangerous said it wanted me to post a screencap because Gold was out of the expected ranges. So... here it is!
WOW. So AnthorNet just uploaded a HUUUUGGGEEE Station.csv file with an additional 20k+ stations - doubling our stations.
Be prepared for a slight delay the next time you do an update!
Sooo much to explore....
This must be from the data hacked from the FD servers a while ago. We have been avoiding using that as it seems less than ethical.