This wiki CookBook#markdown-header-how-can-i-add-a-missing-station page is wrong on the "--blackmarket y" parameter, it doesn't work, the "--bm y" seems to work.
Just confirm in case I missed something, all outpost stations are s + m only, all 'slot' stations have l pads? Since a lot of '?' can be filled in via patient system view on the map I'd like to know if there are any odd cases before committing bad data.
FISHU!
Not yet. Soon(TM). Like, literally the next day or two.
lovely stuffsadly I don't get the gaming time i'd like so I've probably missed stuff in change logs since I started in July.
@maddavo - I assume changes from ? to M or L or adding BM data without adding new stations will be picked up ok uploading station to your site?
Great to hear
The progression of technology is rather fast, in ten years we've gone from 1940's to 3301.
This wiki CookBook#markdown-header-how-can-i-add-a-missing-station page is wrong on the "--blackmarket y" parameter, it doesn't work, the "--bm y" seems to work.
Load from HILL PA HSI/Reilly Terminal (33ls/star, No/bm, Lrg/pad):
532 x Personal Weapons 1,884cr each, 1,002,288cr total, data from 20days and 4days
So if I submit full and correct set of commodities in import.prices file into maddavo's site...
Owner@ARYA /c/apps/trade-loading-refactor (master)
$ trade.py buildcache -f
NOTE: Rebuilding cache file: this may take a moment.
*** INTERNAL ERROR: NOT NULL constraint failed: ShipVendor.station_id
CSV File: C:\apps\trade-loading-refactor\data\ShipVendor.csv:209
SQL Query: INSERT INTO ShipVendor (station_id,ship_id) VALUES((SELECT Station.st
ation_id FROM Station INNER JOIN System USING(system_id) WHERE System.name = ? A
ND Station.name = ?),(SELECT Ship.ship_id FROM Ship WHERE Ship.name = ?))
Params: ['KALIKI', 'Oren City', 'Adder']
Owner@ARYA /c/apps/trade-loading-refactor (master)
$ trade.py buildcache -f
NOTE: Rebuilding cache file: this may take a moment.
Traceback (most recent call last):
File "./trade.py", line 76, in <module>
main(sys.argv)
File "./trade.py", line 50, in main
results = cmdenv.run(tdb)
File "c:\apps\trade-loading-refactor\commands\commandenv.py", line 79, in run
return self._cmd.run(results, self, tdb)
File "c:\apps\trade-loading-refactor\commands\buildcache_cmd.py", line 70, in
run
buildCache(tdb, cmdenv)
File "c:\apps\trade-loading-refactor\cache.py", line 866, in buildCache
processImportFile(tdenv, tempDB, Path(importName), importTable)
File "c:\apps\trade-loading-refactor\cache.py", line 688, in processImportFile
columnDefs = next(csvin)
StopIteration
Ah-ha!The updated.prices file contains a FULL LIST of prices for a station and is generated by TD when a station is updated by it using the trade.py update command. This file is slightly different, it contains a full list of all commodities whether they are sold or not. If there is a commodity that isn't bought/sold at a station then it has 0 sell and 0 buy. THIS file is that is best to upload to my site. As @Snake_Man has identified, the 0 - 0 price will delete that commodity from the station.
Can madupload command upload import.prices file as well or is it hardcoded like you said?The 'madupload' command automatically uploads the updated.prices file.
Yes today I had to do this to delete that one example commodity few posts above. A hassle.There is one problem where EliteOCR is used - it doesn't generate a file that has all commodities in it, just the ones that were in the market. If you upload the import.prices then it won't delete any that aren't there. You could do this: import import.prices into TD, use TD to update the station, make a minor change to 'touch' the data and generate an updated.prices file and then upload the updated.prices file. But that is a hassle.
@Snake_Man and @aguettinger: re prices files
It sounds like you have a handle on the prices files - I'll just add some clarifying comments.
The TradeDangerous.prices file is the prices file used by TD. Usually it contains a complete list of prices for all stations and is either created by TradeDangerous or downloaded from my site.
The import.prices file contains prices to be imported into TD. It is a partial list of stations (or maybe just one). It may come from various sources - TD makes it using the import command to download the latest prices from my site, EliteOCR also makes an import.prices. The import.prices files a primarily for importing into TD.
The updated.prices file contains a FULL LIST of prices for a station and is generated by TD when a station is updated by it using the trade.py update command. This file is slightly different, it contains a full list of all commodities whether they are sold or not. If there is a commodity that isn't bought/sold at a station then it has 0 sell and 0 buy. THIS file is that is best to upload to my site. As @Snake_Man has identified, the 0 - 0 price will delete that commodity from the station.
All the prices files can be uploaded to my site. But a 7Mb TradeDangerous.prices file will take time to upload and a minute or two to process. It's probably better to upload incrementally. The 'madupload' command automatically uploads the updated.prices file.
There is one problem where EliteOCR is used - it doesn't generate a file that has all commodities in it, just the ones that were in the market. If you upload the import.prices then it won't delete any that aren't there. You could do this: import import.prices into TD, use TD to update the station, make a minor change to 'touch' the data and generate an updated.prices file and then upload the updated.prices file. But that is a hassle.
We need EliteOCR to make the import.prices file with all the commodities in it OR maybe some function in TD to process an EliteOCR file, create an updated.prices file and upload it - all in the one cmd.
[color=silver]C:\Windows\CatPix\Trade\>[/color] pip install requests
[color=silver]C:\Windows\CatPix\Trade\>[/color] trade.py update -GF ...
[color=silver]C:\Windows\CatPix\Trade\>[/color] misc/madupload.py
Upload complete.
Can madupload command upload import.prices file as well or is it hardcoded like you said?
#! /usr/bin/env python
import pathlib
import platform
import re
import sys
import requests # the actual script has a helper here
############################################################################
upload_url = 'http://www.davek.com.au/td/uploaddata.asp'
upfile = "updated.prices"
if len(sys.argv) > 1:
upfile = sys.argv[1]
############################################################################
if not pathlib.Path(upfile).is_file():
raise SystemExit("ERROR: File not found: {}".format(upfile))
files = {
}
r = requests.post(
upload_url,
files={
'Filename': (
upfile,
open(upfile, 'rb'),
'text/plain',
{
"Expires": '300',
}
),
}
)
response = r.text
m = re.search(r'UPLOAD RESULT:\s*(.*?)<br', response, re.IGNORECASE)
if not m:
raise Exception("Unexpected result:\n" + r.text)
resultCode = m.group(1)
if resultCode.startswith("SUCCESS"):
raise SystemExit("Upload complete")
print("Upload failed: {}".format(resultCode))
@ 1 G. CAELI/Smoot Gateway
+ Chemicals
Explosives 399 0 75124M - 2014-12-26 01:45:00
In these cases where we know there are numerous repeats of particular bad data, is it feasible to implement a mechanism to mitigate this?
Hi kfsone,
Code:Owner@ARYA /c/apps/trade-loading-refactor (master) $ trade.py buildcache -f NOTE: Rebuilding cache file: this may take a moment. *** INTERNAL ERROR: NOT NULL constraint failed: ShipVendor.station_id CSV File: C:\apps\trade-loading-refactor\data\ShipVendor.csv:209 SQL Query: INSERT INTO ShipVendor (station_id,ship_id) VALUES((SELECT Station.st ation_id FROM Station INNER JOIN System USING(system_id) WHERE System.name = ? A ND Station.name = ?),(SELECT Ship.ship_id FROM Ship WHERE Ship.name = ?)) Params: ['KALIKI', 'Oren City', 'Adder']
I am pretty sure this may be because my system/station file may not be in sync and does not contain KALIKI/Oren City. Workaround is to delete contents of ShipVendor.csv as if I delete just that station, I get the same error on another, rinse and repeat. I tried updating system.csv and station.csv but it does not seem to fix this problem, so not sure what is going on. I am running my own local database. This error started happening for me about a week ago (meant to mention it then but as you were busy and sick and I had a workaround and figured it was something in my setup, did not want to bug you. Bug you, get it?)
[/QUOTE]2. This error I only get with trade-loading-refactor when I try to rebuild the cache using my tradedangerous.prices file and the -f option:
Code:Owner@ARYA /c/apps/trade-loading-refactor (master) $ trade.py buildcache -f NOTE: Rebuilding cache file: this may take a moment. Traceback (most recent call last): File "./trade.py", line 76, in <module> main(sys.argv) File "./trade.py", line 50, in main results = cmdenv.run(tdb) File "c:\apps\trade-loading-refactor\commands\commandenv.py", line 79, in run return self._cmd.run(results, self, tdb) File "c:\apps\trade-loading-refactor\commands\buildcache_cmd.py", line 70, in run buildCache(tdb, cmdenv) File "c:\apps\trade-loading-refactor\cache.py", line 866, in buildCache processImportFile(tdenv, tempDB, Path(importName), importTable) File "c:\apps\trade-loading-refactor\cache.py", line 688, in processImportFile columnDefs = next(csvin) StopIteration
As I am not sure what the issue is here exactly (I bet it is something I have in my .prices or station.csv file which is not in the branch files, or perhaps I should have started just by importing my prices file rather than rebuilding the cache) and have not yet checked out my ED 1.04 upgrade to ensure it works, going to call it quits for tonight on testing this branch. Tomorrow I'll do a clean clone of it again and try from scratch with an empty database and prices file and see how it goes and let you know.
In my ubuntu there is no "pip" and when I do sudo apt-get install python-requests, it says its already in latest version. When I run madupload.py it says ImportError: No module named 'requests'