Release Trade Dangerous (Est. 2015) Power user's highly configurable trade optimizer

Version v2.1.0.4 of TD Helper 2 has been release and may be found at https://github.com/MarkAusten/TDHelper/releases. TDH2 should alert you to the update if you have updates turned on.

v2.1.0.4
--------
* Added support for "--ls-max" and "--planetary" to the OldData command.
* Changed the caption and tool tip for the route option on the old data panel.
* Reduced default limit from 42 to 10 when route is checked since sorting for best distance overall is not an O(n) solution.
* Added missing "--pad" option to old data panel.
 
Last edited:
Obviously, it's great that development continues. Now - as I am presently out with the DWE2 fleet and not due to return until later in the year, I'm not actively using TD.

So, a lot of TS updates can be skipped, server side, for example additional sorting options and so on don't matter to the server, it's only when some change to the data means that the server side files are wrong that TD needs a server side update.

So, although the forum is now correctly emailing me when this thread is written to, please can you ping me when there is an update which affects the server. I may well update the server TD from time to time anyway, but it won't be every single update and so a ping makes sure it happens if it really needs to.
 
Obviously, it's great that development continues. Now - as I am presently out with the DWE2 fleet and not due to return until later in the year, I'm not actively using TD.

So, a lot of TS updates can be skipped, server side, for example additional sorting options and so on don't matter to the server, it's only when some change to the data means that the server side files are wrong that TD needs a server side update.

So, although the forum is now correctly emailing me when this thread is written to, please can you ping me when there is an update which affects the server. I may well update the server TD from time to time anyway, but it won't be every single update and so a ping makes sure it happens if it really needs to.
Sure thing boss. :)
 
Obviously, it's great that development continues. Now - as I am presently out with the DWE2 fleet and not due to return until later in the year, I'm not actively using TD.

So, a lot of TS updates can be skipped, server side, for example additional sorting options and so on don't matter to the server, it's only when some change to the data means that the server side files are wrong that TD needs a server side update.

So, although the forum is now correctly emailing me when this thread is written to, please can you ping me when there is an update which affects the server. I may well update the server TD from time to time anyway, but it won't be every single update and so a ping makes sure it happens if it really needs to.
Stay safe out in the black, Commander.
 
Any idea what's causing this? Running Python 3.7. I can't seem to import from EDDB...

Code:
G:\Games\TradeDangerous>trade.py import --plug=eddblink --opt=clean,skipvend
Traceback (most recent call last):
  File "G:\Games\TradeDangerous\trade.py", line 107, in <module>
    main(sys.argv)
  File "G:\Games\TradeDangerous\trade.py", line 80, in main
    results = cmdenv.run(tdb)
  File "G:\Games\TradeDangerous\commands\commandenv.py", line 81, in run
    return self._cmd.run(results, self, tdb)
  File "G:\Games\TradeDangerous\commands\import_cmd.py", line 124, in run
    if not plugin.run():
  File "G:\Games\TradeDangerous\plugins\eddblink_plug.py", line 846, in run
    os.remove(str(tdb.dataPath) + "/TradeDangerous.db")
PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'G:\\Games\\TradeDangerous\\data/TradeDangerous.db'
 
Is it at all possible you have two instances running?
Or are you running TDH at the same time.

The error indicates that a process on your machine has locked the database, so your import cannot write the data it has downloaded.

The likely reason is that you are already running TD, either directly or from inside TD Helper.

Double check that your import is the only instance of TD running, then try again.
 
Last edited:
It's not necessarily another instance of TD, or an instance od TD from within TDH, either, although one or the other is the most likely scenario. Anything that's accessing that file on a Windows machine will lock the file. Since it's trying to delete the file when the error occurs, you can try manually deleting the file yourself- Windows should tell you what process is accessing it, so then you just close that program and try again.

Worst case scenario, if you can't figure out what is locking it, which means in turn not being to close that program, restarting Windows will definitely close the program so you'll be able to do the thing after reboot.

(Sorry I'm late to the party, never got notice that there were new replies.)
 
Last edited:
Okay. Good news, bad news.

I've been working on converting over the slower import processes, and it is faster than the old method.

Under the old method, importing Stations, ShipVendors, and UpgradeVendors took ~12 minutes, 20 seconds.

Under the new method, the same process took ~7 minutes, 49 seconds.

In both cases, the old data and the new data were the same, so the only difference is the updating method.

It took about 63% as much time to complete with the new method.

That's the good news.

The bad news is, that's still pretty slow.

It's the UpgradeVendor update that's the really slow bit. In the new method, it took 5.5 minutes just to insert the new UpgradeVendor data. Granted, the UpgradeVendor table is ' huge, but I was hoping it'd be faster than that.

I'm going to do the same thing to the listings import and see how that goes. Any improvement is good, after all. Hopefully, we'll see a better time improvement on the listings.
 
Last edited:
Alright, just finished the coding changes, they're testing now. I'm running both the current published version and the new code at approximately the same time. There might have been about a second's delay between starting the one and starting the other.

In any case, both versions are running "trade.py import -P eddblink -O all,progbar", and both are updating a database not updated since 12/30/2018 with the current 01/30/2019 data.

I'll be back with an edit once the test runs are complete.
 
Okay. Good news, bad news.

I've been working on converting over the slower import processes, and it is faster than the old method.

Under the old method, importing Stations, ShipVendors, and UpgradeVendors took ~12 minutes, 20 seconds.

Under the new method, the same process took ~7 minutes, 49 seconds.

In both cases, the old data and the new data were the same, so the only difference is the updating method.

It took about 63% as much time to complete with the new method.

That's the good news.

The bad news is, that's still pretty slow.

It's the UpgradeVendor update that's the really slow bit. In the new method, it took 5.5 minutes just to insert the new UpgradeVendor data. Granted, the UpgradeVendor table is ' huge, but I was hoping it'd be faster than that.

I'm going to do the same thing to the listings import and see how that goes. Any improvement is good, after all. Hopefully, we'll see a better time improvement on the listings.
Updating tables is always slower than inserting new records. How many records are you trying to process in the UpgradeVendor process?
 
Alright, just finished the coding changes, they're testing now. I'm running both the current published version and the new code at approximately the same time. There might have been about a second's delay between starting the one and starting the other.

In any case, both versions are running "trade.py import -P eddblink -O all,progbar", and both are updating a database not updated since 12/30/2018 with the current 01/30/2019 data.

I'll be back with an edit once the test runs are complete.
I presume that you have trawled through the various on-line attires and questions about spending up SQLite? Like these:

https://stackoverflow.com/a/21590912
https://www.sqlite.org/wal.html
https://www.whoishostingthis.com/compare/sqlite/optimize/

Using both 'PRAGMA synchronous=OFF' and 'PRAGMA journal_mode=MEMORY' speed things up a lot but there a cost, of course.

It might just be that you are pushing the boundaries of SQLite and it just will not go any faster. It is, after all, not a blazingly fast DB.
 
Updating tables is always slower than inserting new records. How many records are you trying to process in the UpgradeVendor process?
No idea, but it's a LOT, and it's all inserts. No updates. BTW, the UpgradeVendor table accounts for over half the size of the database. It's HUGE. I'm not at all surprised that it's slow to update.

In other news, I'm doing bug fixing on the new code. Totally forgot that EDDB's listings.csv includes listings for rare items, and since I never got around to dealing with rares, it kind of throws a monkey wrench into things. Let's see if I got all the kinks ironed out.

As an update, the currently published version finished in ~35:17. I'm not at all surprised, since it was updating practically the everything, and both the UpgradeVendor import and the EDDB listings import are pretty big files to process. Before the bug that made it crash, the new code was finishing up processing the EDDB listings before the current code had even finished the UpgradeVendors, so that's something.

Old method:
>trade.py import -P eddblink -O all,progbar
NOTE: Downloading file 'modules.json'.
NOTE: Requesting http://elite.ripz.org/files/modules.json
NOTE: Downloaded 366.1KB of gziped data 210.9KB/s
NOTE: Processing Upgrades: Start time = 2019-01-30 23:18:25.113853
NOTE: Finished processing Upgrades. End time = 2019-01-30 23:18:25.248552
NOTE: Downloading file 'index.json'.
NOTE: Requesting https://raw.githubusercontent.com/EDCD/coriolis-data/master/dist/index.json
NOTE: Downloaded 0.7MB of gziped data 14.9MB/s
NOTE: Processing Ships: Start time = 2019-01-30 23:18:25.549793
NOTE: Finished processing Ships. End time = 2019-01-30 23:18:25.568810
NOTE: Downloading file 'systems_populated.jsonl'.
NOTE: Requesting http://elite.ripz.org/files/systems_populated.jsonl
NOTE: Downloaded 29.6MB of gziped data 1.6MB/s
NOTE: Processing Systems: Start time = 2019-01-30 23:18:45.276445
NOTE: Finished processing Systems. End time = 2019-01-30 23:18:47.598726
NOTE: Downloading file 'stations.jsonl'.
NOTE: Requesting http://elite.ripz.org/files/stations.jsonl
NOTE: Downloaded 117.9MB of gziped data 2.1MB/s
NOTE: Processing Stations, this may take a bit: Start time = 2019-01-30 23:19:45.704442
NOTE: Simultaneously processing ShipVendors.
NOTE: Simultaneously processing UpgradeVendors, this will take quite a while.
NOTE: Finished processing Stations. End time = 2019-01-30 23:32:50.524717
NOTE: Downloading file 'commodities.json'.
NOTE: Requesting http://elite.ripz.org/files/commodities.json
NOTE: Downloaded 107.4KB of gziped data 112.7KB/s
NOTE: Processing Categories and Items: Start time = 2019-01-30 23:32:52.272600
NOTE: Checking for missing items....
NOTE: Missing item check complete.
NOTE: Finished processing Categories and Items. End time = 2019-01-30 23:33:07.604508
NOTE: G:\Elite Dangerous Programs\Trade Dangerous\data\Category.csv exported.
NOTE: G:\Elite Dangerous Programs\Trade Dangerous\data\Item.csv exported.
NOTE: G:\Elite Dangerous Programs\Trade Dangerous\data\Ship.csv exported.
NOTE: G:\Elite Dangerous Programs\Trade Dangerous\data\ShipVendor.csv exported.
NOTE: G:\Elite Dangerous Programs\Trade Dangerous\data\Station.csv exported.
NOTE: G:\Elite Dangerous Programs\Trade Dangerous\data\System.csv exported.
NOTE: G:\Elite Dangerous Programs\Trade Dangerous\data\Upgrade.csv exported.
NOTE: G:\Elite Dangerous Programs\Trade Dangerous\data\UpgradeVendor.csv exported.
NOTE: Downloading file 'listings.csv'.
NOTE: Requesting http://elite.ripz.org/files/listings.csv
NOTE: Downloaded 246.1MB of gziped data 3.3MB/s
NOTE: Processing market data from listings.csv: Start time = 2019-01-30 23:38:26.200312
NOTE: Finished processing market data. End time = 2019-01-30 23:49:57.560805
NOTE: Downloading file 'listings-live.csv'.
NOTE: Requesting http://elite.ripz.org/files/listings-live.csv
NOTE: Downloaded 63.0MB of gziped data 1.5MB/s
NOTE: Processing market data from listings-live.csv: Start time = 2019-01-30 23:50:40.496152
NOTE: Finished processing market data. End time = 2019-01-30 23:53:42.355424
NOTE: Regenerating .prices file.
NOTE: Import completed.
23:18:25.113853
23:53:42.355424
total time = 0:35:17.241571

New method:
>trade.py import -P eddblink -O all,progbar
NOTE: Downloading file 'modules.json'.
NOTE: Requesting http://elite.ripz.org/files/modules.json
NOTE: Downloaded 366.1KB of gziped data 206.4KB/s
NOTE: Processing Upgrades: Start time = 2019-01-31 01:43:58.549276
NOTE: Finished processing Upgrades. End time = 2019-01-31 01:43:58.588338
NOTE: Downloading file 'index.json'.
NOTE: Requesting https://raw.githubusercontent.com/EDCD/coriolis-data/master/dist/index.json
NOTE: Downloaded 0.7MB of gziped data 15.6MB/s
NOTE: Processing Ships: Start time = 2019-01-31 01:43:58.879449
NOTE: Finished processing Ships. End time = 2019-01-31 01:43:59.110458
NOTE: Downloading file 'systems_populated.jsonl'.
NOTE: Requesting http://elite.ripz.org/files/systems_populated.jsonl
NOTE: Downloaded 29.6MB of gziped data 1.8MB/s
NOTE: Processing Systems: Start time = 2019-01-31 01:44:16.478160
NOTE: Finished processing Systems. End time = 2019-01-31 01:44:18.786049
NOTE: Downloading file 'stations.jsonl'.
NOTE: Requesting http://elite.ripz.org/files/stations.jsonl
NOTE: Downloaded 117.9MB of gziped data 2.4MB/s
NOTE: Processing Stations, this may take a bit: Start time = 2019-01-31 01:45:09.722386
NOTE: Simultaneously processing ShipVendors.
NOTE: Simultaneously processing UpgradeVendors, this will take quite a while.
NOTE: Import file processing complete, updating database. 2019-01-31 01:45:20.728646
NOTE: Deleting old Station entries. 2019-01-31 01:45:20.728646
NOTE: Inserting new Station entries. 2019-01-31 01:47:16.541740
NOTE: Deleting old ShipVendor entries. 2019-01-31 01:47:20.179688
NOTE: Inserting new ShipVendor entries. 2019-01-31 01:47:20.239874
NOTE: Deleting old UpgradeVendor entries. 2019-01-31 01:47:22.316752
NOTE: Inserting new UpgradeVendor entries. 2019-01-31 01:47:22.489029
NOTE: Finished processing Stations. End time = 2019-01-31 01:52:52.113807
NOTE: Downloading file 'commodities.json'.
NOTE: Requesting http://elite.ripz.org/files/commodities.json
NOTE: Downloaded 107.4KB of gziped data 111.9KB/s
NOTE: Processing Categories and Items: Start time = 2019-01-31 01:52:54.494764
NOTE: Checking for missing items....
NOTE: Missing item check complete.
NOTE: Finished processing Categories and Items. End time = 2019-01-31 01:52:58.269068
NOTE: G:\Elite Dangerous Programs\Trade-Dangerous\data\Category.csv exported.
NOTE: G:\Elite Dangerous Programs\Trade-Dangerous\data\Item.csv exported.
NOTE: G:\Elite Dangerous Programs\Trade-Dangerous\data\Ship.csv exported.
NOTE: G:\Elite Dangerous Programs\Trade-Dangerous\data\ShipVendor.csv exported.
NOTE: G:\Elite Dangerous Programs\Trade-Dangerous\data\Station.csv exported.
NOTE: G:\Elite Dangerous Programs\Trade-Dangerous\data\System.csv exported.
NOTE: G:\Elite Dangerous Programs\Trade-Dangerous\data\Upgrade.csv exported.
NOTE: G:\Elite Dangerous Programs\Trade-Dangerous\data\UpgradeVendor.csv exported.
NOTE: Downloading file 'listings.csv'.
NOTE: Requesting http://elite.ripz.org/files/listings.csv
NOTE: Downloaded 246.1MB of gziped data 4.2MB/s
NOTE: Processing market data from listings.csv: Start time = 2019-01-31 01:57:36.009786
NOTE: NOTE: Import file processing complete, updating database. 2019-01-31 01:59:17.543606
NOTE: Marking data now in the EDDB listings.csv as no longer 'live'. 2019-01-31 01:59:17.543606
NOTE: Deleting old listing data. 2019-01-31 01:59:45.547688
NOTE: Inserting new listing data. 2019-01-31 01:59:46.771594
NOTE: Finished processing market data. End time = 2019-01-31 02:02:04.198607
NOTE: Downloading file 'listings-live.csv'.
NOTE: Requesting http://elite.ripz.org/files/listings-live.csv
NOTE: Downloaded 63.1MB of gziped data 1.7MB/s
NOTE: Processing market data from listings-live.csv: Start time = 2019-01-31 02:02:42.572602
NOTE: NOTE: Import file processing complete, updating database. 2019-01-31 02:03:12.491596
NOTE: Deleting old listing data. 2019-01-31 02:03:12.491596
NOTE: Inserting new listing data. 2019-01-31 02:03:46.522030
NOTE: Finished processing market data. End time = 2019-01-31 02:04:32.563209
NOTE: Regenerating .prices file.
NOTE: Import completed.

01:43:58.549276
02:04:32.563209

total time = 0:20:34.013933

So, doing a full run, the new method is about 15 minutes faster. Took ~58% as long to complete. Definitely a good savings.
 
Last edited:
Same original data (from 30th December, 2018), this time only doing a listings update, not a full update:

Old method:
>trade.py import -P eddblink -O listings,progbar
NOTE: Downloading file 'systems_populated.jsonl'.
NOTE: Requesting http://elite.ripz.org/files/systems_populated.jsonl
NOTE: Downloaded 29.6MB of gziped data 1.7MB/s
NOTE: Processing Systems: Start time = 2019-01-31 02:10:17.340654
NOTE: Finished processing Systems. End time = 2019-01-31 02:10:19.646454
NOTE: Downloading file 'stations.jsonl'.
NOTE: Requesting http://elite.ripz.org/files/stations.jsonl
NOTE: Downloaded 117.9MB of gziped data 3.1MB/s
NOTE: Processing Stations, this may take a bit: Start time = 2019-01-31 02:10:58.430347
NOTE: Finished processing Stations. End time = 2019-01-31 02:11:11.075562
NOTE: Downloading file 'commodities.json'.
NOTE: Requesting http://elite.ripz.org/files/commodities.json
NOTE: Downloaded 107.4KB of gziped data 112.3KB/s
NOTE: Processing Categories and Items: Start time = 2019-01-31 02:11:12.828745
NOTE: Checking for missing items....
NOTE: Missing item check complete.
NOTE: Finished processing Categories and Items. End time = 2019-01-31 02:11:28.580599
NOTE: G:\Elite Dangerous Programs\Trade Dangerous\data\Category.csv exported.
NOTE: G:\Elite Dangerous Programs\Trade Dangerous\data\Item.csv exported.
NOTE: G:\Elite Dangerous Programs\Trade Dangerous\data\Station.csv exported.
NOTE: G:\Elite Dangerous Programs\Trade Dangerous\data\System.csv exported.
NOTE: Downloading file 'listings.csv'.
NOTE: Requesting http://elite.ripz.org/files/listings.csv
NOTE: Downloaded 246.1MB of gziped data 4.0MB/s
NOTE: Processing market data from listings.csv: Start time = 2019-01-31 02:12:32.727918
NOTE: Finished processing market data. End time = 2019-01-31 02:24:03.399109
NOTE: Downloading file 'listings-live.csv'.
NOTE: Requesting http://elite.ripz.org/files/listings-live.csv
NOTE: Downloaded 63.2MB of gziped data 1.3MB/s
NOTE: Processing market data from listings-live.csv: Start time = 2019-01-31 02:24:51.971622
NOTE: Finished processing market data. End time = 2019-01-31 02:27:48.563368
NOTE: Regenerating .prices file.
NOTE: Import completed.

02:10:17.340654
02:27:48.563368
total time = 0:17:31.222714

New method:
>trade.py import -P eddblink -O listings,progbar
NOTE: Downloading file 'systems_populated.jsonl'.
NOTE: Requesting http://elite.ripz.org/files/systems_populated.jsonl
NOTE: Downloaded 29.6MB of gziped data 1.7MB/s
NOTE: Processing Systems: Start time = 2019-01-31 02:10:17.644907
NOTE: Finished processing Systems. End time = 2019-01-31 02:10:20.026641
NOTE: Downloading file 'stations.jsonl'.
NOTE: Requesting http://elite.ripz.org/files/stations.jsonl
NOTE: Downloaded 117.9MB of gziped data 3.5MB/s
NOTE: Processing Stations, this may take a bit: Start time = 2019-01-31 02:10:54.232115
NOTE: Import file processing complete, updating database. 2019-01-31 02:11:01.256443
NOTE: Deleting old Station entries. 2019-01-31 02:11:01.256443
NOTE: Inserting new Station entries. 2019-01-31 02:13:01.014925
NOTE: Inserting new UpgradeVendor entries. 2019-01-31 02:13:05.081606
NOTE: Finished processing Stations. End time = 2019-01-31 02:13:05.081606
NOTE: Downloading file 'commodities.json'.
NOTE: Requesting http://elite.ripz.org/files/commodities.json
NOTE: Downloaded 107.4KB of gziped data 107.7KB/s
NOTE: Processing Categories and Items: Start time = 2019-01-31 02:13:06.953386
NOTE: Checking for missing items....
NOTE: Missing item check complete.
NOTE: Finished processing Categories and Items. End time = 2019-01-31 02:13:10.842649
NOTE: G:\Elite Dangerous Programs\Trade-Dangerous\data\Category.csv exported.
NOTE: G:\Elite Dangerous Programs\Trade-Dangerous\data\Item.csv exported.
NOTE: G:\Elite Dangerous Programs\Trade-Dangerous\data\Station.csv exported.
NOTE: G:\Elite Dangerous Programs\Trade-Dangerous\data\System.csv exported.
NOTE: Downloading file 'listings.csv'.
NOTE: Requesting http://elite.ripz.org/files/listings.csv
NOTE: Downloaded 246.1MB of gziped data 4.2MB/s
NOTE: Processing market data from listings.csv: Start time = 2019-01-31 02:14:12.542508
NOTE: Import file processing complete, updating database. 2019-01-31 02:15:53.795045
NOTE: Marking data now in the EDDB listings.csv as no longer 'live'. 2019-01-31 02:15:53.795045
NOTE: Deleting old listing data. 2019-01-31 02:16:23.693394
NOTE: Inserting new listing data. 2019-01-31 02:16:24.990141
NOTE: Finished processing market data. End time = 2019-01-31 02:18:53.174872
NOTE: Downloading file 'listings-live.csv'.
NOTE: Requesting http://elite.ripz.org/files/listings-live.csv
NOTE: Downloaded 63.2MB of gziped data 1.5MB/s
NOTE: Processing market data from listings-live.csv: Start time = 2019-01-31 02:19:37.838158
NOTE: Import file processing complete, updating database. 2019-01-31 02:20:07.814318
NOTE: Deleting old listing data. 2019-01-31 02:20:07.814318
NOTE: Inserting new listing data. 2019-01-31 02:20:46.266297
NOTE: Finished processing market data. End time = 2019-01-31 02:21:35.759222
NOTE: Regenerating .prices file.
NOTE: Import completed.

02:10:17.644907
02:21:35.759222

total time = 0:11:18.114315

New method took ~64.5% as long to complete. It actually fell behind a bit doing the Station import, but caught and surpassed the old method soon after. Might mean I want to keep the old method for the Station import, since while it is faster when doing a full import of Station, ShipVendor, and UpgradeVendor, updating the last two doesn't need to happen very often- only when a new ship or new upgrade is added to the game- the typical scenario is just updating the Station.

=====

And, last but not least, a "typical" run, with fairly recent data. (Did the best I can to make sure that they both start with the same data and update with the same data, considering how often the live file gets updated.):

Old method:
>trade.py import -P eddblink -O listings,progbar
NOTE: Downloading file 'listings-live.csv'.
NOTE: Requesting http://elite.ripz.org/files/listings-live.csv
NOTE: Downloaded 63.2MB of gziped data 1.7MB/s
NOTE: Processing market data from listings-live.csv: Start time = 2019-01-31 02:35:14.637120
NOTE: Finished processing market data. End time = 2019-01-31 02:36:42.334489
NOTE: Regenerating .prices file.
NOTE: Import completed.

02:35:14.637120
02:36:42.334489
total time = 87.70

New method:
>trade.py import -P eddblink -O listings,progbar
NOTE: Downloading file 'listings-live.csv'.
NOTE: Requesting http://elite.ripz.org/files/listings-live.csv
NOTE: Downloaded 63.2MB of gziped data 1.8MB/s
NOTE: Processing market data from listings-live.csv: Start time = 2019-01-31 02:35:12.166744
NOTE: Import file processing complete, updating database. 2019-01-31 02:35:24.545899
NOTE: Deleting old listing data. 2019-01-31 02:35:24.545899
NOTE: Inserting new listing data. 2019-01-31 02:35:24.651977
NOTE: Finished processing market data. End time = 2019-01-31 02:35:24.732889
NOTE: Regenerating .prices file.
NOTE: Import completed.

02:35:12.166744
02:35:24.732889

total time = 12.57


14.3% as long to complete as the old method!!!!

New method pushed.

Latest commit (TD): 48686db
 
Last edited:
Hey Mark, there's a weird, annoying problem with the Source system resetting. No idea what is causing it, but you can see it multiple times here:

(This is also a shameless plug, btw.)
[video=youtube;a5hVpc2lC1Y]https://www.youtube.com/watch?v=a5hVpc2lC1Y[/video]
 
Hey Mark, there's a weird, annoying problem with the Source system resetting. No idea what is causing it, but you can see it multiple times here:

(This is also a shameless plug, btw.)
That might be something to do with reading new information from the net logs which it does at intervals. I'll have look and see if I can figure it out.
 
Watched a few minutes at work; seriously cool. I'll probably watch most of the rest at home. One day I may splurge and get VR. But before that I'll have to transition from Joystick+Keybord to full HOTAS, I guess.

Also, I see your a twist-to-yaw pilot. Don't know why but I cannot handle that; I'm a twist-to-roll, X-axis to yaw pilot.

FWIW, I'm spending most of my time at the CNB in LT 4487 trying to grind combat to at least dangerous to open Lori Jameson. Currently at 60% Expert :(
 
Last edited:
... One day I may splurge and get VR....
I recommend waiting for Valve to release their headset and Knuckles controllers. It'll probably be fairly cheap (, more like Rift than like Vive in terms of cost), the controllers are hands down the best that currently exist for VR, and the headset will be one of the better ones as well.

I'm obviously guessing as to the cost, but we'll find out eventually. Based on the current status of the Knuckles, I would be surprised if they didn't release by mid-summer.

(When it does release, I'm buying a set for myself and selling the Vive I currently have. :) )
 
Last edited:
Top Bottom