Processor

Wow. That's probably the most informative and helpful response ever. I'd keep throwing rep at you, but the forum doesn't let me.

I'd be interested to see how much the clock speeds matter in this case. I'm not much of an overclocker as I never had the time to bother with it, but if it has a big impact it may save me some cash.
The only trouble with VR tests, in my opinion, is that it's rather hard to describe the result. Since the image seen in VR is so vastly different from the image seen on a 2D-screen it's very difficult to see the real difference in image quality.

Well I'm struggling to find anywhere that my system is being impacted enough to do a clear cut test but I did one sat on the pad in a high tech station in a type-9.

Overclocked to 4.3Ghz:

FPS - solid 90
GPU usage ~ 95%
CPU usage ~ 84%

Standard clock 3.8Ghz:

FPS - 87 - 90
GPU usage ~ 88%
CPU usage ~ 84%

Underclocked to 3.5Ghz:

FPS ~ 80
GPU usage ~ 80%
CPU usage ~ 73%

I'm a bit puzzed by the reported CPU use but realtemp was saying something completely different to task manager, so I'm not sure if changing the clock speed without rebooting was causing the CPU use to be reported incorrectly. However you can see the effect I was talking about in the GPU usage and framerate, especially when I underclocked it. Also at with the CPU overclocked the GPU usage goes up as does the framerate compared to standard and underclocked.

I've spent a long time getting what I can out of my system and setting so as you can see it will be ok with a 1080 even if you don't overclock it. If I find somewhere with a big impact while I'm playing tonight I'll try testing again (it does happen but it isn't game breaking by any stretch). Places with loads of players are more affected and might be around a few of those later.

Anyways, time to get the whisky and a straw for my Friday night VR in Elite... :D

Btw the reason my GPU usage is so high at 90FPS is because I've tweaked all the settings to be as high as I can get out of the 1080 without hitting 100%. It is basically the VR ultra setting with a few things turned down and HMD quality at 1.25x :)
 
Last edited:
Well I'm struggling to find anywhere that my system is being impacted enough to do a clear cut test but I did one sat on the pad in a high tech station in a type-9.

Overclocked to 4.3Ghz:

FPS - solid 90
GPU usage ~ 95%
CPU usage ~ 84%

Standard clock 3.8Ghz:

FPS - 87 - 90
GPU usage ~ 88%
CPU usage ~ 84%

Underclocked to 3.5Ghz:

FPS ~ 80
GPU usage ~ 80%
CPU usage ~ 73%

I'm a bit puzzed by the reported CPU use but realtemp was saying something completely different to task manager, so I'm not sure if changing the clock speed without rebooting was causing the CPU use to be reported incorrectly. However you can see the effect I was talking about in the GPU usage and framerate, especially when I underclocked it. Also at with the CPU overclocked the GPU usage goes up as does the framerate compared to standard and underclocked.

I've spent a long time getting what I can out of my system and setting so as you can see it will be ok with a 1080 even if you don't overclock it. If I find somewhere with a big impact while I'm playing tonight I'll try testing again (it does happen but it isn't game breaking by any stretch). Places with loads of players are more affected and might be around a few of those later.

Anyways, time to get the whisky and a straw for my Friday night VR in Elite... :D

Btw the reason my GPU usage is so high at 90FPS is because I've tweaked all the settings to be as high as I can get out of the 1080 without hitting 100%. It is basically the VR ultra setting with a few things turned down and HMD quality at 1.25x :)

That's interesting. I may not have to do anything at all with my CPU.

One thing I did try was to simply max out all the settings. I tested it in the training mode as in this case I wasn't so much interested in the performance as much as the end result. To be honest I can't say I noticed any visual difference at all between the medium/high/ultra settings or HMD 1/1,25/1,5.

Could it be that I did something wrong or is the actual change in image quality small enough for me not to notice?
 
That's interesting. I may not have to do anything at all with my CPU.

One thing I did try was to simply max out all the settings. I tested it in the training mode as in this case I wasn't so much interested in the performance as much as the end result. To be honest I can't say I noticed any visual difference at all between the medium/high/ultra settings or HMD 1/1,25/1,5.

Could it be that I did something wrong or is the actual change in image quality small enough for me not to notice?

The changes are very subtle.

Also I noticed that HMD q, and probably supersampling doesn't kick in until after a loading screen.
So you would need to drop to menu and reload.

But yes I don't really see enough difference between hmd q of 1.25 to 1.5 that I sacrifice the fps stability.
And with the 1080ti I happen to get improved fps on surface by pushing the terrain slider all the way to the right.
And I prefer the impact of shadows at ultra rather than bloom or aforementioned HMD q 1.5 .
 
Im running an i5 4690k at 4.5ghz with 16 gigs of ram and a 970gtx clocked to 1.33ghz core/3.5ghz mem and elite runs well.

Im not blasting max setting since i prefer flow before eyecandy but i have the ingame SS at .75 and HMD quality at 1.75 so SS totals at 1.31 in the end and looks alright.
 
Last edited:
Are there any good tools that allow me to see my FPS while using the Vive? Having the counter visible on the mirror image would be acceptable too.
 
That's interesting. I may not have to do anything at all with my CPU.

One thing I did try was to simply max out all the settings. I tested it in the training mode as in this case I wasn't so much interested in the performance as much as the end result. To be honest I can't say I noticed any visual difference at all between the medium/high/ultra settings or HMD 1/1,25/1,5.

Could it be that I did something wrong or is the actual change in image quality small enough for me not to notice?

Judging by my experience the other night FDEV have clearly done some optimising since I last looked into how well my i5 was holding up. While it isn't perfect I'd be hard pushed to recommend you do anything about your CPU. I just don't think the gains would be worth the expense. I was really surprised how infrequently my CPU seemed to be hold things back. More often than not low FPS seemed to be caused by assets loading. The worst example of what appeared to be the CPU holding things back was at a low intensity conflict zone where my GPU was at 41% but I was only get 70FPS... the CPU wasn't pegged at 100% though so was either still reporting usage incorrectly (I'd still not rebooted) or it was some other factor(s).

Anyway, for the vast majority of the time your current CPU should be plenty. I'm scrapping my plans for an upgrade now too!! :)
 
Last edited:
Judging by my experience the other night FDEV have clearly done some optimising since I last looked into how well my i5 was holding up. While it isn't perfect I'd be hard pushed to recommend you do anything about your CPU. I just don't think the gains would be worth the expense. I was really surprised how infrequently my CPU seemed to be hold things back. More often than not low FPS seemed to be caused by assets loading. The worst example of what appeared to be the CPU holding things back was at a low intensity conflict zone where my GPU was at 41% but I was only get 70FPS... the CPU wasn't pegged at 100% though so was either still reporting usage incorrectly (I'd still not rebooted) or it was some other factor(s).

Anyway, for the vast majority of the time your current CPU should be plenty. I'm scrapping my plans for an upgrade now too!! :)
My Wallet thanks you for this information!

I did some more testing, this time with the FPS-counter. With all settings on Low or off I get 30-50 FPS. GPU load is at 100% and CPU load 50-70%. It seems it's time for a GPU upgrade. I'm also considering changing my 8GBs of 1300Mhz GDDR3 RAM for something better. How much of an impact would that have?
 
My Wallet thanks you for this information!

I did some more testing, this time with the FPS-counter. With all settings on Low or off I get 30-50 FPS. GPU load is at 100% and CPU load 50-70%. It seems it's time for a GPU upgrade. I'm also considering changing my 8GBs of 1300Mhz GDDR3 RAM for something better. How much of an impact would that have?

I'm not really too sure. If you have two 4Gb DDR3 sticks then probably not much of a difference.
 
I'm not really too sure. If you have two 4Gb DDR3 sticks then probably not much of a difference.

It could be all kinds of random conicidents but I forgot to enable the xmp profile on my two 8gb sticks of ddr3 after a bios reset so my sticks where running at 1333mhz.

And elite became a jumpy, stuttering mess.
Activating the intended clock of 2400mhz and it immediately went back to a much smoother experience, hardly any stutter and certainly no jumps or leaps.

What I mean by leaps is that, mostly in stations, somehow in a split millisecond would find myself several meters closer to my landing pad that I thought I would.
Kind of like rubberbanding, only I was in the front seat of it.

This was with an i7 4790k and 1080ti.

I don't however recommend anyone to buy DDR3 RAM, better take that money and save up some more to upgrade the CPU and mobo as well.

Spending $150 on new ddr3 RAM sticks is not going leave you satisfied and buying ddr3 RAM isn't going to help down the line.
And that's from someone who just bought new ddr3 sticks ;)

Of course a new guy will be transferable to a new rig down the line and help greatly immediately.
 
Last edited:
It could be all kinds of random conicidents but I forgot to enable the xmp profile on my two 8gb sticks of ddr3 after a bios reset so my sticks where running at 1333mhz.

And elite became a jumpy, stuttering mess.
Activating the intended clock of 2400mhz and it immediately went back to a much smoother experience, hardly any stutter and certainly no jumps or leaps.

What I mean by leaps is that, mostly in stations, somehow in a split millisecond would find myself several meters closer to my landing pad that I thought I would.
Kind of like rubberbanding, only I was in the front seat of it.

This was with an i7 4790k and 1080ti.

I don't however recommend anyone to buy DDR3 RAM, better take that money and save up some more to upgrade the CPU and mobo as well.

Spending $150 on new ddr3 RAM sticks is not going leave you satisfied and buying ddr3 RAM isn't going to help down the line.
And that's from someone who just bought new ddr3 sticks ;)

Of course a new guy will be transferable to a new rig down the line and help greatly immediately.

I haven't experienced anything like what you describe so far during my testing. If I do I'll at least know what the possible cause might be.

I did some more tinkering yesterday and found the reason why my in-game FPS wasn't changing as much as I thought it should when I changed the settings. When using the ED Profiler i saw a change of maybe ~10 FPS when changing from VR Ultra to VR Low, which seemed odd to me. As it turns out, ED Profiler really wasn't doing anything at all, so no wonder I wasn't seeing any improvement. I discovered this by accident when ED refused to launch in VR-mode. I had to go into settings to manually set it to HMD-mode, and I noticed that the in-game settings were set to VR Ultra regardless of ED Profiler changes. After changing to VR Low in the settings menu I saw an immediate jump to 90 FPS in the main menu. I now have ~90 FPS in space, and ~50-60 FPS in stations, which is far better than what I had previusly. I guess it's time to uninstall the Profiler and just make changes directly in the settings menu.

Another oddity I encountered in Quince yesterday were "ghost" ships lingering on the docking pads. I spent five minutes waiting for an Orca to clear the docking pad, and after getting tired of waiting I decided he would simply have to move over so I put full power to shields, extended the landing gear and prepared to ram him off my pad. Much to my surprise, the ship simply clipped through the Orca and landed as normal.
This happened a few other times as well. The strangest one I had was when, as I was returning to the surface of the docking area, another ship spawned around me. When I reached the landing pad and my ship was freed from dock it jumped into the air as the other ship on the pad suddenly took solid form. There were a few other players in the area as well so I assumed this to be the cause, but it was strange. Seeing all this in VR, especially being inside another ship, wasn't very pleasant.
 
These videos might help. They don't relate directly to Elite and VR but I suspect it is the sort of info you are looking for...
[video=youtube_share;dWgzA2C61z4]https://youtu.be/dWgzA2C61z4[/video]

[video=youtube_share;D_Yt4vSZKVk]https://youtu.be/D_Yt4vSZKVk[/video]
 
These videos might help. They don't relate directly to Elite and VR but I suspect it is the sort of info you are looking for...
https://youtu.be/dWgzA2C61z4

https://youtu.be/D_Yt4vSZKVk

This is all well and good.
But the deeper I go into the rabbit hole that is vr. The one thing that comes out with more and more clarity is the fact that you CANNOT compare it to 2d rendering.

VR completely throws out the book on all the old dogmas, like you don't use multithreaded performance in games.
That's not even true for 2d anymore, but it's still touted as fact.

Also VR isn't generating a 2160x1200 image at 90fps.
I think what it is closer to the case is to look at current gen vr as 1080x1200 image at 180fps because render both viewpoints.

Now luckily a lot can be used for both so it's not all done by scratch but we aren't really looking at a target rendering time of 11ms, but 5.5ms for each eye.
And that is not a lot of time.

What is perfectly fine and true for 2d gaming at what? 60hz is just not good enough in vr.
Now I'm probably misunderstanding a lot and grossly simplifying, but if VR was anywhere like 2d rendering wouldn't we have it better performance by now.
I'm sure someone has been running a surround screen setup at a 140+hz with a fair amount of success.
 
This is all well and good.
But the deeper I go into the rabbit hole that is vr. The one thing that comes out with more and more clarity is the fact that you CANNOT compare it to 2d rendering.

VR completely throws out the book on all the old dogmas, like you don't use multithreaded performance in games.
That's not even true for 2d anymore, but it's still touted as fact.

Also VR isn't generating a 2160x1200 image at 90fps.
I think what it is closer to the case is to look at current gen vr as 1080x1200 image at 180fps because render both viewpoints.

Now luckily a lot can be used for both so it's not all done by scratch but we aren't really looking at a target rendering time of 11ms, but 5.5ms for each eye.
And that is not a lot of time.

What is perfectly fine and true for 2d gaming at what? 60hz is just not good enough in vr.
Now I'm probably misunderstanding a lot and grossly simplifying, but if VR was anywhere like 2d rendering wouldn't we have it better performance by now.
I'm sure someone has been running a surround screen setup at a 140+hz with a fair amount of success.

You will get no argument from me on this mate. VR is definitely different and the way my PC behaves playing VR baffles me.

I wish FDEV would add a VR benchmark tool. It is the game for it really, but nobody tests for it (VR) on any benchmarks. If they did add one it could considerably raise their profile!
 
Last edited:
My sys specs are in my sig. I only did some basic testing at Osidian Orbital and on planet. VR Ultra, usual offenders off. At pd 1.5 my cpu is at 75-80 %. At pd 1.75 (about where I stop seeing an appreciable difference) I get 90% cpu usage. I would think I could run into a bottleneck if I pushed everything to max in a busy online res area . As it stands, with ASW off and pd 1.75fps I get holds 90 on planet with only a couple of frame drops here and there when I move my head around quickly. In the station it drops into the 70s and occasional 60s. PD 1.5 runs in the 80,s at 1.5 in station. Still all very acceptable playing even without ASW on. I suspect an I7 would produce lower cpu usage, however since I don't see any real bottlenecking at my chosen ideal settings I am satisfied to spend time enjoying the game rather than doing a lot of testing. Given the diverse range of results one can get in ED in any one area on any given system config, the only way to know for sure if say the 1080ti would perform better than a 1080, in Op's Haswell system might be through actually being able to test both. Not likely practical and less likely to find someone who has moved from a 1080 to 1080ti in the same config. The choice might be to get the 1080ti and if the rest of the system is too weak build a newer rig as budget permits or back off to the 1080 which is an acceptable card for Ed and an excellent card for VR in general. I can say that my i5 Skylake and gtx 1080ti can run pretty much "native" VR title at 90 fps all maxed out and pd 2. That I do love. Since native VR games are really so much more gpu than cpu intensive my choice to save on the i5 vs i7 equation putting the saving into moving up to the1080ti was worthwhile decision.

As for ram, the only extensive comparison I have seen between ddr 3 and 4 and ram speeds was in the DCS forum (not done in VR) and DCS did show as much as an 8 fps gain with ddr4. Here again we are dealing with one sim and a testbed of parts that also were tested for optimal performance to the sim to get that result.


Sorry I rambled on. Back to the game.
 
My sys specs are in my sig. I only did some basic testing at Osidian Orbital and on planet. VR Ultra, usual offenders off. At pd 1.5 my cpu is at 75-80 %. At pd 1.75 (about where I stop seeing an appreciable difference) I get 90% cpu usage. I would think I could run into a bottleneck if I pushed everything to max in a busy online res area . As it stands, with ASW off and pd 1.75fps I get holds 90 on planet with only a couple of frame drops here and there when I move my head around quickly. In the station it drops into the 70s and occasional 60s. PD 1.5 runs in the 80,s at 1.5 in station. Still all very acceptable playing even without ASW on. I suspect an I7 would produce lower cpu usage, however since I don't see any real bottlenecking at my chosen ideal settings I am satisfied to spend time enjoying the game rather than doing a lot of testing. Given the diverse range of results one can get in ED in any one area on any given system config, the only way to know for sure if say the 1080ti would perform better than a 1080, in Op's Haswell system might be through actually being able to test both. Not likely practical and less likely to find someone who has moved from a 1080 to 1080ti in the same config. The choice might be to get the 1080ti and if the rest of the system is too weak build a newer rig as budget permits or back off to the 1080 which is an acceptable card for Ed and an excellent card for VR in general. I can say that my i5 Skylake and gtx 1080ti can run pretty much "native" VR title at 90 fps all maxed out and pd 2. That I do love. Since native VR games are really so much more gpu than cpu intensive my choice to save on the i5 vs i7 equation putting the saving into moving up to the1080ti was worthwhile decision.

As for ram, the only extensive comparison I have seen between ddr 3 and 4 and ram speeds was in the DCS forum (not done in VR) and DCS did show as much as an 8 fps gain with ddr4. Here again we are dealing with one sim and a testbed of parts that also were tested for optimal performance to the sim to get that result.


Sorry I rambled on. Back to the game.

very interesting to me as we have almost the same specs. i5 6600k 4.7Ghz, MSI 1080Ti (+100 on the core and +280 on the memory), 16G DDR4 2400

unigine superposition benchmark 4K setting 9900 points.

I have set everything to high except no AA, no blur, no bloom, shadows ultra, no AO...
SS 0,75 / HMD 1,75

with those settings i get 90fps in stations with occasional dips. same on planets.
 
I wish FDEV would add a VR benchmark tool. It is the game for it really, but nobody tests for it (VR) on any benchmarks. If they did add one it could considerably raise their profile!
See my sig for the Steam VR utility link. That will give you an idea of how well your system can run VR.
 
very interesting to me as we have almost the same specs. i5 6600k 4.7Ghz, MSI 1080Ti (+100 on the core and +280 on the memory), 16G DDR4 2400

unigine superposition benchmark 4K setting 9900 points.

I have set everything to high except no AA, no blur, no bloom, shadows ultra, no AO...
SS 0,75 / HMD 1,75

with those settings i get 90fps in stations with occasional dips. same on planets.

There does seem to be a couple of mentionable differences. Your cpu is clock 400 MHz higher and your ss is lower (I run it at 1- default). I don't know how the game handles these two things in tandem, but ss .75 and pd 1.75 looks the same as ss 1 and pd 1.5 to these old eyes and down sampling ss to .75 would seem to add a second rendering function to the work load. Using ss 1 and pd 1.5 my performance falls in line with about what you are getting. A lot can depend on how and where you are playing and testing. It seems fair to assume that things like varying planet surfaces, number ai ships, number of players in an online instance and such would all vary results to different degrees.

Given the Op's concern as to whether his 2 year older Haswell cpu with slower ram would bottleneck a 1080ti, to any unacceptable degree, our similar setups (yours and mine) probably aren't going to be of too much help on that one. As a pc gamer since dirt was formed, who has the love/hate relationship with the continually moving target that is 'what hardware do I need now", I would buy the 1080ti since it will not go bad and likely not be an issue in any native VR games. If it does (bottleneck some) I would build a platform under it, as budget allows, since I know I am not going to escape or want to, this glorious hobby. It isn't like it won't work at all and one can always tweak things down some until a balance is found. That's just me, though.
 
Last edited:
Back
Top Bottom