980ti GPU-SS2x on VR High - VIVE - Observation and some Benchmark Data

When I've overclocked my GPU - 980ti MSI 6GB I was actually able to run the Game on VR High with SS 2x with AA Off excluding 3 points in the game. I've spend quite some time testing this to isolate the issue points where the SS kills the frame rate (rather poor optimization on FD side) as the game at SS2x is actually playable in terms of graphics with VR high preset and excluding the scaling issue it looks like something which won't blind you with flickering mess and the text is of course readable.

Scale remains an issue regardless of SS - you still play for 12 year old body in ED and climb through tiny doors when you look at the back of your ship...



my OC settings in Afterburner 0 +87mV, power 102 CoreClock at +100, Memory +400 and fan at 80 keeping my Temperature at 72 -74 degreees

The points of Frame rate drops -

Arraving to the station - specifically the the letter box. I'm not sure why the letter box is an issue as once I'm in the station the issue is gone until I land and my Steam VR report is good framerate, so I wonder what is soooooo demanding in the letter box that it kills the frame rate and reprojection kicks in - only when you are in or very close to the letter box...

Secondly without a fail - once I land... Any part of the Bulletin board since you land will kill frame rate completely if you play SS2x - Almost like it was already SS x2 and now was being double sampled on top...

Thirdly on planets but decreasing some of the detail in VR high has resolved it for me..

During these tests I have observed 0 temperature throttling and the GPU was on average under 80-87% load meaning that I wasn't even peaking it ?????

It did max out the Memory though Once I've flown through the letter box and the GUI board loaded thus my Card was without memory until I left the station - this meant that once Landed the memory was swamped. It doesn't appear to clear the memory of the GPU very well..

Anyways I though this info might be helpful to some of the Devs in terms of optimization and I was wondering what would be the data for other users.

Let me know guys what you think and if you find it helpful in anyway. I would be surely interested in others results...


My System - i7-5820k 6 cores 3.3Ghz (12 threads) 64GB Ram, SSDs and MSI 980ti overclocked as per specs above.
 
Last edited:
That's good information, I'm sure it'll be useful to the developers, thanks.
Did you notice some stuttering at the beginning of a jump to another system? It happens to me every single time with no exception. It's like it's loading those clouds or something like that.
I'm running it from and SSD too.
 
Yeah the menu thing is weird, I get fps drops while docked at anything above 1.3x ss and the menu glitches.

Been running at 1.5x ss and it's pretty much the sweet spot overall if you can tolerate the menu glitches while docked.

I do get bad judder in RES when there are more than a few ships in combat even at 1.3x - it looks like a CPU issue coz its flat out.

1080 @2GHz + i5 6600k @4.5GHz + Rift CV1

EDIT : Fixed my judder !!!

GPU Boost was doing weird stuff when the temps got up to 75 and then throttling hard at 77+.

CPU was spiking like crazy as the card throttled and causing massive judder.

Jacked the core clock to +200 and ramped the fan to 85% at 70 degrees and its fine now. Previously had a 1:1 fan profile and +120 core.
 
Last edited:
I do get a quick judder when going to supercruise, but this doesn't seem to be overly related to Graphics settings and where I place my game as this small jitter is only at the beginning and not always... Maybe something to do with instancing? Not sure.

I'll have to monitor my CPU closely tomorrow when If I get a chance to play, but I haven't seen any throttling or peaking today, but I do have 6 core i7 which can handle a lot, so if this game peaks it then it is definitely not right. I'll investigate in RES..
 
Even with an I7-4790k@5Ghz 16Gb ram and a GTX980 with a slight overclock, I can't even run SS 1.5 even on VR low. Juddery mess whenever you move your head.

Will try a higher overclock, I've just never needed to before.

I wish VR Sli was an option though, its kinda sad to have that second 980 sitting there not doing anything. I may go back to playing Elite on the 60 inch 4k TV, I could run that at MAX detail and the framerates never dropped below 60.
 
I think all this points to the fact that the Cobra engine is not very well optimized and struggling.
 
Last edited:
Even with an I7-4790k@5Ghz 16Gb ram and a GTX980 with a slight overclock, I can't even run SS 1.5 even on VR low. Juddery mess whenever you move your head.

Will try a higher overclock, I've just never needed to before.

I wish VR Sli was an option though, its kinda sad to have that second 980 sitting there not doing anything. I may go back to playing Elite on the 60 inch 4k TV, I could run that at MAX detail and the framerates never dropped below 60.

You could try dropping that overclock on the CPU by 500 MHz or run standard to see if it's a CPU bound error that's causing you issues ?
 
Mike - PvtTUX might be correct - if the OC on CPU isn't working optimally you might be getting drops... You might also want to make sure that your SLI is disabled - if it isn't then it will judder regardless of your settings which would make sense considering your description.

About the earlier - I went to HazRes and the funny thing was that my CPU was at 50% (Which is a lot for my CPU. Even when working on my own companies projects do I rarely go that high, Excluding 3d rendering of course), my GPU at 80% my GPU memory at 4.2GB and I had the occasional stutter. Changing settings had no effect on the matter as the Hardware wasn't peaking anyway... Not sure this helps.
 
Last edited:
Even with an I7-4790k@5Ghz 16Gb ram and a GTX980 with a slight overclock, I can't even run SS 1.5 even on VR low. Juddery mess whenever you move your head.

Will try a higher overclock, I've just never needed to before.

I wish VR Sli was an option though, its kinda sad to have that second 980 sitting there not doing anything. I may go back to playing Elite on the 60 inch 4k TV, I could run that at MAX detail and the framerates never dropped below 60.
I'd check your video card was in a x16 slot on your motherboard. What mobo brand are you running?
 
I think all this points to the fact that the Cobra engine is not very well optimized and struggling.

I remember the early struggle they had with the procedural texture generation optimisation on planet surfaces long prior to Horizons. There was tons of super cruise judder when approaching planets for ages even on powerful rigs in 2D. Then there was the whole Direct3D crashing it's nuts off. I had to disable hyperthreading on my old Ivy Bridge i7 to prevent random hitching before I went VR.

Frontier have come a long way and continue to do pioneering work but the new consumer headsets are really demanding at 90hz and require the brute force of no less than a 980Ti to deliver a high fidelity VR experience. My 970 was a great card and got me going when I first got my Rift but supersampling was not an option. It was VRLow and that was it.

My 1080 is a beast in comparison but I still have to be sensible with cranking it up, at least until developers start to implement the VR specific APIs in their games but that's a way off I reckon.
 
Last edited:
Yeah I wasn't very clear there, I meant i'd overclock my Graphics cards, not my CPU anymore, dear good no. lol. poor thing. I put the cpu down to 4.5Ghz and did a test, exactly the same performance. Honestly, I'm getting frustrated and annoyed at Elite. Nothing else does it this bad. A tiny overclock on the GPU and I can run SS at 1.5 ok, but I miss all the details. taking screenshots at maximum details at 4K was just glorious, now all I get is a fuzzy blur.
 
Yeah I wasn't very clear there, I meant i'd overclock my Graphics cards, not my CPU anymore, dear good no. lol. poor thing. I put the cpu down to 4.5Ghz and did a test, exactly the same performance. Honestly, I'm getting frustrated and annoyed at Elite. Nothing else does it this bad. A tiny overclock on the GPU and I can run SS at 1.5 ok, but I miss all the details. taking screenshots at maximum details at 4K was just glorious, now all I get is a fuzzy blur.

OP had a 980Ti; which is significantly faster than a vanilla 980. (and SLI currently does nothing for VR in Elite..) Elite is a real beast on the GPU, even the 980Ti has trouble with SS 1.5 on planets; it will not maintain 90 fps without multiple settings drops.

I think only the 1080Ti is going to get a true 1.5 SS with all settings up; constant 90 fps in almost all scenarios..
 
Yeah I am in the process of saving for another card. I will most likely replace both 980's with a single card. I am hoping nVidia release something that uses HBM2 but at the same time is still efficient with the software that doesn't support HBM, so it will perform well across the board.
 
I'm having some issues with my setup: 2600K @ 4.6ghz, 980TI, overclocked, HTC VIVE, newest windows 10 update + nvidia drivers. (8-24-2016)

with .65 scaling in game and VR high settings.
with Chaperoneswitcher at 1.5 scaling with reprojection disabled.

I get 70-80fps which causes horrible jutter. My CPU usage is around 50%, GPU usage is a measely 50% also, it's not even boosting to my full OC!
What is bottlenecking so bad here? I doubt a 4.6ghz 2600k is bottlenecking this bad. On my 1440P monitor at maxed settings I can peg my 90hz refresh rate normally.

--------
with .65 scaling in game and VR high settings.
with chaperoneswitcher at 2.0 scaling with reprojection disabled.

I get 85-90fps at an asteroid belt fighting. LESS jutter but at hubs I get 45fps when docking.
GPU usage increases to 75%.


Everything I describe seems like a CPU bottleneck (low GPU usage, can't even come close to maxing the GPU). however I fail to see how a highly overclocking 2600k is holding this back?

Can anyone help?
 
I have the same CPU running more or less standard with a 980 GTX and have no issues except for the usual.

If you are sure your settings are correct and the in game graphic setting are fine (ambient occlusion off, aliasing off etc) I'd be taking your hardware back to its default speeds and see where you are with that.

Check your temps just incase ;)

Btw I'm pretty sure I have reprojection on
 
I'm having some issues with my setup: 2600K @ 4.6ghz, 980TI, overclocked, HTC VIVE, newest windows 10 update + nvidia drivers. (8-24-2016)

with .65 scaling in game and VR high settings.
with Chaperoneswitcher at 1.5 scaling with reprojection disabled.

I get 70-80fps which causes horrible jutter. My CPU usage is around 50%, GPU usage is a measely 50% also, it's not even boosting to my full OC!
What is bottlenecking so bad here? I doubt a 4.6ghz 2600k is bottlenecking this bad. On my 1440P monitor at maxed settings I can peg my 90hz refresh rate normally.

--------
with .65 scaling in game and VR high settings.
with chaperoneswitcher at 2.0 scaling with reprojection disabled.

I get 85-90fps at an asteroid belt fighting. LESS jutter but at hubs I get 45fps when docking.
GPU usage increases to 75%.


Everything I describe seems like a CPU bottleneck (low GPU usage, can't even come close to maxing the GPU). however I fail to see how a highly overclocking 2600k is holding this back?

Can anyone help?

You're probably seeing judder because you've turned reprojection off. When you suffer a frame rate drop you'll see 45fps strtaight away, as reprojection won't be there to 'fill in' any missed frames. I'd advise keeping it because it has a big impact on your VR experience, even if it doesn't improve your actual frame rates.

VR rendering is all about getting data from one stage to the next to meet the 'deadlines' that are imposed by having a fixed 90fps refresh rate. You can't change the refresh rate for a Vive or Rift CV1, nor can you turn it off. This is for good reason - users do feel disoriented/sick and get eye strain etc if there's tearing or low refresh. Its judder on steroids, yuck.

Tracking data>CPU>>>GPU>>>>>>TimeWarp/Reprojection if needed>eye buffer>display (the number of >>> indicates the probable latency)

The Rift and Vive all have to supply tracking data to the CPU before the CPU can start to render the next frame. So while the GPU is rendering a frame, the CPU is reading the tracking state of the headset and starting the next frame's geometry. But there is some latency in this loop and if your CPU can't keep up (too much detail = too much geometry to process), you'll miss the deadline for rendering (estimated by the application and OculusVR/SteamVR). Bam! As soon as you've missed the deadline, you either show a re-projected frame, or the same frame as last time. Oops - you have repro turned off, so you get instant drop to 45fps (and that's a best-case scenario!)

There are some tricks (anticipating the user's motions etc) but its not bulletproof (and my explanation above doesn't include all the tricks to avoiding deadlines etc).

Your CPU in VR is having to render nearly DOUBLE the geometry (one scene for each eye, and geometry is the same irrespective of resolution) - hence the big hit the CPU timing takes when you go from single scene monitor to streo in VR.
OK, its not quite double, as some geometry in the scene might be rendered in mono (distant objects, mountains etc - just like in real life you can't actually tell the difference) but its a big jump.

Overclocking the CPU will reduce your latency (GPU waiting for the finished geomtery) but it doesn't improve in line with clock speed (a 4.6GHz CPU won't process double the geometry that a 2.3GHz CPU will process). So overclocking your CPU helps, but if you're showing too much detail and the CPU is missing the deadline to pass geometry to the GPU, you'll see a drop in frame rate. Judder.

This is the main reason why VR applications don't have the same level of geometry as monitor-bound applications. All the hardware has to get faster (and software pipelines streamlined) before we can get to the same level we see in monitor-only games.
 
When I've overclocked my GPU - 980ti MSI 6GB I was actually able to run the Game on VR High with SS 2x with AA Off excluding 3 points in the game. I've spend quite some time testing this to isolate the issue points where the SS kills the frame rate (rather poor optimization on FD side) as the game at SS2x is actually playable in terms of graphics with VR high preset and excluding the scaling issue it looks like something which won't blind you with flickering mess and the text is of course readable.

Scale remains an issue regardless of SS - you still play for 12 year old body in ED and climb through tiny doors when you look at the back of your ship...



my OC settings in Afterburner 0 +87mV, power 102 CoreClock at +100, Memory +400 and fan at 80 keeping my Temperature at 72 -74 degreees

The points of Frame rate drops -

Arraving to the station - specifically the the letter box. I'm not sure why the letter box is an issue as once I'm in the station the issue is gone until I land and my Steam VR report is good framerate, so I wonder what is soooooo demanding in the letter box that it kills the frame rate and reprojection kicks in - only when you are in or very close to the letter box...

Secondly without a fail - once I land... Any part of the Bulletin board since you land will kill frame rate completely if you play SS2x - Almost like it was already SS x2 and now was being double sampled on top...

Thirdly on planets but decreasing some of the detail in VR high has resolved it for me..

During these tests I have observed 0 temperature throttling and the GPU was on average under 80-87% load meaning that I wasn't even peaking it ?????

It did max out the Memory though Once I've flown through the letter box and the GUI board loaded thus my Card was without memory until I left the station - this meant that once Landed the memory was swamped. It doesn't appear to clear the memory of the GPU very well..

Anyways I though this info might be helpful to some of the Devs in terms of optimization and I was wondering what would be the data for other users.

Let me know guys what you think and if you find it helpful in anyway. I would be surely interested in others results...


My System - i7-5820k 6 cores 3.3Ghz (12 threads) 64GB Ram, SSDs and MSI 980ti overclocked as per specs above.

Have you tested the mail slot in training mode to see if you have the same issue. I only had time to test one training mode, this weekend I plan to try the rest but I was able to keep a solid 90fps in the asteroid field while fighting. I'm starting to think the net code might be adding a couple of ms latency which is putting us above the required 11ms or lower.
 
Redraven, thank you for your long post and your help.

I still don't see how anything is bottlenecking a 980TI that's OC'd and I can only push 50-60% GPU usage and still not max out 90FPS.

a 4.6ghz 2600k in all possible benchamrks (excluding AVX2 which the game doens't even use), is an equivilient of a 6700k at ~4.0ghz. How is my CPU bottlenecking?

I'd love to get to the bottom of this.
 
Back
Top Bottom