Oculus Rift - Any obvious settings for performance improvements, or do people live with 45fps?

Currently I just enable my rift on my Ultra settings, and hitting >100% GPU usage in busy HazRes and in stations.

Are there any good adjustments to be made to my settings to improve the OR's performance and maintain good visuals?



That said though, when I do exceed 100% GPU usage, my frame rate throttles down to 45fps (instead of 90) and TBH I'm not sure I can even tell the difference? :) Does anyone else just live with the VR unit dishing out 45fps (instead of 90fps) quite happily?
 
Last edited:
Completely depends on your hardware.
I'm running an x99 6 core water cooled build with a 980TI.
Like you I choose Ultra, but I then turn bloom and some other features on and I also set super sampling at x2 and turn anti aliasing off.
I have no issues at all.
 
Last edited:
My 2x780GTX's in SLI I'm sure are not reaching 90fps consistently (although given they were pushing my triplescreen setup to 230fps I'm not sure why).

In any case I refuse to chase on screen FPS counters and go more with the overall experience and the FPS is going to have to be very low before I take off the HMD and lose the additional situational awareness IMHO.
 
Completely depends on your hardware.
I'm running an x99 6 core water cooled build with a 980TI.
Like you I choose Ultra, but I then turn bloom and some other features on and I also set super sampling at x2 and turn anti aliasing off.
I have no issues at all.

"No issues" being you're hitting 45FPS at times and that's no issue to you? ie: I think I'm in that camp TBH (if I'm at 45fps I think it plays fine).

Do you find x2 super sampling gives a better visual experience than anti-aliasing then in VR?
 
Last edited:
  • Like (+1)
Reactions: NW3
Do you find x2 super sampling gives a better visual experience than anti-aliasing then in VR?
Super sampling gives better visuals period. ED uses "deferred rendering" which is inherently incompatible with "real anti-aliasing". The in-game AA options are simply a blur filter applied to some edges. SS x2 is basically 2x AA with a non-rotated grid. In other words, it's the most primitive "real" AA available. If I had a video card that could do it (and if ED supported it), I'd use something like 4x SS, with 16 pixels (4x4) being down-sampled to 1 pixel. That's what it would really take to get rid of the crawling edges in ED. Even then, it's not ideal because the subpixels would all be on-grid, instead of optimal placement.

Currently, Anti-Aliasing in ED is pitiful. See this thread for more details: https://forums.frontier.co.uk/showt...-ALIASING-like-this-would-be-awesome-in-Elite

See this, for info on sub-pixel placement: http://techreport.com/review/6572/nvidia-geforce-6800-ultra-gpu/20
 
Last edited:
Super sampling gives better visuals period. ED uses "deferred rendering" which is inherently incompatible with "real anti-aliasing". The in-game AA options are simply a blur filter applied to some edges. SS x2 is basically 2x AA with a non-rotated grid. In other words, it's the most primitive "real" AA available. If I had a video card that could do it (and if ED supported it), I'd use something like 4x SS, with 16 pixels (4x4) being down-sampled to 1 pixel. That's what it would really take to get rid of the crawling edges in ED. Even then, it's not ideal because the subpixels would all be on-grid, instead of optimal placement.

Currently, Anti-Aliasing in ED is pitiful. See this thread for more details: https://forums.frontier.co.uk/showt...-ALIASING-like-this-would-be-awesome-in-Elite

See this, for info on sub-pixel placement: http://techreport.com/review/6572/nvidia-geforce-6800-ultra-gpu/20



when you say SS do you mean hmd image quality or SS in the settings?
 
Super sampling gives better visuals period. ED uses "deferred rendering" which is inherently incompatible with "real anti-aliasing". The in-game AA options are simply a blur filter applied to some edges. SS x2 is basically 2x AA with a non-rotated grid. In other words, it's the most primitive "real" AA available. If I had a video card that could do it (and if ED supported it), I'd use something like 4x SS, with 16 pixels (4x4) being down-sampled to 1 pixel. That's what it would really take to get rid of the crawling edges in ED. Even then, it's not ideal because the subpixels would all be on-grid, instead of optimal placement.

Currently, Anti-Aliasing in ED is pitiful. See this thread for more details: https://forums.frontier.co.uk/showt...-ALIASING-like-this-would-be-awesome-in-Elite

See this, for info on sub-pixel placement: http://techreport.com/review/6572/nvidia-geforce-6800-ultra-gpu/20

Is this why we need TAA added to ED, cos that would be better?
 
I run HMD 2x, blur off, dof off, aa off. ss 1x and all else maxed. No doubt ASW is kicking in but I don't really notice any graphics anomalies. I see some slight artifacting in DCS World, if I run ASW in auto, but forcing 45fps actually runs really well.
 
when you say SS do you mean hmd image quality or SS in the settings?
I was speaking of SS in general.

I'm not sure about the HMD setting, since I don't yet have a VR headset. (I'm waiting for my Pimax 8K.) I can enable SS in-game or using DSR in the NVidia control panel. Both are similar (although DSR adds a smoothing pass).

Is this why we need TAA added to ED, cos that would be better?
Yes. Temporal AA is a low-cost (framerate wise) method. Essentially, it's doing "real" AA in time, by shifting the viewport slightly on each frame (to generate the subpixel offsets). It's quite complicated to implement (because moving objects must be tracked), but the technique yields high-quality AA for negligible cost (framerate wise).
 
Last edited:
Im pretty new to VR, and may need to upgrade CPU and RAM (from i5-4460 / 8gb), but no matter the setting I see the rift drop to ASW pretty consistantly. GPU (980ti) usage is 50'ish at VR-low with SS 0.5 and HMD 2.0... but still hits ASW.

The crazy thing is that even if I dial up the settings I dont think the visuals are better. I really cant see the difference between low and high textures as an example. I just want to run as close to 90fps consistantly for smooth play - and in that regard I may need new CPU+RAM. I am considering a full generation upgrade, but may stick to socket 1150 to save on mobo costs.
 
I have been tinkering with setting for couple of days and found whats best for my setup:

pimax 4k
i7 2700k @4.6
GTX 970 (I will receive 1080 in week or so)

Settings for ED:
rendering quality on my pimax 4k set to max
Ingame SS 1
HMD 0.5
Steam VR SS 1.2
Antialiasing FXAA
Textures-high, material-ultra, shadows-off, ambient oc.-off... there are more but those have highest impact on quality and FPS
Anisotropic filtering x16 in nvidia control panel.

The biggest improvement in quality was for me anisotropic filtering x16 + antialiasing FXAA. Before I tried antialiasing x8 in nvidia control panel but I could not tell the difference in game. In game FXAA antialiasing eats little bit of FPS but the difference is easily noticable (the far lines are not so jagged as before).

With this settings I get around 40 FPS in stations and SRV and 60 FPS (locked refresh rate) pretty much everywhere else.
Now just curious what difference 1080 will make.
 
I run HMD 2x, blur off, dof off, aa off. ss 1x and all else maxed. No doubt ASW is kicking in but I don't really notice any graphics anomalies. I see some slight artifacting in DCS World, if I run ASW in auto, but forcing 45fps actually runs really well.

Sorry "HMD 2x"?

Matemaster mentioned "HMD 0.5"?
 
Sorry "HMD 2x"?

Matemaster mentioned "HMD 0.5"?

I also tried ingame SS 0.5 and HMD 1 but I get less FPS than with SS 1 and HMD 0.5 and I dont see quality difference between those.
I found out that biggest FPS drops (from highest to lowest) for me are:

1. Ingame HMD (highest FPS drop)
2. In game SS
3. Steam VR SS
4. pimax render quality (lowest FPS drop)
 
I also tried ingame SS 0.5 and HMD 1 but I get less FPS than with SS 1 and HMD 0.5 and I dont see quality difference between those.
I found out that biggest FPS drops (from highest to lowest) for me are:

1. Ingame HMD (highest FPS drop)
2. In game SS
3. Steam VR SS
4. pimax render quality (lowest FPS drop)

I understand super sampling I believe. ie: A SS of 2 will render at the double the necessary resolution and then downsample that to ideally give a better outcome in the VR headset.

Surely a SS of less than 1 will give a poorer visual result as it's having to then upscale?


And what is this HMD value of 0.5 or 2 being mentioned?
 
I understand super sampling I believe. ie: A SS of 2 will render at the double the necessary resolution and then downsample that to ideally give a better outcome in the VR headset.

Surely a SS of less than 1 will give a poorer visual result as it's having to then upscale?


And what is this HMD value of 0.5 or 2 being mentioned?

I must confess I've never satisfactorily understood the interaction between the SS setting and the HMD Quality setting.

As I understand it, the SS setting is basically getting the Elite: Dangerous game code to pretend you have a different resolution display to the one you actually have. For example if you're running on a 1920x1080p monitor and have SS=1.5 then the game generates frames that are 2880x1620 with all the additional finer detail that this entails. Your graphics driver than downscales these frames back to your screen resolution. This has the advantage over simple anti-aliasing techniques in that the "in-between" pixels which are filling in the gaps between your jaggies are based on real extra data from the game rather than guesswork from the AA code.

Meanwhile the HMD Quality setting is a value that's passed through to the Oculus runtime software (exactly the same value that the Oculus Debug tool wires out as "Pixels per Pixel") which causes a similar thing to happen re: the frames the Oculus software is producing and the hardware resolution of the HMD.

Vast quanities of (subjective) experimentation (documented all over these forums) appear to suggest that the Oculus software somehow does a more effective job with the HMD Quality setting than you get from increasing SS (so you get a clearer image with SS=1.0 and HMDQ=1.5 than you would with SS=1.5 and HMDQ=1.0). However, this seems to come at a performance cost so having HMDQ set much above 1.5 requires a pretty darn powerful GPU.

Now, where it gets confusing is that a lot of people, in order to get decent framerates with a high HMDQ value, deliberately downscale their SS to something like 0.75 (i.e. getting ED to only render frames that are 1440x810 in the above example). This gives the game code less work to do which reduces the load on the GPU and thereby allows the Oculus software more headroom to handle a higher HMDQ value (1.75 or even 2.0).

You can do the sums (resolution x SS x HMDQ) and see that the resulting frames might well be slightly higher-res than, say, the nearest SS=1.0 equivalent but I personally still feel like there's a garbage-in/garbage-out thing going on here which intuitvely rankles my tech' sensibilities. Consequently I leave SS at 1.0 and set HMDQ as high as it will go such that I still generally get close to 90fps (with ASW turned off to check).

P.S. I might re-use this post in the future so if there's anything substantially wrong in my understanding then I'd love to know so I can rephrase this post appropriately.
 
Last edited:
I must confess I've never satisfactorily understood the interaction between the SS setting and the HMD Quality setting.

As I understand it, the SS setting is basically getting the Elite: Dangerous game code to pretend you have a different resolution display to the one you actually have. For example if you're running on a 1920x1080p monitor and have SS=1.5 then then game generates frames that are 2880x1620 with all the additional finer detail that this entails. Your graphics driver than downscales these frames back to your screen resolution. This has the advantage over simple anti-aliasing techniques in that the "in-between" pixels which are filling in the gaps between your jaggies are based on real extra data from the game rather than guesswork from the AA code.

Meanwhile the HMD Quality setting is a value that's passed through to the Oculus runtime software (exactly the same value that the Oculus Debug tool wires out as "Pixels per Pixel") which causes a similar thing to happen re: the frames the Oculus software is producing and the hardware resolution of the HMD.

Vast quanities of (subjective) experimentation (documented all over these forums) appears to suggest that the Oculus software somehow does a more effective job with the HMD Quality setting than you get from increasing SS (so you get a clearer image with SS=1.0 and HMDQ=1.5 than you would with SS=1.5 and HMDQ=1.0). However, this seems to come at a performance cost so having HMDQ set much above 1.5 requires a pretty darn powerful GPU.

Now, where it gets confusing is that a lot of people, in order to get decent framerates with a high HMDQ value, deliberately downscale their SS to something like 0.75 (i.e. getting ED to only render frames that are 1440x810 in the above example). This gives the game code less work to do which reduces the load on the GPU and thereby allows the Oculus software more headroom to handle a higher HMDQ value (1.75 or even 2.0).

You can do the sums (resolution x SS x HMDQ) and see that the resulting frames might well be slightly higher-res than, say, the nearest SS=1.0 equivalent but I personally still feel like there's a garbage-in/garbage-out thing going on here which intuitvely rankles my tech' sensibilities. Consequently I leave SS at 1.0 and set HMDQ as high as it will go such that I still generally get close to 90fps (with ASW turned off to check).

P.S. I might re-use this post in the future so if there's anything substantially wrong in my understanding then I'd lvoe to know so I can rephrase this post appropriately.

Eeeeek! Where do you set the HMD value then?

And why isn't SS=1.0 and HMDQ=1.5 vs SS=1.5 and HMDQ=1.0 identical in rendering and outcome? Surely in bother cases the GPU is ultimately being told to render at 1.5 times resolution and then down sample when applied to the VR headset?

And surely anytime you tell the GPU to render at a resolution lower than the VRs, (eg: SS 0.75 or HMD 0.75) you're losing image quality along the way?
 
Eeeeek! Where do you set the HMD value then?

I forget exactly what's it's called in-game but in EDProfiler (which is how I set all my graphics options these days) it's called "HMD Image Quality".

You can still also set it out of game with the Oculus Debug tool via the "Pixels per Pixel" setting and I believe it's also possible to set it using the "Oculus Tray Tool".

Don't try to override it in more than one place tho as I believe these settings can all multiply or otherwise combine in some other way leading to very confusing results re: what's actually happening.
 
I forget exactly what's it's called in-game but in EDProfiler (which is how I set all my graphics options these days) it's called "HMD Image Quality".

You can still also set it out of game with the Oculus Debug tool via the "Pixels per Pixel" setting and I believe it's also possible to set it using the "Oculus Tray Tool".

Don't try to override it in more than one place tho as I believe these settings can all multiply or otherwise combine in some other way leading to very confusing results re: what's actually happening.

In ED you set it by clicking ( with a mouse) on the little plus (+) symbol next to "quality". Then its called "HMD quality" or something. It has a max value of 2, but in the debug tool its an input field so you can add any number.
 
My advice would be to skip the Oculus debug tool and just make your adjustments in ED. Try SS 1 and HMD 1.25 and go from there. I think many of us play on sub-90 fps regularly and I can speak for myself that I don't mind!
 
My advice would be to skip the Oculus debug tool and just make your adjustments in ED. Try SS 1 and HMD 1.25 and go from there. I think many of us play on sub-90 fps regularly and I can speak for myself that I don't mind!

Sub 90fps? When my gpu exceeds 100% at 90fps it immediately drops to 45fps, until such time as it can manage 90fps again.


When I've jumped between 90fps and 45fps, TBH I've not noticed the difference...
 
Back
Top Bottom