Oculus Rifts Running Maxed Settings, please add your hardware here

What I'm trying to say is: It is very hard to know what comes from what.
I believe the connection between judder (from tracking) and fps are only connected in the way that a system (CPU & GPU) that doesn't rech high framerates is very likely to have tracking problems too and vice versa. If you restrict elite to one core you will see a massive performance drop in framerate and tracking which is I guess because elite also doesn't run solely on the GPU. Most quality setting however are render related which almost completely is done on the GPU.
There is also a problems with the terms stutter, judder, microjudder, dropouts and so on.
For example: I can have a very constant low framerate which of course will result in the visibility of single frames when I move my head fast, but without tracking dropouts which usually are not regular or happen just every 3rd frame or so (just an example) the whole experience can feel much smoother to some than high framerates with constant tracking stutter (meaning the camera rotation is not updated fast enough).
It all depends on individual perception or to put it like this: what's bugging you most.
I am of course no expert. I'm just trying to say at this moment where the "flaws" come from and what you can do about it. Still collecting as much info as possible is a good idea allthough it may be outdated very soon ;)

My advice to anyone with Judder

Check your FPS, if it's less than 75 then that is the cause of judder and you should work to fix that first and foremost.
If it's stuck at 60fps and never goes any higher then you have the Vsync issue, turn Vsync off or unplug your 60hz monitor or lower the monitor's resolution till you can set the refresh rate to 75hz (e.g. I run my 1440p 60hz monitor at 1280x1024 which allows me to run it at 75hz rather than 60hz).

If the FPS goes above 75hz (or is capped at 75hz by vsync) but goes under 75fps a fair bit then the issue is performance related. You can lower your settings or monitor your GPU and CPU, work out which is the bottleneck and either overclock or upgrade.
Core parking can come into play here, turn off core parking to be sure. If you're on Win 7 then make sure you disable Aero.

If it's stuck at 37.5hz/fps then the issue is that you are running in direct mode. Change to Extended.

If the FPS is always north of 75fps and you still get judder then it could be camera related. if you block the camera does it work better? If so trying plugging the camera into a USB 2.0 (black) socket rather than 3.0 (Blue). Plug the power cable into the camera. Sit further back/closer to the camera and see if that fixes the issue. Try the desk demo, does that work AOK? if so go back and monitor your FPS, it's probably that but you didn't check it well enough.

If all else fails it's probably some kind of software issue and they've hard to debug, I'd recommend a backup and format/reinstall windows (harsh but the only way to be sure).

You want 75fps in games in VR, anything else is subpar to 75fps. You might be used to less than 75fps and have come to terms with it but it's still subpar compared to 75fps. If you are getting 72fps then it's not too bad, you still get low persistence. Anything south of 72fps is subpar, it causes judder, you get smearing if low persistence is turned off. Anyone that's interested in VR should be attempting to hit 75fps in their VR games. This is heavily documented.
 
Last edited:
I completely agree. I didn't want to say one shouldn't look for 75 fps. But if your loe framerate is caused by elite you may as well wait for it to get optimized instead of reinsatlling windows ;)
 
Last edited:
edit @ granite if you can help it at all, imo it is better to lower any setting you need to in VR but NOT the oculus quality slider.

Any reason not to lower the Oculus Quality slider? This is how Frontier got the game running so well on the DK2 at EGX, they had lowered the slider by a few notches...
 
I have an overclocked (4.5Ghz) 5820 and overclocked GTX980 (1.5Ghz core boost) and hit 75 fps everywhere with everything at max (except blur off) at 1080 settings.

However I still get micro stutters fairly frequently, and looking at the log there is no strain on the GPU at those times... it'll be pootling along quite happily at 30-40% (90% in stations) and then drop a few frames or whatever. It doesn't do this in any other games/demos so I'm putting it down to ED. :)
 
everything at max (except blur off)

When the 'Blur' option first appeared in the settings there was some speculation that it was some kind of motion blur setting. I also turned it off since it seemed to be affecting the frame rate.

Since Gamma I've been messing with the settings some more and discovered that turning 'Blur' on had no noticeable impact on frame rate on my setup. This setting seems to only do one thing. It allows things like the station menus to allow a hint of the background to show through. With it turned off the station menus have a plain background. I found I quite like the effect and so I'm running with 'Blur' on now. This doesn't seem to affect the ship UI menus.

If you haven't tried turning it back on then give it a go, you might like the effect =)
 
Any reason not to lower the Oculus Quality slider? This is how Frontier got the game running so well on the DK2 at EGX, they had lowered the slider by a few notches...

is that right? for me personally I found it reduced my image quality, way more so than lowering detail levels. Maybe I am wrong and was deluding myself but I felt it looked like it was rendering at a lower resolution and then upscaling it (kind of the opposite of super sampling)

If you know better however and I am wrong, then, maybe I should go back and reinvestigate this setting. I am always happy to be proven wrong esp if it means better performance for little downside!.
 
is that right? for me personally I found it reduced my image quality, way more so than lowering detail levels. Maybe I am wrong and was deluding myself but I felt it looked like it was rendering at a lower resolution and then upscaling it (kind of the opposite of super sampling)

If you know better however and I am wrong, then, maybe I should go back and reinvestigate this setting. I am always happy to be proven wrong esp if it means better performance for little downside!.

No I think you are right, it does certainly seem to degrade the image resolution. But I can tolerate 2 or 3 notches down from the top, and this allows me to keep everything else on Max...and retain 75fps. I'm quite happy with that... :)
 
I have an overclocked (4.5Ghz) 5820 and overclocked GTX980 (1.5Ghz core boost) and hit 75 fps everywhere with everything at max (except blur off) at 1080 settings.

How readable is the text? I have tried both 1080 and supersampling to 1440 with a GTX970. Both feel like I'm underwater with a dirty diving mask.
Still way more fun than a monitor mind :)
 
I can't run it maxxed out and achieve 75FPS all the time. The best I can achieve is medium high (I'll edit this later for specifics, once I'm on my gaming rig) and even then I get some FPS drops in the high detail (luxury) stations. I run at 2560 x 1440 downsampled, extended mode.

i7 4790K (4 Ghz, 4.4 Ghz boosted)
ROG Maximus Ranger VII Motherboard
32 GB Kingston HyperX 1600 Mhz RAM
120 GB Samsung Evo 840 SSD OS Drive (optimised and in "Rapid" mode)
1TB Samsung Evo 840 SSD Media Drive
2 x GTX 970 (1050Mhz) Sli

If I run at standard 1920 x 1080 I don't see any FPS drops (stuttering is a different issue and game based as far as I'm concerned), I can run on High settings (for everywhere I've been anyway) if I don't downsample.
 
Maybe the thread should be renamed to running maxed out in a station. I have a laptop and I can run maxed out in space as well.
 
I really meant running maxed out without any frame drop anywhere. I was under the impression that people were achieving this and wanted to know from what hardware because the rig I posted in the original post is unable to do that. My rig at home which is a slower processor but faster gpu also cannot do it. We also have a high end machine here with two Titans and it cannot run at full with supersampling either. There is always some stuttering in the bases that I can't seem to get around.
I'm not sure what causes the issue but I wanted to make sure there wasn't something on my end, like everyone running maxed out were all running Win8 or something like that. I think that ED needs some optimizations in game. I have a feeling they are drawing too much to screen (i.e. they are drawing the entire station even though you are inside the docking area and can't see it all). In our engine (Frostbite), we use occluder volumes to help the engine determine what to draw depending on where the player is looking. Hopefully they have a similar system. We also have driver issues for sure. I'm really looking forward to seeing some more work done by nvidia and ati to get things up to speed.
 
I really meant running maxed out without any frame drop anywhere. I was under the impression that people were achieving this and wanted to know from what hardware because the rig I posted in the original post is unable to do that. My rig at home which is a slower processor but faster gpu also cannot do it. We also have a high end machine here with two Titans and it cannot run at full with supersampling either. There is always some stuttering in the bases that I can't seem to get around.
I'm not sure what causes the issue but I wanted to make sure there wasn't something on my end, like everyone running maxed out were all running Win8 or something like that. I think that ED needs some optimizations in game. I have a feeling they are drawing too much to screen (i.e. they are drawing the entire station even though you are inside the docking area and can't see it all). In our engine (Frostbite), we use occluder volumes to help the engine determine what to draw depending on where the player is looking. Hopefully they have a similar system. We also have driver issues for sure. I'm really looking forward to seeing some more work done by nvidia and ati to get things up to speed.

You're looking at an high end SLI/xfire machine to be able to do 75fps maxed out settings everywhere at 1080p. Not sure it it could handle 1440p in certain situations like stations, rings or heavy battles. Maybe three way sli/xfire might do the job.
Even then you may still see the odd drop below 75fps.
The reports of people getting 75fps maxed out settings everywhere (even stations) at 1440p at all times are mainly exaggerations.
I think many people say max settings but turn some stuff down, don't check their fps at all and gage it by feel or if they do it's in deep space only.
I suspect that the issue is really performance related in the game as you get massive frame drops in stations on a normal monitor. They were worse performance wise in Premium beta but the did improve a little bit. Had hoped they'd sort it by release, not holding my breath.

I believe you might be right about drawing all the station, this was the explanation given by the devs previously about why performance in stations used to be really bad, not sure if they fixed it properly though.
 
Last edited:
I have everything on the highest settings except for AA which I have on SMAA because I can't see a difference and blur is off because it's pointless in VR. I am running it in 1080p at 75fps at all times with no frame drops at all even in busy stations.

I'm using an i7 3770k oc to 4.4ghz, 16gb ram, GTX 980, installed on an SSD.
 
I have everything on the highest settings except for AA which I have on SMAA because I can't see a difference and blur is off because it's pointless in VR. I am running it in 1080p at 75fps at all times with no frame drops at all even in busy stations.

I'm using an i7 3770k oc to 4.4ghz, 16gb ram, GTX 980, installed on an SSD.

Thanks for the info, I'll start shopping for a similar rig like yours. Part of me thinks I should wait till the CV1 released, by then the whole GPU market would have changed. Am tempted to get a 980, but not sure how big the leap will actually be compared to a 780TI overclocked.
 
I have everything on the highest settings except for AA which I have on SMAA because I can't see a difference and blur is off because it's pointless in VR. I am running it in 1080p at 75fps at all times with no frame drops at all even in busy stations.

I'm using an i7 3770k oc to 4.4ghz, 16gb ram, GTX 980, installed on an SSD.

Astounding that your single 980 can do what peoples SLI 970's and 980's can't ... Hold onto it, probably gonna be future proofed until 2020 with that beast :rolleyes:
 
Astounding that your single 980 can do what peoples SLI 970's and 980's can't ... Hold onto it, probably gonna be future proofed until 2020 with that beast :rolleyes:

Eh? My 980 is doing pretty much the same as freehotdawgs too. The "no frame drops" is different, but it's not down to horsepower as the GPU is never struggling for breath, it's something to do with the game itself. I can be sitting in a scene where the fps is 75 (of course) and the GPU is 30% and I still drop a few frames every now and again for some reason - doesn't happen in other games and demos. Also I get 75fps in stations and the GPU tops out at about 90%.
 
Supposedly SLI setups just plain perform worse in VR than a single card. If someone is running an SLI setup, I'd definitely recommend running in single card mode.

BTW, I do get some stuttering for a second occasionally during SC, but that has to do with bugs, netcode, or whatever because that happens no matter what, even if I play on a monitor. I'm sure that will be ironed out in a patch at some point.
 
Last edited:
My rig is currently under repair due to a leaking water cooler so I haven't been able to use the latest builds of ED or the new SDK. Running a Devil's Den processor at 4.6ghz, 16gb ram, and 2x 970's running in extended mode. I get 75fps on medium settings running at 1440p, but I still get judder in Super Cruise near planets and in busy stations.

- - - - - Additional Content Posted / Auto Merge - - - - -

Supposedly SLI setups just plain perform worse in VR than a single card. If someone is running an SLI setup, I'd definitely recommend running in single card mode.

BTW, I do get some stuttering for a second occasionally during SC, but that has to do with bugs, netcode, or whatever because that happens no matter what, even if I play on a monitor. I'm sure that will be ironed out in a patch at some point.

I get much better performance running in SLI than when forcing it to run a single card. Both cards run at about 80% workload. That is in extended mode, though. Direct mode flashes horribly in SLI.
 
My setup, Windows 8.1/ unlocked i5 4690K CPU watercooled on CPU and both Graphics cards with CPU just running at stock speed 3.5GHZ with 3.9GHz in turbo mode ( havent needed to overclock this CPU at all for any game so far ) 2 EVGA GTX 970s in SLI 4096 gfx memory on each card/ 16GB RAM/ Samsung EVO 500GB SSD/ Rift DK2.

On my DK2 i run 1920*1080 ( was running @ 4K using DSR before Gamma, now if i enable DSR @ 4k with SLI i get bad frame stuttering ) everything maxed out and SMAA on ( Blur off/Bloom off/ ) silky smooth everywhere.
 
Last edited:
On my DK2 i run 1920*1080 ( was running @ 4K using DSR before Gamma, now if i enable DSR @ 4k with SLI i get bad frame stuttering ) everything maxed out and SMAA on ( Blur off/Bloom off/ ) silky smooth everywhere.

This is probably VRAM related, given you are running 970's with only 4GB RAM.

I'm running at 4K - 3840 x 2160 using downsampling with Ultra settings, framerate on average is 67fps.

RIG:
i7 4960X Extreme Edition
MSI Big Bang XPower II
2x EVGA Titan Z
Silverstone Evo 1200w
 
Last edited:
Back
Top Bottom