Rift Judder and FPS discussion thread.

Heavy judder in RES sites - yes or no


  • Total voters
    35
  • Poll closed .

Viajero

Volunteer Moderator
Just saw this dedicated VR subforum!

I posted this elsewhere but probably better here:

After these last couple (or 3) of patches yesterday I am still getting huge FPS drops in stations, down to 10-15 from my Vsync 75 in normal space. Is this a known issue or is it just me?

These FPS drops only seem to happen in OPEN PLAY and in systemns where there are several other CMDR´s.

In SOLO mode or in remote systems (i.e. with no CMDR´s around) while in OPEN PAY these FPS drops in stations are gone.

From this I suspect the FPS drops are netcode related more than hardware related but I am a bit lost.

Any ideas?

i7 3.1 Ghz, GTX 680, DK2 4.3 runtime in extended mode with all graphic settings in low or off.
 
Yes I have exactly the same its net code and as you say its very obviously since they a re only present if you play it in multiplayer. I think it has something to do with landing requests in crowded stations since I see almost all landing pads blinking on and off.
 
This problem almost lost me a Hauler this evening. Its horrendous - I could barley control my ship and was crashing into the sides due to laggy controls and graphics.

I normally play in Solo mode - but I thought I'd just give it another chance and it failed miserably.

Back to Solo.
 
Just updated and dusted off my X-55 to play the latest content. Sadly, I too am having this problem. I will try playing in solo I suppose.
 

Javert

Volunteer Moderator
75FPS Why?

Hi, as a newcomer to VR and prospective OR owner, and also someone who used to be an avid gamer many years back, but took a long break until recently, I'm a bit confused by the discussion on minimum frame rates.

I've been told that you need at least 75FPS to run OR. However I remember in the old days we were pretty happy if we got 30FPS or more, and movies are shot in 24FPS I believe. I seem to recall reading somewhere that the humen eye can't discern the difference once frame rate gets above about 50 or so, but obviously that cannot be true given what's being said here.

Therefore I'm wondering:
- Why does this VR technology need much higher frame rates than just looking on a monitor?
- What happens if the frame rate is not that high e.g. what is it's 65, or 45 or whatever - does this just reduce the immersion or destroy it completely?
- This may have been discussed before in the massive DK2 threads but I didn't find it so far, or there may be a good article out there on this - if someone has a link that would be really helpful.
- Is this the reason why everyone keeps saying that OR requires a very high end graphics card? I know this sounds flippant, but to me it sounds like it's just an HD monitor stuck in your face with head tracking built in, so I'm not really clear why it would require so much more grunt than an classic HD monitor with TrackIR for example.

TIA
Patrick
 
Hi, as a newcomer to VR and prospective OR owner, and also someone who used to be an avid gamer many years back, but took a long break until recently, I'm a bit confused by the discussion on minimum frame rates.

I've been told that you need at least 75FPS to run OR. However I remember in the old days we were pretty happy if we got 30FPS or more, and movies are shot in 24FPS I believe. I seem to recall reading somewhere that the humen eye can't discern the difference once frame rate gets above about 50 or so, but obviously that cannot be true given what's being said here.

Therefore I'm wondering:
- Why does this VR technology need much higher frame rates than just looking on a monitor?
- What happens if the frame rate is not that high e.g. what is it's 65, or 45 or whatever - does this just reduce the immersion or destroy it completely?
- This may have been discussed before in the massive DK2 threads but I didn't find it so far, or there may be a good article out there on this - if someone has a link that would be really helpful.
- Is this the reason why everyone keeps saying that OR requires a very high end graphics card? I know this sounds flippant, but to me it sounds like it's just an HD monitor stuck in your face with head tracking built in, so I'm not really clear why it would require so much more grunt than an classic HD monitor with TrackIR for example.

TIA
Patrick

I will try to my best, as a fres OR user (it's amazing!)

- Why does this VR technology need much higher frame rates than just looking on a monitor?
You need much smoother transitions to avoid motion sickness and other visual problems - I think the CV has been told to go as high as 90. The thing with Oculus is that it's nothing like a normal monitor, the usual rules don't apply - it takes your entire field of vision and needs to be butter smooth or your brain will get confused. The usual 24/30fps is old stuff - there is a huge difference between 30&60, noticeable to almost everybody - higher frame rates can still be perceived and and that thing about human eye not being able to see more fps is just an urban legend.

- What happens if the frame rate is not that high e.g. what is it's 65, or 45 or whatever - does this just reduce the immersion or destroy it completely?
You have judder which causes motion sickness and breaks greatly the immersions. When you move your head, it needs to move smoothly without any noticeable delay, it's very important.

- This may have been discussed before in the massive DK2 threads but I didn't find it so far, or there may be a good article out there on this - if someone has a link that would be really helpful.
Don't have anything under hand, sorry!

- Is this the reason why everyone keeps saying that OR requires a very high end graphics card? I know this sounds flippant, but to me it sounds like it's just an HD monitor stuck in your face with head tracking built in, so I'm not really clear why it would require so much more grunt than an classic HD monitor with TrackIR for example.
It's not just 1 screen - it's the equivalent of 2 screens sending a slightly different image. This is the equivalent of rendering the same scene twice - bare in mind, not like having two monitors but like running the game twice (sort of) - this taxes the graphic cards very heavily even at low resolutions and the minimum FPS requirement is a killer. Trust me, it's nothing like a HD monitor, once you try it you will realise that all these years we have played fake 3D games. Oculus is the true 3D where you finally have perception of depth - old school 2D monitors feel like toys in comparison.
 
The 24fps limit in movies is from a time when the material that movies were shot on was still rather expensive. And then they stuck with it for compatibility reasons.

The reason movies don't seem to stutter is because of motion blur. Every camera, digital or not, is using a shutter system that stays open for a bit to record a picture. Objects in motion will keep moving during this time, which results in the aforementioned motion blur. We notice stuttering when there are enough details in an image for our eyes to "lock on", when the contrast between the moving object and its background is high enough and when the frame rate is just too low to make up for all of these deficiencies. Motion blur masks the details and evens out the contrast a bit.

Games on the other hand aren't recorded by a camera. Every frame of a game is generated, put together artificially. Every frame of a game is a perfect still. There is no motion blur that would help to mask details, unless you activate some motion blur or movie-like filter in a game's settings. So for a game to appear as fluid as movies, you'd need a higher frame rate.

Games also allow you to control what's going on in them. At 30FPS, the inputs you make are sampled every 33.333ms. At 60FPS input is sampled every 16.667ms and at 120FPS is sampled every 8.333ms. Shorter delays between samplings allow your actions to be executed more closely to when you physically push the buttons.
 

Javert

Volunteer Moderator
I think the CV has been told to go as high as 90

When you say they have been "told" to go as high as 90, are you saying there is some rule that they have to follow for the product to be safe?

All - thanks for your replies. I guess what I'm wondering also is whether 75 (or 90) is the magic number for everyone, or whether it can vary person by person, and some people might think it's perfect at 60 whilst others might still have problems at 90?
 
no there is no magic number, its just playing the percentages.

some people can "cope" with 60fps without feeling sick.... I think i remember reading that 75fps is the number where like 85% of people are ok with the tracking, and 90fps is ok for well over 90% of people.


iirc palmer lucky said once you hit 120fps then you are essentially "there" at the point of diminishing returns making it moot to go any higher....

it is just the point at which the tracking feels "natural" for people in VR (and is related to nausea as well, though not 100% related - there are many reasons why you may feel sick in VR)

under 60fps however is a killer for VR and is likely to put people off it for life. bad VR is worse than no VR at all imo.
 
Last edited:
When you say they have been "told" to go as high as 90, are you saying there is some rule that they have to follow for the product to be safe?

All - thanks for your replies. I guess what I'm wondering also is whether 75 (or 90) is the magic number for everyone, or whether it can vary person by person, and some people might think it's perfect at 60 whilst others might still have problems at 90?

Kinda, not quite.
The higher the FPS/refresh rate the more likely you are to get presence in VR. This is basically when VR tricks your brain into thinking what you are seeing is real. You feel like you are actually there. This varies per person so someone might feel presence at 75fps but are much more likely to feel it at 90 (or 120 or 144 etc..). Once you experience presence you'll want to continue experiencing it as it's mind blowing. Most people only experience it fleetingly at the moment at 75fps/hz.

Here's a good video on the subject of presence -
https://www.youtube.com/watch?v=G-2dQoeqVVo

Currently the reason not to go below 75fps isn't to do with presence but low persistence. This is basically strobing the screen so that you only ever see good data rather than seeing a mix of good and bad data like you do at lower fps/refresh rates (smearing basically).

Here's an article from Abrash again about judder and low persistence- http://blogs.valvesoftware.com/abrash/why-virtual-isnt-real-to-your-brain-judder/
 
The real answer to this is simple, a display has a refresh rate in HZ, to get the smoothest possible experience you need to match the fps rendered to the exact hz of that screen. The dk2 has a 75 hz screen and if you push anything less or don't cap at that frame rate then you get tearing judder and other undesirable effects.

This is something people with regular monitors should know and is not something limited to the dk2 or vr.
 
When you say they have been "told" to go as high as 90, are you saying there is some rule that they have to follow for the product to be safe?

All - thanks for your replies. I guess what I'm wondering also is whether 75 (or 90) is the magic number for everyone, or whether it can vary person by person, and some people might think it's perfect at 60 whilst others might still have problems at 90?

As a ED player with about 60 hours in the game with DK2 i can say that when i have solid 75fps, its like heaven. You really feel the prescence like you are there. I have that out in space and outside station. Howerever it drops when cruising by the planets and inside stations. I think oculus demanding 90fps in the future is to further avoid motion sickness for as many people as possible. But obvously its going to exteremely taxing on the computers and modern games. Hopefully they can solve it with a perfect crossfire or SLI solution, but where not ehere yet. Still DK2 in ED is a bliss when its all working at 75fps. It still have low res flaws and strain your eyes somewhat, but man, its really is a great expeience.

I few days ago i spent about 6h straight in the DK2 hauling cargo all around. And i didnt feel ill for one second. Just came out with a smile, and wanted more. Thats how good it can be :)

The biggest problem i have so far is the galaxy map and have to type in the name of a star i want to go to. I wishe they had some virtual keyboard inside the game so I wouldnt have to lift up the DK2 and type.
 
Last edited:
The real answer to this is simple, a display has a refresh rate in HZ, to get the smoothest possible experience you need to match the fps rendered to the exact hz of that screen. The dk2 has a 75 hz screen and if you push anything less or don't cap at that frame rate then you get tearing judder and other undesirable effects.

This is something people with regular monitors should know and is not something limited to the dk2 or vr.

The DK2 is actually capable of doing a bunch of variable refresh rates, including 60Hz. That's why when you lose low persistence (when you go below 75 FPS), you usually just drop down to 60 FPS, which is still relatively smooth, but results in a lot of smearing (low persistence off) and judder (render timing issues).

There's a bunch of reasons for higher frame rates in VR. Right now, the biggest one is low persistence. You need at least 75 FPS for low persistence to look good for most users. Aside from low persistence, higher frame rates will also simply make things look smoother and it reduces latency. Having a little head room for the occasional frame drop is also a nice thing.
 
- Is this the reason why everyone keeps saying that OR requires a very high end graphics card? I know this sounds flippant, but to me it sounds like it's just an HD monitor stuck in your face with head tracking built in, so I'm not really clear why it would require so much more grunt than an classic HD monitor with TrackIR for example.
It really comes down to simple math. Lets say you've been gaming at FullHD until now. You've been playing your favourite game in Ultra settings at ~30 fps. This is currently possible with a decent mid-range GPU, so why is the Rift so much different? I'll try to explain:
  1. The Rift screen "only" has FullHD resolution, but the internal rendertarget of the Rift is actually 2364 x 1461 pixels, due to the lenswarping. So basically every Rift title needs to render 75% more pixels to get the most out of that FullHD screen. Those 30 fps are now already down to 17fps (naively asuming a linear performance scale).
  2. You need two different perspectives, one for each eye. Some parts of the scene might not need to be computed twice, but most do which almost doubles the GPU workload per frame(this is not a static number, some games might see a less dramatic drop and game engines will likely optimize for this over time, but right now they're pretty inefficient for SBS 3D so lets go with the worst-case). That turns our 17 fps into 8fps, effectively little more than ~10% of our desired framerate of 75.
  3. Due to reasons explained by others before, you actually need to at least match the screen refresh rate to have a good VR experience. The DK2 runs at 75 Hz, the consumer version will most likely be 90 Hz AND higher resolution. Taking the above figures we can conclude that in order to run that same game at identical detail settings, you'd actually need at least a 10x more powerfull GPU to have a perfect DK2 experience, and a lot more for the consumer version. Ouch!
For now all we can do is drop detail levels and find a good compromise between visual quality and performance. I know its hard, I've always been an eyecandy over framerate kind of gamer myself. But in the Rift this is an entirely different scenario and framerate is king.

There is some help on the horizon that might make this a bit less dramatic. Asyncronous timewarp (not to be confused with normal timewarp, which is already in the SDK) is a feature Oculus is working on, but it isn't ready yet on PC (only GearVR). What it does is basically create intermediary frames based on the last rendered frame, a depth buffer and the new tracking information. It is supposed to be quite effective in optimal circumstances and makes a couple of dropped frames much less destructive to the experience than it currently is. Hopefully we will get to see this feature added to the runtime in the coming months.

The real answer to this is simple, a display has a refresh rate in HZ, to get the smoothest possible experience you need to match the fps rendered to the exact hz of that screen. The dk2 has a 75 hz screen and if you push anything less or don't cap at that frame rate then you get tearing judder and other undesirable effects.

This is something people with regular monitors should know and is not something limited to the dk2 or vr.
Thing is those problems are easily ignored on regular monitors. I used to have no problem playing a slow paced 3D game (like a 3rd person RPG) at 20-25 fps if it meant getting all the eye candy. Even in single player FPS games I'd often opt for looks over framerate and was happy to run it with ~30 fps. In the DK2 anything below 75 FPS is just not tolerable, this is hard to explain until you try it out yourself.
 
If you were to use lower framerate than 75hz with low persistence (basically screen strobing at 75hz), it would result in a visible flicker.
Idea of the low persistence is to reduce motion blur by showing the frame only when it's just rendered for just a few milliseconds. After that it would be incorrect image until the next frame rendered.
This technique needs fast oled display and high framerates. The rumored 90hz refresh rate of cv1 serves mosty the purpose of lowering the latency furthermore.
 
iirc palmer lucky said once you hit 120fps then you are essentially "there" at the point of diminishing returns making it moot to go any higher....
under 60fps however is a killer for VR and is likely to put people off it for life. bad VR is worse than no VR at all imo.

The other major benefit of 120fps is that it's a multiple of the standards for framerates we use the most (ignoring uk tv standards, as everyone seems to): 24, 30 and 60, so there's no inherent stuttering due to framerate smoothing.
 
So I have a question regarding the fps.

What effect would there be if the FPS drops quickly from 120 to 75?
One assumes if the improvements are perceptible from 75 to 120 then the degradation that comes with a drop in FPS must also be perceptible even if the FPS does not drop below 75?

So is the OR FPS locked or does it go up and down as with a monitor?
Is there a way to lock the FPS also?
 
Having lower than 60 fps makes the game seem like it's swimming.. sea-sickness / motion sickness. Not pleasant, at all. I came close to puking, a few times, and usually feel bad the day after. As far as GPU horsepower, the game has 2 cameras to render offset images to get your depth perception, which means the game is rendering twice.

taking what the_wretched was saying about a GPU that is 10x more powerful, well, that is true with today's (Nov 05, 2014) software/hardware, however, there are a number of optimizations coming within the year.

- Windows 10 performance: With 5 different desktops running, each running 5x firefox tabs, excel documents, and word documents.. if I minimize everything, the OS uses 0% - 1% CPU Usage. With windows 7, I am at.. lemme check... 30% CPU usage.
- DirectX 12: Is supposed to be super-charged in the backend optimization. From current reports, it gives you 60% better performance / fps with the same power consumption
- Oculus SDK: The Oculus team is still in the process of optimizing the SDK
- NVidia Drivers: There are a ton of optimizations in the pipeline for VR. SLI is a big one.

- Elite Dangerous: Still in beta. I am 99.976385% sure that there are still a ton of optimizations in the game to be done before actual release.

So, to be honest, I don't think hitting 90fps in games will be that big of a step as it is today.


edit: To answer your question about FPS dropping from 120-75, it would be noticeable, but you wouldn't get physically sick from it, like you would going from 60fps to 30fps. 60FPS is the absolute, bare minimum, required FPS. Even then, there will be alot of users getting sick even at 60 fps.

At 150 fps, we can no long consciously perceive higher framerates, however, the visual cortex gives sensory input to the brain at roughly 1,000 fps. Basically, in my mind (because I obviously cannot test this), going to 120 fps would be a good stopping point until AI and real-time photorealism catches up due to the uncanny valley. From 120 - 150 you will see diminishing returns. From 150fps - 1,000fps, you're running up the steep cliff known as the uncanny valley.
 
Last edited:
... but the internal rendertarget of the Rift is actually 2364 x 1461 pixels, due to the lenswarping. So basically every Rift title needs to render 75% more pixels to get the most out of that FullHD screen.

Does this imply that to get the best image with the rift we should be telling the game to render at 2364x1461? (at 75hz) or is the 1980x1080 resolution enough?
 
Back
Top Bottom