Why is using VR decreasing performance

Hopefully someone can answer this.
When I play Odyssey on my 1440p monitor I get between 100 to 155 FPS depending on the activity .

Now in VR I use rift s . It has one panel , resolution 2560 X 1440 .
I use the same settings so I theory I should be getting the same FPS , meaning I should always be able to keep 80 FPS on the headset .
But that's not the case .
For example , in stations I get 72 in vr , while normal play it's 115 to 120.
Agian I use them same settings, no super sampling or increasing anything.

Does someone have an answer ?
Cheers
 
Hopefully someone can answer this.
When I play Odyssey on my 1440p monitor I get between 100 to 155 FPS depending on the activity .

Now in VR I use rift s . It has one panel , resolution 2560 X 1440 .
I use the same settings so I theory I should be getting the same FPS , meaning I should always be able to keep 80 FPS on the headset .
But that's not the case .
For example , in stations I get 72 in vr , while normal play it's 115 to 120.
Agian I use them same settings, no super sampling or increasing anything.

Does someone have an answer ?
Cheers
In flatscreen the game is rendering the scene once and outputting that to your monitor.

In VR the game is rendering the scene twice and outputting both images to your Rift S - the panel in your headset may be the same resolution as your monitor but your computer is having to do a lot more work to put two images on it.
 
Plus there is a distortion factor added to the VR image to account for the headset lenses.
To elaborate: The rendered resolution depends on your settings for the headset - in SteamVR there is a resolution slider, I don't remember how it is set for the Rift S in the Oculus software, been too long since I had one. Anyway, the default render resolution of the Rift S is 1648x1776 pixels per eye, I think, so quite a bit more than 1440p. In general, all VR headsets do quite a bit of oversampling, because there is some distortion applied to the image to, as DeckerSolo said, account for the lenses and counter the otherwise much lower pixel density at the edges of the image.
 
It's unfortunate that all that extra resolution that is rendered to make the most of the of the higher effective resolution in the center of the lenses (where they squish pixels together), is utterly wasted on the lower effective resolution in the periphery (where pixels are stretched out), but that's what we get with how frames are traditionally rendered.

There are several ways to save performance by matching rendered resolution to the observed pixel density for different parts of the field of view, but it is sadly rarely done, even in titles made specifically for VR. :(
 
It's unfortunate that all that extra resolution that is rendered to make the most of the of the higher effective resolution in the center of the lenses (where they squish pixels together), is utterly wasted on the lower effective resolution in the periphery (where pixels are stretched out), but that's what we get with how frames are traditionally rendered.

There are several ways to save performance by matching rendered resolution to the observed pixel density for different parts of the field of view, but it is sadly rarely done, even in titles made specifically for VR. :(
I think you have confused the centre of the lens with the centre of your eye. Yes, the point where your fovea is pointing at is the only part where it needs the highest quality resolution but you can look anywhere within the lens.. Unless you have a VR headeset that tracks the eye so that your headset can tell your software where to render at maximum resolution, then you have to render at max everywhere.

I've not been keeping track, I believe some of the latest Pimax's might do this now, but not the Rift S.
 
I think you have confused the centre of the lens with the centre of your eye. Yes, the point where your fovea is pointing at is the only part where it needs the highest quality resolution but you can look anywhere within the lens.. Unless you have a VR headeset that tracks the eye so that your headset can tell your software where to render at maximum resolution, then you have to render at max everywhere.

I've not been keeping track, I believe some of the latest Pimax's might do this now, but not the Rift S.

No, both are true.

The lenses in VR headsets cause what goes by the monicker: "pillow distortion", which leads to the periphery being significantly more oversampled than the center (sometimes to the point it can lead to some aliasing, due to samples from source frames being skipped when applying software "barrel"-type counter distortion (EDIT: ....because there are just so many source pixels (EDIT2: ...in the periphery) to shoehorn into a single output pixel -- at some point the texture filter will take a shortcut in order to not become too slow).

On top of this, more pixels per degree are inherently rendered in the periphery (and wasted on it) because of the viewplane being a flat rectangle -- between 30 and 40 degrees out, you get 1.5 times more pixels per degree than from 0-10; the following ten degrees after that gets twice as many, and the next ten 3.2 times as many, up to infinity at 90 degrees out. Trigonometry does not work for us here... :p

So this, AND what you say: The fovea does indeed have a lot greater cone cell density than the rest of the retina, and it is tiny. Hopefully we'll see foveated rendering as standard one day, but we've been saying that for quite a while now, haven't we? :7
 
Last edited:
Because you have two eyes. The game renders two images of 1648x1776 from two slightly different perspectives, because your eyes have a certain distance. The fact that the Rift S has a single panel is irrelevant; it's merely a cheap out way to save on using two panels and a mechanical eye distance adjustment.
 
Because you have two eyes. The game renders two images of 1648x1776 from two slightly different perspectives, because your eyes have a certain distance. The fact that the Rift S has a single panel is irrelevant; it's merely a cheap out way to save on using two panels and a mechanical eye distance adjustment.
So what you're saying is that if I were a Cyclops I wouldn't be having performance issues. Got it. Time to make some adjustments .
 
Slightly relevant to the above discussion ...

Virtual Desktop recently added FOV Tangent adjustment to their app, which you can also find in the Oculus Debug Tool for link.
In both cases, you can reduce the size of rendered image at the periphery and therefore reduce the amount of pixels required to only those you can actually see.
For my Quest 2 on USB Link, via ODT I was able to run around 90h and 87v, allowing me to jump from 1.05 supersampling to 1.10, no performance loss.
For my Quest 3 via Virtual Desktop I can do 87h and 83v before I see black borders. What that equates to for Elite Dangerous for me is SteamVR's render resolution can be increased from 104% to 110% without losing any performance.

I know very little about the Rift S, but I assume it uses the Meta software and can be adjusted by Oculus Debug Tool? Seems like a better adjustment than an eye patch or knitting needle or whatever ... :D
 
Slightly relevant to the above discussion ...

Virtual Desktop recently added FOV Tangent adjustment to their app, which you can also find in the Oculus Debug Tool for link.
In both cases, you can reduce the size of rendered image at the periphery and therefore reduce the amount of pixels required to only those you can actually see.
For my Quest 2 on USB Link, via ODT I was able to run around 90h and 87v, allowing me to jump from 1.05 supersampling to 1.10, no performance loss.
For my Quest 3 via Virtual Desktop I can do 87h and 83v before I see black borders. What that equates to for Elite Dangerous for me is SteamVR's render resolution can be increased from 104% to 110% without losing any performance.

I know very little about the Rift S, but I assume it uses the Meta software and can be adjusted by Oculus Debug Tool? Seems like a better adjustment than an eye patch or knitting needle or whatever ... :D

Apologies to the OP for all the straying-from-topic stuff, but...

So this adjustment really happens on the render side? -Not "just" as a trick to (EDIT: after rendering) optimise the amount of data that needs to be transferred over WIFI/USB?

If so: How does it make the game render differently? There are things, such as VRPerfKit, and OpenXR Toolkit, that "hack" their way into the rendering pipeline of a game, and activates VRS for select shaders -- is that what is going on?
 
Last edited:
As far as I am aware, adjusting the FOV tangent reduces the rendered area of the headset, so the area you can't see beyond the edges is not rendered at all. I don't believe it is a "trick" that just renders black pixels on the edges or something like that. The video below does a pretty good job showing how to set it up and the end results, tho how he can go as low as 65% vertical I have no idea. I can just see the black border on the apex of the "compass points" at 87h 83v.

 
As far as I am aware, adjusting the FOV tangent reduces the rendered area of the headset, so the area you can't see beyond the edges is not rendered at all. I don't believe it is a "trick" that just renders black pixels on the edges or something like that. The video below does a pretty good job showing how to set it up and the end results, tho how he can go as low as 65% vertical I have no idea. I can just see the black border on the apex of the "compass points" at 87h 83v.


Aha, so it's just reducing the FOV; That would indeed be very useful for any user who can not get the lenses close enough to their eyes to see the whole FOV of their headset.
Pimax has also long had a number of fixed FOV options for their wide FOV headsets, and SteamVR added a slider for it some time back, too.

Personally I couldn't make myself go below 100 degrees, though -- I'm frustrated as all hell with the claustrophobic "skimask feeling" of even the widest FOV HMD I have, at 160 degrees. :p

...but damn! The term: "Tangent Adjustment" sounded exactly like something that would address one of the things I mentioned earlier: Adjust sampling density for higher tan(), like. :p
 
Back
Top Bottom