HMD Image Quality

Hi,

I've been using an Oculus CV1 now for a few weeks, and are mighty impressed.

I'm now trying to optimize the image quality as much as possible.

I'm now using Supersampling through the Oculus Debug Tool, and that gives a rather good result without to much performance hit. I can get alomost constant 90fps with most settings maxed out.

Now to the question:

In the Elite grapics quality settings, there is a setting called "HMD Image Quality". The values used looks like some kind of super sampling. Setting it to 2.0 does seem to give a better image, but at a big performance hit. Down on planets the fps starts varying between 45 and 90.

What exactly does this setting do? I have googles a lot but can't find any real info about it.

/MrStarhouse
 
Its the debug tool wrapped up in the game! No need for the debug tool now

This - unless you want to get to the performance HUDs etc. The App Render Timing HUD is still really useful, as it shows how many lost frames you're (not) seeing etc.

Depends on which GPU you're using... 1080 owners will probably use 1.25 or 1.5. 2.0 is still too high, unless you want to run at 45fps all the time and rely on ATW/ASW to fix the other 45 frames for you, with the attendant image wriggles.
 
Last edited:
This - unless you want to get to the performance HUDs etc. The App Render Timing HUD is still really useful, as it shows how many lost frames you're (not) seeing etc.

Depends on which GPU you're using... 1080 owners will probably use 1.25 or 1.5. 2.0 is still too high, unless you want to run at 45fps all the time and rely on ATW/ASW to fix the other 45 frames for you, with the attendant image wriggles.

Ok that's interesting. What is this 45 fps malarky? I found ASW to be absolutely amazing. It didn't feel like anything short of a constant 90fps. Got some weird warping only on station menus, and that was tolerable.
 
Ok that's interesting. What is this 45 fps malarky? I found ASW to be absolutely amazing. It didn't feel like anything short of a constant 90fps. Got some weird warping only on station menus, and that was tolerable.

This is what ASW does.
It realises you can't reach 90fps, so it throttles to 45fps and engages the interpolation to get back to 90.

The old interpolation was only rotational, and got really blurry.

ASW is miles beyond the old yes, but not perfect and introduces rippling and artefacts in certain areas.

It's very similar to the regular interpolation in tv's do to video to match 24, 50, 60 Hz feeds into whatever the panel displays.
It's what normally causes "soap opera" effect in movies.
It's really just the last few years we have gotten tv's with an interpolation mechanic that avoids this, almost nauseating effect and still produce a smoother, clearer picture.
And that is with pre rendered linear video, this is now interpolation in a 3d space, live.
Needless to say, very impressive.
 
Last edited:
Its the debug tool wrapped up in the game! No need for the debug tool now

OK,that's interesting,than maybe the extremly bad performance I got when turning this above 1.0 in-game might be due to that I also had it set to 2.0 in Oculus Debug tool.

I will try and skip the debug tool setting and only use HDM Image Quality and see what happens.

It is a bit tedious to always start the debug tool first, so It would be nice if I could do it only In-Game
 
So the SS and HMD image quality settings in game now directly use the Oculus API (like the debug tool) instead?

I'll give that a try.
 
So what is the difference between in game SS and HMD Image quality ? Thanks!
OK, I'll speculate a bit here. I don't really know. But I would guess, based on what people have written above that:

Supersampling = 2.0
Means that the game engine renders a frame that is twice the size a needed for display, and then scales it to fit the ocules screen.

HMD Image Quality = 2.0
Means that the game tells the Oculus engine to render a screen twice the size needed, and then scales it to fit.

So only difference is where it is done.

As far as I've seen for Oculus it feels like "HMD Image Quality" makes better VR image then Supersampling.

Please fill in or correct me if I'm "Far out and beyond" in this stement :)
 
OK, I'll speculate a bit here. I don't really know. But I would guess, based on what people have written above that:

Supersampling = 2.0
Means that the game engine renders a frame that is twice the size a needed for display, and then scales it to fit the ocules screen.

HMD Image Quality = 2.0
Means that the game tells the Oculus engine to render a screen twice the size needed, and then scales it to fit.

So only difference is where it is done.

As far as I've seen for Oculus it feels like "HMD Image Quality" makes better VR image then Supersampling.

Please fill in or correct me if I'm "Far out and beyond" in this stement :)

The two options use different rendering paths. The Debug Tool Pixel Density and the HMD Quality setting are the same. Its thought to be a lower-level process, closer to the hardware, and quite fast.

FD's in-game Supersampling option, looks like it includes some post-processing as well, so is a bit slower. Hard to determine how different its rendering is.

Its best to just click different combinations until you find one that suits performance and image quality-wise.
 
OK, I'll speculate a bit here. I don't really know. But I would guess, based on what people have written above that:

Supersampling = 2.0
Means that the game engine renders a frame that is twice the size a needed for display, and then scales it to fit the ocules screen.

HMD Image Quality = 2.0
Means that the game tells the Oculus engine to render a screen twice the size needed, and then scales it to fit.

So only difference is where it is done.

As far as I've seen for Oculus it feels like "HMD Image Quality" makes better VR image then Supersampling.

Please fill in or correct me if I'm "Far out and beyond" in this stement :)

I honestly don't notice a difference in quality between the two, the HMD image quality provides a larger performance hit from what I've seen. Atm with a GTX 1080 I can get 90 FPS lock with VR Ultra and 1.5 SS. I tried 1.75 and it was close, it would dip to about 75 minimum but I prefer a 90 fps lock.
 
Last edited:
I tried a bit more last night. And I still feel that the best performance is when using the pixel density in the Oculuse debug tool.

As said above, this should be the same as HMD Image Quality, but I do feel that the later has a bigger cost on performance.

When running Debug tool pixel density = 2, I get almost constant 90 fps.
When running HMD Image quality, then I need to go down to between 1.25 to 1.5 to get almost constant 90 fps.

The Downside with Debug Tool pixel density however is that I feel it is hard to get activated. I do as instructed:

- Turn on oculus home
- Start debug tool
- Set debug tool density to 2.0
- Start Elite
- When Elite just started, turn off debug tool

But this only sometimes works. It's very easy to see if it is on or not. Especially when on a planet. When it is not on, the horisontal indicator on the hud get very jagged without any anti aliasing, but when activation succeeded, there is nice antialiasing on the HUD.

I Don't know what I might be doing wrong to get it going only sometimes.
 
As far as I've seen for Oculus it feels like "HMD Image Quality" makes better VR image then Supersampling.

Please fill in or correct me if I'm "Far out and beyond" in this stement :)

Hi, Im new here.

Yesterday did some tests and it looks like that its quite opposite.
HMD Image Quality generate much more artefacts then Supersampling.
I stayed with SS.
 
I tried a bit more last night. And I still feel that the best performance is when using the pixel density in the Oculuse debug tool.

As said above, this should be the same as HMD Image Quality, but I do feel that the later has a bigger cost on performance.

When running Debug tool pixel density = 2, I get almost constant 90 fps.
When running HMD Image quality, then I need to go down to between 1.25 to 1.5 to get almost constant 90 fps.

The Downside with Debug Tool pixel density however is that I feel it is hard to get activated. I do as instructed:

- Turn on oculus home
- Start debug tool
- Set debug tool density to 2.0
- Start Elite
- When Elite just started, turn off debug tool

But this only sometimes works. It's very easy to see if it is on or not. Especially when on a planet. When it is not on, the horisontal indicator on the hud get very jagged without any anti aliasing, but when activation succeeded, there is nice antialiasing on the HUD.

I Don't know what I might be doing wrong to get it going only sometimes.
Don't turn off the Debug tool? I never do and it always works. And there really is no reason to turn it off, it is no resource hog is it :)
 
Last edited:
Don't turn off the Debug tool

^^^ this! I've read in numerous places that you need to leave the debug tool running in order to keep the Pixel Density setting applied, otherwise apparently it tails off and reverts back to normal.

This said, I'm hoping to use the in-game setting now instead although it's interesting to read that some people think this might not work as well.
 
Its safe Alec - the new in-game HMD Quality is the exact same debut tool pixel density setting. they're just calling the same function from within the game now.

I'm not seeing any difference between setting HMD Quality at 1.25 vs setting the debug tool to 1.25. Identical.

And yeah, if you do use the debug tool, don't turn it off. The game does re-initialise the Rift hardware at various points (seems to be between frames, you don't see it happening). As soon as it does, you can lose the pixel density setting if you turned off the debug tool. It just reverts to 1.0x.
 
Hi, stupid question - how do I activate the fps rate? I use the CV1 and I would like to know how many fps my PC gets.

I use the debug tool with 1.5 on my GTX 1080 machine. I think 2.0 is too much. I set the gfx quality to 'VR Ultra'.

UPDATE: I set SS to 2.0 in the Debug tool and VR Ultra settings ingame. i get round about 40 fps?????????????? that's ok?
Within a station it's only 28 fps. I have a i7 power machine and a GTX 1080. Hmmm... Maybe VR Ultra is too much.
 
Last edited:
Hi, stupid question - how do I activate the fps rate? I use the CV1 and I would like to know how many fps my PC gets.

I use the debug tool with 1.5 on my GTX 1080 machine. I think 2.0 is too much. I set the gfx quality to 'VR Ultra'.

UPDATE: I set SS to 2.0 in the Debug tool and VR Ultra settings ingame. i get round about 40 fps?????????????? that's ok?
Within a station it's only 28 fps. I have a i7 power machine and a GTX 1080. Hmmm... Maybe VR Ultra is too much.

I may be way off base here, but If memory serves, VR ultra sets in game supersampling of 1.25.
And adding a multiplier of 2 to that, well that would mean something like every grid of 100 pixels are rendered as a grid of 25000 pixels.
So yes I think that's a bit much.

I would turn off all supersampling for all profiles, then monitor FPS via the debug tool.
Once you get quality settings that give you a stable 90FPS in stations and RES sites, I would consider upping the HMD quality setting (this is the same as multiplier in the debug tool) and leave the regular ingame multiplier at 1x.

I honestly don't think the cards that can run ED in VR at ultra and a fair amount of supersampling exist yet, and I would be suprised if they do for the next few years, like maybe a theoretical next series top end ti GPU could do it.
 
Don't turn off the Debug tool? I never do and it always works. And there really is no reason to turn it off, it is no resource hog is it :)

Agreed the order I use is simply.

- Start the debug tool
- Set the pixel density to 2.0 (or whatever)
- Start the Oculus home
- Start Elite

Simple as that really. In ED 2.2 I no longer need to have the debug tool to get the same result.

I use 2 x TX 980Ti cards in SLI and get great results from the following settings...

SS - 1.5
HMD Image Quality - 2.0
Shadow Quality - Medium
Ambient Occlusion - Medium

Everthing else on either high or ultra (top setting)
 
Back
Top Bottom