Debug Tool at 2.0 and SS=0.65 looks terrible - am I missing something?

Well if we're talking about 4k (per eye?) + wireless then yes you may be right. But I think image quality in gen 2 (18-24 months away??) is going to be much better than what we have now. For sure I don't think VR HMDs can be considered a mainstream consumer device until they go wireless. When that will happen is anyone's guess.
When did monitors go wireless?

Wireless VR does exist (Samsung/Oculus Gear, Google Cardboard, etc); but you will always have better bandwidth, lower weight, and more power on a wired system. Given that people still use wired mice, keyboards, joysticks, monitors, etc; I don't see that wireless is in any way necessary (cool perhaps, but not necessary), at least not for these applications. When I look at AR and what they'd like to do with hololens or did with google glass: that's a different matter.

4k/eye will be nice, but right now the problems are mostly to do with downsampling and the lens/screen interaction. There's a lot that can still be improved in software.

The big issue with VR is going to be killer applications. Once you've moved completely past the "this is cooler in VR" factor: you aren't left with a lot that's actually either VR specific or far better in VR, and that's also enough long-term fun to drive sales. Yes, flight sims (including ED) and racing games; but at the moment, that's about it. The rest, right now, are novelties.
 
When did monitors go wireless?
...
Given that people still use wired mice, keyboards, joysticks, monitors, etc;

Not sure I follow what those devices have got to do with it, given that you typically don't attach a monitor, mouse, keyboard or joystick to your head and typically do not use them 3 meters away from the PC whilst moving around. Once VR is past the early adopter phase and gains more interest from the public I can see a lot of people being put off by the cable.

The big issue with VR is going to be killer applications. Once you've moved completely past the "this is cooler in VR" factor: you aren't left with a lot that's actually either VR specific or far better in VR, and that's also enough long-term fun to drive sales. Yes, flight sims (including ED) and racing games; but at the moment, that's about it. The rest, right now, are novelties.

100% agree.
 
VR isn't there yet. Maybe in another 5-10 yrs. In a wrap around pair of sunglasses that also correct for vision.

I guess that depends on where "there" is. When I fly down to a planet, land my Huey on a building (which I couldn't do before VR) or see the apex properly in IRacing for me -there is here. You need a first generation to survive to get a second gen. Those who wait (nothing wrong with that) can thank the early adopters later.
 
With the Oculus Debug Tool set to 2.0 pixel density and SS=0.65 looks terrible to me. I can 'feel' the faster frame rate/smoothness somehow but it reduces the image to an even grainier quality, losing some of the AA off sharp edges etc, and distant textures become formless, like the AF isn't working. I thought it would just 'soften the image a little, but a lot is being lost.

So forgive me - I'm having trouble seeing why Debug 2.0/SS=0.65 is somehow seen as better, as the final image quality degrades a fair bit. Cobra is rendering much higher quality but then down-sampling and tossing it out (which would account for the texture quality loss).

I'm now running with the highest possible settings apart from Bloom (Off) and Blur (also Off), with a i7 3770K @ 3.8GHz and Gigabyte G1 GTX 1080 in the Rift CV1. Understandably, this is a fairly high-end system and YMMV.

Is there something I'm missing? AA mode? Or is it just about keeping the frame rate up on slower hardware (in which case I'd use Debug default at 1.0 and SS at 1.0; the render seems better at default (as was the case when I was running a 780GTX with the Rift CV1).

I prefer to leave in-game SS at 1.0 and leave the Debug tool at 1.8-2.0 - you can definitely see the results from the Debug over-rendering, but its difficult to see the benefit of dropping SS to 0.65 to see that quality lost.

Both Debug tool and ED appear to be functioning fine; I'll try to take some screengrabs for comparison (at work atm).

It seems that negative reports from Rift users are common with a 0.65/2.0 setting.

However, as a Vive user, Image Quality and Frame Rate are drastically increased with a 0.65/2.0 setting.

Maybe this has something to do with differences between SteamVR SS and Oculus Debug Tool SS?
 
Last edited:
Gtx 1080 Overclocked, with In Game SS to 0.85 and Oculus Debug Tool to 1.75. Graphics on Ultra, with AA off, Shadow's Medium, Bloom Off. I get 90 fps, with 86fps in Stations. It is incredibly crisp with these settings and so much fun to cruise around in Canyons on planet surfaces.

Are you using a Rift or Vive?
 
I dont get it either... in game at 0.65 and debug at 2 looks far worse than what im using now... all ultra in game + 1.5 debug tool, looks amazing and text is crisp.
 
I dont get it either... in game at 0.65 and debug at 2 looks far worse than what im using now... all ultra in game + 1.5 debug tool, looks amazing and text is crisp.

Seems like, depending on hardware owned, people are getting different mileage out of debug vs in game SS. Some claim better fps if they lower to .65 and may be more affected by the over all perception of things. Given everyone has different eyes and detects light at slightly different frequencies even sharpness is somewhat a personal thing. All these factors together make this somewhat arbitrary, but none the less interesting.
 
Last edited:
With the Oculus Debug Tool set to 2.0 pixel density and SS=0.65 looks terrible to me. I can 'feel' the faster frame rate/smoothness somehow but it reduces the image to an even grainier quality, losing some of the AA off sharp edges etc, and distant textures become formless, like the AF isn't working. I thought it would just 'soften the image a little, but a lot is being lost.

This is correct. When the folks say 'quality is much better with ingame at 0.65 and the tool at [anything higher than 1]', they are referring to text. In this case, the text is easier to read so people call it a 'higher quality'. It is easier to read because the text takes up more pixels on the screen due to not being rendered exactly at 1:1.

Textures, Polygons quality are directly affected by the ingame supersampling -- 0.65 will render the game graphics at 65% of native resolution. 2.00 in game SS will render at 2.0x native resolution. Upscaling a 0.65x in game resolution via the tool will look way worse than downscaling game engine source mateiral at 2.0x.
 
Back
Top Bottom