With the Oculus Debug Tool set to 2.0 pixel density and SS=0.65 looks terrible to me. I can 'feel' the faster frame rate/smoothness somehow but it reduces the image to an even grainier quality, losing some of the AA off sharp edges etc, and distant textures become formless, like the AF isn't working. I thought it would just 'soften the image a little, but a lot is being lost.
So forgive me - I'm having trouble seeing why Debug 2.0/SS=0.65 is somehow seen as better, as the final image quality degrades a fair bit. Cobra is rendering much higher quality but then down-sampling and tossing it out (which would account for the texture quality loss).
I'm now running with the highest possible settings apart from Bloom (Off) and Blur (also Off), with a i7 3770K @ 3.8GHz and Gigabyte G1 GTX 1080 in the Rift CV1. Understandably, this is a fairly high-end system and YMMV.
Is there something I'm missing? AA mode? Or is it just about keeping the frame rate up on slower hardware (in which case I'd use Debug default at 1.0 and SS at 1.0; the render seems better at default (as was the case when I was running a 780GTX with the Rift CV1).
I prefer to leave in-game SS at 1.0 and leave the Debug tool at 1.8-2.0 - you can definitely see the results from the Debug over-rendering, but its difficult to see the benefit of dropping SS to 0.65 to see that quality lost.
Both Debug tool and ED appear to be functioning fine; I'll try to take some screengrabs for comparison (at work atm).
So forgive me - I'm having trouble seeing why Debug 2.0/SS=0.65 is somehow seen as better, as the final image quality degrades a fair bit. Cobra is rendering much higher quality but then down-sampling and tossing it out (which would account for the texture quality loss).
I'm now running with the highest possible settings apart from Bloom (Off) and Blur (also Off), with a i7 3770K @ 3.8GHz and Gigabyte G1 GTX 1080 in the Rift CV1. Understandably, this is a fairly high-end system and YMMV.
Is there something I'm missing? AA mode? Or is it just about keeping the frame rate up on slower hardware (in which case I'd use Debug default at 1.0 and SS at 1.0; the render seems better at default (as was the case when I was running a 780GTX with the Rift CV1).
I prefer to leave in-game SS at 1.0 and leave the Debug tool at 1.8-2.0 - you can definitely see the results from the Debug over-rendering, but its difficult to see the benefit of dropping SS to 0.65 to see that quality lost.
Both Debug tool and ED appear to be functioning fine; I'll try to take some screengrabs for comparison (at work atm).
Last edited: