I understand super sampling I believe. ie: A SS of 2 will render at the double the necessary resolution and then downsample that to ideally give a better outcome in the VR headset.
Surely a SS of less than 1 will give a poorer visual result as it's having to then upscale?
And what is this HMD value of 0.5 or 2 being mentioned?
I must confess I've never satisfactorily understood the interaction between the SS setting and the HMD Quality setting.
As I understand it, the SS setting is basically getting the Elite: Dangerous game code to pretend you have a different resolution display to the one you actually have. For example if you're running on a 1920x1080p monitor and have SS=1.5 then the game generates frames that are 2880x1620 with all the additional finer detail that this entails. Your graphics driver than downscales these frames back to your screen resolution. This has the advantage over simple anti-aliasing techniques in that the "in-between" pixels which are filling in the gaps between your jaggies are based on real extra data from the game rather than guesswork from the AA code.
Meanwhile the HMD Quality setting is a value that's passed through to the Oculus runtime software (exactly the same value that the Oculus Debug tool wires out as "Pixels per Pixel") which causes a similar thing to happen re: the frames the Oculus software is producing and the hardware resolution of the HMD.
Vast quanities of (subjective) experimentation (documented all over these forums) appear to suggest that the Oculus software somehow does a more effective job with the HMD Quality setting than you get from increasing SS (so you get a clearer image with SS=1.0 and HMDQ=1.5 than you would with SS=1.5 and HMDQ=1.0). However, this seems to come at a performance cost so having HMDQ set much above 1.5 requires a pretty darn powerful GPU.
Now, where it gets confusing is that a lot of people, in order to get decent framerates with a high HMDQ value, deliberately downscale their SS to something like 0.75 (i.e. getting ED to only render frames that are 1440x810 in the above example). This gives the game code less work to do which reduces the load on the GPU and thereby allows the Oculus software more headroom to handle a higher HMDQ value (1.75 or even 2.0).
You can do the sums (resolution x SS x HMDQ) and see that the resulting frames might well be slightly higher-res than, say, the nearest SS=1.0 equivalent but I personally still feel like there's a garbage-in/garbage-out thing going on here which intuitvely rankles my tech' sensibilities. Consequently I leave SS at 1.0 and set HMDQ as high as it will go such that I still generally get close to 90fps (with ASW turned off to check).
P.S. I might re-use this post in the future so if there's anything substantially wrong in my understanding then I'd love to know so I can rephrase this post appropriately.