Interesting. According to
MSDN docs:
D3D11_FILTER_MIN_MAG_LINEAR_MIP_POINT Use linear interpolation for minification and magnification; use point sampling for mip-level sampling.
If I'm interpreting it right, this means that it always uses a single mipmap instead of interpolating between two mipmaps. In this case since there is only one level, it'll always use that as the only available mipmap. Then, from this mipmap, it'll pick the four samples closest to the pixel center and make a weighed average.
This doesn't work well if the onscreen rendered size is significantly smaller than the original texture size since some parts of the source image may end up being ignored completely. I think that matches the effect visible in
http://i.imgur.com/vKIxRUZ.png . Look at the horizontal part of the "L" in "LANDING GEAR" or the first "S" in "MASS LOCKED", you can see that some parts of the image seem to be disappearing instead of being averaged among close pixels.
Crude ASCII graphics:
++++++++++++++++++
++12++12++12++12++
++34++34++34++34++
++++++++++++++++++
++12++12++12++12++
++34++34++34++34++
++++++++++++++++++
Each "+" is a pixel in the source image, and the "1234" points are the ones being sampled and linear filtered for output pixels. Note that there are many "+" pixels that don't contribute at all to the output image, so that any information there gets lost.
The best way to fix this would be to either add mipmaps to ensure that it can properly downsample without losing information, or to dynamically adjust the size of the texture region used for drawing text to roughly match the pixel size after rendering.
Does that sound about right?
As far as I can tell this is all part of Elite: Dangerous's pipeline, so it ends when it hands off a rendered texture to the compositor. I'm still a bit suspicious about how the compositor maps the rendered texture in its distortion filter to get onscreen subpixels, but that would be a separate issue.
This reminds me - I saw an undocumented --nodistort flag for vrcompositor.exe. Can someone try if it's possible to add that for launching it as part of SteamVR to see if that makes a difference in sharpness? (I can't access my Vive just now.) Not sure if it works at all, or if doing so would keep a 1:1 mapping in the image center or if it'll get the scale completely wrong.
Many thanks for doing the investigation!