Pimax 5K Super - massive performance issues

I don't remember asking if you posted it, but what's your GPU? If you have an NVIDIA (50xx or 40xx), it's been suggested that you downgrade back to v566.36. There are reports of driver issues with the latest and greatest driver version ATM. I've noticed some stutterers, but it's not a deal breaker just yet. I may downgrade anyway.
 
I don't remember asking if you posted it, but what's your GPU? If you have an NVIDIA (50xx or 40xx), it's been suggested that you downgrade back to v566.36. There are reports of driver issues with the latest and greatest driver version ATM. I've noticed some stutterers, but it's not a deal breaker just yet. I may downgrade anyway.
3090 Ti. And I'm on 565.90. I had read somewhere else recently that anyone not on 5000 series cards should go to the highest version of 565.xx for exactly the reason you mentioned. I was on a 566.x release after I had recently updated from a 535.x driver, then when I tried changing the headset to 120Hz mode it refused to turn on the headset; moving back to a 565 driver solved that problem, so indeed it seems the newest drivers are problematic for past generation cards.
 
Is supersampling determining the resolution the game tries to render in, and then the HMD image quality setting simply upscales or downscales the rendered image in the headset?
It looks like you've found your comfortable tradeoff spot, but anyway...

"HMD Quality" multiplies the render target size requested by the VR runtime, and delivers it rendered frames of the resolution produced by this multiplication. There is limited sense in messing with combinations of multipliers in Pimax software, SteamVR and HMD Quality -- they all just contribute to what the final frame size will be, and save the matter of how fine-grained each can be adjusted, a change in any single one of them can produce the exact same result as giving them different values that simply cancel one another out.

In-game "Supersampling" also multiplies the render target size (after HMDQ has made its contribution to it, so both apply), but takes upon itself to resample the rendered frame back to the unmultiplied size, before handing it over.

This is why you want to use HMDQ (or Pimax/SteamVR multipliers) as your first choice of the two, to give the VR runtime compositor as much detail as possible to pick from, when it compensates for lens distortion, and maps the result to the display panels; In-game SS can be added on top of HMDQ-etc in the hypothetical situation where you want to go higher than a combined x2.0 (same as 400%), because realtime filtering often begins to skip pixels from the source image at that point, instead of including every one comprising the sample area for an output pixel, so if you want the benefit of any such extra-extra additional render resolution, you may be best off getting it "baked it", so to speak, beforehand).

It is also used when rendering for displaying on a monitor, which would be the reason it helps when on foot -- it (presumably) supersamples the developer-preset-resolution 2D view, that is rendered and then mapped to the subsequently rendered cinema screen view scene in VR (which does "respect" the requested frame size from the VR runtime).

Other than for cases like the above, personally I strongly disagree with any alchemical messing with combinations of subsampling, but that's from my personal preferrence of hating so much as the slightest bit of blur even more than I do aliasing. Others obviously have different sensibilities.


On other notes: When I have used Pimax headsets with my (slightly older) hardware setup (EDIT: ...and older Pimax software), I have had to disable Hardware-accelerated GPU Scheduling in Windows graphics settings, or things would get extremely stuttery and jittery, but that is probably neither here nor there for your situation...


If you think you could live with foveation, it is possible you could eke out an extra frame per second, or maybe two, by force-injecting Variable Rate Shading, using either OpenXR Toolkit, or VRPerfkit (for OpenVR (i.e. SteamVR)), especially given how much the combination of more than 100° of field of view, and Parallel Projections, expands the size of the render target. I know the latter allows you to set the size of the foveation radii, and believe the former does too, and the former maaaaay also be able to utilise the eye tracking in the Super, to move the foveation around, but I do not know about that...

(EDIT: Are you really getting as low as the 1440p resolutions you suggest, by the way? Base render target resolutions are usually ca. 1.4 times display panel resolution, to account for how the lenses compress pixels in the centre, and anisotropy adds more on top of that, as FOV gets larger -- especially with parallel projections. I do notice larger numbers by your SteamVR resolution sliders.)

(EDIT2: Ah, err, never mind -- somehow I got into my head you had managed to bagsy an early Crystal Super, rather than one of the older ones; Disabused of this particular bit of lack of attention, things square up better in my head... :p)
 
Last edited:
It looks like you've found your comfortable tradeoff spot, but anyway...

"HMD Quality" multiplies the render target size requested by the VR runtime, and delivers it rendered frames of the resolution produced by this multiplication. There is limited sense in messing with combinations of multipliers in Pimax software, SteamVR and HMD Quality -- they all just contribute to what the final frame size will be, and save the matter of how fine-grained each can be adjusted, a change in any single one of them can produce the exact same result as giving them different values that simply cancel one another out.

In-game "Supersampling" also multiplies the render target size (after HMDQ has made its contribution to it, so both apply), but takes upon itself to resample the rendered frame back to the unmultiplied size, before handing it over.

This is why you want to use HMDQ (or Pimax/SteamVR multipliers) as your first choice of the two, to give the VR runtime compositor as much detail as possible to pick from, when it compensates for lens distortion, and maps the result to the display panels; In-game SS can be added on top of HMDQ-etc in the hypothetical situation where you want to go higher than a combined x2.0 (same as 400%), because realtime filtering often begins to skip pixels from the source image at that point, instead of including every one comprising the sample area for an output pixel, so if you want the benefit of any such extra-extra additional render resolution, you may be best off getting it "baked it", so to speak, beforehand).

It is also used when rendering for displaying on a monitor, which would be the reason it helps when on foot -- it (presumably) supersamples the developer-preset-resolution 2D view, that is rendered and then mapped to the subsequently rendered cinema screen view scene in VR (which does "respect" the requested frame size from the VR runtime).

Other than for cases like the above, personally I strongly disagree with any alchemical messing with combinations of subsampling, but that's from my personal preferrence of hating so much as the slightest bit of blur even more than I do aliasing. Others obviously have different sensibilities.


On other notes: When I have used Pimax headsets with my (slightly older) hardware setup (EDIT: ...and older Pimax software), I have had to disable Hardware-accelerated GPU Scheduling in Windows graphics settings, or things would get extremely stuttery and jittery, but that is probably neither here nor there for your situation...


If you think you could live with foveation, it is possible you could eke out an extra frame per second, or maybe two, by force-injecting Variable Rate Shading, using either OpenXR Toolkit, or VRPerfkit (for OpenVR (i.e. SteamVR)), especially given how much the combination of more than 100° of field of view, and Parallel Projections, expands the size of the render target. I know the latter allows you to set the size of the foveation radii, and believe the former does too, and the former maaaaay also be able to utilise the eye tracking in the Super, to move the foveation around, but I do not know about that...

(EDIT: Are you really getting as low as the 1440p resolutions you suggest, by the way? Base render target resolutions are usually ca. 1.4 times display panel resolution, to account for how the lenses compress pixels in the centre, and anisotropy adds more on top of that, as FOV gets larger -- especially with parallel projections. I do notice larger numbers by your SteamVR resolution sliders.)

(EDIT2: Ah, err, never mind -- somehow I got into my head you had managed to bagsy an early Crystal Super, rather than one of the older ones; Disabused of this particular bit of lack of attention, things square up better in my head... :p)

That all makes sense to me, and yeh since I was just going for native resolution I have all those different scaling settings set at 1.0. I was actually confused why the resolution didn't seem to line up with the 2x2560x1440 resolution of the panels, but your explanation of it multiplyingit by 1.4 to account for pixel compression makes sense so I guess that's why it shows that way.

I did try disabling HAGS to see if it had any effect but there was very little if any. It seemed like if anything it made performance slightly worse. I'm still on Windows 10 though, so maybe it affects things differently for those on Win11. I know there are several settings in Win11 around security that can absolutely tank game performance if they're enabled (which they are by default).

So, it's funny you mention OpenXR Toolkit cause I found something interesting tonight. I had previously installed it months ago along with PimaxXR to allow me to use OpenXR with my racing simulator (iRacing) as that sim has native support for OpenXR and it's noticeably more performant than OpenVR. I was doing some testing with iRacing last night and then tonight when I went to test ED again, as I fired up SteamVR it popped a message about SteamVR not being the default runtime, and since I was starting ED I clicked the option to revert it back as my understanding is ED only supports OpenVR. Once ED is launched I see fpsVR running in the headset as expected, but as I turned my head I noticed the periphary of my vision seemed a bit muffled, like you would get if running foveated rendering (which PimaxXR implements, but I'm running in OpenVR mode not OpenXR). So I hit the hot-key combo that opens the PimaxXR in-game settings menu that you use to configure it and to my shock it shows up! At first I thought it might just be a byproduct but not actually doing anything, but I tried shrinking the radius of the center circle area down to 5% and lo and behold most of the view becomes low resolution and garbaled, so it was indeed doing foveated rendering! I tried enabling and disabling it back and forth and fpsVR showed a GPU load drop every time I enabled it, so it's definitely having an effect. But now I'm confused because my understanding is that feature is implemented within the PimaxXR runtime and so shouldn't be available when running in OpenVR mode??? I attached a picture of it to prove it's for real. Sorry for the blurriness but I had to take the screenshot through the lense of the hadset as the 2D mirror window on the desktop doesn't show either the fpsVR display or the PimaxXR config menu.

So I'm confused as to how this is working, but the fact that I can use with the SteamVR runtime means I now have access to fpsVR in the headset and that makes tweaking 10x easier again. It definitely seems to be using the PimaxXR runtime though, because even without foveated rendering it's still more performant than it was using OpenVR. In fact I was able to expand the FOV larger than what I could with OpenVR and still keep the frametimes mostly in the green!

I'm not sure how this is working but I'm glad it is!
 

Attachments

  • 20250407_212940-abreviated.jpg
    20250407_212940-abreviated.jpg
    171.7 KB · Views: 32
Elite Dangerous does support OpenXR

 
Last edited:
Wait so you run ED in 2D as a flat screen app within Virtual Desktop?
No full 3D.
I launch ED from my desktop with the /vr binary and the game launches in to VR

See my comments in this thread

 
No full 3D.
I launch ED from my desktop with the /vr binary and the game launches in to VR

See my comments in this thread

Ahh ok I see now. Good stuff
 
It definitely seems to be using the PimaxXR runtime though, because even without foveated rendering it's still more performant than it was using OpenVR.

I don't believe it is so much a matter of one being more performant than the other, as it is about having one middleman fewer in the stack of discrete pieces of software everything has to pass through.

Elite: Dangerous does not (so far) have OpenXR support, to the best of my knowledge -- only Oculus and OpenVR (the former known as OVR, just to confuse things further ;P).

What you do have, is that the various APIs and the functionality they mediate access to are similar enough that you can shoehorn-in a wrapper (such as OpenComposite) that poses as the OpenVR API, converts OpenVR calls it receives from applications, in this capacity, to the same type of call in the OpenXR API, and redirects them to your active OpenXR-compliant VR runtime. This way SteamVR is bypassed with headsets that do not run native on SteamVR, but have their own software, and will not act as an extra runtime on top of your headset's native runtime -- a more direct route.

Pimax has assimilated a lot of stuff from various sources into their software over the years, starting with its foundation which was the original older Oculus runtime, back when it was opened up, after Oculus moved over to their release one.

Pimax's OpenXR implementation was written by MBuccia -- an independent developer, on his own inititative and time, and Pimax subsequently included it in their distro; I believe he is also the brains behind the OpenXR runtime that GGodin added to VirtualDesktop.

Back when Pimax announced they had baked in foveated rendering capability, I first assumed this must have been a pointless ploy, where only their compositor ran foveated, which would add little benefit, because the game would still render the frame in full, surely? Enough people reported good perfomance gains, however, that I had to make a new assumption that maybe they had vacuumed up FHolger's VRPerfkit github, or something like it, and bolted that in there as well... Software like it and OpenXRToolkit "hack" their way into the render pipeline, and activates VRS for select intercepted shaders of a game's, for which it is appropriate (figuring out which are is to a degree guesswork). MBuccia has a little writeup somewhere, where he outlines this process and its complications.

I don't think your being able to call up that window is necessarily an indication you are running over OpenXR -- the various runtimes are simply multilingual, so to speak -- SteamVR "speaks" its own OpenVR API, and the OpenXR one; Oculus understands its own OVR, and OpenXR, and so on... (EDIT2: The idea is that OpenXR is to supplant all the older, prorietary, APIs, as a common industry standard, so that an application developer can "write once, run everywhere".)

(EDIT: There is a bit of a controversy right now, about some of Meta's plugins for game engines favouring their own non-OpenXR APIs, instead of making all their functionality go through OpenXR and its extensions framework; Further more disturbingly: Disabling functionality for others. One workaround suggested, for any developer who still want to use those plugins, instead of alternatives which are more open standard compliant, is to make other runtimes "spoof" as Meta's... :p)
 
Last edited:
When I'm using VR for ED with my Quest 3 via Virtual Desktop's VDXR implementation of OpenXR I can open and use in game the OpenXR Toolkit


 
Last edited:
When I'm using VR for ED with my Quest 3 via Virtual Desktop's VDXR implementation of OpenXR I can open and use in game the OpenXR Toolkit

I do not doubt it, but does this happen without something or other (not necessarily with your good knowledge) launching OpenComposite or equivalent, in order to make this happen (in spite of itself), either globally, so that it hijacks all attempts to access OpenVR; Or per-app, by replacing openvrapi.dll in your Elite installation with an imposter?

Nobody would be happier than I, should it transpire FDev has added OpenXR support in Elite, but I don't think I need to be particularly cynical, to have little faith in their will to do so.
 
Last edited:
I don't believe it is so much a matter of one being more performant than the other, as it is about having one middleman fewer in the stack of discrete pieces of software everything has to pass through.

Yeh that's basically what I was alluding to by "more performant", but admittedly glossed over the reason as to why. I'll blame my typing things at 2AM sleep deprived for that one :ROFLMAO:

Elite: Dangerous does not (so far) have OpenXR support, to the best of my knowledge -- only Oculus and OpenVR (the former known as OVR, just to confuse things further ;P).

What you do have, is that the various APIs and the functionality they mediate access to are similar enough that you can shoehorn-in a wrapper (such as OpenComposite) that poses as the OpenVR API, converts OpenVR calls it receives from applications, in this capacity, to the same type of call in the OpenXR API, and redirects them to your active OpenXR-compliant VR runtime. This way SteamVR is bypassed with headsets that do not run native on SteamVR, but have their own software, and will not act as an extra runtime on top of your headset's native runtime -- a more direct route.

Yep all that makes sense for sure. There's multiple abstraction layers to a VR pipeline and we're swapping in proxies for the original in places to either remove other layers or add new features in the mix. I'm a programmer, though not a graphics programmer, so up to a certain point everything is clear as day and then once it gets to the lower level where I don't have the domain specific knowledge to follow it further...it's maddening to say the least.

So for my setup I'm not (to my knowledge) using OpenComposite in any way. I'm aware of it but have never downloaded or installed it, so it's definitely not in deliberate use by me. Now if something else in the chain like OpenXR ToolKit or PimaxXR is using it to implement its functionality then it may be in use and I just wouldn't know about it. The main part that's confusing is that fpsVR works, becuase my understanding of that app is that it polls its data from SteamVR/OpenVR and so isn't capable of collecting data if OpenXR is in use. That's the point where I was unsure what layer of this pipeline was allowing it to run even though the PimaxXR runtime is clearly in use. As far as my limited understanding goes it shouldn't be possible, but I was very happy to be wrong about it once I discovered it :)

Elite: Dangerous does not (so far) have OpenXR support, to the best of my knowledge
That's my understanding as well. I did some searching around to see if Frontier even announced implementing OpenXR but didn't find anything to that effect, and considering it's not presented as a configuration option anywhere my bet is that they haven't. When iRacing implemented it they announced it with the release notes, and now it's a drop down selection you choose everytime you start a race session where you choose either regular monitor display, OpenVR, or OpenXR so with that game you always know exactly what API is being used.

Pimax has assimilated a lot of stuff from various sources into their software over the years, starting with its foundation which was the original older Oculus runtime, back when it was opened up, after Oculus moved over to their release one.

Pimax's OpenXR implementation was written by MBuccia -- an independent developer, on his own inititative and time, and Pimax subsequently included it in their distro; I believe he is also the brains behind the OpenXR runtime that GGodin added to VirtualDesktop.
Yeh I still remember when he first published PimaxXR (it was right around the time I bought the Pimax) and a lot of us VR racers on iRacing were super thankful for his efforts as it really made a huge difference in performance for the Pimax folks, especially with the support for foveated rendering. I saw a while back that it appears Pimax was incorporating some of his work into their official software but didn't see anything after that. I'm glad they're taking the time to support it officially so more people can benefit from it. If it wasn't for the iRacing forums I never would have known about PimaxXR

EDIT: There is a bit of a controversy right now, about some of Meta's plugins for game engines favouring their own non-OpenXR APIs, instead of making all their functionality go through OpenXR and its extensions framework; Further more disturbingly: Disabling functionality for others. One workaround suggested, for any developer who still want to use those plugins, instead of alternatives which are more open standard compliant, is to make other runtimes "spoof" as Meta's... :p)
Ohhhhhhh yeh, I saw that "stuff" when I looked at the OpenXR Toolkit site again the other day. Wish I could say I'm surprised, but unfortunately not at all, as that's the exact kind of thing I EXPECT the likes of Meta to do. I got into the VR space around the Oculus DK2 headset days and we were all disappointed when facebook bought it cause we knew this kind of anti-consumer behavior would happen inevitably at some point. I hope someone manages to "spoof" it exacty like you said. (insert the Linus Torvalds "middle finger" meme here).
 
The main part that's confusing is that fpsVR works, becuase my understanding of that app is that it polls its data from SteamVR/OpenVR and so isn't capable of collecting data if OpenXR is in use. That's the point where I was unsure what layer of this pipeline was allowing it to run even though the PimaxXR runtime is clearly in use. As far as my limited understanding goes it shouldn't be possible, but I was very happy to be wrong about it once I discovered it :)

Yes, it really sounds like you're still on SteamVR, whose compositor would also be what fpsVR uses to draw its graphical overlay into the view (...and presumably the main source of overhead). :7

What I was wondering, was whether that window you could open is necessarily exclusively a PimaxXR thing, or more a Pimax thing -- I believe Pimax has foveation for OpenVR, but not for OpenXR, which requires the addition of OpenXR Toolkit. -I do not even remotely know this, though -- I haven't dabbled with Pimax since PiTool. :7

I hope someone manages to "spoof" it

Hopefully rather that developers opt for the more standards-compliant alternatives instead, even if it is a little bit more work, so that things remain open and no spoofing is needed, I think...

I recall having to make web browsers I was using spoof as Netscape Navigator to get service from a lot of sites; It always amused me how Andreessen et al were state witnesses against Microsoft and their Embrace-Extend-Extinguish shenanigans, when that is the exact way they themselves had previously gained dominance. :p
 
Last edited:
Yes, it really sounds like you're still on SteamVR, whose compositor would also be what fpsVR uses to draw its graphical overlay into the view (...and presumably the main source of overhead). :7

What I was wondering, was whether that window you could open is necessarily exclusively a PimaxXR thing, or more a Pimax thing -- I believe Pimax has foveation for OpenVR, but not for OpenXR, which requires the addition of OpenXR Toolkit. -I do not even remotely know this, though -- I haven't dabbled with Pimax since PiTool. :7
The window? You mean the settings window from my previous screenshot? If so that's OpenXR Toolkit, that much I can say for sure. I'm gonna do some more testing of it tonight. With foveated rendering now enabled, it was actually running pretty smoothly in "small" FOV at 120Hz and could also do "Large" FOV at 90Hz pretty well, neither of which were a good idea when using pure SteamVR. Now I just need to mess around with it and decide if want wider FOV or smoother display....decisions decisions.

I recall having to make web browsers I was using spoof as Netscape Navigator to get service from a lot of sites; It always amused me how Andreessen et al were state witnesses against Microsoft and their Embrace-Extend-Extinguish shenanigans, when that is the exact way they themselves had previously gained dominance. :p
Oh man, now THAT is my area of programming. Good GOD I don't miss having to build sites around Internet Exploder and all its non-standard "JScript" . Or my favorite one when AJAX first took off and for basically every other browser there was a standard way of making such a call...then there was IE, which required an ActiveX component to do it...because OF COURSE it needs to use ActiveX, cause EVERYTHING should be ActiveX

<steps off soapbox> Well that was a fun trip down memory lane :ROFLMAO:
 
The window? You mean the settings window from my previous screenshot? If so that's OpenXR Toolkit, that much I can say for sure. I'm gonna do some more testing of it tonight. With foveated rendering now enabled, it was actually running pretty smoothly in "small" FOV at 120Hz and could also do "Large" FOV at 90Hz pretty well, neither of which were a good idea when using pure SteamVR. Now I just need to mess around with it and decide if want wider FOV or smoother display....decisions decisions.

Curiouser and curiouser. :7

My experience with 8k/5k series headsets was that I couldn't really tell all that much difference between the three (not counting "potato") FOV options.

Much of this is probably down to how preciously little of the view through the 8k/5k lenses is clear and not too distorted, optically, but it also made me form a little bit of a hypothesis that I, for one, might possibly value the 170-to-210 degree range of field of view higher than I do the 130-to-170 one, even if anything within the 140 and 160 degrees covered respectively by the "medium" and "large" options had been possible to read in the HMD. :p

It's that sense of "room", and optical flow when moving, provided by what's to the side and slightly behind you, that seems to me to contribute that much more to a sense of presence than the intermediate range...

If you come to any insights about your software pipeline, please do consider keeping the rest of us updated. :)

...
<steps off soapbox> Well that was a fun trip down memory lane :ROFLMAO:

The distance of time works wonders to turn strife into catharsis. :p
 
Last edited:
Back
Top Bottom