VR Framerate Upgrades and Software

Certainly not a "this is better than that" thread. I am looking to ask the community for a helping hand to those of us who are thinking of upgrading. I use Oculus Rift.

There are two "facts" that I know of with regard to VR that are rarely explicitly called out.

1) There are no computers capable of running a low resolution (Oculus or Std Vive) at 90 fps with settings at Ultra
2) To compensate for this each headset has its own software. In Oculus this is called Asynchronous Space Warp (ASW) and in Vive this is something like "Positional Reprojection".

These software solutions are just like normal software i.e. v1.0 is often buggy and not quite right. Then as the product matures the solution gets better as in Oculus ASW 2.0.

So the help I am looking for is this. If the low res HMD's cant run 90fps what on earth happens with the high end systems like Samsung's Plus, Vive Pro and Pixmax 5k (let alone 8k)? Whats the Software like (equivalent and maturity) i.e. ASW.

I could not have a slide show regardless of how many pixels I got but I would truly love to get better resolution, 100% readable text, wider field of view (FOV) and less screen door effect (SDE). I am sure (hope) I am not alone in wondering how these systems manage.

Thanks Cmdrs
 
1) There are no computers capable of running a low resolution (Oculus or Std Vive) at 90 fps with settings at Ultra
Not sure that's entirely true. I have an Oculus and a 1080 (not even a Ti) and, until I start ramping up the HMD quality setting above 1.0, I can pretty much run at 90fps on Ultra (and yes, I do mean 90 with ASW turned off). Perhaps you might need to tweak shadow quality down to high but it's certainly really close to that so I think with a Ti and/or maybe a 20xx series it's definitely possible. Of course what then happens is people starting increasing the HMDQ to 1.5 or above and the framerate starts to come down again.

Nevertheless, you question about the higher resolution headsets like the Pimax is a good one and one I've aalways been curious about myself.
 
I have a Rift, Modded Vive Pro and a 5k Plus. Rig is a 2080Ti + 9900k.

I used to run the Rift at 1.5 render scale and maintain 90fps everywhere.

With the Vive Pro i dont push it above the default 100% in SteamVR and maintain 90 fps everywhere.

The Pimax - even with the Small FOV, which at 130 is actually considerably larger than the modded Pro and Rift at 100/110 respectively - reprojects a lot. Add to that the poor black levels and its not an immersive experience.

I use the Vive Pro exclusively for Elite because it looks amazing and runs as smooth as silk - no reprojection at all [cool]

The 5k plus (for me) is not worth bothering with in Elite. It is however absolutely awesome for Assetto Corsa which has the fastest render pipeline i know and enables you to use the HMD in Normal FOV with MSAA and a reasonably large pre distortion buffer and get 90fps :eek: its a glimpse of the future for sure.

The current software solutions are Valves Motion Smoothing/Oculus Async Spacewarp/Pimax Brainwarp but i really dont like any of them - I used to use ASW a lot with the Rift but got tired of the artifacts.

Foveated rendering is the only way you are going to get ultra high resolution HMDs running at their native refresh rates in max FOV without reprojection/extrapolation. This of course requires extensive work throughout the render pipeline and i doubt we will see support for it in Elite any time soon (if ever) to be honest.

For me what is more important is the progress in lens development and distortion correction. Its no good having monster resolution and tons of rendering horsepower if you are looking through crap optics. The current Pimax fresnels are a quantum leap over the Rift/OG Vive Pro ones but are nowhere near as good as the aspheric GearVR lenses i use in the Pro. The transparency difference is colossal.

TLDR; listen to Michael Abrash at the Oculus keynote for a real good deep dive into how things are going to evolve (1:14:30)

[video=youtube;o7OpS7pZ5ok]https://www.youtube.com/watch?v=o7OpS7pZ5ok&feature=youtu.be&t=4470[/video]
 
Last edited:
Certainly not a "this is better than that" thread. I am looking to ask the community for a helping hand to those of us who are thinking of upgrading. I use Oculus Rift.

There are two "facts" that I know of with regard to VR that are rarely explicitly called out.

1) There are no computers capable of running a low resolution (Oculus or Std Vive) at 90 fps with settings at Ultra
2) To compensate for this each headset has its own software. In Oculus this is called Asynchronous Space Warp (ASW) and in Vive this is something like "Positional Reprojection".

These software solutions are just like normal software i.e. v1.0 is often buggy and not quite right. Then as the product matures the solution gets better as in Oculus ASW 2.0.

So the help I am looking for is this. If the low res HMD's cant run 90fps what on earth happens with the high end systems like Samsung's Plus, Vive Pro and Pixmax 5k (let alone 8k)? Whats the Software like (equivalent and maturity) i.e. ASW.

I could not have a slide show regardless of how many pixels I got but I would truly love to get better resolution, 100% readable text, wider field of view (FOV) and less screen door effect (SDE). I am sure (hope) I am not alone in wondering how these systems manage.

Thanks Cmdrs

Hi Hessitant,

It is possible to run at 90 fps with current PC's but just plonking in a really powerful GPU won't work. It is the entire system the contributes at 90 fps. What people are doing is increasing settings to try and overcome the poor resolution on headsets like the Rift to the point where the GPU can't do 90 fps and it flips to 45 fps. There isn't as much difference between pushing a Rift too far via settings and the native settings on an odyssey+ (for example).

[video=youtube_share;CsmhijQzvbs]https://youtu.be/CsmhijQzvbs[/video]
 
Thank you all for the posts.

Alec you reminded me of a classic point that I think talks to HeavyGroovers point about "having or needing" to make adjustment to the low res HMDs in order to achieve a result that is standard - or indeed better in a high res HMD. I do use HMD quality, infact at a minimum I set it at 1.25 so that I can read all the text in the standard cockpit view without moving but my preferred setting is 1.5.

I said that Ultra settings across the board where unachievable and I think that's confirmed right? However I dont run any where near Ultra, in fact its a few Lows, Mostly Mids and some Highs, yet when approaching planet bases - such as Felicity Farseer - ASW kicks in and FPS drops to 45, same for other mid to large planet bases. Also when exiting Hyperspace into a Station FPS drops for 5-10 secs and intermittently as I manoeuvre around again I see blips, as I enter the stations - cross the slot - it invariably tanks then recovers back to 90 and finally when on the landing pad and entering the docking bay it tanks for a short while especially if there are panels open. All in all not what I would call solid 90fps.

I also realise the poor timing of my post as currently I am just outside Sag A on a passenger mission so no chance of popping over to Felicities for a quick video :)

I am very much drawn towards the Samsung Plus.
 
Pimax just released their beta "Brainwarp" algorithm, worth checking out (I didn't have the time to do this yet). Along with that there's "fixed foveated rendering" experimental feature, alas locked to RTX cards for now.

As for black levels in Elite on pimax, they're not different to an LCD tbh as Pimax has a LCD screen. This becomes a "problem" if you do "side-by-side" type of comparison, but is not that bothersome in game. Granted, you might be better off with Vive pro for Elite, but not many of us are multi-headset overlords. I'm a single-headset-peasant with borrowed Pimax 5k+. I'm loving the Pimax, it is however not Gen2 yet, and I will hold my purchase for now.

And one more thing: VR in general is extremely subjective experience. Some people despise SDE, some people despise jaggies, some people despise reprojection and some people don't give a flying duck about all these vr shortcomings and just enjoy the ride. Your Mileage May Vary.
 
I would be very surprised if most people could see the difference between Mura and SDE.
And the first is an oled problem that varies between each panel. The former is just the gap between pixels, Mura causes the display to show a completely off tint for some colours.

It's a complete lottery, going for LCD's mitigates that lottery.
And most likely, the next rift will use LCD's as well for this and many other reasons.

Oled is neat and all, but it's a dead end tech, even more than plasma's.

Guess what.

Anyone that has performance problems in VR on a 1080 or more is running supersampling at 1.5 or higher.

No shift Sherlock.
And supersampling is a red herring.
It gives no actual new information. That would require higher res panels.
Either you can accept that or you don't.
Supersampling at best does moderate sharpening, and antialiasing.
Legibility doesn't actually improve. If you do have problems with reading.

See an optician.
 
Last edited:
Supersampling at best does moderate sharpening, and antialiasing.
Legibility doesn't actually improve. If you do have problems with reading.

See an optician.

See I agree with almost all you wrote besides this last part. I know for a fact that my vision is perfect (yearly check-ups) and there is a definitive difference between legibility of text in ED (and elsewhere) with or without supersampling on Vive. It might be connected to the fact that our brain is good at recognizing shapes, and supersampled shapes are by design fooling you to see a curve where you have in fact jagged line. Also people laugh that 4k is "meh" for gaming, but does wonders for... text legibility. I tend to agree with them :)

Anyhow supersampling makes a difference. You're right that it doesn't add new information but it's primary job is masking resolution deficiencies and artifacts. This in turn makes the resulting image better, via the same smoke and mirrors tricks which are used in 2d gaming. Remember the first days of vive and rift? People were saying Rift looks WAY better than the Vive, despite having identical resolution. The trick was behind-the-scenes supersampling which Oculus Runtime was doing when possible. That was before any supersampling sliders, you had to edit some obscure files in SteamVR to get the Rift picture quality. Nowadays it's automatic :)
 
I said that Ultra settings across the board where unachievable and I think that's confirmed right?

From my own experiments, the biggest FPS killers in ED VR are hmd quality, shadows, and terrain related planetary surface settings, in that order. The rest of the settings (aside from those most VR players turn off like AO) don't have that much of an effect. Anything past 1.5 HMD is diminishing returns, so that's the holy grail. I don't think there's any card that can do High/Ultra shadows, maxed terrain, and 1.5 HMD and maintain 90 fps inside stations/on planetary surfaces/in res sites. Anybody feel free to correct me if I'm wrong.

I would be very surprised if most people could see the difference between Mura and SDE.

I'd be surprised if they couldn't. They look completely different.

edit: This discussion in general reminded me of something.

I've been trying out DCS World lately. It supports VR. The graphics aren't particularly impressive, and certainly far less than ED's. And it has similar struggles maintaining a good frame-rate. Yet I'm able to play it on my 1070 at 1.5 HMD with ASW forced and everything is VERY smooth. Yes, I'm in permanent reprojection, but the experience is better than 1.0 HMD and 90 fps. There are the usual artifacts, but it's an enjoyable experience. Whereas ASW in ED for me means extreme visual hitching and stuttering when, for example, pitching up in a res site. There seems to be something odd with the way ED's engine interacts with ASW.
 
Last edited:
Dcs is severely thread limited, so if you are under 5000 feet I don't have a chance of getting 90 regardless of settings.

Basically the game is still single core and that doesn't move well with him frame rates.

And unlike elite the hud and dials actually have important information.
Most of the hud in elite is honestly superfluous at least if you compare it to the torque meter in the Huey or Exhaust temps in most aircraft.

And considering you can't reach high fps even with a i9 9900k and a 2080ti you might as well force the game to ASW and use the gpu how you can.

But still. Any supersampling above 1.3 doesn't really do much at all.

As for Mura and SDE, yeah they look completely different but most people doesn't know what either actually look or is anyway.
 
Dcs is severely thread limited, so if you are under 5000 feet I don't have a chance of getting 90 regardless of settings.

Basically the game is still single core and that doesn't move well with him frame rates.

And unlike elite the hud and dials actually have important information.
Most of the hud in elite is honestly superfluous at least if you compare it to the torque meter in the Huey or Exhaust temps in most aircraft.

And considering you can't reach high fps even with a i9 9900k and a 2080ti you might as well force the game to ASW and use the gpu how you can.

But still. Any supersampling above 1.3 doesn't really do much at all.

As for Mura and SDE, yeah they look completely different but most people doesn't know what either actually look or is anyway.

My point is, that forcing ASW in ED results in a stuttery, jittery mess, whereas forcing ASW in DCS results in smooth gameplay. Why? In neither case is the game hitting anywhere near 90 (yet still above 45), yet one can perform smoothly with reprojection, and the other can't. Every time I go into a station, ASW kicks in and things stutter. Yet DCS doesn't stutter. Why?
 
Last edited:
It could be a lot of things.

And as users we can only guess to what exactly does what.

I did personally however notice a big reduction in ASW artefacting and frame stutters when I upgraded from an i7 4790k to the i7 8700k.

So a chief suspicion of mine is that ASW does reduce the demand on the gpu, it increases load on the CPU.

If you are throttling or close to throttling on CPU, ASW might not help much at all, or even make things worse.
DCS on the other hand CPU limits itself by barely making use of a single or two threads at most whereas Elite loads on all of my 12 threads.
So in dcs windows or Oculus has a lot of room to work with compared to elite.

Although there is the question if that is Elite, or just windows cycling threads but I suspect elite uses at least eight threads.

Anyhoo, that's just me making random guesses.
 
Last edited:
I don't that's the case for me since I have a 6700k @ 4.6 GHz. I don't think 2 extra cores and a few hundred MHz makes a difference since there's not much difference architecturally between our CPUs. Do you see the same with ASW on your system in ED?
 
Last edited:
Yeah that cpu should honestly be fine.

Apart from some squiggles in the hud on occasion, pretty much on par with spacewarp behaves in dcs.

There was a couple patches ago a bug with the transition between asw or not by that was two feature patches ago.
 
Last edited:
I don't think there's any card that can do High/Ultra shadows, maxed terrain, and 1.5 HMD and maintain 90 fps inside stations/on planetary surfaces/in res sites. Anybody feel free to correct me if I'm wrong.

I am of the same opinion, based on my own experience and also from hat others have posted through out these forums.

And one more thing: VR in general is extremely subjective experience. Some people despise SDE, some people despise jaggies, some people despise reprojection and some people don't give a flying duck about all these vr shortcomings and just enjoy the ride. Your Mileage May Vary.

Absolutely agree 100% here. for me the stutters and ASW moments, SDE, FOV just fade into the meh who cares when compared with the fact I am flying a space ship :)
 
.fMr. grooves i have the same system specs as yours vive pro
.so my question is can you play in a rez site for more than 30 minutes and still maintain a 90fps ?
i can maintain for about 15 to 20 minutes then i start getting stutter especially when im looking at a lot of npc ships in combat with each other in front of me.
im just wondering if you doing something magical i might need to try .this is only happening in rez and conflict zones everywhere else has been fine. also if i log and log back in it resets the whole process . is there a known memory leak?
thanks
 
There is definitely some instance degradation going on for certain, always has been, but significantly worse after npc's started dropping materials.

I also haven't checked after the last small patches but after 3.3 Vram use would start at around 6GB, then slowly rise as you played for about 30-40 minutes to pretty much max out my 11GB 1080ti.

That had a smattering of memory leak to me.
Don't think it released it fully either until I closed the game completely.
 
Last edited:
Thanks tor torden for your knowledge on the game it helps with the frustration.
i just see so many posts from people that say they get 90fps everywhere ,smooth as butter they say so i just want to know what i can do or buy to get to there lvl of play.
but than you for your insight on rez/conflict zone degradation . im glad to know theres nothing i can really do about it. it just is what it is.
 
This drove me nuts when I was tuning my graphics settings. Just realize a significant portion of people can't seem to tell between ASW and true 90 fps.

And herein lies the reason for this post. Can higher res HMDs actually run at 90FPS in - Space Stations, Planet Ports, Res Sites with details set above Low cartoon? ASW is OK as long as you dont move too much, but when your glancing around it judders and stutters. My experience is def at the Good end but not in beautiful places like Felicity Farseers place at Deciat. I am torn between the added benefits of wider FOV, clearer text/images and less SDE vs more 45fps
 
Back
Top Bottom