3090 is the new VR KING

Overall, the 3080 and 3090 are faster at 4k than the new Radeons, and significantly faster where there is heavy use of hardware raytracing.

However, I would not be surprised if AMD's parts actually turn out to be faster in Elite: Dangerous at 4k and in VR, than their direct NVIDIA competitors.

Elite is primarily fill rate limited and the 6800XT and 6900XT have a fill rate advantage. Elite also doesn't use any of the NVIDIA-centric features that AMD is lacking or that AMD underperforms in; there is no ray tracing and no DLSS support in Elite.

Will have to see some actual benchmarks to be certain (and if the 6800XT shows up in stock before any of the 3080 modles I like, I may be able to compare them myself), but from my past and current experience with running ED on a variety of hardware, if I had to guess where the parts would stack up, I would expect RDNA2 to have an edge in this game.
 
Review of 6800xt and RTX3080 in VR games, it includes Elite.

TLDR : 3080 currently better experience in general across various VR games, however in Elite both cards offer excellent VR performance

https://babeltechreviews.com/vr-war...e-rtx-3080-15-vr-games-performance-benchmarke

Thank god for that! It is the impending inclusion of VR to MSFS2020 that made me decide to push the boat out and get a 3090. If the 6800XT was trouncing the 3080 in VR I'd be suffering severe buyers remorse. I'm no fanboy though and AMD doing so well is great news for VR. Nvidia can't dictate GPU prices and performance due to lack of competition anymore. I wouldn't be surprised if GPU performance doubles again in the next two years.

It really annoys me that there is such a lack of attention to VR from the usual YouTube lot. VR is the cutting edge of PC gaming yet almost completely ignored. How hard can it be to at least include OpenVR or VRMark benchmarks?
 
Thank god for that! It is the impending inclusion of VR to MSFS2020 that made me decide to push the boat out and get a 3090. If the 6800XT was trouncing the 3080 in VR I'd be suffering severe buyers remorse. I'm no fanboy though and AMD doing so well is great news for VR. Nvidia can't dictate GPU prices and performance due to lack of competition anymore. I wouldn't be surprised if GPU performance doubles again in the next two years.

It really annoys me that there is such a lack of attention to VR from the usual YouTube lot. VR is the cutting edge of PC gaming yet almost completely ignored. How hard can it be to at least include OpenVR or VRMark benchmarks?
I think, in the case of reviewers, they try to appeal to the masses. It's easier for them to connect to someone who is a fanboi of say, a graphics card or a CPU. VR is new technology, sides have not been drawn yet.
 
Last edited:
Review of 6800xt and RTX3080 in VR games, it includes Elite.

TLDR : 3080 currently better experience in general across various VR games, however in Elite both cards offer excellent VR performance

https://babeltechreviews.com/vr-war...e-rtx-3080-15-vr-games-performance-benchmarke
Thanks for linking that, it’s just the sort of VR-specific info I’ve been after.

I’ve been considering jumping to AMD seeing as the new consoles are based on them, but it looks like a 3080 has the VR power that I’m after. Now then, what will happen first - 3080’s come back in stock enough for me to buy one, or the 3080ti is announced and I resume the hmmm wait cycle once more? 😁
 
The consoles are based on AMD cause they wanted less money than Intel\Nvidia for the same job.
There is no special reason beyond that.
I was thinking more of future games - stuff optimised for the new consoles might work a touch better on similar hardware with the PC, or so my thinking went.

Whether it will or not is another matter, of course.
 
Windows and the drivers are abstracting what is in our computers anyway, so that shouldn't have any real meaning.
Or we would see some games be nearly unplayable between AMD or Intel already.
 
nah, 55" Q700T Samsung. I sit back 1-1.5m from the screen as I have a mixing desk sat between the screen so need the large size.



When I set windows to 100% scaling the text is so small that it's a smaller pixel density than my phone. I can't imaging trying to use a 32" up close @ 8k on 100% scaling.
Just tested it and it's about 6" distance just to see a pixel clearly.
 
Last edited:
Overall, the 3080 and 3090 are faster at 4k than the new Radeons, and significantly faster where there is heavy use of hardware raytracing.

However, I would not be surprised if AMD's parts actually turn out to be faster in Elite: Dangerous at 4k and in VR, than their direct NVIDIA competitors.

Elite is primarily fill rate limited and the 6800XT and 6900XT have a fill rate advantage. Elite also doesn't use any of the NVIDIA-centric features that AMD is lacking or that AMD underperforms in; there is no ray tracing and no DLSS support in Elite.

Will have to see some actual benchmarks to be certain (and if the 6800XT shows up in stock before any of the 3080 modles I like, I may be able to compare them myself), but from my past and current experience with running ED on a variety of hardware, if I had to guess where the parts would stack up, I would expect RDNA2 to have an edge in this game.

In my experience there is a distinct threshold in Elite where the sheer grunt of the dedicated texture mapping on the raster operations pipeline is simply throttled by the latency from the schedule on the core engine thread.

No matter how much dedicated oomph you have shader side on the card you still get annoying breaches of frametime budget that are consistent between very high and very low frame buffer targets.

And of course this is exacerbated by the latency from the web socket requests to the cloud infrastructure. Even when fully utilising the parallel multi threaded capabilities of the rendering api you are still subject to fundamental latency as the core engine thread awaits cloud resource responses.

Super interesting reading and watching material if you are interested :



Source: https://youtu.be/EvJPyjmfdz0
 
Last edited:
In my experience there is a distinct threshold in Elite where the sheer grunt of the dedicated texture mapping on the raster operations pipeline is simply throttled by the latency from the schedule on the core engine thread.

Elite: Dangerous is one of the least CPU dependent titles I can recall; even my aging 4.2GHz 5820k is good for almost 250fps before the game becomes appreciably CPU limited (by the primary render thread). On my newer CPUs this is even higher and since I'm never targeting such high frame rates, I'm essentially always going to be GPU limited, as I will throw custom detail settings and supersampling at whatever I've got until I hit my worst case frame rate floor. That is with NVIDIA parts though.

Still, even with AMD's higher DX11 overhead, my 3900X should not be the limiting factor in any rational scenario (I'm willing to see worst case frame rate of 60fps on my VRR monitors and 90 on my WMR headset), no matter what GPU I pair it with, in this game.

No matter how much dedicated oomph you have shader side on the card you still get annoying breaches of frametime budget that are consistent between very high and very low frame buffer targets.

And of course this is exacerbated by the latency from the web socket requests to the cloud infrastructure. Even when fully utilising the parallel multi threaded capabilities of the rendering api you are still subject to fundamental latency as the core engine thread awaits cloud resource responses.

Not much I can do about the occasional dips in frame rate during loading transitions or network reasons that I haven't already done and these aren't really problematic in most scenarios. Outside of that, in my experience, performance scales almost linearly with pixel/texel fill rate, and isn't particularly limited by shader performance (as I originally noted when comparing my 290X with my Fury years ago).

I made a video recording + fame time plot of a session of ED not terribly long ago:
Source: https://www.youtube.com/watch?v=eZ4Q-Qjvuuc

7e7LPCh.png


That's 12 minutes of gameplay at 4k with beyond ultra settings (forced 16x AF, SMAA, significantly increased texture resolutions, plus my custom shadow tables with ten frustums and 16k slice sizes), while recording 80Mbps video with OBS+NVEC on a 4.2Ghz 5820k and a 2.05GHz 1080 Ti.

There are certainly some spikes in the present intervals--that can mostly be pinned down to asset loading or waiting on Frontier/AWS--and some generally much smaller spikes in actual displayed frametimes, but overall it's a very smooth experience and one that is overwhelmingly GPU limited.

Anyway, if I can get my hands on an RX6000 series part (which is what I'm trying to do now), I'll be able to compare it directly with my brother's 3080 to see if my hypothesis is correct.


Been a while since I've seen that one!
 
Elite: Dangerous is one of the least CPU dependent titles I can recall; even my aging 4.2GHz 5820k is good for almost 250fps before the game becomes appreciably CPU limited (by the primary render thread). On my newer CPUs this is even higher and since I'm never targeting such high frame rates, I'm essentially always going to be GPU limited, as I will throw custom detail settings and supersampling at whatever I've got until I hit my worst case frame rate floor. That is with NVIDIA parts though.

Very interesting frametime data !

Ive found that CPU frametimes frequently blow budget in the virtual reality pipeline and cannot be wholly attributed to the additional geometry pass required for the stereo eye poses. It happens with SteamVR, SteamVR for WMR, Pimax and Oculus (I have HMDs from all 4 manufacturers and have frametime metrics for all of them).

The big problem with virtual reality is a frame drop because of a behind schedule CPU present (ignoring mitigation scenarios such as reprojected frames that are the result of synthesis by the virtual reality api) is that it is much more detrimental to the experience compared with normal monitor based gameplay.

One small hitch is very jarring - although i do confess to being very sensitive to my immersion being disrupted by such intrusions.
 
Last edited:
I try every day. I cannot find a 3080 or 3090 in stock anywhere. The money is burning a hole in my pocket too. If AMD gets their stock out first, I'll be so tempted. But I'd rather stick with Nvidia.
 
I try every day. I cannot find a 3080 or 3090 in stock anywhere. The money is burning a hole in my pocket too. If AMD gets their stock out first, I'll be so tempted. But I'd rather stick with Nvidia.

Quite a few Founders Edition 3090 cards on eBay at reasonable prices.
 
Back
Top Bottom