HP Reverb G2 impressions in Elite Dangerous

I'm soon getting a Reverb G2 and I was hesitating between keeping my current order for a RTX3080 or cancelling it and getting a RTX3060TI. I'm coming from a 1060.
But from what I'm reading I'm thinking the 3060TI won't be near powerful enough to play smoothly with this monster of a headset...
Thoughts?

Go as high as your budget allows is what I'd advise. My 2080 Ti requires some considerable compromises to run without reprojection, though reprojection also benefits from a more powerful GPU, as the closer you are to 90 fps, the less frames have to be generated through reprojection, and the smoother the experience feels. For those who experience nausea in VR, this can be critical.
 
I'm soon getting a Reverb G2 and I was hesitating between keeping my current order for a RTX3080 or cancelling it and getting a RTX3060TI. I'm coming from a 1060.
But from what I'm reading I'm thinking the 3060TI won't be near powerful enough to play smoothly with this monster of a headset...
Thoughts?

3080
 
My 1080ti is ok in ED in VR medium settings on the G2. Not perfect Tho. General yardstick for high Res VR.... Throw at it what ever you can afford
I am planning on either a 3080 or 3080ti or 6800xt for my next upgrade to go with a 5900x CPU once supply settles down
 
My 1080ti is ok in ED in VR medium settings on the G2. Not perfect Tho. General yardstick for high Res VR.... Throw at it what ever you can afford
I am planning on either a 3080 or 3080ti or 6800xt for my next upgrade to go with a 5900x CPU once supply settles down

I am also eyeing the 3080ti, but if I can get hold of a 3080 FE in January, I'll do it. Actually it is the sensible choice, considering the 3090 performance, the difference will be marginal anyway.
But having the ti would be great, if nothing for the brand snobbery. :)
 
It is is the ram which concerns me a bit on the 3080 going back 1 GB from my current GPU which came out.... 3? Years ago seems wrong to me. I would go AMD this time but raytracing is important for my next upgrade and with (currently) no DLSS equivalent and weaker raytracing (a feature which is imo only just about fit for purpose in the 3080 WITH DLSS if you want to game at equivalent of 4k ) means I am not convinced about AMD for my GPU this gen despite possibly having an edge in normal rasterisation (and ram amount)

The leaked benchmarks for 3080ti are dissapointing however Imo. I was hoping for essentially 3090 performance as far as gaming was concerned given same core count and clockspeed and just a little less memory bandwidth ... After all the 3090 is no great shakes better than the 3080 in performance.... But the 3080ti if the leaked benches are believable have it much closer to to 3080 than the 3090. Possibly not worth a £300 price increase.
 
Last edited:
It is is the ram which concerns me a bit on the 3080 going back 1 GB from my current GPU which came out.... 3? Years ago seems wrong to me. I would go AMD this time but raytracing is important for my next upgrade and with (currently) no DLSS equivalent and weaker raytracing (a feature which is imo only just about fit for purpose in the 3080 WITH DLSS if you want to game at equivalent of 4k ) means I am not convinced about AMD for my GPU this gen despite possibly having an edge in normal rasterisation (and ram amount)

The leaked benchmarks for 3080ti are dissapointing however Imo. I was hoping for essentially 3090 performance as far as gaming was concerned given same core count and clockspeed and just a little less memory bandwidth ... After all the 3090 is no great shakes better than the 3080 in performance.... But the 3080ti if the leaked benches are believable have it much closer to to 3080 than the 3090. Possibly not worth a £300 price increase.

Elite isn’t very VRAM demanding and I don’t think I’ve seen it go over 10 in Elite or not far over if it has. X plane 11 munches through it big time though in VR and I’ve seen over 21GB in use. MSFS2020 isn’t too bad at 4K, think I’ve seen around 11 or 12 in use but I’m guessing that would increase a bit in VR when it’s out.

I’ve not seen it use over 10 playing RDR2 or shadow of the tomb raider in 4K ultra with RTX on the highest settings with tomb raider. If VR is your main thing and you play flight sims I’d certainly lean towards more than 10GB VRAM otherwise it’s probably enough.
 
I am a pretty general gamer. I like a bit of most things. I have FS 2020 ready for VR , and a bunch of games like metro exodus waiting to go (lots of games, little time to play). I will likely get cyberpunk at some point....
But here is the thing. I am due a full upgrade CPU, GPU ram and SSD (my CPU is over 5 years old -i7 5820k... It's a good CPU but I think it would bottleneck any new GPU I buy now. I buy a cpu and expect it to last 5 years minimum... During that time I would probably expect to upgrade my GPU once more during a cpu cycle, so realistically I am hoping for 2.5-3 years out of my next GPU ... I am confident 10gb is ok for now... But in 18 months?
 
I am a pretty general gamer. I like a bit of most things. I have FS 2020 ready for VR , and a bunch of games like metro exodus waiting to go (lots of games, little time to play). I will likely get cyberpunk at some point....
But here is the thing. I am due a full upgrade CPU, GPU ram and SSD (my CPU is over 5 years old -i7 5820k... It's a good CPU but I think it would bottleneck any new GPU I buy now. I buy a cpu and expect it to last 5 years minimum... During that time I would probably expect to upgrade my GPU once more during a cpu cycle, so realistically I am hoping for 2.5-3 years out of my next GPU ... I am confident 10gb is ok for now... But in 18 months?

I don’t think 10GB will age well but I’d guess it will be ok for a couple of years.

I thought my 8700K would last longer than it has but think I’m probably going to upgrade next year. The 3090 is such a monster I think the rest of the system is holding it back now in MSFS and X plane. I think I’m probably going to go for a 5800X or 5900X and hopefully get 5 years out the CPU, RAM and MB.
 
Yeah. Whilst that is my thinking too the only thing nagging me at the back of my mind is that as good as the 5000 zen 3 is... It is the end of the line for ddr 4. Part of me is saying I should make do with my existing rig and get a next gen AMD DDR5 system. But then there is always something new on the horizons.... So I will probably upgrade as planned.

I have this G2 and it is practically begging me for more power!
 
Same for me. I expect DDR5 will cost a fortune when it comes out and probably won’t be available until towards the end of next year. If it ends up going into 2022 I’ll end up with another unbalanced system!

What would be good is if Zen 4 and AM5 support DDR5 when it comes out and come out mid next year. Then I’d probably hold off as AM5 should be around for ages and I could shove in my DDR4 and upgrade to 5 later.
 
Pale, did your G2 comes with a USB-C to USB A Adapter?

got my G2 today from SystemActive, got the kit assembled, might have the G2 up and running by tomorrow,
Before I assemble the plinth on casters, so the pc is easy to roll on the ground for easy access.

The Man Cave is coming together 😍

Anyhoo if yours did not come with that adapter, I would email HP to send one out.

(I had a look on the HP Web site for you, but they don't list it.)

Yea, it came with an adapter, but I didn't use it. I tried it last night, and didn't really see a difference. (Is it only used for tracking?) I switched from the Oculus Rift to the G2, and had issues with Steam etc. I finally got everything worked out, but when I try to change settings within Options/Graphics I have jittering once I select the new setting that I almost can't click on "Yes" to save the settings before my time runs out. I have also noticed that when I am at the Cliff house or have Edge open that the screen seems to move back and forth ever so slightly like it's trying to adjust the video. Does anyone have a good settings guide for ED/SteamVR/WMR now that the G2 has been out for a little bit? I had the Oculus running so smoothly that switching to the G2 has been an adventure to say the least.
 
Yea, it came with an adapter, but I didn't use it. I tried it last night, and didn't really see a difference. (Is it only used for tracking?) I switched from the Oculus Rift to the G2, and had issues with Steam etc. I finally got everything worked out, but when I try to change settings within Options/Graphics I have jittering once I select the new setting that I almost can't click on "Yes" to save the settings before my time runs out. I have also noticed that when I am at the Cliff house or have Edge open that the screen seems to move back and forth ever so slightly like it's trying to adjust the video. Does anyone have a good settings guide for ED/SteamVR/WMR now that the G2 has been out for a little bit? I had the Oculus running so smoothly that switching to the G2 has been an adventure to say the least.

I’d start by follow HeavyGroovz instructions to turn reprojection on: set the line of code from disabled to auto.


In the steamVR settings turn the render resolution to manual (auto selects ridiculous settings). Wind it down to 50% and work your way up from there until the performance goes down.

Also the settings only apply the next time you launch whatever game in VR so you’ll need to come quit Elite then launch it again for the changes in steam to take effect.

This might help:

Source: https://youtu.be/YpKFTqOUY-s
 
Last edited:
Also the settings only apply the next time you launch whatever game in VR so you’ll need to come quit Elite then launch it again for the changes in steam to take effect.
Elite Dangerous, specifically, seems to re-acquire the SteamVR recommended render resolution every time one change one of the game's own ss/hmdq settings; So as an alternative to relaunching the game, one can hop into settings, change HMD Quality to something else, and then back to 1.0 again, to make one's new SteamVR render resolution setting "take".

It is, of course, also possible to make Elite gobble up all one's VRAM - it really looks quite nice, and "tangible", with a ton of supersampling, 16k skybox (especially if one alter one's galaxy map parameters to make things a bit more eventful), and 8k planet textures - of course the frame rate when doing 400% render target on a 20+ ppd VR headset becomes rather slideshow-y. :7

I find a 6k skybox makes a, to me, tolerable balance between looks, and hyperspace jump time (during which the skybox is rendered), as long as I'm not going to do too many consecutive jumps; And 3k planets for rendering to a Pimax8KX, at x1.5 quality (225%); Any less, and the night lights on an inhabited earthlike looks "blobby"; More, and they begin to exhibit quite a bit of aliasing (...which can be balanced with more supersampling :p).

( Don't know if this has changed, but an upper limit that SteamVR has to the bitmap size it recommends to a game, at least used to default to 4096, and any SteamVR render resolution past that would be reduced to fit inside that frame, even though a larger number would be shown under the supersampling slider. This can be increased (to e.g. 8192), with the key: "maxRecommendedResolution", in the "steamvr: {}" object of one's steamvr.vrsettings file. )
 
Whew, I'm getting mine today and I found this thread.

I'm pretty ignorant to all these settings and was hoping for a plug and play experience.

Guess I have some homework
 
It is is the ram which concerns me a bit on the 3080 going back 1 GB from my current GPU which came out.... 3? Years ago seems wrong to me. I would go AMD this time but raytracing is important for my next upgrade and with (currently) no DLSS equivalent and weaker raytracing (a feature which is imo only just about fit for purpose in the 3080 WITH DLSS if you want to game at equivalent of 4k ) means I am not convinced about AMD for my GPU this gen despite possibly having an edge in normal rasterisation (and ram amount)

The leaked benchmarks for 3080ti are dissapointing however Imo. I was hoping for essentially 3090 performance as far as gaming was concerned given same core count and clockspeed and just a little less memory bandwidth ... After all the 3090 is no great shakes better than the 3080 in performance.... But the 3080ti if the leaked benches are believable have it much closer to to 3080 than the 3090. Possibly not worth a £300 price increase.

I don't think RAM will be an issue, not for the next 3-5 years when this card is expected to run the top games anyway. We'll see soon enough what Flight Sim demands in VR, that will set for me the benchmark anyway.

For me AMD is not attractive at all, DLSS will be big in VR too and with this image quality, that with ray tracing will be a truly generational leap.
 
Elite Dangerous, specifically, seems to re-acquire the SteamVR recommended render resolution every time one change one of the game's own ss/hmdq settings; So as an alternative to relaunching the game, one can hop into settings, change HMD Quality to something else, and then back to 1.0 again, to make one's new SteamVR render resolution setting "take".

It is, of course, also possible to make Elite gobble up all one's VRAM - it really looks quite nice, and "tangible", with a ton of supersampling, 16k skybox (especially if one alter one's galaxy map parameters to make things a bit more eventful), and 8k planet textures - of course the frame rate when doing 400% render target on a 20+ ppd VR headset becomes rather slideshow-y. :7

I find a 6k skybox makes a, to me, tolerable balance between looks, and hyperspace jump time (during which the skybox is rendered), as long as I'm not going to do too many consecutive jumps; And 3k planets for rendering to a Pimax8KX, at x1.5 quality (225%); Any less, and the night lights on an inhabited earthlike looks "blobby"; More, and they begin to exhibit quite a bit of aliasing (...which can be balanced with more supersampling :p).

( Don't know if this has changed, but an upper limit that SteamVR has to the bitmap size it recommends to a game, at least used to default to 4096, and any SteamVR render resolution past that would be reduced to fit inside that frame, even though a larger number would be shown under the supersampling slider. This can be increased (to e.g. 8192), with the key: "maxRecommendedResolution", in the "steamvr: {}" object of one's steamvr.vrsettings file. )

Interesting. Might have a fiddle with some of those settings.
 
I wana thank you for code, i upgraded to a 3070 from a 2070 and for what ever reason, everything went to poooo, the code seems to have helped



Motion smoothing will not do anything it is native to the SteamVR runtime not the SteamVR for Windows Mixed Reality runtime.

You need to enable WMRs own motion vector reprojection.

\Steam\steamapps\common\MixedRealityVRDriver\resources\settings\default.vrsettings

Code:
// Motion reprojection doubles framerate through motion vector extrapolation
//     motionvector = force application to always run at half framerate with motion vector reprojection
//     auto         = automatically use motion reprojection when the application can not maintain native framerate
//     disabled     = turn off motion reprojection
//
// Comment out or remove this line to use the SteamVR settings for controlling motion reprojection
        "motionReprojectionMode" : "disabled",


I
 
Update. After installing the code it got better, but it wasnt the real issue. Real issue is I had the WMR window display, on a 144hz monitor, making for irregularities in rendering frames in the G2 head set. I switch my monitor to 60hz along with the heads set and everything cleared up, no more stuttering, head movement became smooth. I cant make a custom 90hz 1440 in NV control panel. there is a 87hz option. I have not tried running the window on my other monitor that runs 60hz, i'm not sure if all my monitors need to run same hz for best performance or not.

What happened is I bought a 3070 upgraded from a 2070, that didnt have the issue. The 2070 I had my 144z monitor on a HDMI port and it wasnt getting more than 60hz so it wasnt a issue, when i got the 3070 it had all the correct ports to hook up my triple monitors and run my 1440 monitor at 144hz, hense why i didnt have the issue with the 2070.
 

Attachments

  • 119954728_1660921300752941_2209025598398822181_o.jpg
    119954728_1660921300752941_2209025598398822181_o.jpg
    203.4 KB · Views: 121
I don't think RAM will be an issue, not for the next 3-5 years when this card is expected to run the top games anyway. We'll see soon enough what Flight Sim demands in VR, that will set for me the benchmark anyway.

For me AMD is not attractive at all, DLSS will be big in VR too and with this image quality, that with ray tracing will be a truly generational leap.
Ray tracing won't be an issue for the next 3-5 years either, no matter what Nvidia say.
 
Hmmmm I hope my G2 does not have a fault
I was gaming for a few hrs all fine then my display started cutting out, then come back, then cut out again.
This possibly coincided with my TV turning off due to power saving so I am hoping it is a software issue that the monitor turning off confused it.

But if not there is something wrong. Time will tell. as anyone else had issues with it cutting in and out ?
 
Back
Top Bottom