Rift CV1 stats confirmed: 2160x1200@90Hz. Oculus recommended card: "GTX 970". Frontier, it's time to optimize!

And this generates a question for myself: Extra resolution WILL have an impact on performance. Regardless of "how much perceptual difference" it will make to my eyes, my R9 290 is still going to have some extra work to do with the CV1. So,,,,,,,,,,

I am left wondering if I should just grab a DK2 when I can afford it and just forget about the CV1 until I can also afford a beefier GPU?

And if the CV1 is released at a price point that is about the same of the DK2, I would think it would force a lower price on used DK2's.

Or am I missing something?

My gut feeling (ie i am probably wrong ;) ) is that CV1 will have different set up options for those with weaker systems.

just as with DK2 we have 60hz 72hz or 75 hz...

my feeling is that CV1 will have at least 2 refresh rate options (75hz and 90hz)

and 2 resolution options - 1080p (with scaling) and native res.

with this in mind, if you choose to run at 1080p @ 75hz then imo there is no reason why performance wont be on a par with DK2, and you will get the benefit of the improved optics, hopefully full RGB screen and the reduced SDE.
 
My gut feeling (ie i am probably wrong ;) ) is that CV1 will have different set up options for those with weaker systems.

just as with DK2 we have 60hz 72hz or 75 hz...

my feeling is that CV1 will have at least 2 refresh rate options (75hz and 90hz)

and 2 resolution options - 1080p (with scaling) and native res.

with this in mind, if you choose to run at 1080p @ 75hz then imo there is no reason why performance wont be on a par with DK2, and you will get the benefit of the improved optics, hopefully full RGB screen and the reduced SDE.

Please tell me it I'm being a bit too optimistic but at the moment you really need to be supersampling to get the DK2 looking moderately okay. If the resolution of the commercial headset is adequate then will it not run slightly faster seeing as it will be working at its native resolution? That's of course assuming that people won't be supersampling the new headset to get an even better experience <grin>
 
Last edited:
I run 2 * 6GB 780's, I wing up at times with 3 other people, and I have not noticed a stutter in recent builds. Most settings are maxed out, but I did turn ambient occlusion off which seems to help with frame rate in stations.

All though a 970 is a recommended minimum for cv1 i'm hoping sli 780's can cut it. I don't want to upgrade until we get cards with <28nm tech
 
And this generates a question for myself: Extra resolution WILL have an impact on performance. Regardless of "how much perceptual difference" it will make to my eyes, my R9 290 is still going to have some extra work to do with the CV1. So,,,,,,,,,,

I am left wondering if I should just grab a DK2 when I can afford it and just forget about the CV1 until I can also afford a beefier GPU?

And if the CV1 is released at a price point that is about the same of the DK2, I would think it would force a lower price on used DK2's.

Or am I missing something?


The problem is that DK2 support will most likely be obsolete when CV1 hit the market. DK2 is just a prototype after all. So I don't know why people think they'll be able to use their DK2 after CV1...no one is going to support it any more.

- - - Updated - - -

I run 2 * 6GB 780's, I wing up at times with 3 other people, and I have not noticed a stutter in recent builds. Most settings are maxed out, but I did turn ambient occlusion off which seems to help with frame rate in stations.

All though a 970 is a recommended minimum for cv1 i'm hoping sli 780's can cut it. I don't want to upgrade until we get cards with <28nm tech


With the recent Nvidia announcement, I'm confident that SLI will be the way to go with VR. I'm pretty sure 2x 780 will be more then enough.
 
DirectX 12 will do away with most of the problems developers have with having to constantly optimize their games for different cards. FDev simply needs to fully adopt DX12 going forwards, as it will increase frame-rate a lot for complex scenes. And, FDEV can spend more time coding the game, and less time fumbling about with optimization issues as soon as a new card or a new driver comes out. DX12 doesn't just change everything for the gamer. It changes everything for the developer as well, in how they can prioritize their time.

SLI for VR will also be solved by DX12, thanks to Split Frame Rendering (combined with all the latency issues getting solved automatically).

Also, for the people owning AMD cards, it will even out the competition between Nvidia and AMD. After DX12, Nvidia will no longer be the top dog for having fewer cores with higher clock rates. AMD's cards got more cores but lower clock rates. The new API will allow the programmer to hand out tasks for all the cpu and gpu cores, appropriately, with very little overhead.
 
Whatever the FOV, it will not be enough. Idealy, the FOV would be 180-170ish, that would really feel like we were "there" but, I wouldn't look for that for years yet....
 
Whatever the FOV, it will not be enough. Idealy, the FOV would be 180-170ish, that would really feel like we were "there" but, I wouldn't look for that for years yet....

I respectfully disagree.... whilst it may not be perfect I think it will be enough to get VR to take off and actually become a "thing"


Rome was not built in a day and all that. Back in the 1940s I bet people did not look at their 7 inch screen tellys and say "bah humbug this is rubbish, I want full HD, full colour flatscreen ;) "
 
Back
Top Bottom