First off I'm sorry to hear a few people have received dead dk2's on arrival, that must be so frustrating.
I guess part of the problem is people are using different benchmarks with lots of variables? Some are testing in hangars, others stations / space / planet rings). I've still not decided if running my monitor at 75 Mhz is making any difference to my DK2 (read reports it helps some but not all users). Others have mentioned even different ships effect fps?
Personally I suspect robdood was right when he said Beta 2.x is never 100% judder free (on initial startup / loading new areas etc) due to software rather than hardware.
Yeah, I think at present there are too many instances and cases that judder could occur some of which go beyond simply what graphics card you have and this is likely to be true for the future as well. Things like internet lag, large data uploads to the cards over a few frames, what type of ship, cpu load from other services/apps, hardware etc.
In fact I suspect you'll never get 100% judder free on the Rift, now or ever as PC' are such complex beasts, there will always be a time where a frame takes longer than the refresh rate. All we can hope for is that it is minimised or only shows as micro-stutters which perhaps can be coded around or specific display hardware might be able to mask.
Something I ended up taken into consideration in my new system, though i'm not expert on these things, is that I almost went for a i7 5820k, it was the best deal for the price but its 28 lanes really put me off as this means you can't run sli at x16/x16 instead its going to be x16/x8. However as in sli both cards are going to need the same data in order to render the correct output surely the x16 is going to have to wait on the x8?
Now from what I can tell that means a max throughput of data to the card of almost 16GB (PCI-e 3.0) to one card but only 8GB to the other. Sounds like alot doesn't it? However at 75 fps that is only 108MB a frame (at 90 fps thats only 88MB) and when a single uncompressed 4kx4k texture is almost 90MB, its entirely possible that coming into a new area or streaming in a lot of new data in a single frame could saturate that bandwidth.
There are
tests which show that currently in all games 28 vs 32 or more lanes has no noticeable difference on average framerate, but they didn't test for micro-stutters, nor at v-sync.
In the end I didn't feel safe going for the 5820k and went with 5930k instead as I wont be buying a new system for at least 4 years maybe longer as I feel not much as been done in terms of analysing how x8/x8 on PCI-e 3.0 will affect games that load large amounts of data. I suspect i've over-compensated, but I know i'd just be constantly worrying about it otherwise.
A lot depends on where you are, what ship you are in, what kind of station you are in.
Some ships hit performance a bit. Biggest ship I've had is a Lakon 6 and it was the worst performer or least sharing worst with the Cobra. Sidies are ok as are Eagles and Vipers, Haulers are somewhere in in the middle.
Some stations hit performance more than others do. The more complex the worse the performance.
Good point about the different ships, I would propose its either due to open cockpit variety or lack of optimising in beta or a bit of both. As I've said I suspect ED is currently and always will be fillrate limited, meaning that the larger your resolution the worse performance will get and I think in a non-linear fashion (but haven't bothered to work out it that's true). So my guess is enclosed cockpits mean ED will be able to use the depth buffer to reject many more pixel operations in a view before doing any calculations and thus some ships will be less demanding, to a degree than others.
As for stations, yep some big differences. Back in beta 2.02 or 2.03 I was getting 40 fps on my GTX465 in the less demanding dodecahedron stations, but in others (the bright white one) it would drop to 25 fps or less. I'm hopeful Frontier can/will perform a decent optimisation path for these before release, it doesn't make any sense to me that inside stations complexity should be allowed to vary so much as to effectively half ones framerate.
I wouldn't build for the CV1, I have a feeling it will be a long time coming. Judging from Brendan Iribe's speech at Oculus Connect, they want it to have no visible pixels; so that probably means a 4k screen. Will two 980's in SLI be enough for 90 Hz and 4k resolution? I don't know.
Actually I was watching an
interview/play test of Crescent Bay by one of the guys from Tested and Oculus were very cagey about the new screen. They refused to give resolution information, but Norm from Tested commented about how good it looked, even compared to the Samsung Gear VR 2560x1440. Speculating that they may have found other technological solutions to the screen door and overall resolution perception issues. Alternatively perhaps they are going to switch screen manufactures and mentioning a resolution would be too much of a hint? Either way I got the distinct feeling that a 4k screen may not be necessary for Crescent Bay visual quality, which can only be a good thing as I hate to think what gpu would be needed to drive a game like ED on such a massive dpi screen.
Oh and if you've not watched that interview yet I recommend it. One big thing was that the testing room for journalists had no seating and it was clear that Norm had absolutely no problems going through the many experiences standing up. Granted none of them seemed to involve travelling, but still I believe that is another step forward.