SLI looks to be the black sheep of the GPU family. NVIDIA and AMD support it (probably to sell more GPU's), but they really don't support it that well at the end of the day.
Given the level of GPU performance today, SLI really isn't needed for top end VR.
Here's a good article on the different types of anti-aliasing.
http://www.hardocp.com/article/2013...mage_quality_video_card_review/4#.V07iMlaDFBd
Well, I both agree and disagree here

The problem with SLI/Crossfire is not entirely Nvidia/AMD. Game developers also need to implement support in their games. No matter what drivers Nvidia/AMD push out, if the game itself doesn't support it, it just won't work well, if at all.
While GPU power is indeed pretty good right now, it's still not reliable for VR. (as we can see from the performance issues on Elite in VR). True, the technology is new and all that, but Elite is probably one of the only AAA games that have been pushing for good VR support since the start. So if the GPU doesn't perform well in Elite, how would it perform in less optimized games?
Thing is, we have to look at the resolution on the Vive. While it's not as high as a regular monitor, we have to understand that we are basically running two monitors (three, if you include your mirror window on your normal screen).
The Vive renders at a resolution of 1080x1200 per eye. Not very impressive in itself. But since you are rendering two screens at this resolution, you are basically rendering at 2160x2400. Now that's a lot different.
If you apply Supersampling to that, let's say, 1.5X, you are then rendering at 3240x3600 total. (2160x1.5 = 3240 and 2400x1.5 = 3600). Increase that to Supersampling 2.0 and you are putting a very heavy load on the GPU. (Read the article shadragon referenced to, if you are curious about how different antialiasing methods work).
Not to mention, GPUs have to deal with extra overhead since they are rendering stereoscopic images instead of just flat 2D images (remember Nvidia 3D? You needed a beefy computer for that).
We all know that there is still not a single GPU that can handle 4K reliably. Even the new GTX 1080, despite all Nvidia's promises, falls short when it comes to 4K.
So what's the possible solution? Properly implemented SLI or Crossfire could deal with the VR performance issues once and for all. In an ideal scenario, each card in an SLI configuration could be used to render each eye individually. So, instead of one card rendering both eyes, or both cards trying to render both eyes, split the workload evenly. Card #1 for eye #1, card #2 for eye #2.
Can it be done? Definitely yes. HTC/Valve proved that SLI or Crossfire is perfectly viable for VR, with their "VR Performance Test". Before switching to my GTX 980Ti last month (should've waited for the 1080, lol), I had two R9 290X in Crossfire.
Using one card, the performance test showed my VR readiness at 6.7 out of 11. With Crossfire enabled, VR readiness jumped to 10.8-11 out of 11. So Valve showed that with proper in-game implementation, Crossfire/SLI could be used to increase performance dramatically.
Fun Fact: Did you know that AMD's Crossfire usually scales better than SLI? I don't know the reason, but with two cards sometimes I used to get as much as an 90% performance scaling on the second card. I have yet to achieve something like this in SLI (I've had a bunch of Nvidia or AMD cards, I don't pick sides

I usually go with best bang for buck or performance depending on budget XD)
Lastly... Nvidia hyped the 1080 up into the high heavens. True, it's a great card. And even faster than the 980 Ti or Titan (not enough to warrant an upgrade, though), but: 4K is still not possible at a consistent FPS. And DX12 performance seems to be horrible. Nvidia really, really needs to get on their DX12 performance issues. Now, the 980Ti and all previous Nvidia cards were unable to do asynchronous shaders at the hardware level, and as some of you may know, this is very important for VR.
It remains to be seen whether the 1080 fixes this issue. If it does not, Nvidia might have a loser in their hands.
AMD, on the other hand, has supported asynchronous shaders for quite a while now, at the hardware level. The fact that the 290X, which is an extremely old card could get a rating of almost 7 in the VR test, is a good demonstration of the importance of the shaders. And yes, lower settings, but the 290X could handle Elite just fine in VR with some tweaking. It's no secret that AMD performs much better than Nvidia in DX12. And as of late, AMD has also been very timely (and good) with their drivers. This time around, Nvidia really needs to step up their game.
My advice to you: Don't bother with the 1080. If you have a Titan X, or a 980Ti, it's not worth it. If you have anything lower, then yes
Personally, I will probably wait for AMDs Polaris or even Vega next year.
