"Made for VR", but still no SLI support?

ED now requires DX11, and Nvidia VR SLI support has been out a while, but it seems that FDEV is still not implementing it. It's looking more and more to me that this technology is going to be required to run the next generation of HMDs, because I don't think a single card will have the power for several years. Does anybody know what's going on with this? Do they have plans to implement it?
 
It also doesn't work with omnidirectional treadmills (for faster than light sprinting) or output telemetry data for motion platforms / bass shakers. Maybe we should uninstall?
 
Pretty much....

But the Nvidia "VR SLI" was supposed to be a new implementation that worked with VR. It was up to FDEV to enable it. And my twin GTX-970's gave me about 70% higher FPS in 2D at 2560x1440 and 2.0SS, so I'd think dual GTX-1080's would blow away a single RTX-2080i, but I could be wrong (famous last words). And I think I'd rather have dual cards, spreading the power and heat, than one card screaming at high temps and power. One-per-eye, synced, makes sense to me.

So are you saying Nvidia has already abandoned this?
 
SLI will remain a wonky and expensive solution until you can take any two or more cards and merge into one virtual card to the os.

At which you would see benefits in all aspects of computing using a GPU and not the few who happen to support the current implementation.
 
But the Nvidia "VR SLI" was supposed to be a new implementation that worked with VR. It was up to FDEV to enable it. And my twin GTX-970's gave me about 70% higher FPS in 2D at 2560x1440 and 2.0SS, so I'd think dual GTX-1080's would blow away a single RTX-2080i, but I could be wrong (famous last words). And I think I'd rather have dual cards, spreading the power and heat, than one card screaming at high temps and power. One-per-eye, synced, makes sense to me.

So are you saying Nvidia has already abandoned this?

Most VR games don't have SLI implemented. I can only think of one, which is serious sam.

SLI will remain a wonky and expensive solution until you can take any two or more cards and merge into one virtual card to the os.

At which you would see benefits in all aspects of computing using a GPU and not the few who happen to support the current implementation.

Very much this. Until the OS can see them as one entity then SLI or Crossfire are not that good.
 
But the Nvidia "VR SLI" was supposed to be a new implementation that worked with VR. It was up to FDEV to enable it. And my twin GTX-970's gave me about 70% higher FPS in 2D at 2560x1440 and 2.0SS, so I'd think dual GTX-1080's would blow away a single RTX-2080i, but I could be wrong (famous last words). And I think I'd rather have dual cards, spreading the power and heat, than one card screaming at high temps and power. One-per-eye, synced, makes sense to me.

So are you saying Nvidia has already abandoned this?

ED doesn't support VR SLI/SLI/Crossfire, so there is no point in buying the cards until FD change that. But it's been years now so I wouldn't hold your breath.


As for your last point. Will you buy the latest Nvidia card for it's raytracing abilities in ED?
 
I thought SLI was a dead medium?

Not really dead, well it is for vr since oculus doesn't support it, but SLI is healthier now and has more support than ever before for standard 2d games.

Mostly down to the high price of the 2080ti, you can pick up 2x1080ti's for around £200 less than a single 2080ti, and in games that support sli, they'll wipe the floor with the new card... just not in vr, and not in Elite :(
 
Last edited:
As for your last point. Will you buy the latest Nvidia card for it's raytracing abilities in ED?
Is ED supporting raytracing? I can't imagine what good it would be for flying spaceships. Maybe when they introduce atmospheric landings and space legs, but not now.

When a new HMD comes out with 4K per eye, or near that, with a wide FOV (150 deg or so), in a year or two (hopefully), I'll be ready to build a new computer. At that time, I'll buy whatever is required to support that HMD in ED. The problem is, I doubt a single card will be able to handle that load. I'm playing on 4K now and have SS set to 1 at Ultra everything. This maintained 60FPS everywhere with Win 7 (dropped to 40-50FPS in stations and RES with Win 10). Now imagine tripling that load at 90FPS on two screens, even at low settings. I just hope it won't require SS to get a good image. If I can't offload that onto 2 or more cards, I just don't see how it can work.
 
Don't need 4K per eye, will be useless and a waste of money, I think that for the human eye, 1920 x 1080 on each VR display is enough, watch this:

[video=youtube;VxNBiAV4UnM]https://www.youtube.com/watch?v=VxNBiAV4UnM[/video]
 
Don't need 4K per eye, will be useless and a waste of money, I think that for the human eye, 1920 x 1080 on each VR display is enough, watch this:

You can't compare sitting 5 meters from a 50" 4k tv, to a huge FOV HMD image.
Try sitting 15 cm from a 1080p TV and tell me you don't see the pixels..
 
How many NEW titles support SLI/Crossfire?

Most of the AAA titles released this year with the exception of Forza Horizon 4 and the new assassins creed

Just a few off the top of my head that support sli, scale well (60-90%) and were released within the last year or so are Shadow of the Tomb Raider, Call of duty black ops 4, Battlefield V, far cry 5, vampyr, star wars battlefront 2, destiny 2, wreckfest, warhammer vermintide 2 and sea of thieves, there's probably a bunch more but those are the only ones I've personally tried.

*actually I should take sea of thieves off that list as it doesn't natively support sli and requires messing with sli profiles for it to work.
 
Last edited:
Don't need 4K per eye, will be useless and a waste of money, I think that for the human eye, 1920 x 1080 on each VR display is enough, watch this:
That's absolute bullcrap.
For true VR you actually need something akin to 16k.
per eye thats how much resolution you perceive in the real world.
Not see. but perceive, there is an acute distinction between the two. The brain itself does a lot of world building and for instance exploiting how the brain does this is how we get optical illusions.
By tricking the brain to build an errounous world.

That video and many other regurgitated opinions keep cropping up around 4k.

One thing these have in common is they mostly come out of complete ignorance of 4k media, the display technology, the content that is viewed and the tight tolerances a 4k screen start to operate under.

That video you linked covers pretty much the ground between utter misconception, gross simplification followed by just being plain wrong.

It honestly holds as much water as an anti vaccer's argument, only it's harmless in the fact they simply don't buy 4k screens vs killing children.

I am -5 myopic and sitting 2.5m away from a 65" UHD screen and easily tell the difference between 1080p and 4K, because I have taken the effort to get both capable equipment to bring out the difference and have them setup well enough to boot.
 
Don't need 4K per eye, will be useless and a waste of money, I think that for the human eye, 1920 x 1080 on each VR display is enough, watch this:
Yeah, that guy's talking about a 50-55" TV sitting 6-8' away. I'd agree with him, more or less, for general picture quality, but when the credits start scrolling the 4K is much easier to read. And on a 65" or larger TV, the 4K is clearly superior viewing. But as said above, when you put that screen, magnified through a lens, a few cm away, the pixels are very visible. It's like taking a magnifying glass up to a TV. So 4K per eye is only just the starting point when it comes to a good picture quality in VR, but I won't be holding my breath for the power to display 8K and higher in a HMD (and I agree it would be ridiculous for a TV).
 
Ahhh, but I keep forgetting about eye-tracking and foveated-rendering, which probably means my current 4yo computer and 1080 will probably be more than enough to handle 4K per eye - no need for SLI or super cards (and a new computer) until you get to 8K/16K. Actually, with the sweet spot so low on current lenses, I'd think they'd could already be rendering the periphery of each screen at a lower resolution and it wouldn't be noticed other than with a higher FPS (I assume they aren't doing this already?).
 
Back
Top Bottom