VR SLI ?

Hey

Does ED support SLI by any chance? I have a good ol titan X maxwell and was thinking of getting another one for few bucks to double up on frame rate for my VR rig, as I recently got Pimax5k+ (freaking amazing) but ED drops FPS :- (((( Buying extra Ti for pennies would be great since single titan can handle 1 eye easily.
 
I don't think VR supports SLI regardless of the game.
A lot of us would on the SLi bandwagon if it did!
 
Last edited:
Years ago in Elite Beta I had SLI working with a pair of GTX 670s. Basically make sure you plug the headset into the same GPU as the monitor otherwise you got really bad stuttering, which is the main complaint I read on forums about people who have tried it and say it didn't work for them. I did get some planet textures mess up in SLI.
It didn't give me a frame rate boost as such but what I did get was a more level frame rate which didn't dip so I could increase all the settings in the game.
 
Last edited:
I don't think VR supports SLI regardless of the game.
A lot of us would on the SLi bandwagon if it did!
Nvidia has released SDK for VR sli some time ago > https://developer.nvidia.com/vrworks/graphics/vrsli
This would allow us to speed up the frame rate quite a bit for quite little money nowdays. Say dual titan X or 980ti.s Each card handling 1 eye. Its cheapest VR hw power you can get I think.

Years ago in Elite Beta I had SLI working with a pair of GTX 670s. Basically make sure you plug the headset into the same GPU as the monitor otherwise you got really bad stuttering, which is the main complaint I read on forums about people who have tried it and say it didn't work for them.
I didn't get a frame rate boost as such but what I did get was an even frame rate which didn't dip so I could increase all the settings in the game.

Did you monitor GPU performance ? It could be just placebo mhmhmmm
 
Did you monitor GPU performance ? It could be just placebo mhmhmmm

and I got a load of replies just like this one every time I mentioned it.

I knew it was working because when the drivers auto updated it never reverted back to SLI mode, that was something that I had to do manually in the geforce settings page, I could tell when I had new drivers because the game used to stutter badly with my high settings that a single GPU couldn't cope with, until I switched SLI back on again.
 
and I got a load of replies just like this one every time I mentioned it.

I knew it was working because when the drivers auto updated it never reverted back to SLI mode, that was something that I had to do manually in the geforce settings page, I could tell when I had new drivers because the game used to stutter badly until I switched SLI back on again.
Uuuuu that sounds "promising" :D Well gotta steal titan X maxwell from some one now :D

I kinda wonder tho... maybe it used "classic" SLI rendering, where it just shuffled GPUS on each frame. Or if it used true per eye gpu rendering hmhmhmmh
 
Last edited:
No. I'm not saying that it's currently working. I'm talking around the time of the Oculus DK2 with GTX670s and HTC Vive release with SLI 970s. I don't think Nvidia had added the VR rendering at that stage. I believe the lighting and graphics engine in Elite has had an overhaul since then.

I'd like to try it again but I'd need a more powerful PSU and another 2080ti , which is a fair bit more than adding another 970.
 
Ahhh hmmm it must have been the old SLI then. Which makes me wonder if it still works or not o_O What would I give for official frontier dev response now >.>
 
I believe the official line has always been that SLI doesn't work... and if you read back my very old post history from 2014 you'll see that I argue until I'm blue in the face that it did actually improve things a fair bit.
 
So just to clarify, based on your personal experience, you're certain that it "isn't" working anymore? ( Finger hovering over 'buy' button for 2nd 1080ti with puppy dog hopefulness )
 
aXeL - SLI doesn't work and when it does / did it added latency to the head tracking. Unless that money is seriously burning a hole in your pocket its not worth it.
 
Thanks - yeah, have been itching to upgrade but the 2080Ti isn't enough of a jump ( by my own personal logic rather than any formal analysis ) from my 1080Ti@2.1GHz and since Intel can't make Sunny Cove reliably, goodness knows when we'll see Ice Lake X, 2023? Meanwhile nothing else ticks my own personal boxes, 10 HT cores @5.0GHz+ all core boost, 24/7 stable.
 
There ARE titles where SLI works in VR. It is a bit more complicated to get it working for reasons that are way over my head as a non-game developer, though
( https://steamcommunity.com/app/250820/discussions/0/2561864094350508326/ , https://developer.nvidia.com/vrworks/graphics/vrsli )
Here is a list of VR SLI capable titles https://www.nvidia.com/en-us/geforc...11/244932/vr-sli-vr-sli-supported-games-list/ .

I really, really have to say (intentionally beating this dead horse) that Elite VERY MUCH needs VR SLI support. I get over 150 FPS with all settings 100% in 4K with 1.25 supersampling in non-VR with my 2 2080 Ti's (both are utilized efficiently according to any monitor I use), so I know the game can benefit greatly from it, and VR Elite dangerous is one the most graphically demanding pieces of software I own that doesn't use any sort of ray tracing global illumination. It would make this title truly the perfect VR game if VR SLI support were added.

Right now I have an overclock profile specifically for Elite: Dangerous VR just so that the one 2080 Ti it will use is running at its peak so I can get all the performance possible out of it.
 
Last edited:
For SLI to work you would need to have the distortion passes running in parallel on a seperate GPU and this is not supported in the current Elite render pipeline as it has no native support for single pass stereo rendering.
 
Last edited:
Hey

Does ED support SLI by any chance? I have a good ol titan X maxwell and was thinking of getting another one for few bucks to double up on frame rate for my VR rig, as I recently got Pimax5k+ (freaking amazing) but ED drops FPS :- (((( Buying extra Ti for pennies would be great since single titan can handle 1 eye easily.

You can achieve Sli Support through StarVr but it costs a lot.They have developed software to assign each card in each eye but i don't if this method works with ED or how is the performance
 
You can achieve Sli Support through StarVr but it costs a lot.They have developed software to assign each card in each eye but i don't if this method works with ED or how is the performance

Even dedicated per eye distortion has significant latency as you still have to ship the secondary frame buffer over the SLI/NVLINK to the rendering thread and composite it.
 
Even dedicated per eye distortion has significant latency as you still have to ship the secondary frame buffer over the SLI/NVLINK to the rendering thread and composite it.
You're assuming that AFR is the only way and AFR was always prone to micro-stutter. SFR makes a more sensible use case; intelligent analysis of geometry and distribution of workload by GPU. You don't get the benefit of AFR's brute force approach to utilise each GPU but it's not so susceptible to micro-stutter.

Update, ok scratch that - how many years ago was SFR dropped? Not sure how NV think the 'broadcast' approach will improve latency except for the CPU.
 
Last edited:
I do wonder if something akin to VorpX injector approach could be used to aid SLI. My first thought is 'no' however, driver issues notwithstanding, I'm thinking of how VorpX implements both 3D and stereoscopic display for non VR games. In that case I think it's AFR but I'm guessing. Assuming so, e.g. take Alien Isolation a non 3D, non VR game which runs really nicely in VR through VorpX, except for the letterbox which I never bothered to fix. What would it take to take the AFR and do it by GPU? performance notwithstanding.

TBH I know nothing about order of execution in the rendering pipeline and therefore nothing about the workload allocation for GPUs and whether it can be interrupted by an injector to re-allocate. Last time I rendered anything 3D was before Windows so I'm pretty clueless.
 
Possibly the dumbest question of the morning/week/year/forever...

Could NV's VR SLI SDK hypothetically be used to develop an injector?? Does anyone know anyone at NV who could be asked?

Moreover, wasn't DX12 supposed to be able to aggregate disparate compute resources and pool them? This is where someone tells me ED is still on DX9 lol. Anyway, in this context, could a wrapper be used or would it be more plausible to go to Amazon and order unicorn laughter.. or ARX
 
Back
Top Bottom