Does Elite Dangerous support SLI for two graphics cards?

Had a play with the SLI on my dual 1070 GTX cards. Using them in alternate frame rendering mode runs into the bottleneck of my somewhat aging i7-6700k CPU not being able to keep up that well. Using the second card for PhysX works better, with an improvement in frame rate (but not always smoothness) over SLI off. That being said, GPU0 still often hits 100% while GPU1 usually strolls along at 10% while the CPU does a bit of light work as well.

The big killer in Odyssey is the inability of the game engine to stop rendering hidden detail. In the image below I'm on a somewhat average (for Odyssey, so spectacular otherwise) planet. When I take skybox pictures like below, the frame rate is unsurprisingly steady at the prescribed G-SYNC value of 60 fps. If I point the camera at the ground, which here is a rather nicely drawn assembly of sandy dunes with some crinkly looking outcrops of rocky stuff we really should be able to sample, and zoom in on the dull grey nothing, frame rate drops to a measly 30-ish.

Screenshot_0066.jpg

So Odyssey is still busy drawing everything that we are facing in whatever instance it is drawing around us, regardless of whether it is visible or not.

:D S
 
Yeah, I haven't tinkered with it, just leaving on whatever the stock Elite Dangerous SLI profile is. I haven't rechecked GPU loads and all that yet. Glad it seems to be working again.

But yeah, general optimization could still use some TLC.
 
Yeah, I haven't tinkered with it, just leaving on whatever the stock Elite Dangerous SLI profile is. I haven't rechecked GPU loads and all that yet. Glad it seems to be working again.

But yeah, general optimization could still use some TLC.
Ty, I will try it out this weekend again. If I see a performance increase, fine. If I see a drop again, I at least know for sure that my CPU isn't up to the task. It's an i7 5690X clocking in at 4 GHz. Should be enough, but is 6 years old now. Still it would mean ED is very CPU heavy and very bad at utilizing multiple cores. I've been suspecting that to be the problem all along. The performance differs too wildly across a wide range of graphics cards to be an actual GPU problem.

Edit: did a quick test, no improvement, putting Odyssey back on the shelf.
 
Last edited:
Yeah, it's quite strange. I'm using two Titan Black cards with 16x PCIe lanes each and an i7-3930K at 4.2GHz with hyperthreading enabled, so 6 cores/12 threads. A quite capable processor, but rather old now as well. The thing for me though is that the GPU performance load has so far (I haven't really checked it yet this newest update) seem to have been the main bottleneck. I have been force enabling PCIe Gen 3 after driver updates with an Nvidia patch, since the CPU doesn't officially support it apparently due to motherboards at the time or something not all having specs for wide enough traces. My Asus P9X79-WS seems to handle it well enough though. I also have 64GB 1600MHz DDR3 RAM, but it's 8 8GB DIMMs in quad channel, so still decently fast for its age and type.

I've mostly been testing in the hanger though (trying to keep things easy and relatively consistent for comparison), so terrain rendering needs might be significantly different.
 
Last edited:
Frontier once said that SLI was not supported. I had two cards in SLI mode for years and it never worked.
Looking at the numbers one card was taking the load while the other was pretty much idle doing nothing.
 
Frontier once said that SLI was not supported. I had two cards in SLI mode for years and it never worked.
Looking at the numbers one card was taking the load while the other was pretty much idle doing nothing.
Up until Odyssey I've been using SLI quite happily with significant gains from it with the stock driver profiles. I've only seen noticeable and significant gains with SLI in Odyssey with the latest update.

It's my understanding that SLI is implemented differently for different generations of cards on the profile side of things or under the hood, so maybe a difference in support for different card generations or something. Not sure.
 
Last edited:
I ran SLI for a while at release in 2014, it worked really well for the first few updates.

At... Hmm... Probably some time shortly after horizons it started not working, then working, then not, each patch was different, so I sold one of my cards and gave up.

It's not being actively supported, if it works it's not because frontier are doing anything to make it work.
 
When I take skybox pictures like below, the frame rate is unsurprisingly steady at the prescribed G-SYNC value of 60 fps.
What do you mean by that? Gsync (or any other adaptive sync technology) is not limiting the Fps.
 
I ran SLI for a while at release in 2014, it worked really well for the first few updates.

At... Hmm... Probably some time shortly after horizons it started not working, then working, then not, each patch was different, so I sold one of my cards and gave up.

It's not being actively supported, if it works it's not because frontier are doing anything to make it work.
I had a few hiccups like this during betas or after new version releases but they were the exception and all sorted themselves out one way or another.
This lead me to believe SLI was continuing to be supported to some extent at least.

With the latest Odyssey update I'm getting roughly 35 FPS in the hanger with SLI off and 50 FPS with it on. Still about half what I'm getting In Horizons with SLI, but better than the third I was getting in Odyssey.
 
Last edited:
Using the second card for PhysX works better

Should be functionally the same as disabling SLI as Elite: Dangerous almost certainly does not support GPU accelerated PhysX.

I have been force enabling PCIe Gen 3 after driver updates with an Nvidia patch, since the CPU doesn't officially support it apparently due to motherboards at the time or something not all having specs for wide enough traces.

SB-E doesn't officially support PCI-E gen 3. It almost always works, but actual certification had to wait for IB-E. NVIDIA is just playing it safe.

However, to have lowest input lag, it remains advisable to apply an Fps limit. More details here: https://blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/5/

Overwatch is a bit of an outlier...a competitive FPS that doesn't mandate a low queue depth, which is part of what makes the FPS cap useful. You can see the opposite trend in some other games (CS:GO in the same article, for example), where more fps always results in lower latency, even if the game becomes GPU limited.

In general, I recommend enabling the driver's low latency mode (both NVIDIA and AMD drivers have these options and they attempt to enforce a low queue depth, among other things) rather than capping frame rate, unless you're sure the title reacts better to the later, or want to cap frame rate for another reason.
 
Overwatch is a bit of an outlier...a competitive FPS that doesn't mandate a low queue depth, which is part of what makes the FPS cap useful. You can see the opposite trend in some other games (CS:GO in the same article, for example), where more fps always results in lower latency, even if the game becomes GPU limited.

In general, I recommend enabling the driver's low latency mode (both NVIDIA and AMD drivers have these options and they attempt to enforce a low queue depth, among other things) rather than capping frame rate, unless you're sure the title reacts better to the later, or want to cap frame rate for another reason.
While the low latency mode will gain only one frame advantage at best, it might come at the cost of a slightly more inconsistent frame rate. It might be a matter of preference depending on the game in the end.
 
Just wanted to report that with update 6 to Odyssey, SLI is back to not working for me in Odyssey. I'll be sticking with Horizons for the time being outside of testing.
Sorry for any confusion, but as you can tell, it's a bit of a mess.
 
Back
Top Bottom