So I ended up landing on the latest NVIDIA Drivers v430.39 (had v417.71 before) and the difference in max- and average- FPS is massive. Either it is a driver improvement (I am always skeptical although I do enjoy the placebo effect of clean-uninstalling via DDU and reinstalling of new drivers) or rather I had something messed up somewhere and this update was able to clean the problems.
I noticed after the NVIDIA driver update, my custom 2560x1440 @ 75Hz video setting was removed (as expected) so ED ended up settling at the usual 2560x1440 @ 60Hz (VSync On, still). Maybe it was that, maybe not.. but anyway it looked like I had a 2x more powerful graphics card, for sure, at the same XML settings.
As far as NVIDIA 3D settings go, here is what I have:
In Program Settings (elitedangerous.exe)
Anisotropic Filtering: 16x
Texture filtering - Quality: High Quality
Threaded Optimisation: ON
Maximum pre-rendered frames: 1 or "Use the 3D Applicaiton Settings" (*)
(*) Quoting Morbad: Maximum pre-rendered frames = 1 does not help performance and will actually increase CPU utilization slightly. However it does limit the render queue to one thread, which minimizes overall input latency and so I almost always force this to 1.
In Global Settings
DSR - Factors: 4.00x (Native Resolution)
DSR - Smoothness: 0% (alternative combo: 33%)
Power management mode: Prefer maximum performance
I do get the best performance at a final resolution of 2560x1440, naturally. This can either be a 4x DSR with 0.50 in-game SuperSampling or a 1x DSR with 1x in-game SS (obviously). The latter is crispier and more defined and readable but I can't say I do like it better. The former makes things a bit blurrier but it looks less like videogame polygons and more like "watching something" out of a video. I am sure some people would call me crazy for this but it's especially noticeable in things like the huge station "piston/pylons" and rounded structures.
For quality settings I have everything maxed out with Morbad's settings for shadows as found here:
forums.frontier.co.uk
I am scared to find out whether it was the 75Hz setting making things weird (forcing a higher average FPS than my card can take) and anyway I don't wanna risk it
I noticed after the NVIDIA driver update, my custom 2560x1440 @ 75Hz video setting was removed (as expected) so ED ended up settling at the usual 2560x1440 @ 60Hz (VSync On, still). Maybe it was that, maybe not.. but anyway it looked like I had a 2x more powerful graphics card, for sure, at the same XML settings.
As far as NVIDIA 3D settings go, here is what I have:
In Program Settings (elitedangerous.exe)




(*) Quoting Morbad: Maximum pre-rendered frames = 1 does not help performance and will actually increase CPU utilization slightly. However it does limit the render queue to one thread, which minimizes overall input latency and so I almost always force this to 1.
In Global Settings



I do get the best performance at a final resolution of 2560x1440, naturally. This can either be a 4x DSR with 0.50 in-game SuperSampling or a 1x DSR with 1x in-game SS (obviously). The latter is crispier and more defined and readable but I can't say I do like it better. The former makes things a bit blurrier but it looks less like videogame polygons and more like "watching something" out of a video. I am sure some people would call me crazy for this but it's especially noticeable in things like the huge station "piston/pylons" and rounded structures.
For quality settings I have everything maxed out with Morbad's settings for shadows as found here:
Graphics Settings Beyond Ultra -- Shadows
Many thanks (sorry I sound like a broken record) for the additional insights. I am going to try Planets at 4096 and see what happens, still while doing (4x DSR, 0% smoothing and 0.65 SS in-game, these three parameters I won't change). Once I get a feel of this (a couple cargo hauling missions...
I am scared to find out whether it was the 75Hz setting making things weird (forcing a higher average FPS than my card can take) and anyway I don't wanna risk it
Last edited: