FPS and my Nvidia GPU - Struggle with FPS? Here's how I solved mine!

Hello folks,

As we all know, the graphics in this game absolutely killed frame rates on old and new GPUs alike when the expansion was released. For years now, I've lived with it, as has many others.

However, even though I've gone through my Nvidia settings before - I had never tried turning OFF most features. Like most, I tried everything from changing the game settings via windows and xbox game bar, tweaked the graphics in the Graphics Config file - but all of this barely made a dent. Regardless, I now have the windows and xbox gaming features switched back on.

Now I'm now running Ultra settings with a steady 90 FPS in the Bar Area of my Fleet Carrier. Even in a Ground Base or Space Station, I am on a steady 60+ FPS (between 50-60 on ground bases). Before I made the changes in the Nvidia settings, I was between 60-70 in the Bar Area of my Fleet Carrier (and not even on Ultra Settings - most were set to High or Low), and even less in a regular space station or ground station.

Here's my current rig:

GTX 1080 8GB
i7700k (4.4Mhz)
24GB RAM 3200Mhz
SSD Samsung

Here is my settings for NVIDIA:

Image Scaling: Off
Ambient Occlusion: Off
Anisotropic: Off
Antialiasing - FXAA: Off
Antialiasing - Gamma Correction: Off
Background Application Max Frame Rate: 20 or Off
CUDA - GPUs: All
DSR - Factors: Off
DSR - Smoothness: Off
Low Latency Mode: Off
Max Frame Rate: Off
OpenGL GDI Compatibility: Prefer Compatible
OpenGL Rendering GPU: select your graphics card
Power Management Mode: Prefer Maximum Performance
Preferred Refresh Rate: Application-Controlled
Shader Cache Size: Unlimited
Texture Filtering - Anisotropic: OFF (greyed out)
Texture Filtering - Negative LOD: Allow
Texture Filtering - Quality: High Quality
Texture Filtering - Trilinear: ON (greyed out)
Threaded Optimisation: On
Triple Buffering: Off
Vertical Sync: Use the 3D Application Setting
VR Pre-rendered frames: 1
Vulkan/OpenGL present method: Prefer Native

For whatever reason, I think the Nvidia drivers are badly optimised for Elite - so to ensure Elite actually runs optimized, having your settings set, as above, effectively makes the Nvidia drivers not to bother TRYING to optimize Elite (I'm under the impression it is creating twice the work itself - trying to optimize a game it isn't optimized well to run). I ussually run just the default settings in Nvidia - but with the above settings as they are, it hasn't affected my other games (early access games I do not count).

But yes, I'm happy with the tinkering - no doubt, a lot of optimisations by frontier has no doubt helped - but I think 50% of their problem was mostly GPU driver related.

The in-game settings I've used is default Ultra without touching anything.

If you have a better card than a GTX 1080, and a better CPU and faster memory - let us know your results by trying these settings. What works for mine may not work for you!

EDIT: Bonus - the Galaxy Map no longer horrendously lags for me - and I've increased the supersampling from 1.0x to 1.25x! And I still hit between 60-70 without any FPS dips!
 

Attachments

  • 20230303074848_1.jpg
    20230303074848_1.jpg
    484.9 KB · Views: 374
  • 20230303075922_1.jpg
    20230303075922_1.jpg
    662.5 KB · Views: 200
Last edited:
These settings won't do anything because they apply to features the game does not have (MSAA) or APIs the game does not use (OGL/Vulkan):
Antialiasing - Gamma Correction: Off
CUDA - GPUs: All
OpenGL GDI Compatibility: Prefer Compatible
OpenGL Rendering GPU: select your graphics card
Triple Buffering: Off
Vulkan/OpenGL present method: Prefer Native

Most of the gains you're seeing from the driver settings you've altered are probably down to...
Anisotropic: Off
...because the ultra preset sets 16x for most textures (presumably shadows and nebulae samples in the galaxy map) and 8x for the terrain sampler.

AF is usually not that demanding, but there are few other driver options what would alter performance vs. the defaults at all.

The only other real other possibility I see is:
Threaded Optimisation: On
Threaded Optimization should be Auto = On, but it's possible they changed that for CPUs below a certain core count. I do usually force it on anyway, just in case.
 
Back
Top Bottom