New to "Proper" gaming PCs - old hand at Elite - help with settings?

OK so I've got a new PC and I've NEVER had a proper gaming PC before. I'm blown away with the performance... but I want to optimise the settings where I can. Build is:

ASUS TUF Gaming Z-790; Core i7-14700K; ASUS DUAL 4070 Super OC; 64GB 5600 RAM (XMP-I); MSI 321Q 1440P/165Hz Monitor

So the game is set to Ultra all the way; Full Screen, 1440p, 165Hz, etc etc etc.

1. But what do I set for the best results in terms of quality in the NVidia 3D settings? (Can I improve on the factory defaults which appear to either set things to "Off" or "Application Controlled"?)
2. Is there anything I can do to affect the awful jaggies due to crap anti-aliasing? (e.g. can I set the game to DLSS3 (Quality), 1080p-1440p?) (If so, HOW?)

Can anyone please give this old n00b some advice? (p.s. I tried a few subs on reddit but they're all really up their own arses about post topic and removed my posts).
 
All that fancy hardware, and then you skimp on the display? Well, you do you...

As for your question: since you have power to spare, I'd switch off all the anti-aliasing and upscaling and run the game in screen resolution. Any jaggies that are left (and there will be a few - check the thread(s) on the orbit lines...) are due to the game engine. Upscaling is always a crutch if your system can't run a game in your chosen resolution at your desired frame rate.

There's also a worthwhile video by CMDR Exigeous, demonstrating all(?) the graphics settings in ED. Yes, the video is now 4 years old - but AFAIK the game/graphics engine hasn't changed:
 
Graphics in game I'm happy with - it's the "3D Settings" in the NVidia control panel I'm looking for. Everything is cranked to "Ultra" in game. Surely if I turn off anti-aliasing, the jaggies will be worse?! I notice them mostly on my fleet carrier when I'm manouvreing around it.

p.s. The rig cost me the 2 grand I had left... I didn't "skimp" on anything - more than happy with 31.5" 165Hz 1440p. To run Elite in 4k I'd still need to tweak graphics settings in game.
 
All that fancy hardware, and then you skimp on the display? Well, you do you...
It doesn't strike me as a particularly cheap monitor for the pricepoint of the rest of the components. It ticks a lot of the boxes (IPS, large, high-refresh display); the only thing I'd nitpick really is that the display could be a bit brighter - 300 nits isn't really great for HDR, but Elite doesn't support HDR anyway.
There's also a worthwhile video by CMDR Exigeous, demonstrating all(?) the graphics settings in ED. Yes, the video is now 4 years old - but AFAIK the game/graphics engine hasn't changed:
The graphics were overhauled for Odyssey. There are significant differences betweeen the 3.8 and 4.0 clients in how they look, at least. There were major problems with the performance of the original 4.0 engine as well. Performance was dire, especially on low- to mid-range hardware. This is now much improved.
Graphics in game I'm happy with - it's the "3D Settings" in the NVidia control panel I'm looking for. Everything is cranked to "Ultra" in game. Surely if I turn off anti-aliasing, the jaggies will be worse?! I notice them mostly on my fleet carrier when I'm manouvreing around it.
The display is Freesync Premium, which means it is very likely G-Sync compatible, even if Nvidia don't explicitly list it as such (they only list displays they've tested, iirc). G-Sync/Freesync won't make much difference with the framerate you're getting in space, but if you do any Odyssey content you may wish to investigate turning it on - the performance in Odyssey settlements is much more variable, especially in ground conflict zones. It won't make the game look 'better', however; it will just eliminate screentearing and increase perceptions of visual smoothness.

I'm a bit of a weirdo, but I generally prefer FXAA over MSAA in the game. You might try tweaking a few settings down and trying out the in-game supersampling options. Whether that looks better to you will be your call.
 
Cheers - I will try the various ingame options as well. The PC grabs "auto HDR" whenever I go into a game; and it always recognises the G-Sync compatibility too which is cool.
 
1. But what do I set for the best results in terms of quality in the NVidia 3D settings? (Can I improve on the factory defaults which appear to either set things to "Off" or "Application Controlled"?)

The only quality settings of relevance here are texture filtering quality (set to high); anisotropic filtering (set to 16x as this appears to affect some aspects of terrain that max out at 8x in the game settings); and DSR (see below).

I would also recommend setting low latency mode to On (not ultra), power management mode to 'prefer maximum performance, and threaded optimization to enabled.

Most of the rest of visual quality tuning will need to be done from the game's configuration files, and there is a lot of room for improvement beyond what the in-game options allow to be set.

2. Is there anything I can do to affect the awful jaggies due to crap anti-aliasing? (e.g. can I set the game to DLSS3 (Quality), 1080p-1440p?) (If so, HOW?)

Not really. Increasing render resolution and filtering the image are about the only things that can be done without extreme trade-offs.

If you have GPU limited frame rate to spare, use a DLDSR resolution in game and then the in-game super sampling option to add another layer of scaling on top of that. Jaggies will still be obtrusive, but they'll be more tolerable.

Elite doesn't support HDR anyway.

Tuning the game's tone mapping and injecting HDR is a worthwhile improvement, if one does have a solid HDR display.

I'm a bit of a weirdo, but I generally prefer FXAA over MSAA in the game. You might try tweaking a few settings down and trying out the in-game supersampling options. Whether that looks better to you will be your call.

The game doesn't have MSAA and there is no good way to make MSAA work with it. If you mean SMAA, I generally prefer this at lower resolutions as it's sharper, but FXAA does a slightly better job of softening jaggies and looks subjectively better to me a higher resolutions or with supersampling.
 
The graphics were overhauled for Odyssey. There are significant differences betweeen the 3.8 and 4.0 clients in how they look, at least. There were major problems with the performance of the original 4.0 engine as well. Performance was dire, especially on low- to mid-range hardware. This is now much improved.
You're right, of course. I somehow didn't want to remember that part of history.
 
The only quality settings of relevance here are texture filtering quality (set to high); anisotropic filtering (set to 16x as this appears to affect some aspects of terrain that max out at 8x in the game settings); and DSR (see below).

.....

For completeness:

I did all this and WOW what a difference... my carrier now doesn't look like it's being viewed through a bathroom window LOLOLOL
 
The 4070 Super has 12GB of VRAM. I find the game can use up to 10GB at 3440x1440, so I'm not sure 4K will work without tweaking some settings down.

I've been testing some buget 8GiB cards (an RX 580 2040SP and an ARC A750) I purchased for a few older builds and I can confirm they start to have VRAM capacity issues in Odyssey at higher settings.

However, my 10GiB RTX 3080 still runs the game without such problems, despite the game/drivers being more than willing to allocate 14-16GiB of VRAM at the same settings on higher-end cards.

12GiB should be for any settings that a 4070 Super can handle.
 
The only quality settings of relevance here are texture filtering quality (set to high); anisotropic filtering (set to 16x as this appears to affect some aspects of terrain that max out at 8x in the game settings)
Just trying this myself. Am I right in thinking that when you set Texture filtering Quality to High it then automatically sets Anistropic sample optimisation Off, Negative LOD bias to Clamp and Trilinear optimisation to On?
 
Am I right in thinking that when you set Texture filtering Quality to High it then automatically sets Anistropic sample optimisation Off, Negative LOD bias to Clamp and Trilinear optimisation to On?

Yes, though the trilinear optimization setting is misleading, as it's ignored (on is the same as off) with high quality texture filtering.

Edit: Correction, it's not the texture filter quality setting that sets the LOD bias to clamp, it's manually specifying an anisotropic filtering level.
 
Last edited:
Yes, though the trilinear optimization setting is misleading, as it's ignored (on is the same as off) with high quality texture filtering.

Edit: Correction, it's not the texture filter quality setting that sets the LOD bias to clamp, it's manually specifying an anisotropic filtering level.
About HDR

I never got it to look nice. Especially in the waiting window the glow around the rotating ship is horrible. Any tips on how to make it decent? It might be worth the extra wait times on alt-tabbing...
 
About HDR

I never got it to look nice. Especially in the waiting window the glow around the rotating ship is horrible. Any tips on how to make it decent? It might be worth the extra wait times on alt-tabbing...

I found that tuning the game's tone mapping settings in GraphicsConfiguration.xml (or the override) helped a lot. I basically pasted over some settings from an old backup I had from 2015 when the game was much darker, with a few minor adjustments. This allowed me to take advantage of the higher dynamic range of the HDR swapchain from SpecialK to get the default brightness of most things close to vanilla, but allow for a bit more saturation and much brighter highlights. Never did get the menus or loading screens quite 100%, but the rest of the game looks pretty good, subjectively speaking.

These are the relevant GraphicsConfigurationOverride.xml settings:
XML:
    <HDRNode>
        <HistogramSampleWidth>164.000000</HistogramSampleWidth>
        <ExposureType>2</ExposureType>
        <ManualExposure>-0.250000</ManualExposure>
        <ShoulderStrength>0.491000</ShoulderStrength>
        <LinearStrength>0.707000</LinearStrength>
        <LinearAngle>0.704000</LinearAngle>
        <ToeStrength>1.185000</ToeStrength>
        <ToeNumerator>1.377000</ToeNumerator>
        <ToeDenominator>1.976000</ToeDenominator>
        <LinearWhite>5.242000</LinearWhite>
    </HDRNode>
    <HDRNode_Reference>
        <HistogramSampleWidth>164.000000</HistogramSampleWidth>
        <ExposureThreshold>1.000000</ExposureThreshold>
        <Percentiles>0.010000,0.540000,0.999000</Percentiles>
        <ManualExposure>-0.250000</ManualExposure>
        <GlareCompensation>1.125</GlareCompensation>
        <ShoulderStrength>0.491000</ShoulderStrength>
        <LinearStrength>0.707000</LinearStrength>
        <LinearAngle>0.704000</LinearAngle>
        <ToeStrength>1.185000</ToeStrength>
        <ToeNumerator>1.377000</ToeNumerator>
        <ToeDenominator>1.976000</ToeDenominator>
        <LinearWhite>5.242000</LinearWhite>
        <ACES_A>2.51</ACES_A>
        <ACES_B>0.03</ACES_B>
        <ACES_C>2.43</ACES_C>
        <ACES_D>0.59</ACES_D>
        <ACES_E>0.14</ACES_E>
    </HDRNode_Reference>

The HDR section of my SpecialK.ini for EliteDangerous64.exe:
INI:
[SpecialK.HDR]
Use10BitSwapChain=false
Use16BitSwapChain=true
Promote8BitRTsTo16=true
Promote10BitRTsTo16=true
Promote11BitRTsTo16=true
Promote8BitUAVsTo16=false
Promote10BitUAVsTo16=false
Promote11BitUAVsTo16=false
AllowFullLuminance=false
ContentEOTF=-2.2
AdaptiveToneMap=true
Preset=2
scRGBLuminance_[0]=3.6411
scRGBPaperWhite_[0]=0.3
scRGBGamma_[0]=0.955
ToneMapper_[0]=0
Saturation_[0]=1.0
GamutExpansion_[0]=0.1
MiddleGray_[0]=1.25
PerceptualBoost0_[0]=1.0
PerceptualBoost1_[0]=0.1
PerceptualBoost2_[0]=1.273
PerceptualBoost3_[0]=0.5
scRGBLuminance_[1]=1.7056
scRGBPaperWhite_[1]=0.14
scRGBGamma_[1]=0.92
ToneMapper_[1]=0
Saturation_[1]=1.0
GamutExpansion_[1]=0.1
MiddleGray_[1]=1.25
PerceptualBoost0_[1]=30.0
PerceptualBoost1_[1]=11.5
PerceptualBoost2_[1]=1.5
PerceptualBoost3_[1]=1.0
scRGBLuminance_[2]=12.152475
scRGBPaperWhite_[2]=0.0125
scRGBGamma_[2]=1.0
ToneMapper_[2]=0
Saturation_[2]=1.0
GamutExpansion_[2]=0.1
MiddleGray_[2]=1.26875
PerceptualBoost0_[2]=-1.0
PerceptualBoost1_[2]=0.1
PerceptualBoost2_[2]=1.273
PerceptualBoost3_[2]=0.5
scRGBLuminance_[3]=1.0
scRGBPaperWhite_[3]=1.0
scRGBGamma_[3]=1.0
ToneMapper_[3]=255
Saturation_[3]=1.0
GamutExpansion_[3]=0.0
MiddleGray_[3]=1.25
PerceptualBoost0_[3]=-1.0
PerceptualBoost1_[3]=0.1
PerceptualBoost2_[3]=1.273
PerceptualBoost3_[3]=0.5
Keep8BpcRemastersUNORM=false
KeepSubnativeRemastersUNORM=false
ColorBoost_[0]=0.333
TonemapOverbright_[0]=true
ColorBoost_[1]=0.333
TonemapOverbright_[1]=true
ColorBoost_[2]=0.333
TonemapOverbright_[2]=true
ColorBoost_[3]=0.5
TonemapOverbright_[3]=true
LastUsedColorSpace=1

The first three presets are all configured, but if you use them, you'll definitely need to tune them to you display.
 
Hello @Morbad
Weren't the ability to adjust the HDR node reference disabled by frontier some years ago? Particularly the ACES? Or, still valid if used in the override file.

Flimley
 
Last edited:
Hello @Morbad
Weren't the ability to adjust the HDR node reference disabled by frontier some years ago? Particularly the ACES? Or, still valid if used in the override file.

Flimley

I know I encountered difficulty getting some of the HDRNode variables to work in the past and have been duplicating the changes between the two sections since. Since the changes still work, I've assumed HDRNode_Reference is still being parsed, but it's been quite a while since I've tested them independently.

Regardless, there are some settings that can't be changed via the override, which can be changed by editing the main file, but I'm pretty sure none that are the other way around.
 
The only quality settings of relevance here are texture filtering quality (set to high); anisotropic filtering (set to 16x as this appears to affect some aspects of terrain that max out at 8x in the game settings); and DSR (see below).

I would also recommend setting low latency mode to On (not ultra), power management mode to 'prefer maximum performance, and threaded optimization to enabled.

Most of the rest of visual quality tuning will need to be done from the game's configuration files, and there is a lot of room for improvement beyond what the in-game options allow to be set.



Not really. Increasing render resolution and filtering the image are about the only things that can be done without extreme trade-offs.

If you have GPU limited frame rate to spare, use a DLDSR resolution in game and then the in-game super sampling option to add another layer of scaling on top of that. Jaggies will still be obtrusive, but they'll be more tolerable.



Tuning the game's tone mapping and injecting HDR is a worthwhile improvement, if one does have a solid HDR display.



The game doesn't have MSAA and there is no good way to make MSAA work with it. If you mean SMAA, I generally prefer this at lower resolutions as it's sharper, but FXAA does a slightly better job of softening jaggies and looks subjectively better to me a higher resolutions or with supersampling.

Did a DDU nvidia driver rollback today and forgot to note down my nvidia control panel settings. Luckily I found this post again, anything to add? Somebody was recommending triple buffering but I seem to recall you saying elsewhere that it had no effect on Elite?
 
Back
Top Bottom