A week on Horizon 4.0 and...

I would still say your precariously on the edge performance wise with that cpu tbh
but morbad will put you right but just rem a wizard he is but a god he isn't well as far as I am aware...……………….
 
Hi !

CPU :

INTEL Core i3-10100 3.6GHz LGA1200 6M Cache​

GPU :
https://www.gigabyte.com/fr/Graphics-Card/GV-N1060IXOC-6GD#kf

RAM :

Crucial Ballistix DDR4 3200MHz DIMM​

CARD :

MSI B560M Pro-VDH Micro ATX​


Thank you :)

First step is to open the NVIDIA control panel, go to 'Manage 3D settings', and find Elite Dangerous under the 'Program Settings' tab. In that menu scroll down until you find four 'Texture filtering -' options. Change them, in order to, 'On', 'Clamp', 'Performance', 'On'. Immediately below those settings should be a 'Threaded Optimization' setting; set this to 'Off'. If you have any other global driver AA, AF, or low latency options that are set to something other than default, revert them to default for this game.

Next, there are some config files I want you to back up and replace, namely these ones:
%LocalAppData%\Frontier Developments\Elite Dangerous\Options\Graphics\Custom.4.0.fxcfg
%LocalAppData%\Frontier Developments\Elite Dangerous\Options\Graphics\GraphicsConfigurationOverride.xml
(wherever your game exe is installed)Frontier\EDLaunch\Products\elite-dangerous-odyssey-64\AppConfig.xml

They could well be in different places than mine, depending on what distribution platform you got the game from. Double check the locations of those files (search, when in doubt). AppConfig.xml in particular exists in both client versions separately, make sure you are modifying/replacing the one associated with the Odyssey/4.0 executable. Note their locations on your system, then rename them (by adding .bak or something to them) to back them up. Also, in the same directory as AppConfig.xml, delete the GPUWorkTable.xml file...no need to back that one up.

I've attached replacements to this post. They have a .txt extension appended to them because the forum doesn't allow .fxcfg or .xml files. Remove that .txt, leaving the rest of the file name intact, before moving them to where the old files were. They are all in plain text, so feel free to take a look at their contents in notepad or whatever.

Essentially, these will:
  • reduce the number of threads spawned, but increase their stack and queue sizes.
  • set most graphics options to medium, with a few exceptions. Keeping in mind that spot shadows need to be 'low' and FX quality needs to be at least 'medium' to avoid certain costly shader bugs.
  • knock out fog in the medium FX quality, slightly reduce volumetrics costs, and make bloom less obnoxious.

Use whatever resolution you normally use (though more than 1080p will probably struggle on your GPU) at your maximum refresh rate. I recommend disabling any frame rate cap and vsync, at least until you can analyze the performance you're getting. Run something like GPU-Z, MSI Afterburner, CapFrameX, or even Task Manager, to get an idea of per-core CPU load and, more importantly, GPU load. If you notice drops in GPU load (outside loading screens/transitions), but have logical CPU cores that aren't fully loaded, set that 'Treaded Optimization' driver option to 'On' and see if that improves anything.

Do note that if you've changed any network settings in game, you'll need to change them again.

If you still don't see any meaningful improvement, there are a few other things you can try, but this should be enough to start with.
 

Attachments

  • Custom.4.0.fxcfg.txt
    1.5 KB · Views: 61
  • GraphicsConfigurationOverride.xml.txt
    690 bytes · Views: 59
  • AppConfig.xml.txt
    3.2 KB · Views: 83
Hi !
Thank you for looking into my case.
I quickly tested the recommended settings last night. I didn't have time to go deeper (I had a big mission to complete), but at first glance, there is progress. Even if I still have the impression of a little flickering (which stings the eyes a little) compared to 3.8.
So tonight, I'll finish and I'll let you know.
Thanks again :)
 
First step is to open the NVIDIA control panel, go to 'Manage 3D settings', and find Elite Dangerous under the 'Program Settings' tab. In that menu scroll down until you find four 'Texture filtering -' options. Change them, in order to, 'On', 'Clamp', 'Performance', 'On'. Immediately below those settings should be a 'Threaded Optimization' setting; set this to 'Off'. If you have any other global driver AA, AF, or low latency options that are set to something other than default, revert them to default for this game.

Next, there are some config files I want you to back up and replace, namely these ones:
%LocalAppData%\Frontier Developments\Elite Dangerous\Options\Graphics\Custom.4.0.fxcfg
%LocalAppData%\Frontier Developments\Elite Dangerous\Options\Graphics\GraphicsConfigurationOverride.xml
(wherever your game exe is installed)Frontier\EDLaunch\Products\elite-dangerous-odyssey-64\AppConfig.xml

They could well be in different places than mine, depending on what distribution platform you got the game from. Double check the locations of those files (search, when in doubt). AppConfig.xml in particular exists in both client versions separately, make sure you are modifying/replacing the one associated with the Odyssey/4.0 executable. Note their locations on your system, then rename them (by adding .bak or something to them) to back them up. Also, in the same directory as AppConfig.xml, delete the GPUWorkTable.xml file...no need to back that one up.

I've attached replacements to this post. They have a .txt extension appended to them because the forum doesn't allow .fxcfg or .xml files. Remove that .txt, leaving the rest of the file name intact, before moving them to where the old files were. They are all in plain text, so feel free to take a look at their contents in notepad or whatever.

Essentially, these will:
  • reduce the number of threads spawned, but increase their stack and queue sizes.
  • set most graphics options to medium, with a few exceptions. Keeping in mind that spot shadows need to be 'low' and FX quality needs to be at least 'medium' to avoid certain costly shader bugs.
  • knock out fog in the medium FX quality, slightly reduce volumetrics costs, and make bloom less obnoxious.

Use whatever resolution you normally use (though more than 1080p will probably struggle on your GPU) at your maximum refresh rate. I recommend disabling any frame rate cap and vsync, at least until you can analyze the performance you're getting. Run something like GPU-Z, MSI Afterburner, CapFrameX, or even Task Manager, to get an idea of per-core CPU load and, more importantly, GPU load. If you notice drops in GPU load (outside loading screens/transitions), but have logical CPU cores that aren't fully loaded, set that 'Treaded Optimization' driver option to 'On' and see if that improves anything.

Do note that if you've changed any network settings in game, you'll need to change them again.

If you still don't see any meaningful improvement, there are a few other things you can try, but this should be enough to start with.

@Morbad , you mentioned something about AA. Have you got some way of getting rid of the jaggies? Worst thing about the graphics for me.
 
@Morbad , you mentioned something about AA. Have you got some way of getting rid of the jaggies? Worst thing about the graphics for me.

Not any good way, and the less-than-good ways are generally too expensive in terms of VRAM and performance for slower GPUs.

VSR (AMD) or (DL)DSR (NVIDIA) can help. If you enable these, then set the in game resolution higher than native, then use the in-game supersampling setting to subsample (a multiplier below 1.0x) the game back toward native resolution, you get pair of extra scaling passes that will soften jaggies with only a small performance hit. DSR has the edge here because of the smoothness slider and DLDSR (limited to RTX cards) is best.

That said, the only truly effective way to significantly reduce jaggies in EDO, without bluring the crap out of everything, is to pour on the supersampling, which will quickly bring any card to it's knees.
 
On-foot has been released exactly as that, an optional DLC to the base game. So you sound like: "Since I'm not interested in something, nobody else should be either".
No…your words not mine!
Odyssey wasn’t released as a walk about only DLC, but you know that….don’t be a fibber!
 
Just my penny worth! The I3 10100 is only slightly better than my old i7 4790k. With Odyssey it was often maxing out and one of the reasons i did a slight upgrade with a 3600 not long after Odyssey came out (With the expectation to grab a 56003d in a year or two).
This is paired with a RTX 3070 which was bought for Odessey, just before its release,

The main issues with 4.0 and Odyssey is still the optimization of the base code. It still needs a lot of TLC in places and frame drops are far too common on many systems.
 
Hello !

I come to give some news.
I noticed a rather interesting thing.
First I set the graphics to "high" and I put the antialiasing back to 16x.
When I'm in space, or at least not near a planet or in a station, the game runs at 60 fps. When I approach, it drops to 45 or even 30.
In this case, I modify "CONVERSION". I'm upgrading it to AMD FSR 1.0. So the game goes back to 60 fps. When I go up in the ship, the thin parts are very crenellated. So I switch the "CONVERSION" mode back to normal (x1).
It's not really practical. But if it helps some...
 
Hi !

I return to the subject because I have understood a few additional things since then.
The 1060 6GB works great everywhere. It all depends on the settings. As said above, in the colonies, I switch to AMD FSR 1.0. I could stay with this option all the time, except it creates eye-stinging aliasing (at the HUD).
I have read several times that the AMD FSR 1.0 is especially made for high resolutions like 1440p. The aliasing problem would then come from my screen which only displays 1080p.
I've seen players play Elite with 3060s before and in colonies it's 30-45 fps too! I've even seen some with bigger cards, ditto.
I wanted to buy myself a more recent card, because it is true that the 1060 is quite old and that it does not know how to calculate certain things. But from what I saw, I tell myself that it would be more reasonable (for my wallet) to change the screen.
What do you think ? Do any of you play with a 1060 and a 1440p screen? What are the yields?

Thanks :)
 
Let's just be honest here: If you continue to play the 3.8 version (I prefer it), it will eventually be retired. Your advancements in this version will be lost when that happens. It is going to be retired. Not a question of 'if', but 'when'.
 
Let's just be honest here: If you continue to play the 3.8 version (I prefer it), it will eventually be retired. Your advancements in this version will be lost when that happens. It is going to be retired. Not a question of 'if', but 'when'.

Why do you say that ? I play Odyssey, not 3.8. ;)
 
Hi !

I return to the subject because I have understood a few additional things since then.
The 1060 6GB works great everywhere. It all depends on the settings. As said above, in the colonies, I switch to AMD FSR 1.0. I could stay with this option all the time, except it creates eye-stinging aliasing (at the HUD).
I have read several times that the AMD FSR 1.0 is especially made for high resolutions like 1440p. The aliasing problem would then come from my screen which only displays 1080p.
I've seen players play Elite with 3060s before and in colonies it's 30-45 fps too! I've even seen some with bigger cards, ditto.
I wanted to buy myself a more recent card, because it is true that the 1060 is quite old and that it does not know how to calculate certain things. But from what I saw, I tell myself that it would be more reasonable (for my wallet) to change the screen.
What do you think ? Do any of you play with a 1060 and a 1440p screen? What are the yields?

Thanks :)
Changing the screen won't make your graphics card any faster.
 
Let's just be honest here: If you continue to play the 3.8 version (I prefer it), it will eventually be retired. Your advancements in this version will be lost when that happens. It is going to be retired. Not a question of 'if', but 'when'.

Same goes for the live version; it will just be around a bit longer.
 
Changing the screen won't make your graphics card any faster.
No, of course ! But as I said, the 1060 (6gb) is up to the task. When we activate the AMD FSR 1.0, I have 60 fps everywhere! The only problem is the crenellated interface. This is surely due to my screen which is in 1080 while the AMD FSR 1.0 is designed for 1440p.
 
No, of course ! But as I said, the 1060 (6gb) is up to the task. When we activate the AMD FSR 1.0, I have 60 fps everywhere! The only problem is the crenellated interface. This is surely due to my screen which is in 1080 while the AMD FSR 1.0 is designed for 1440p.
FSR has four different modes: Ultra, Quality, Balanced and Performance. These correspond to the upscaling factors 1.3, 1.5, 1.7, and 2.0 respectively. So let's assume you're now using Quality mode. That means the game will render internally at 720p and then upscale the result to 1080p, the native resolution of your screen.

So what's going to happen if you change your screen to 1440p? Quality mode will now render internally at 960p, so your frame rate in that mode will be lower than before. You can compensate this by switching to Performance mode, which will now render internally at 720p. The frame rate here will be about the same as before, but so will the aliasing in the upscaled result. You will simply be scaling the same tiny 720p frame to a larger screen. It will look just as bad.
 
It's quite interesting what you say there. The mode is FSR is a little clearer with this explanation. I deduce that when I put in NORMAL, the scale is 1, therefore 1080; hence the sharpness.
But then why do I have to switch to have a good fps in the colony. The resolution of the graphics in the colonies is higher?
 
It's quite interesting what you say there. The mode is FSR is a little clearer with this explanation. I deduce that when I put in NORMAL, the scale is 1, therefore 1080; hence the sharpness.
But then why do I have to switch to have a good fps in the colony. The resolution of the graphics in the colonies is higher?
I would assume that its because the settlements take more horsepower from the GPU to render and if the card cannot render enough in the time allotted, then the output looks like crap.

I've been playing Ody, since release, at 1440p on a 27" 165 Hz gaming monitor via RTX 2060 (6GB) w/slight overclock. I stopped using FSR around Update 9 or so as the AA was terrible.

Since that time I have used "Normal" @ "x1.0" and the AA is still bad but overall the picture looks better.

Along the way I upgraded to RTX 3060 (12 GB) w/ slight overclock and with U14 I'm getting best framerates all around.

Some players with much better GPU than mine have always complained that their frame rates have always been bad.

My best advice is adjust the settings until you're happy with the look and performance and then leave it there.
 
Last edited:
Hi !

I return to the subject because I have understood a few additional things since then.
The 1060 6GB works great everywhere. It all depends on the settings. As said above, in the colonies, I switch to AMD FSR 1.0. I could stay with this option all the time, except it creates eye-stinging aliasing (at the HUD).
I have read several times that the AMD FSR 1.0 is especially made for high resolutions like 1440p. The aliasing problem would then come from my screen which only displays 1080p.
I've seen players play Elite with 3060s before and in colonies it's 30-45 fps too! I've even seen some with bigger cards, ditto.
I wanted to buy myself a more recent card, because it is true that the 1060 is quite old and that it does not know how to calculate certain things. But from what I saw, I tell myself that it would be more reasonable (for my wallet) to change the screen.
What do you think ? Do any of you play with a 1060 and a 1440p screen? What are the yields?

Thanks :)
Depending on what kind of system you have you might need to upgrade RAM and CPU too. ED needs quite a lot from the rest of the system too, not only from the GPU..
 
Top Bottom