Hows performance after patch 11 ?

I'm seeing a repeatable and fairly massive performance regression in that suit tutorial after powering up the settlement.

In U10, powering up the tutorial settlement did almost nothing to performance. In U11, it cuts mine almost in half at some places.

Im not sure if this any data for you, but on lower spec pc's this has always been a problem. Ie 40-60 in the power building while waiting for it to turn on.. 30 fps the moment the holo barriers come on. There's a bunch of smoke and fx that happens there as well.

I've done one test on 11 and it seems just slightly worse than before, but to get space looking better i've upped what i can in terrain to ultra so i wrote it off as that.
 
My main gripe with performance changes in this update is that they took the implementation of the checkerboard rendering quite literally.


checkered.jpg
 
yea, seems to happen right after you get the message to recharge (not when you actually recharge)
i initially thought it's AI related, but seems not.

I just did some more testing and it's the settlement power, probably all the lights, electrical effects, and resulting shadows caused by powering it up.

Minimum fps inside the burning building was 76 with the power off, 52 with it on, entirely repeatable, with a bigger difference for the averages. It's also readily apparent that there is very little culling going on, anywhere.

I'm looking at some of my past tests to confirm there wasn't as significant a hit.

In my case the only thing U11 enabled was the checkerboard even tho my terrain settings were set to Ultra or Ultra+ - which checkerboard got promptly disabled
The FSR or the rest of the settings were left untouched

In U10, setting one of the general presets would not reset the FSR setting to it's default (which is supposed to be enabled), U11 seems to have fixed this, so if one wants to play with FSR after using a preset, they need to manually disable it.

Setting the global preset to ultra should also disable checkerboard rendering, but individually adjusting any other setting will not do so automatically.
 
Well im mostly on 31 fps now when i was at 20-25 before. So there are slight improvements but no significant ones. I bet my Dinner that the Code of the Game is the problem and that optimizing wont help all that much.
 
Im not sure if this any data for you, but on lower spec pc's this has always been a problem. Ie 40-60 in the power building while waiting for it to turn on.. 30 fps the moment the holo barriers come on. There's a bunch of smoke and fx that happens there as well.

I've done one test on 11 and it seems just slightly worse than before, but to get space looking better i've upped what i can in terrain to ultra so i wrote it off as that.

It's possible that I was CPU limited in the past to the extent that the differential between powered down and powered up was smaller, but I also don't recall seeing below 60 fps in the tutorial with FSR enabled, which is what's happening here.

Looking for some of my older tests and testing my usual custom configs to make sure I'm not imagining things.
 
Just noticed the the render to texture ?? countdown in carriers absolutely destroys fps. For me its ~60 down to ~24.

Frontier seem very confident from the patch notes so im guessing they have it under control.
 
Just noticed the the render to texture ?? countdown in carriers absolutely destroys fps. For me its ~60 down to ~24.

Frontier seem very confident from the patch notes so im guessing they have it under control.
During my first jump yesterday, I had 60 FPS before the jump sequence initiated, but as soon as the (quite ugly) effects started appearing, my FPS tanked to 30 and remained there even after the jump was finished. I had to relog to get it back to 60 again.
 
Why is it that the people who report "significant improvements" never have any numbers to provide?

Anyhow, Yamiks did a test of Update 10 versus Update 11 over on the ED subreddit, using the Odyssey tutorial mission as the control environment. Reported zero change in performance, while also noting visual regression in the terrain texture quality. Take that for what you will I guess.
Yamiks has a fit PC. A 2nd gen thradripper (literally worse for gaming than a ryzen 5 2600X) with a 2080 Ti and quad channel memory. His PC is a literal bottle of latency and bottlenecking. His benchmarks are lower than many. I have a Ryzen 9 3900X, RX 6800 installed on a Samsung 980 Pro (all on X570's pcie 4.0) and 32 GB of G skill Trident Z rgb OC to 3777 Mhz. I noticed about a 5 FPS improvement in all locations (i get almost 90 on foot and about 150-160 in space) im running on 1440p on ultra preset. No FSR.
 
During my first jump yesterday, I had 60 FPS before the jump sequence initiated, but as soon as the (quite ugly) effects started appearing, my FPS tanked to 30 and remained there even after the jump was finished. I had to relog to get it back to 60 again.

Yeah, as soon as you go into the carrier services interface seems to do it too.

Probably better not to get too involved :)
 
Yeah, as soon as you go into the carrier services interface seems to do it too.

Probably better not to get too involved :)
I waited an hour for a jump, was a bit whelmed, installed the extra services, was a bit whelmed, uninstalled the extra services. The one positive thing I found so far in U11 is that they seemed to have brought back the fixes from U8 (with regards to the freezes and hitches when a settlement loads in) which they somehow managed to undo in U9.
 
It's possible that I was CPU limited in the past to the extent that the differential between powered down and powered up was smaller, but I also don't recall seeing below 60 fps in the tutorial with FSR enabled, which is what's happening here.

Looking for some of my older tests and testing my usual custom configs to make sure I'm not imagining things.

Turns out it's mostly my custom settings, which are at least 10% faster than the default ultra (while looking better).

This itself is rather strange, because only a small handful of my rather extensive list of adjustments should be improving performance and in past testing I always saw a small hit.
 
The Cobra engine is 6-7 year old tech and it's really showing it's age, bolting on MBR without reworking and optimising the core rendering obviously has crippled the performance and I don't believe FDev have any of the same programmers working on it that wrote the original code, so it's not surprising. Any programmer that's new a large code base struggles to find their way around the mindset of the original authors and it takes a long time adjust and really get up to speed - or they grow frustrated with it and move on to other things.

FDev should have had it's gfx engine programmers just start again from the ground up and build a decent Vulcan renderer.

My old PC died, so was forced to spend just over a £1000 to get a new one, I went from:
GTX1060 with AMD FX 8350 4.2Ghz(8 Cores, 8 Threads) & 24GB DDR3
to
RTX3060TI with Intel® Core™ i5-10400F(6 Cores, 12 Threads) & 16GB DDR4

In U10 Old: 25-30FPS at ground bases just walking around @1080p ( Horizons on the ground in SRV was getting 80-100FPS at Guardian Ruins vs 35-40 in Odyssey )
In U10 New: 50-80FPS at ground bases just walking around @1080p - this at least means the update I paid for over 8 months ago is now playable - no thanks to FDev.

I'll have to fire up Odyssey tonight and see if U11 makes any difference, but honestly I doubt it and with the massive difference in FPS depending on exactly where you are looking I'm not sure I would notice anything different ( unless it looks a lot worse )!

I get about the same FPS in Cyperpunk, the difference is that game looks absolutely amazing and I have all the Ray Tracing turned on, Odyssey looks mah.
CP2077 in the desert looking at single boring building I'm getting 80+ FPS, in Odyssey looking at a single even less detailed boring build I'm only getting 60 FPS, again CP2077 is with everything high and RTX On!
 
Do you think your custom settings could be an ultimate solution for a wide range of different hardware combination and sceneries? Or could it be that it is mainly the improvement of your system and found out primarily by trial and error? Honest question, not meant to be provocative.

The most likely culprit for the performance discrepancy that I've observed are my volumetric settings.

This is default "Ultra" volumetrics:
XML:
    <Ultra>
        <LocalisationName>$QUALITY_ULTRA;</LocalisationName>
        <StepsPerPixel>24</StepsPerPixel>
        <DownscalingFactor>1</DownscalingFactor>
        <BlurSamples>4</BlurSamples>
        <TwoPassBlur>true</TwoPassBlur>
        <InScatterSteps>2</InScatterSteps>
        <StepMultiplier>6.0</StepMultiplier>
        <RingQuality>3</RingQuality>
        <FogMotesEnabled>true</FogMotesEnabled>
    </Ultra>

My override replaces the following:
XML:
    <Volumetrics>
        <Ultra>
            <DownscalingFactor>1.5</DownscalingFactor>
            <BlurSamples>3</BlurSamples>
        </Ultra>
    </Volumetrics>

To me, this difference is completely imperceptible, image quality wise, from the default "ultra". It preserves the critical fog quality and is still significantly higher resolution than "high". However, it's significantly lower resolution than "ultra" and cuts an evidently redundant blur sample.

It's no ultimate solution to anything, but the effects in settlements for those who would otherwise be using ultra who are limited by shader performance in settlement is meaningful.

I might be able to tune the lower quality presets similarly, but it may not be possible to preserve as much quality while still seeing appreciable improvements, and it can't help if the bottleneck is elsewhere.
 
Last edited:
I struggle with the system map /station menus taking an age to load. Does anyone have some suggestions for me to try to improve this by changing in game graphics setting please?
 
Back
Top Bottom