Hows performance after patch 11 ?

Baseline with my current (5800X) setup that I'll use for future comparisons:
Source: https://www.youtube.com/watch?v=C553i_cQE1o


Predominantly, but not quite entirely, CPU limited at 1080p Ultra (no custom graphics settings this time, other than turning off FSR). Should still reveal any changes from a CPU swap without radically deviating from typical performance patterns.

I considered using a CZ to get a more NPC/AI heavy area, but they aren't as repeatable as the suit tutorial and the same CZs may not be active by the time I get the 5800X3D moved to my main system. My sample seems to be a pretty good one, but I want to test it thoroughly before I void the warranty. There is also new firmware that could be released any day now that I want to test and AMD is rumored to be loosening the overclocking restrictions on the part. I want to give it the same best effort tuning that I've been doing with the 5800X. Right now, I can use curve optimizer via a number of unofficial routes (and this sample seems stable at -30 on all cores), but I'm hoping for something more official soon.

Anyway, EDO's performance is extremely erratic in general. The most GPU intensive area I could find (the corridor with the fire and smoke after the settlement is powered up) gets down to 89 fps at these settings on a 2.6GHz 6800XT (similar to a stock 6900XT or RTX 3090), which is quite low for 1080p. Most other areas are still CPU limited at fairly modest frame rates, even with no active AI...it's all apparently rendering overhead. Even with the modest polling rate I'm using in the CapFrameX overlay, you can see where the main game thread (on physical core 6) is frequently maxing out.
 
@Morbad someday when you're bored, maybe you can try a vsync locked 60 fps and see if that cures the stuttering players complain about. I remember Digital Foundry doing a test like this on one of the Batman games, and they found performance (not FPS, but rather smoothness and lack of stutter) greatly improved with vsync on, which they attributed to the processor having more time to "process" between frames. The game actually looked a lot better at a locked 60 FPS than it did at an unlocked 100+ FPS.

Personally, I always prefer a locked 60 or 72 FPS that's liquid smooth rather than max out frames that aren't even being drawn because they outpace my monitor's refresh rate. Of course that also means anything below 60 FPS is bad, bad, bad! (None of my displays offer variable refresh rate.)
 
@Morbad someday when you're bored, maybe you can try a vsync locked 60 fps and see if that cures the stuttering players complain about. I remember Digital Foundry doing a test like this on one of the Batman games, and they found performance (not FPS, but rather smoothness and lack of stutter) greatly improved with vsync on, which they attributed to the processor having more time to "process" between frames. The game actually looked a lot better at a locked 60 FPS than it did at an unlocked 100+ FPS.

Personally, I always prefer a locked 60 or 72 FPS that's liquid smooth rather than max out frames that aren't even being drawn because they outpace my monitor's refresh rate. Of course that also means anything below 60 FPS is bad, bad, bad! (None of my displays offer variable refresh rate.)

It's true that a frame rate cap can allow more consistent present and display intervals which can make perceived smoothness better (though there are limits to this, especially if the interval between the rendered images is significantly less consistent than the interval are actually displayed), but that's generally not the issue with Elite: Dangerous, in my experience. The actual frame pacing is quite good, for the most part; jitter in frame times is low. I've tested this all sorts of ways, and the graph you can see at the lower right in the above video is a good example.

The stutters that are perceptible as actual stutters, and not simply a reduction in smoothness, are far too long for any sane frame rate cap to mask. These can be seen during the initial loading of a settlement, when accessing menus, or in CZs when dropships spawn in. Most of my performance complaints come from scenarios where the average frame interval is just too high (frame rate too low), or such CZ stutters.

In general, I don't like vsync, and even VRR has more downsides than upsides on most displays, at least once refresh rate is sufficient. At such sufficently high refresh rate (somewhere north of 120Hz, for me), tearing is no longer perceptible (or at least not obtrusive), and LCD overdrive modes usually work best within a narrow range of refresh rates. On my Samsung G7, the optimal overdrive mode for peak refresh (240Hz) produces overshoot ghosting near the bottom of the VRR range, even with it automatically doubling up on refreshes to mitigate this...and most displays are quite a bit worse. There are a handful of LCDs out there with true dynamic overdrive (fewer where it's tuned correctly), and sufficiently fast displays (like most OLED or microLED panels, or CRTs) that don't have any ghosting issues, but they all have other trade-offs, and I'd be hard pressed to find any display I like more than the one I have for my use case (and I've looked...if it existed for less than the price of a kidney, I'd have found a way to own one already).

My main issues with EDO performance are those situations where I can still fall below 60 fps. At the largest settlements CZs, on this system, are heavily CPU limited, often right around 60 fps, and are riddled with stutters (mostly from the aforementioned dropships, made worse by U11) where the frame interval can jump to 30ms+ for a few frames. I have an example of this back in in post 238.

All I can do, at this point, is alleviate as much of any memory sub-system bottleneck as practical (I ran into diminishing returns with CPU clock speed increases a while back, but noticed a large jump in game performance from improving system memory performance...and now that that is tapped out on this system, the last remaining potential solution is the extra cache on the 5800X3D)to get my worst-case average frame rates up, and hope the length of the problematic stutters is also reduced.
 
It's true that a frame rate cap can allow more consistent present and display intervals which can make perceived smoothness better (though there are limits to this, especially if the interval between the rendered images is significantly less consistent than the interval are actually displayed), but that's generally not the issue with Elite: Dangerous, in my experience. The actual frame pacing is quite good, for the most part; jitter in frame times is low. I've tested this all sorts of ways, and the graph you can see at the lower right in the above video is a good example.

The stutters that are perceptible as actual stutters, and not simply a reduction in smoothness, are far too long for any sane frame rate cap to mask. These can be seen during the initial loading of a settlement, when accessing menus, or in CZs when dropships spawn in. Most of my performance complaints come from scenarios where the average frame interval is just too high (frame rate too low), or such CZ stutters.

In general, I don't like vsync, and even VRR has more downsides than upsides on most displays, at least once refresh rate is sufficient. At such sufficently high refresh rate (somewhere north of 120Hz, for me), tearing is no longer perceptible (or at least not obtrusive), and LCD overdrive modes usually work best within a narrow range of refresh rates. On my Samsung G7, the optimal overdrive mode for peak refresh (240Hz) produces overshoot ghosting near the bottom of the VRR range, even with it automatically doubling up on refreshes to mitigate this...and most displays are quite a bit worse. There are a handful of LCDs out there with true dynamic overdrive (fewer where it's tuned correctly), and sufficiently fast displays (like most OLED or microLED panels, or CRTs) that don't have any ghosting issues, but they all have other trade-offs, and I'd be hard pressed to find any display I like more than the one I have for my use case (and I've looked...if it existed for less than the price of a kidney, I'd have found a way to own one already).

My main issues with EDO performance are those situations where I can still fall below 60 fps. At the largest settlements CZs, on this system, are heavily CPU limited, often right around 60 fps, and are riddled with stutters (mostly from the aforementioned dropships, made worse by U11) where the frame interval can jump to 30ms+ for a few frames. I have an example of this back in in post 238.

All I can do, at this point, is alleviate as much of any memory sub-system bottleneck as practical (I ran into diminishing returns with CPU clock speed increases a while back, but noticed a large jump in game performance from improving system memory performance...and now that that is tapped out on this system, the last remaining potential solution is the extra cache on the 5800X3D)to get my worst-case average frame rates up, and hope the length of the problematic stutters is also reduced.
I may not be as technically literate as you, but the big takeaway from this is that a 6800XT and 5800X are apparently still not enough to stay over 60fps at even just 1080p in EDO, and that itself is pretty damning imho. With hardware like that you SHOULD be blistering along at 140+fps at such a resolution, and yet as you said; plenty of dips below 60 with a max of 87fps in these situations. I still have no idea how Braben and FDev claimed that EDO "ran fine" in-studio. There's simply no way.

One of the biggest gameplay additions they advertised for EDO and it's the ONE place everyone tells you NOT to go if you want to keep playable framerates.

But nah, apparently the only problem with EDO is the conspiracy review bombs.
 
I still have no idea how Braben and FDev claimed that EDO "ran fine" in-studio. There's simply no way.

They probably had a lighter build in a more optimistic scenario to demo for the management, when they should have been presenting worst-case performance. If the higher-ups realized just how badly optimized things were 15-18 months ago, maybe there would have been enough pre-launch pressure/motivation to straighten things out. But the ship sailed, and now they're Flex Taping the Titanic back together while the band plays on.

If I understood correctly what I learned from here, it makes no sense to aim for higher frame rates than 60 fps if your monitor is limited to 60 Hz - even though Hz and fps are not quite the same thing. So vsync should be the way to go in this situation, right?

There are some advantages to having an unconstrained frame rate, but unless either frame rate or refresh rate are quite high, tearing can be a major issue.

Additionally, the single biggest disadvantage to fixed refresh rate vsync is, ironically enough, the potential for perceptible stutters. With vsync enabled, any time the next frame cannot be ready before the next refresh, you just see the same frame again. This isn't a big deal if the refresh interval is short enough, but 60fps is 16.7ms frame intervals...miss one refresh and you're already down to 30 fps (33.3ms) effective for that frame. Triple buffering or flip queues can help ensure a frame is ready more frequently, but nothing can get around the need to sync the frame with the refresh when vsync is on.

With a 60Hz display, if you can always, or nearly always, keep frame rate above 60, but can't maintain frame rates so high that tearing becomes imperceptible (probably in the ballpark of 120-180 fps), then you'll probably have a better experience with vsync enabled. However, if you're sensitive to stutter or abrupt changes in smoothness and you cannot ensure that your minimum fps is above 60, then the stutter or slow down from vsync may be worse than tearing.

This is why VRR is so nice to have, especially on displays that don't have extreme refresh rates.
 
Last edited:
So vsync should be the way to go in this situation, right?
For me, yes, assuming I can guarantee 60 FPS minimum. I'm very sensitive to tearing, and I also play on a gaming laptop, so if a game can run at 100 fps and I limit it to my framerate to my monitor's 60 Hz refresh rate, then my computer runs cooler and quieter. It just makes sense for me 99% of the time.

BTW, my laptop screen also supports 144Hz, but I have very few games that run that fast on this hardware. I can run Horizons at a locked 72 fps, which vsyncs nicely on a 144Hz display, but I just don't really notice the 12 extra fps, so I keep my screen at 60 Hz. For me personally, 60 fps is my sweet spot. Anything less is bad (because fps < Hz = stutter or tearing), and anything more isn't noticeable to me unless I'm playing a super twitchy game like Overwatch.
 
Well, I killed my test board, so my 5800X3D is now in my main system. Doing a few quick tests and some setup, then I'll bench Odyssey.
 
I run an admitted old rig. I see hitching in the FSS and Sysmap, but I can't imagine anything those sections of the game is doing that is so amazing that it causes that. If they can figure that out the rest of the game would probably run buttery.
 
I tried odd latest over the weekend on my little 5500m...

Not a chance. Its such a joke.

I did also try a different approach... instead of insisting the game not look like crap, i started to have a go with amd fsr. Even on the performance and balanced settings for kicks. There was potential again, but when that megashader is in the scene its not playable (doing a 180 with the camera yields 50-60 rock solid not 100% gpu).

While this approach probably has some leads for more investigation, of course the moment you head into space the skybox is beyond unacceptable. Its extreme rubbish. I don't think i could be bothered changing the settings depending on where im at anyway so..

Hey im beyond feeling at this point, odd is not for me, completely passed me by, im fine with that.
 
Well, I figured there was going to be a meaningful difference, but holy crap, this game is possibly the most memory subsystem limited title I've ever heard of.

Uploading the comparison video now, but this 5800X3D at 4.45GHz (it should be hitting 4.55GHz, but I broke boost somehow...trying to figure that out now) is about 30% faster than my old 5800X at 4.7-4.8GHz. The difference in purely CPU/memory limited areas might be even larger as the uplift was enough to make my system predominantly GPU limited at 1080p.
 
Last edited:
Well, I fixed the boost behavior and tweaked a few other things which should be good for another percent or two of performance in favor of the 5800X3D. Will redo the comparison soon, but I'll probably put that in a separate thread as it's more of hardware topic than a U11 performance topic at this point.

Anyway, the impression that this experience has further reinforced is that many of the dips/gaps in CPU utilization that have been causing so much confusion, as they are often simultaneous with low GPU utilization, are due to memory accesses. The GPU is waiting on the CPU, but the CPU frequently has to sit there with it's thumb in it's butt because the game wants stuff that isn't in the CPU cache. The same general lack of optimization that plagues other areas is probably the cause of this...too much is being done that shouldn't even need doing.
 
I blame the Bitcoin miners that companies are slipping into their software these days. It's really apparent in Microsoft Flight Simulator's update manager, which pegs the GPU at 100% while downloading updates a 1/10th the rate of Steam.
 
After refunding during the open alfa, I bought the expansion yesterday. And I am pleasantly surprised with the performance.
I'm still rocking a 780 GTX on a gen4 intel, on a ultrawide at 2880x1080 and it is alright. Sure I get dips to mid 25fps on busy installations, but the average is usually mid 40s.
And having a blast restoring power, clearing any "opposition" using the SRV :D and then emptying all containers for good measure.
 
Source: https://www.youtube.com/watch?v=Ltdktvkfxr8


Some deviations from the first video, play wise, as I was deliriously tired when I recorded it, but it's well representative. Will run the test again after I straighten out firmware.

That was a fun watch. Wow you loose 100-150 fps when the fun starts and you've got 100% gpu usage there. Imagine what its like for me i only start with high 40s to begin with! :p

I guess it doesn't matter, but its also good to see that you have variable gpu usage up to that point as well, just like my machine.
 
I guess it doesn't matter, but its also good to see that you have variable gpu usage up to that point as well, just like my machine.

Until the power is reactivated 1080p ultra is still CPU limited on this setup, but culling still doesn't work well and powering up a damaged settlement adds a huge GPU load, even from areas that should be occluded. After that, the faster CPU still helps, but without the sort of loads present at CZs, the frame rate floor is dictated by the GPU.
 
I blame the Bitcoin miners that companies are slipping into their software these days. It's really apparent in Microsoft Flight Simulator's update manager, which pegs the GPU at 100% while downloading updates a 1/10th the rate of Steam.

I've been thinking about that for quite some time. You look like you are certain about that. Got any proof ? I'm not into this crypto stuff, is there some kind of cue one should look for ? maybe some kind of specific network traffic ?
 
I've been thinking about that for quite some time. You look like you are certain about that. Got any proof ? I'm not into this crypto stuff, is there some kind of cue one should look for ? maybe some kind of specific network traffic ?
It appears the MSFS update application doesn't have a frame rate cap, applying one through the use of third party applications will restore GPU utilisation to acceptable values (an easier workaround is to reduce the size of the window).
 
I've been thinking about that for quite some time. You look like you are certain about that. Got any proof ? I'm not into this crypto stuff, is there some kind of cue one should look for ? maybe some kind of specific network traffic ?
Just for the record, I am not stating as fact that Odyssey is Bitcoin mining in the background. No, no, no, no! No libel suit for me, thank you. However, there was a news story some time ago about a company that was doing just that - using spare CPU cycles to mine Bitcoin for themselves while users ran their software. I can't remember the outcome of that story (I don't know if there were any legal ramifications), but I do remember it causing an uproar at the time.

The MSFS installer, on the other hand, is definitely doing something fishy. Why does it need to peg my GPU while slowly downloading the latest update from the Microsoft store? Some people say that Microsoft is using "clever" (as in, terribly inefficient) compression, and the GPU is just decompressing the data. Hogwash I say! Steam downloads 10 times faster and doesn't touch my GPU. Also, I throttled down my processors while running the installer (to keep my laptop from melting), and download speeds remained the same, so it just doesn't add up. But like Odyssey, I am not stating as fact that MSFS installer is Bitcoin mining in the background.

Back to Odyssey, I was just joking. It is much more likely the result of bad, inefficient programming. A lack of talent and time (as in, a rushed product pushed out under an insane crunch) is the most simple and likely explanation.
 
Top Bottom