Odyssey - CPU limited or GPU limited?

It seems to be a GPU limitation.

I decided to toast (burn up a AMD RX-5700XT) by making some GPU firmware (bios) tweaks and finally writing a bios of my own <<(I'm sure AMD would frown on) and managed to get some good results. I will be sharing my results through the AMD Bug reporting system because some of what I found may help with both newer and older cards, at least for AMD. Unfortunately, I am unable to get the systems support in using the PCI-e Resizable Bar for the card, because the 5700XT does not have the hardware support for it.

Good results. Just changing memory timings with a memory mod. (Link Below)

199FPS.jpg


Insane results. Just before the card overheated and the system shut down.

2404FPS.jpg


Hybrid Mod.

Source: https://www.youtube.com/watch?v=JbwE6p41OGo

Determine if your card supports "Resizable Bar". Your bios may support this, but it seems you need a 6000's series card to use it.

GPU-Z Resize.jpg

It may say "Enabled" here, but this is only if it is supported in your systems bios, not your cards ability to use it.

GPU-Z Resize Advanced.jpg

Use the "Advanced Tab" to see if your card can use the "Resizable Bar". If "GPU Hardware Support" and "Graphics Driver Support" say "No" and "Unsupported GPU", then it does not matter if you have the proper settings in your bios, your card is not hardware compatible.

I have two more RT-5700XT cards, one of which I will do some more testing on, since my system no longer supports Cross-Fire.

WARNING and DISCLAIMER: I know what I am doing. I'm willing to burn up, (or brick) a couple of cards if things go wrong.

DO NOT ATTEMPT these things if you do not understand it, or if you value the card you have. You may brick your card with firmware (bios) changes.

Thanks. Be well.....
 
Last edited:
I run 8600k all core 4.8 and 1080ti
Horizons is 144 fps mostly on ultra settings 1440p ultrawide

Odyssey is more like 60fps inconsistent and I change settings to medium or low on a bad day.

I managed to get a 6800xt and now I can run odyssey at 60 to 80 fps on high settings so I would say as long as you are running the recommended cpu( which OP is not ) its gpu limited.

I noticed 1 core is running about 85% minimum all the time and the rest 50 to 65%
My upgrade path is 9900k and I found a brand new sealed 1 for 200 quid so it should be coming this week
I will let you know how that goes. I think maybe it will make gameplay smoother without necessarily increasing fps
 
It seems to be a GPU limitation.

I decided to toast (burn up a AMD RX-5700XT) by making some GPU firmware (bios) tweaks and finally writing a bios of my own <<(I'm sure AMD would frown on) and managed to get some good results. I will be sharing my results through the AMD Bug reporting system because some of what I found may help with both newer and older cards, at least for AMD. Unfortunately, I am unable to get the systems support in using the PCI-e Resizable Bar for the card, because the 5700XT does not have the hardware support for it.

Good results. Just changing memory timings with a memory mod. (Link Below)

View attachment 250520

Insane results. Just before the card overheated and the system shut down.

View attachment 250521

Hybrid Mod.

Source: https://www.youtube.com/watch?v=JbwE6p41OGo

Determine if your card supports "Resizable Bar". Your bios may support this, but it seems you need a 6000's series card to use it.

View attachment 250530
It may say "Enabled" here, but this is only if it is supported in your systems bios, not your cards ability to use it.

View attachment 250531
Use the "Advanced Tab" to see if your card can use the "Resizable Bar". If "GPU Hardware Support" and "Graphics Driver Support" say "No" and "Unsupported GPU", then it does not matter if you have the proper settings in your bios, your card is not hardware compatible.

I have two more RT-5700XT cards, one of which I will do some more testing on, since my system no longer supports Cross-Fire.

WARNING and DISCLAIMER: I know what I am doing. I'm willing to burn up, (or brick) a couple of cards if things go wrong.

DO NOT ATTEMPT these things if you do not understand it, or if you value the card you have. You may brick your card with firmware (bios) changes.

Thanks. Be well.....
Those FPS values make no sense when compared to the 95th percentile. I'm not really sure what you are demonstrating beyond frying a GPU.
 
Just add my bit.... Ryzen 5 3600 here, paired with a Asus Strix 3060 OC - Both CPU and GPU are usually sat around 50-60% usage, ocassional spikes in temp on the CPU (due to the standard cooler being a bit crap), 75FPS everywhere (pegged at 75, but never been below that)...
 
Usually get to 99% GPU usage before it starts throttling.

CPU sits anywhere between 15-30%

1080p / 60fps everywhere on ultra settings.
 

Attachments

  • stats.png
    stats.png
    72.3 KB · Views: 107
Last edited:
Just add my bit.... Ryzen 5 3600 here, paired with a Asus Strix 3060 OC - Both CPU and GPU are usually sat around 50-60% usage, ocassional spikes in temp on the CPU (due to the standard cooler being a bit crap), 75FPS everywhere (pegged at 75, but never been below that)...
Could you also supply the relevant information of what resolution that is at, and what game graphical settings? For all we know that's on an old 1204x768 monitor and on 'potato' settings ;) .
 
Tbh I too will probably get it within the next month or so mainly because when I do upgrade to a bigger desktop (which is also on the cards) it should be ready to go 🙂

As BottomHat says performance can't be adequately determined on hardware specs without actually trying it 🤔
 
Could you also supply the relevant information of what resolution that is at, and what game graphical settings? For all we know that's on an old 1204x768 monitor and on 'potato' settings ;) .
1080p (my max resolution on my monitor), Ultra/Ultra+ on everything, SuperSampling set at x1.25, Terrain Gen Slider all the way full. AA off.

I also found a few bases which seem to dip to single FPS numbers occasionally - with nothing hitting 100% usage... (something to do with waiting for network stuff, maybe?)
 
Last edited:
1080p (my max resolution on my monitor), Ultra/Ultra+ on everything, SuperSampling set at x1.25, Terrain Gen Slider all the way full. AA off.

I also found a few bases which seem to dip to single FPS numbers occasionally - with nothing hitting 100% usage... (something to do with waiting for network stuff, maybe?)
Same. I have an 8600K OCed (which is only one tier above the recommended 8600 non-K) and a 1070 Ti (which is two tiers above the 1060 6GB recc). When I see the lowest framerates, neither GPU nor CPU are anywhere CLOSE to being maxed out (and yes I measure CPU usage per-core, so I know it isn't a single thread issue). In these low framerate situations (as in 30-40fps), GPU is usually around 40% usage and the highest CPU thread usage is around 70%.

Whatever the issue is, it's with poorly organized code, and NOTHING to do with there not being enough hardware horsepower. And even then, the times where GPU usage is at maximum for me, there is usually not nearly enough things going on on-screen to justify that kind of GPU usage.
 
When I see the lowest framerates, neither GPU nor CPU are anywhere CLOSE to being maxed out (and yes I measure CPU usage per-core, so I know it isn't a single thread issue). In these low framerate situations (as in 30-40fps), GPU is usually around 40% usage and the highest CPU thread usage is around 70%.

Are you controlling for polling rate? Without it being fast enough, you'll see spikes in CPU load--either those caused by the game engine itself, or by the Windows scheduler's propensity to cycle threads between different cores, or both--averaged out and rounded off. Even overtly CPU limited scenarios will report well under 100% per-core utilization if the load is very transisent and/or the polling interval too long.
 
I’m at least contemplating buying Odyssey in a couple of months, if FD improves upon it in the coming months (in terms of FPS, planet appearance and bugs), because I have an interest in making another ED video series IF it is practical to do so with my current hardware, but there would be very little point in it if I can’t show the newest EDO stuff (on-foot movement, base interiors, etc.)

But my system is a little lopsided in terms of CPU/GPU balance, as I have a five year old i7-6700k CPU combined with an RTX 3070. I‘ve always run EDH at 1440p resolution and would want to continue doing so since that’s my monitor’s native resolution. The obvious question I wish to put to the forum is how EDO (in its current state) might run on such a system. But to ask it in a more specific and useful way: when players run EDO on a relatively balanced middle-tier system (e.g., something like a 6700k + 1070 or 1080), are they finding their FPS to be limited by the GPU maxing out, or the CPU?

I’m hoping to hear that the bulk of the load on such a system might be on the GPU side, since that’s where my gaming system’s extra muscle is, and in such a case I’d hope to at least have a chance to achieve somewhat reasonable performance with EDO (e.g., 40 FPS or better, enough to allow mostly smooth 30 FPS video capture for use in videos).

Your thoughts and musings would be much appreciated.
i got 3900x 32gb ram 3060ti and still have under 30fps on some settlements. its the game. sometimes gpu chills at 40-50% sometimes its 100% but still cant go over 30fps. Cpu is bored.
 
Same. I have an 8600K OCed (which is only one tier above the recommended 8600 non-K) and a 1070 Ti (which is two tiers above the 1060 6GB recc). When I see the lowest framerates, neither GPU nor CPU are anywhere CLOSE to being maxed out (and yes I measure CPU usage per-core, so I know it isn't a single thread issue). In these low framerate situations (as in 30-40fps), GPU is usually around 40% usage and the highest CPU thread usage is around 70%.

Whatever the issue is, it's with poorly organized code, and NOTHING to do with there not being enough hardware horsepower. And even then, the times where GPU usage is at maximum for me, there is usually not nearly enough things going on on-screen to justify that kind of GPU usage.
I'm actually seeing occasional dips to single figures! No warning, no indications anything is struggling - just a second or two of 8 or 9 fps, then it'll jump back up to a smooth 75.... (it's nothing to do with graphics settings, as it does it in full potato mode, too)
 
Edit: I may have misread the OPs CPU, my bad. Still in general the reminder on requirements may help someone.

I may have a unicorn experience, with all the fancy PC setups reporting issues… but my rig meets closely (it is a tad bit better) the system requirements recommended by Frontier and since Update 6 it runs reasonably well. Still some FPS drops on foot, measurable - not necessarily disturbing while playing.

So I believe the OP may have a good reason to update the CPU as it seems to be far below spec while the GPU is overkill already. I mean, there was a Design goal for FDEV and it materialized into the recommended requirements. Anything below can be expected to run not so well.

Reminder:

Elite Dangerous: Odyssey Recommended Requirements​

Latest Graphic Cards
  • CPU: Intel Core i5-8600 / AMD Ryzen 5 2600
  • RAM: 12 GB
  • OS: Windows 10 64bit
  • VIDEO CARD: NVIDIA Geforce GTX 1060 (6 GB VRAM) / AMD RX 580 (8 GB VRAM)
  • PIXEL SHADER: 5.1
  • VERTEX SHADER: 5.1
  • FREE DISK SPACE: 75 GB
  • DEDICATED VIDEO RAM: 6 GB
 
Are you controlling for polling rate? Without it being fast enough, you'll see spikes in CPU load--either those caused by the game engine itself, or by the Windows scheduler's propensity to cycle threads between different cores, or both--averaged out and rounded off. Even overtly CPU limited scenarios will report well under 100% per-core utilization if the load is very transisent and/or the polling interval too long.
Considering that these peculiar fps drops often last for many minutes on end, polling rate isn't really a factor needing controlling. I've dropped into settlements (sometimes uninhabited ones) that will drop my framerate down to high 40's or low 50's for the entire duration that I'm there, despite neither the GPU nor any one CPU thread being close to max usage. And the framerate doesn't recover until I either leave the planet or I run away and put enough distance between me and that settlement.

The framerate drops aren't happening quickly and sporadically, so polling rate is not going to confound the results.
 
Considering that these peculiar fps drops often last for many minutes on end, polling rate isn't really a factor needing controlling. I've dropped into settlements (sometimes uninhabited ones) that will drop my framerate down to high 40's or low 50's for the entire duration that I'm there, despite neither the GPU nor any one CPU thread being close to max usage. And the framerate doesn't recover until I either leave the planet or I run away and put enough distance between me and that settlement.

The framerate drops aren't happening quickly and sporadically, so polling rate is not going to confound the results.

Duration of the low performance is not at all relevant to my points.

You aren't increasing polling rate to reveal a frame rate drop, but a CPU limitation caused by patterns of CPU load that get averaged away over longer polling rates. If the game engine loads a couple of render/worker threads to 100% for half of the time it takes to issue it's draw calls then falls to near idle for the rest of the time the frame is being presented, that's still a CPU limitation and is still long enough to significantly impact frame rates, even if the average logical CPU load over the span of one full second is far below 100%.

If I play in settlements at 1440p or lower, I am completely and persistently CPU limited, even with a ~5GHz 5800X, and this CPU bottleneck is essentially invisible at typical polling rates.

This is from a few months back, but the game still behaves fundamentally the same way:

Several of my logical cores are hitting 100% utilization with Task Manager never reporting more than 70% on any core, because it's graphs are essentially showing the mean for a 1000ms period.

If I could poll even faster, I bet I'd see spikes to near 100% more frequently, probably at least once per rendered frame, and it would last as long as I stayed in an area I was CPU limited in.
 
I'm actually seeing occasional dips to single figures! No warning, no indications anything is struggling - just a second or two of 8 or 9 fps, then it'll jump back up to a smooth 75.... (it's nothing to do with graphics settings, as it does it in full potato mode, too)
My guess is, those dips are related to loading assets off of disk, perhaps during LOD changes. Or possibly shuffling data between onboard and main system memory. If the GPU is simply waiting for data it needs to render the scene, the only thing that can happen is a stall. I can say, from some fiddling with storage last month, that moving the game install between rotating and SSD storage makes a big difference (but does not eliminate) to the frequency and severity of those stalls at settlements.
 
My guess is, those dips are related to loading assets off of disk, perhaps during LOD changes. Or possibly shuffling data between onboard and main system memory. If the GPU is simply waiting for data it needs to render the scene, the only thing that can happen is a stall. I can say, from some fiddling with storage last month, that moving the game install between rotating and SSD storage makes a big difference (but does not eliminate) to the frequency and severity of those stalls at settlements.
Even running off an M.2 NVMe SSD with a 2.4GB/s read speed, alongside 32GB RAM?
 
Back
Top Bottom