GPU Usage at 99-100 %

A card that is properly cooled and running at factory clock rates usually reaches obsolescence long before it reaches end of life.
Sorry but nope , 99 % of my video cards i had , were overclocked and running perfecly fine if u know how to overlock em at best stage of em. As i said before , not a single card were burned because of overclock , more than , i sold them after i upgraded each time to a newer video card , and not a single guy told me anything wrong about the video card i sold after even years .
 
A card that is properly cooled and running at factory clock rates usually reaches obsolescence long before it reaches end of life.
Usually, is the key word. Building a PC is complicated and full of pitfalls. I guess we shouldn't be surprised everyone prioritizes console releases.
 
All other single games i have most of them using up to 40-60 % of GPU usage maximum . This is not normal !
This is actually really good! you WANT your GPU to be giving all its power, so you get the most performance and frames that it can offer. You should actually be glad it's hitting that level because it means the game is using all of your hardware as its meant to be used
 
Usually, is the key word. Building a PC is complicated and full of pitfalls. I guess we shouldn't be surprised everyone prioritizes console releases.
The point here is that 100% load isn't the problem. If it was, then you could extend any GPU's lifespan by overclocking it to oblivion, so that it would never hit 100% load with any game.

Obviously the opposite is true: an underclocked card is more likely to hit 100% load, but also more likely to last longer.
 
The point here is that 100% load isn't the problem. If it was, then you could extend any GPU's lifespan by overclocking it to oblivion, so that it would never hit 100% load with any game.

Obviously the opposite is true: an underclocked card is more likely to hit 100% load, but also more likely to last longer.
But it is the problem. Why does Odyssey need 4k textures, speculars and normal maps when the new lighting model just washes out any extra detail you'd get from it? All you need are some side by side comparisons to see the Horizons build offers better visual fidelity for less vram usage. Its like they were just asked to jack up the requirements by any means necessary rather than make a reasonable expansion to the game.
 
Your graphics card should keep trying to do more if you are under the framerate limit. Less than 100% would point to a bottleneck someplace else. I have noticed the 8gb in my 1080Ti seems to constantly nearly or very full. I do wonder if a lot of data is being passed between the gpu/cpu/main memory.
 
Your graphics card should keep trying to do more if you are under the framerate limit. Less than 100% would point to a bottleneck someplace else. I have noticed the 8gb in my 1080Ti seems to constantly nearly or very full. I do wonder if a lot of data is being passed between the gpu/cpu/main memory.
Its like I said. They greatly expanded texture memory with no actual measurable net gain in visual fidelity. No game in existence can fill 8gb of vram right now except elite odyssey, for reasons you or I can't possibly fathom. If they were actually interested in making a good game they would be here right now explaining why. But they aren't. They're only telling you 'we're looking into it,' as if they don't already know what all the problems are.
 
It is true, more GPU usage the better, high gpu usage doesn't degrade anything - it was designed to be at 100% load but with the only condition if the game is well optimized - in our case its not. The GPU does a lot of extra work that it shouldn't, 100% load, 100% VRAM usage even for some people with 8+ GB VRAM... its a waste of power. There are hardly any AAA games that use 4-6 GB of VRAM let alone 8-12 GB like Odyssey - this is not ok and the Dev's should look into it and optimized it. I have a RX 580 double OC 4 GB GPU and when I read people having difficulties with high-end GPUs this actually scares me because my mid grade GPU runs with 30-50 FPS - my only problem is the planets ground texture remaining low quality on high-ultra settings, everything else is sharp and kicking. I mean... how in hell do you have a much lower FPS with a superior GPU than my old trusty RX? I read a post with someone struggling with an RTX 2070 at 24 FPS with medium-high textures(Reddit). Do these people lie? Why would they lie? What do they gain by lying? I believe FDev shot themselves in the foot. ED will lose a lot of players that have mid build computers. I don't believe everyone here have the latest GPU's on the market. Especially now with Covid, crypto mining and the new import tariffs.

Captweweure.PNG
 
Last edited:
Although I'm not in game development I do work extensively with GPUs for my job in high performance computing. First of all, yes GPUs are indeed designed to work at 100% without issues. It's just normally very difficult to achieve this as the GPU is controlled from the CPU side. The GPU is designed as a coprocessor and as such it requires the CPU to instruct it what code to execute at what time. Now the architectures allow for so called asynchronous execution, which means that you can send execution instructions to the GPU in a queue and control will return immediately to the CPU before the GPU is done executing them. This allows both to execute code simultaneously. But most of the time the CPU side of things must synchronize with GPU side of things so there is usually some inefficiency involved in this model.

If the GPU code that is running is efficiently implemented and your GPU is relatively more powerful than your CPU, you will underutilize the GPU because it will execute it's side of the program more quickly than the CPU side and it will be idle until the CPU is caught up. In those cases you will see that the GPU load will not reach 100%. This is called a CPU bottleneck. Your GPU could do much more work, but the poor CPU can't keep up.

This is not the only reason for underutilisation of the GPU of course. Poorly designed programs can also cause this. For instance the program must constantly move data across the PCI-E bus. That will slow down execution on the GPU if it requires that data for execution.

Now if the GPU is 100% utilized by a program it can mean that the software engineer has been able to tune it so well that both sides are used very efficiently and for games that would mean high framerates at the resolution that the GPU was designed for. But if the framerates are low and the utilization is high, that points to poorly optimized code.

So in short: High utilization isn't bad in itself and under normal circumstances won't do harm. If you are experiencing low to medium utilization in most games it might mean that your GPU outclassed your CPU and that you might benefit from an upgrade there.
Low framerates combined with high GPU utilization can mean that you are exceeding the capabilities of your system or can indicate low optimization.
 
Last edited:
Although I'm not in game development I do work extensively with GPUs for my job in high performance computing. First of all, yes GPUs are indeed designed to work at 100% without issues. It's just normally very difficult to achieve this as the GPU is controlled from the CPU side. The GPU is designed as a coprocessor and as such it requires the CPU to instruct it what code to execute at what time. Now the architectures allow for so called asynchronous execution, which means that you can send execution instructions to the GPU in a queue and control will return immediately to the CPU before the GPU is done executing them. This allows both to execute code simultaneously. But most of the time the CPU side of things must synchronize with GPU side of things so there is usually some inefficiency involved in this model.

If the GPU code that is running is efficiently implemented and your GPU is relatively more powerful than your CPU, you will underutilize the GPU because it will execute it's side of the program more quickly than the CPU side and it will be idle until the CPU is caught up. In those cases you will see that the GPU load will not reach 100%. This is called a CPU bottleneck. Your GPU could do much more work, but the poor CPU can't keep up.

This is not the only reason for underutilisation of the GPU of course. Poorly designed programs can also cause this. For instance the program must constantly move data across the PCI-E bus. That will slow down execution on the GPU if it requires that data for execution.

Now if the GPU is 100% utilized by a program it can mean that the software engineer has been able to tune it so well that both sides are used very efficiently and for games that would mean high framerates at the resolution that the GPU was designed for. But if the framerates are low and the utilization is high, that points to poorly optimized code.

So in short: High utilization isn't bad in itself and under normal circumstances won't do harm. If you are experiencing low to medium utilization in most games it might mean that your GPU outclassed your CPU and that you might benefit from an upgrade there.
You aren't the real Crazy Joe that donated all of his virtual income to charity. You're a god damn fraud and so is your education. 100% utilization is never good, it never has been, never will be. If you're going to try to argue with people against common sense, maybe start with drugging them first then hitting them in the head repeatedly with a hammer. It might actually work as opposed to whatever this rambling is supposed to be.
 
You aren't the real Crazy Joe that donated all of his virtual income to charity. You're a god damn fraud and so is your education. 100% utilization is never good, it never has been, never will be. If you're going to try to argue with people against common sense, maybe start with drugging them first then hitting them in the head repeatedly with a hammer. It might actually work as opposed to whatever this rambling is supposed to be.
And I see the crazies have arrived.....
 
Your GPU should be at 100%, that's how it's designed to work. That's not the issue, the issue seems to be performance is misallocated in the case of Odyssey, your 100% GPU usage doesn't get you much because of optimization issues.

Like if, you're used to running only 40% GPU, you are bottlenecking at your Monitor (Most likely, Vsync) or CPU (Less likely, and this would be bad) If you have a good, high refresh (ideally) adaptive sync monitor, your GPU will always be working at 100% or near there.
 
And I see the crazies have arrived.....
Oh of course, because life expectancy under full load vs normal load on hardware is just a rubbish statistic newegg pulled out of its butt just to screw over the poor innocent GPU manufacturers who are barely scraping by thanks to bullies like me right?
Your GPU should be at 100%, that's how it's designed to work. That's not the issue, the issue seems to be performance is misallocated in the case of Odyssey, your 100% GPU usage doesn't get you much because of optimization issues.

Like if, you're used to running only 40% GPU, you are bottlenecking at your Monitor (Most likely, Vsync) or CPU (Less likely, and this would be bad) If you have a good, high refresh (ideally) adaptive sync monitor, your GPU will always be working at 100% or near there.
See above. Your GPU usage is relative to vram. If a game is using 8gb of vram, it is a broken game. See: Every single other game that currently exists.
 
I'm running a 2070 Super with 8GB VRAM, an Intel i7 10700K and 16GB RAM.

My CPU and RAM doesn't break a sweat running Odyssey but the GPU is frequently hitting 90%+ utilisation and my water cooling system is having to work hard. None of this worries me much as it's within the hardware's capabilities but what I find strange is that other games I run with similar or greater requirements don't seem to be taxing the GPU anywhere near the same level (although I'll have to run some tests to confirm what I'm saying is accurate).

EDIT: Forgot to point out that my current assumption from what I'm seeing is that Odyssey isn't properly optimised as I seriously can't see anything in-game that requires anything like 8GB VRAM to render.
 
Last edited:
Back
Top Bottom