Taking heat damage : Thermal Throttle :(

My GTX 980 goes up to 70°C when playing Odyssey, which is more than with any other game. And it uses 90-100% of the GPU and memory. Also all my cores are used to around 90%. Granted, I am pushing it with the settings a bit but it's still notable from my experience with the computer.
 
That's generally what's meant by overheating today, when it has to throttle to avoid damage. It's a symptom of a problem with your build, not a problem with the software - there's something wrong with the cooling as it shouldn't be possible for software to do this to you even if it's deliberately trying. If you can identify which component has faulty cooling you can fix it, like via the benchmark tools I posted above. If it's the GPU throttling it probably just needs dusted since GPU manufacturers select appropriate heatsinks/fans for their hardware. If it's the CPU throttling and cleaning the heatsink doesn't fix it, you might need a different heatsink or to reinstall it.
This is mostly wrong, there is always a performance limit to a PC and it's almost always heat, this is what CPU and GPU boost is all about, unless you have an extremely overkill cooling set up if you are not hitting the limits of your cooling you are leaving performance unused and wasting the money you paid for it.

Manufacturers set sensible heat limits balanced between performance and longevity so the CPU and GPU (and sometimes the RAM) will almost allways hit their heat limit before they hit their power limit or a clock speed that makes them unstable, a bone stock system with bone stock cooling will have heat as it's performace limit in 99% of situations because they boost up until they get to hot and then slow down again. I have never seen a stock cooled PC able to boost to its maximum clock speed 100% of the time.
 
Last edited:
Perhaps I'm just not playing heavy duty games (I typically muck about with NMS, KSP, Star Trek bridge crew, Homeworld, Star Wars Squadrons and World of Warships), but I do have MS Flight Simulator (which, I think ought to give my PC a good kicking).... but all is good.

Well all is not good.
You clearly observed a 10K increase in temperature by just switching from Horizons to Odyssey. For me it is a 20k increase (45-55C to 65-70C) - recorded in the very same instance in both Horizons and Odyssey (idling in a station / hangar or on your carrier).
Additionally the frequency displayed by the ingame counter might very well be the frequency with which your screen is refreshed, but that's not the content update frequency - that one can be way lower. Just say you are in the main menu - the game might "decide" 'who cares? updating the content with 30fps is totally fine' - but your screen will refresh with 60fps essentially showing you one frame two times and then updating it.
Something similar is happening in Odyssey proper. If you fly through the mailslot of a station, land at a surface outpost or just walking the hangar, you will realize that the content update frequency clearly is not what your screen refresh frequency is. While the former can be in the single digit realm, you might have a "solid" 30fps displayed.

There is something wrong with Odyssey on a very fundamental level and it remains to be seen if Fdev is willing to fix this. We know it can be done because we do have Horizons.
 
It isn't necessarily bad for a game to use all the GPU, the GPU should be able to handle it. Being throttled due to heat can only really be solved by hardware (e.g. water cooling). This isn't to suggest EDO doesn't have performance issues. :)
 
Elite: Dangerous in general has always been able to load GPUs like almost no other actual game. That said Horizons is still a bit warmer than Odyssey for me, at least if frame rate is uncapped, and Horizons has a considerably higher frame rate.

On my current primary system, most games keep my GPU (6800 XT @ 2.5GHz, undervolted, uncapped power limit) in the 275w range, but Horizons will push it above 300w, with Odyssey usually being somewhere between. Of course, I did all of my testing/validation for ~375w in 30C ambients.

Part of the issue with Odyssey is that there are some areas where I'm CPU limited, but Horizons has never had such a limitation.

What's really strange is even my CPU (i7-7700) is very hot (about 70 degrees). And that's a 65W cpu with a 250W cooling block on top.

70C isn't particularly hot for a CPU and Intel's hasn't been using a definition of TDP that limits the thermal output of a part to it, at least in the short term, for quite some time.

Cooling capacity ratings of heatsinks is also misleading as no one states temperature deltas and cooling manufacturers aren't able to take the broad ranges of thermal densities or resistance of various CPUs into account. The general trend has been for thermal density to increase, even as peak power has leveled off or even declined.

For example, my first 200+ watt CPU was Core 2 Quad (well, a Xeon X3350 Yorkfield that I overclocked from 2.66GHz to 4GHz), my current primary CPU peaks at less than 170w (a Ryzen 7 5800X w/custom PBO and loosened power limits), but the later part is vastly harder to cool because despite lower total dissipation, thermal density has at least tripled. Getting the heat to the heatsink is now the limiting factor.

That's generally what's meant by overheating today, when it has to throttle to avoid damage. It's a symptom of a problem with your build, not a problem with the software - there's something wrong with the cooling as it shouldn't be possible for software to do this to you even if it's deliberately trying. If you can identify which component has faulty cooling you can fix it, like via the benchmark tools I posted above. If it's the GPU throttling it probably just needs dusted since GPU manufacturers select appropriate heatsinks/fans for their hardware. If it's the CPU throttling and cleaning the heatsink doesn't fix it, you might need a different heatsink or to reinstall it.

Most modern CPUs and GPUs are power and/or temperature limited. Clocks are, by design, highly dynamic and without meddling can swing wildly from load to load, generally boosting as far as practical within a given power limit that the cooling solution and power delivery components were designed around.

Also OP posted that it doesn't happen in any other of his demanding games so it's not his rig that's the prob

Most other games aren't as demanding as Elite: Dangerous, but Elite: Dangerous is hardly the most demanding software that can be executed.

I’m running it on a laptop so I’m used to thermal throttling, especially this time of year, but Odyssey is straining my system far more than games with high required specs, it’s not our rigs.

Recommended specs or apparent visual quality aren't good indicators of hardware utilization.

Being throttled due to heat can only really be solved by hardware (e.g. water cooling).

I'm not reaching thermal limits on any of my GPUs, but they are all significantly undervolted.

No, it isn't. It's even a security hazard to have a system where untrusted software (javascript, etc.) can bring the temperature up to the point where throttles kick in, it's a huge opportunity for a side-channel attack.

It's not a security hazard to run at less than peak clock speeds.

You will be hard pressed to find a modern, stock, part that cannot be easily brought to it's power and/or thermal limit using the cooling that was designed for it's TDP or that came with it. Assuming it's not outright defective, it will also handle running at such temperatures indefinitely, if nothing is out of spec.
 
It's not a security hazard to run at less than peak clock speeds.
There's been a substantial amount of work getting encryption algorithms to run in the same amount of time regardless of the input, but not always in the same amount of heat. If you can raise the temp to the point where it might throttle or not throttle depending on the input you've got your timing attacks back. Faulty cooling is unsafe just like faulty hardware.
 
My GTX 980 goes up to 70°C when playing Odyssey, which is more than with any other game. And it uses 90-100% of the GPU and memory. Also all my cores are used to around 90%. Granted, I am pushing it with the settings a bit but it's still notable from my experience with the computer.
My current case is not optimal for cooling (new one on the way), and my 980 I'm having to limit, otherwise it's hitting 90c. It's a factory OC card, so it's stock speed is the same as most stock cards boost. Hopefully my new case will bring it down to a better level.

Running 1080x1920 and about the same on High or Ultra. Running Horizons on Ultra, temps are more stable in the low to mid 80s, rather than mid 80s to 90s.
 
My GPU starts to melt when I start EDO, the temperature rises from 40 ° C to 75 ° C in about 5 minutes and when the room temperature is above 35 ° C, like yesterday, I'm really scared for my card, and a replacement is out of sight during this time of GPU deficiency. That's why I prefer not to play Odyssey. But in spirit I am with all the players who sacrifice their cards for the greater good™

75°C is nothing to worry about.
 
My current case is not optimal for cooling (new one on the way), and my 980 I'm having to limit, otherwise it's hitting 90c. It's a factory OC card, so it's stock speed is the same as most stock cards boost. Hopefully my new case will bring it down to a better level.

Running 1080x1920 and about the same on High or Ultra. Running Horizons on Ultra, temps are more stable in the low to mid 80s, rather than mid 80s to 90s.
I have to run mine in debug mode to turn off the OC or the driver will randomly crash resulting in a blank screen. No driver update ever fixed that for me. Just happy there's this workaround for it.
 
I was streaming EDO last night and noticed that my PC was hitting a thermal throttle (set at 80c) whilst playing EDO Patch 4 both on CPU and GPU. Stats from Afterburner. No other game I have pushes my computer this hard (including MS Flight Simulator). This also affects my streaming as (due to the throttle) Streamlabs starts to stutter during those parts of the game, meaning I can't stream it smoothly either regardless of what EDO is going.

My PC spec

GTX 1070, i7-8700, 16 GB Ram, SSD (all standard air-cool case)

I wonder if given the number of systems out there someone is going to fry their computer due to playing EDO, particularly as we're now in summer/ambient temps are high and not everyone may have cleaned out dust/optimised their cooling. - Not really a complaint, just an observation. My PC is getting too hot... and it's only EDO that does this. The fan speeds are really audible when playing and I can feel hot air blowing on my feet (PC is under the desk) - don't have this on any other game. FPS in settlements was dreadful... no real improvement for me in Patch 4, though concourses seemed a bit better (I hit 40 for the first time). My FPS is sync'd to 60 for the monitor.

On the plus side - last night I noticed "space" looked noticeably better, the game didn't crash once and I completed two missions. Instancing was rock solid (far better than horizons has ever been).

However, jumping back into Horizons to get the stats below ... the smoothness and general "togetherness" of that version of the game is striking when compared to EDO right now.

Space (Odyssey)

GTX 1070 70c 55% 4840MB
Fan Speed 1970
i7-8700 69c 29% 8774MB
FPS 60

Settlement (Odyssey)

GTX 1070 79c 85% 7222MB
Fan Speed 2965
i7-8700 79c 38% 8774MB
FPS 19

For comparison.... Horizons

Space (Horizons)

GTX 1070 58c 35% 3492MB
Fan Speed 622
i7-8700 41c 16% 6824MB
FPS 60

Surface base (Horizons)
GTX 1070 68c 39% 3492MB
Fan Speed 1120
i7-8700 54c 19% 6824MB
FPS 60

Cheers,

Drew.
do not complain to frontier for your hardware problems, if your cpu is using 100% this is a good sign that the software knows how to use your hardware, the problem is in your cooling system.
 
But that’s all we, the consumer, have to go on. It’s not unreasonable to expect games with the same recommended specs to have similar performance.

That's a different issue entirely. How the game looks or runs is poorly correlated with how demanding it is.

If the games require a water cooled rig because the standard cooling won’t cut it, that should be specified.

It doesn't.

Don’t blame our rigs because the game isn’t properly optimised.

If you can run anything on your rig that causes problems with heat, then something is wrong with your rig. It may be cooling, or it may be overly generous power limits.

You can blame the game for squandering the performance it extracts from hardware, for running like crap, or looking like crap, but a rationally configured system cannot be damaged or produce other significantly undesirable effects simply because software is asking more from it.

If all of your games want x watts, and you are comfortable with that level of heat output, then cap the card at x.

There's been a substantial amount of work getting encryption algorithms to run in the same amount of time regardless of the input, but not always in the same amount of heat. If you can raise the temp to the point where it might throttle or not throttle depending on the input you've got your timing attacks back. Faulty cooling is unsafe just like faulty hardware.

You don't need faulty cooling for clocks to become thermally limited on most modern CPUs and GPUs. Virtually no modern consumer part is going to sit at it's maximum clocks in anything but the lightest of loads, because even moderate loads will hit a current or temperature limiter and be throttled back/not boost as far. As long as the parts are reaching minimum advertised speeds, they are still in spec.

Mitigating potential side-channel attacks that could be impacted by this is not even vaguely a priority for consumer hardware. The days of fixed clocks are over because it leaves too much performance on the table to be competitive. So, we have target temperatures and power limits, and the default is to use every bit of it, because that's how you get the most performance.

For many practical purposes, power and temperatures are what is fixed. Clocks and fan speeds are the dynamic components.

vsync'd to 60 fps

This is why Odyssey is hotter in space for you.
 
Back
Top Bottom