Is Odyssey bad for your GPU?

I have an RTX 2070 Super and have had it reach over 70 C just sitting in a station hanger, although being anywhere on foot is the worst. I don't know much about graphics settings, but I went through the Odysssey settings and changed anything that said "Untra" to "High", and anything that said "High" to "Medium." I can now sit in a station. Awaiting Frontier fix before going out on foot.
 
Odyssey hasn't really bothered my 1080 GPU too much; it tends to float between 55° in space and rare peaks of 70° in really busy or bright concourses. But the i7 4770 CPU gets the absolute snot kicked out of it by this game.

It's a closed-loop water-cooled CPU but the radiator fans aren't ideally positioned for airflow in warm weather and the PC is sitting in the warmest room in the house, so it tends to idle around 70° on the hottest Summer days anyway. But Odyssey ramps that to a minimum of 85° with regular excursions into the low-to-mid 90s and brief peaks into the high 90s in busy concourses. A couple of times at the height of the UK mini-heatwave it actually tripped the BIOS' over-temperature protection.

The only other program ever to have done that was the well known resource-gobbler Handbrake, and in that case I had to set its affinity to a single core and leave it rendering overnight just to be safe. I can't imagine Odyssey would cope well with having 7 cores removed from its arsenal, although to be fair I haven't tried.

I've no idea why Odyssey is so CPU-bound on this system when Horizons wasn't, but perhaps as others have said the GPU is silently throttling when it hits 70° and forcing the CPU to pick up the slack.

It's difficult to be certain with the changing ambient temperature being a factor, but anecdotally it feels as though each of the Odyssey patches has improved things marginally when it comes to both GPU and CPU load, so I have some confidence that further optimisations may improve things a little more. But the degree to which they can be improved is the real unknown. There may simply be an efficiency limit in the Cobra engine.

At this stage, with most of the hardware in this PC being 2013 vintage, I'd normally be looking at a full replacement. But that ended when I started looking at the prices.
 
Cyberpunk 2077 @ ultra with raytracing doesn't cane my graphics card as badly as EDO which is why I've stopped playing for now.
Will see what it's like after patch 6 <fingers crossed>, tho' probably not coming back until performance, planet gen and lighting is fixed...
 
So based on recent stories about the beta of the new Amazon MMO New World bricking some 3090 cards, particularly it seems EVGA ones: (https://www.gamesradar.com/amazon-m...ly-bricking-some-geforce-3090-graphics-cards/) I'd say it is at least a hypothetical possibility unoptimized games may in fact be harmful to certain cards.

I would bet though, it's likely the card manufacturer's gear being faulty than some grand issue with all Nvidia cards or something dumb like that.

FWIW, my experience in the past few years is that usually when I have a problem with a game it is CPU bound and not GPU bound. I don't really know why that is but I suspect the complexity of modern APIs makes it hard to know when some really hot code path is going to end up hitting the CPU or the GPU. Usually, I'm sure DX, Vulcan, etc. are going to ensure the GPU is doing absolutely the most work it possibly can for all workloads, but perhaps there's something slipping though the cracks?
 
Depends quite a bit on case cooling and ambient air temperature - but as I said elsewhere : especially the Odyssey Main Menu is almost as good as running FurMark(tm).

I can be a bit more specific on that: it can be a tad worse(better) than Furmark actually. Just today I plugged in a customer's card I brought home from work, reported as potentially faulty/unstable during gaming. Since I'll have no builds ready for the next few days and it's the weekend, what's better than carry it back at home and put it under real gaming stress?

Card is a RTX 2060, Inno3D so nothing exceptionally fancy cooling wise, first stress test with Furmark stabilized at 85-86 °C after a few minutes (closed case, hot day and a Ryzen 5600x above 80 °C on its pityful stock cooler, so far from ideal conditions)....there comes Odyssey main menu at uncapped framerate, 1080p on high-ultra...87 °C, a few seconds after launch. Take that Furmark!

(By the way, card seems to be working mighty fine for hours on end, the usual case of PEBCAK)

Capping framerates at 60Hz helped a good bit, found the card chugging along around 70-75 °C after a while of playing (nav beacon hunting though, I suspect any concourse could have been hotter).

@Threeofseven - unless you are playing at 4K, game seems to be doing something very wrong on your end though. During those tests (1080p, FXAA, everything on high with terrain, bloom and materials on ultra), I saw the surface concourses going from the usual 25-30 fps of my 1060 to 40-45 at worst bar minor dips of a few seconds, with most of the places and planet surfaces at a mostly stable capped 60. A 2080Ti should literally fly by comparison. I don't envy whoever has to sort through this mess.
 
Ryzen 2700X // RTX2060 // 32 gig DDR4 // MSI B450 Gaming Plus mainboard.

Odyssey in space 65C
Odyssey in station/planetside 70-75C using a MSI Afterburner fan speed profile or its much higher!

This is very bad programming Frontiers End and should of been addressed in the very first few "Many fixes and Improvements!", But it was not.

Since I can heat my house and cook dinner on my GPU, I am giving Elite a wide birth until hopefully Frontier can do a Hello Games and bring Elite back to a better playable state. Update 6 coming "soon™" I am highly doubtful of any improvement, due to the past 5 updates have done very little. But I have a little faith left to hold onto.
 
Last edited:
Cyberpunk 2077 @ ultra with raytracing doesn't cane my graphics card as badly as EDO which is why I've stopped playing for now.
Will see what it's like after patch 6 <fingers crossed>, tho' probably not coming back until performance, planet gen and lighting is fixed...
Yeah, ED:O is the only game right now that makes my PC sweat. I have to have a fan pointed at the tower if I'm playing it.
 
The story I heard about the MMO that was killing cards that it was generating small surges of extremely high framerates at the menu screen. The surges were so short that the GPU card’s safety features did not pick them up, and so they effectively assumed that power consumption would be related to the average framerate - while the GPU melted itself to death.

One could argue that if this is the explanation, both the developer (and/or engine developer) and the hardware people messed up.

(Say what you want about EDO, that doesn’t seem to be its problem.)
 
PC hardware is no different from any other piece of kit you may own. Flogging it is not healthy for it. The absolute best thing you can do for your system as a whole is a good cooling solution. In a very simplistic sense heat is the by product of the work your system does. The hotter your system is, the harder your system is being worked. EDO spiked my GPU hotspot temps to over 90 degrees, whilst idling at the bartender. This was repeatable.

I have been building my own systems for 20 years, and qualified as a system and network engineer over 20 years ago (though only worked in the field for a short period of time, preferring real world pew pew over screens as a young idiot). I'm a cooling nerd, who runs a HAF (high airflow) setup. EDO abuses my GPU in a way that it shouldn't, and it's something i have been saying since the beta. The code is borked, and optimising it should have been the top priority, it hasn't been, so we have what we have today with people whose systems should be eating this game without breaking a sweat, cooking themselves to hit 50 fps.

The absolute best piece of freeware to monitor what is going on with your hardware is HWiNFO64, it's insane that's it's free, so if you do use it consider kicking a dono to the creators. Don't be afraid of all the numbers, it's easy to understand with a little reading into what it's telling you.
 
Every joule of energy converted from electricity to heat by your GPU is bad for your GPU. Every thermal cycle is bad for your GPU. Higher rates and amplitudes of any of these things are worse.

Is Odyssey going to promptly kill your GPU? Not unless your GPU already has a fatal defect, waiting to be exposed, or you're running something significantly beyond spec.
 
One could argue that if this is the explanation, both the developer (and/or engine developer) and the hardware people messed up.
It's absolutely not a developer problem at all, other than their game being garbage for other reasons. It will fail in other games, not just New World. EVGA's terrible engineering is the problem, their cards often catch fire. Their 1080 FTW even caught fire while being reviewed.

runs a little hot.jpg
 
Bad for your GPU? Not really no; if it gets too hot it will just throttle its performance to keep the heat below its max safe temperature.

The problem right now is that peoples' GPUs are essentially working triple-time just to pump out what little frames they can render in Odyssey, especially in comparison to Horizons. Heck, I can get better framerates in Cyberpunk at max settings than I can in EDO.
 
I found that Odyssey is the coolest game I've run on my i7 RTX 2060 & run the laptop on a cooling tray when gaming.

Space Engineers & NMS both run at least 10 degrees higher, both gpu & cpu, recently I tried a beta of Dysterra & I was actually getting concerned at the high temps I was getting, not likely to buy that game because of it.
 
This may be a stupid question as I’m a PC hardware noob, but is odyssey bad for your GPU? I have a 2080 Ti and I’m struggling to get over 35fps on surface settlements—with frequent drops to 10-15fps. My GPU fans sound like a jet engine and the temp hovers around 84-85C. Is this not terrible for the GPU?

Ive pretty much stopped playing Odyssey for the time being due to abysmal performance. I’m worried that it will stress my GPU to the point of failure, which wouldn’t be good with the GPU shortage at the moment.
Just with everything else, if something is pushed to its limits it wears out faster or loses some of its performance. Oddity does not directly damage your GPU, it just accelerates the end for it. Some would say that is the Same. If you still need your GPU for a few years, do not play Oddity on it.
 
I far prefer the 73 °C of my old 1060, compared to the 87 of the 2060 I just removed. The 25 fps less are just collateral damage. 😅

Anyway, FS2020 didn't fare much better (as in, it did fare absolutely the same), so I guess with all the many issues Odyssey carries with it, an excessive heat build up isn't necessarily one.

An interesting thing I noticed though was related to the CPU. Given that the stock Wraith Stealth cooler of the 5600x is absolutely unfit for the job, the difference in max temps between the two cards was notable: 85-86 with the 1060, a terrifying 97 °C max peak with the 2060. A good part of it surely was due to heat feedback inside the closed case (with the hotter 2060 saturating the dissipating capacity of air circulation inside), but given the peak was near instantaneous, I also suspect having a faster GPU might mean more draw calls (or whatever other graphics engine jargon) to the CPU, hence more work and heat for that too.
 
With my RTX 3080 and Odyssey limited to 60 FPS, the GPU has around 80% load as its maximum down at settlements, and a temperature of around 63 degrees C, which is very acceptable. If I remove the FPS limiter, it will of course rise to 100% load in order to crank out more FPS, but I'm happy with 60. I still see the occasional drop in frames down to 20, but I trust FDev is investigating this due to the number of people reporting the issues.
 
With my RTX 3080 and Odyssey limited to 60 FPS, the GPU has around 80% load as its maximum down at settlements, and a temperature of around 63 degrees C, which is very acceptable. If I remove the FPS limiter, it will of course rise to 100% load in order to crank out more FPS, but I'm happy with 60. I still see the occasional drop in frames down to 20, but I trust FDev is investigating this due to the number of people reporting the issues.
Your GPU is 2 generations above recommended. The fact you have FPS drop AT ALL is concerning. Even worse that you need to cap at 60fps.

They say you need a Lada, and you have a Ferrari. FPS should not be a concern whatsoever. The only concern would be to cap the FPS at 120 so you don't have a thousand or something.
 
I've stopped playing Odyssey because frame rates too low to be fun and because it's summer and Odyssey over heats my machine, as Nvidia throttle back when the temperature gets too high this probably helps to explain some of the bad frame rates - the card is being forced to render too much and over heating so the driver throttles it and delivers even lower frame rates.

I too can play CP2077 without these concerns.

I think FDev are seriously out of their depth when it comes to ground based First person rendering - whilst the lighting and texturing were simple in Horizons, sitting in my SRV and driving around had no issues with heat or frame rates, I'm capped at 60 and it was never a problem. Now in Odyssey they are trying to catch up with PBR techniques( from around 2016 ) and modern rendering and they can't even manage 30 FPS.
FDev's engine(Cobra) is not fit for purpose and I seriously believe it's a non-trivial thing to bring it up to the level of Unreal or CryEngine or REDengine 4 or any other modern rendering engine.
I believe the Cobra engine was designed for space combat first.
 
Back
Top Bottom