EDO Performance low priority

Does anyone actually engage with the oddysey content in stations, with the low frame rates does anyone actually bother? In which case might as well stick to horizons.
 
Hey, is there a significant improvement in performance since the latest update? I last played the game in june, but now with around 1-2 month later, anybody of you experienced an improvement? Or is there something planned for the upcoming patches?
 
Hey, is there a significant improvement in performance since the latest update? I last played the game in june, but now with around 1-2 month later, anybody of you experienced an improvement? Or is there something planned for the upcoming patches?
Yes and no. A lot of people are having to tweak settings with the new options.

Some people are running on same settings as before but doing better. Bit hit and miss!

Playable for me
 

Deleted member 182079

D
Depends on your system specs. On a 3070 RTX I currently see significant improvements performance wise, while everything on normal, ultra and ultra + (with VSync set to fast). I've just visited a few settlements and most of the time I'm at 60 FPS or above (higher when I'm not recording). But I also see small downgrades of details. Looks like we are in the middle of the polishing an optimising process right now. Play and see for yourself and don't just believe what other people say (me included of course) :D
FSR is enabled by default. So if the average player didn't know better they'd think FDev finally optimised the game.

Sneaky, innit.
 
Hey, is there a significant improvement in performance since the latest update? I last played the game in june, but now with around 1-2 month later, anybody of you experienced an improvement? Or is there something planned for the upcoming patches?
It is hit or miss, some things have gotten better (The UI doesn't suck as much, and Stations (some of them) have gotten better) but you can hit some settlements with 4 mining lasers and fires raging, and the FPS drops to single digits.
 
Last edited:
I have a second gaming PC with an i5-4690, 16gb of ram and a GTX970, all running on a 46" NEC multisync monitor at 1920 x 1080.

Odyssey is fine in space, but I was getting red FPS numbers in stations even with most details set to low.

So I tried something with update 6: I changed the supersampling to 0.5x, set AA to SMAA, changed the Upscaling to Ultra quality, and started getting 50-60 FPS in stations. Yep, it looks a little grainy, and doubly so on that huge monitor, but it's much more playable and will do me until the optimisations (hopefully) show up.
 
Here's what I found recently while playing in Ultra (but first, specs, and tl;dr at the bottom):

Intel(R) Core(TM) i7-6850K CPU @ 3.60GHz (6 cores, 12 threads)
64GB Ram (I don't remember the specs)
NVIDIA Corporation GP104 [GeForce GTX 1080] (rev a1) (465.31 drivers)
5.10.0-8-amd64 #1 SMP Debian 5.10.46-1 (2021-06-24) x86_64 GNU/Linux
Steam + Proton
1920x1080 resolution (the best my monitors can do)

Once things stabilize (I'll get to that later) after a fresh start (start Odyssey, enter scene), I get 150-240 fps in space (don't remember exact numbers). On/near the surface (in the black) in ship (and SRV?), usually 60-120 fps. On foot: 50-90fps. Being near large bio colonies will drop my FPS somewhat, but rarely below 40. In a station or settlement, I have seen 40-80fps on foot.

HOWEVER!!!!!

Every now and then (usually approaching certain planets, sometimes a station), something breaks, and my frame-rate plummets, giving me 8-15 on foot and 15-25 in ship. I say that something breaks because the only way to get decent frame-rate again is to restart Odyssey entirely (just relogging is not enough). This works even landed on the planet that "broke" everything (I think in stations, too, but I got back to the bubble only yesterday, so not enough data). That said, one time this happened in a station, I noticed that culling very definitely is doing something: entering the pioneer shop took my fps from ~15 to ~40, looking out the shop entry dropped it back to ~15.

As for things stabilizing....

Regardless of where I am when I log in (fresh from program start), Odyssey does a LOT of stuttering. I have noted that it's always a single thread (well, half core) pegging at 100% usage in bursts of several seconds while something is being calculated/decompressed/rendered in software... (I really don't know what, but more on that later), with short intervals between the bursts. Depending on the situation, this can last for a few tens of seconds to a few minutes (yeah, it sucks, I've learned to sort of live with it). This usually happens most times I disembark, too, though it will sometimes clear up if I disembark in a close enough place enough times (but if I travel far enough (another planet is definitely far enough), something gets reset and the disembarkation stutters return).

I've even had the above in Horizons on occasion (approaching a planet (sometimes) or megaship (most times, first time that session)).

I think the heavy work is related to textures and maybe font rendering. When I disembark, the HUD elements come in slowly (I guess as their textures are rendered). When I look at the lap panel, the suits (maverick and dominator (I was wearing the artemis)) took time to load in, as did the weapons. This usually cleared up for a while (not sure what resets it, wasn't paying attention).

I think it's something to do with terrain generation that breaks as when whatever it is breaks, I get some stutter and I can see the planetary LOD doing strange things (this is approaching the planet from a distance where the planet is getting to the size of the large DSS circle). However, this stutter is not always "fatal": sometimes the breakage clears and the frame-rate recovers. I am uncertain, but I think it's on planets with large areas of particularly pointy terrain as seen from 10-20Mm. However, of course, I really don't know.

As for how I know it's a single half-core pegging: I have dual monitor setup with Odyssey on one monitor and a bunch of other stuff on the other. Of of those other things is an xterm running bpytop which displays various system stats, including cpu usage/temperature (including the fact that Odyssey has 76 threads, but it still pegs just the one half-core... grrr (though, to be fair, many of those 76 threads could be Proton's Vulkan implementation of DX-whatever doing the right thing)).

For those of you with better machines than mine (and not seeing the relevant performance difference), keep in mind that differences in CPU are more in core counts, cache sizes and the like, memory access speeds, etc, so an O(N^3) algorithm will bring any CPU to its knees pretty quickly, especially if only a single thread is being used. A 4k monitor is pushing ~4x as many pixels (I don't know the actual dimensions).

tl;dr:
When things are ok, Odyssey actually performs quite well (I consider anything over 40 fps to be plenty, I don't want to get into any arguments over that, though), especially considering how much more work it has to do near the surface (in space, Odyssey performs very well, usually over 180fps, almost always over 150fps (until things break :p ), so on par with Horizons). The poor performance seems to come from two things: poor asset loading, and something breaking and causing resource leaks (thus requiring a restart). On foot frame-rates are a completely different kettle of fish: Odyssey has to do much more work (you just don't see that sort of detail from your ship in Horizons).
 

Deleted member 182079

D
Maybe, but that's not what I do. I'm currently playing on extreme settings: everything on max, no FSR (aka normal), ultra/ultra+, even the expensive settings like both shadow settings and even Bloom set to ultra. These are the settings that not too long ago could even bring an RTX 3090 to its knees in surface installations. I'm on an RTX 3070, not exactly low end either, but in most places I get at least 60 FPS now and that's while recording which also costs me about 10-15 FPS. I've just uploaded 5 smaller videos which I'll link here once the processing to HD is complete. They are simple APEX transport missions and only on the largest of the installs does the performance go down to about 40 FPS. I'm pretty sure that optimisations had to be made in the last few days. I don't believe in miracles. In addition, after a few hours of play, I haven't had a single CTD today !YAY! whereas I've had about 3 - 5 or even more in the last few days.

Apart from the extreme settings to push my graphics card to its limits, the only other thing I've changed is fast VSync in NVidia control center. So unless I'm a victim of some weird glitch that prevents the options UI from displaying the true settings, I'll just have to conclude that the improvements actually happened in-game (or more likely server side, as there was no other update since 6.01). I just know they don't always announce everything, not even the positive things. Many improvements have always been done quietly. And that's not something you can just glean from the usual forum drivel. You have to get your own fingers dirty for that.

But a word about these settings and also about a possible future port to consoles. Who says that the game should run smoothly on all kinds of machines with ultra settings, including consoles? If you simply compare the purchase price of a high-end PC with the best console currently available, is that really still a realistic expectation?
I didn't suggest they made no optimisations at all - I leave FSR off as it looks terrible, and in the worst affected area of the concourses (planetary port, bar area) it no longer dips into the red FPS zone. It still runs like a dog though with low frames, jaggies and stutter.

So maybe I gained a few frames here and there, wow well done after 3 months and 6 patches. The vast bulk of any performance improvements comes from FSR though for me if I enable it.

I locked the game at 45fps now to get a more consistent experience (even on an empty Planet it sometimes drops a couple of frames to below 60). What seems to be the highest FPS hit is NPCs. Empty settlements run smooth, as soon as scavs appear it's a stutterfest again.

Anyways... I was thinking of creating an exasperation thread about the broken lighting, shadows and other visual glitches last night but decided against it - the game looks very inconsistent across the board still (unlike Horizons) and that is what is putting me off it first and foremost now. No idea whether those are bugs or intentional, looks like Fdev don't know either (first cockpits are too dark, now they went the other direction, implementing things like headless chicken without much thought) so whether they ever get fixed... Probably best not to hold my breath.
 
Last edited by a moderator:

Deleted member 182079

D
Yes, the inconsistencies are quite strange. A lot of FPS drops can often be "fixed" by logging into the menu at the moment (like so many other things, currently). No idea what's behind it. I don't know enough about such things anyway, but I think only FDev has the tools to find the cause(s). As long as I see steady improvements, I would remain cautiously optimistic.
I just made a post in the "Dying" thread, my personal problem is that Frontier may well fix performance but by the time it happens I may have lost interest in playing the game for other reasons (visuals which I think are still very problematic in many areas, a lot of work left to do - fwiw I thought and still think Horizons is the benchmark as that plays and feels great still; and secondly gameplay related [specific to Odyssey]).

Anyways, back on topic, I watched @drew 's recent performance stream (it's a bit long at around 1hr 10mins) and that shows pretty well the game has a long way to go - he stuck with the quality presets in the graphics menu (I tend to stick it to Ultra and then tweak a few settings) and those presets didn't really have as large an impact on performance as they should; the only bit that makes a significant difference is still resolution (which also explains why FSR can work so well for some of us). My own experience with fiddling with the settings are in line with what his video showed, although I get better performance overall due to somewhat better hardware, so it's not a trainwreck anymore for me (patch 5 was the lowest point for me as settlements became unplayable jank fests).

A lot done, much much much more left to do unfortunately.
 
Last edited by a moderator:
Odyssey was released on May 19th. Since then, nothing has fundamentally changed in the overall impression of the DLC.
The console development is on ice. You could now concentrate 100% on the PC implementation.
The question is what is 100%? 5 developers and 3 community managers once a week?
Okay, it's vacation time, but time is just running out. The longer this situation lasts, the less chances there are of making Odyssey a success.
For me it currently looks like a minimal concept.

It is also logical, because too little money was taken in the fatal publication.
Damage control is almost unnecessary because it is almost a Total loss.

I still keep my fingers crossed that it will still work.
 
Depends on your system specs. On a 3070 RTX I currently see significant improvements performance wise, while everything on normal, ultra and ultra + (with VSync set to fast). I've just visited a few settlements and most of the time I'm at 60 FPS or above (higher when I'm not recording). But I also see small downgrades of details. Looks like we are in the middle of the polishing an optimising process right now. Play and see for yourself and don't just believe what other people say (me included of course) :D
Now I want to be clear; i'm very glad you are able to have a better experience now. Fun is what matters, especially with gaming.

But I also want to point out that the game lists a 1060 as the recommended spec, and Horizons could run at Ultra 1080p with triple digit fps with that card. So for you to have a 3070 (which is about equal to a 2080 Ti), which is two generations and half a dozen tiers above a 1060, and just barely get 60fps speaks volumes about how poorly the game still runs and how far it needs to go.

I would imagine most of the PC Elite community was not running on top of the line GPUs and still aren't, so it just strikes me as odd when folks who have some of the best hardware you can buy in 2021 saying performance is "fine" now when they're only just getting 60fps at 1080p. A 3070 should be absolutely obliterating Odyssey even at 4K, especially when you consider how dated Odyssey still looks compared to most recent games.
 
But I also want to point out that the game lists a 1060 as the recommended spec, and Horizons could run at Ultra 1080p with triple digit fps with that card. So for you to have a 3070 (which is about equal to a 2080 Ti), which is two generations and half a dozen tiers above a 1060, and just barely get 60fps speaks volumes about how poorly the game still runs and how far it needs to go.
I can't comment on a 1060, but Odyssey does still get triple digit FPS at 1920x1080 on my 1080... usually (sometimes something breaks and it drops down to 30-60, then 20. I think I've seen 10, all while in areas where I was getting 150+ minutes before, and the only way to "fix" it is to restart the program).

I think there are some nasty resource management bugs in Odyssey's rendering code. If some of those resources are compute shader tasks, well, that would bring any graphics card to its knees (orphaned tasks spinning their wheels while the rest of the card tries to get pixels to your screen). Another possibility is memory leaks on the card causing the card to go into memory crises (possibly swapping to system memory, which is quite expensive). More cycles drained away from the pixel pushers.

I do not expect Horizons level performance when on foot in Odyssey, that is just impossible (so much more detail when on foot than even when out in the SRV in Horizons), though I do expect reasonable performance (see below). Expecting Horizons level performance when in space away from any bodies is quite reasonable and just what I get... until something breaks because I got too close to the wrong body, and then I get "GL Quake on my Matrox G200" level (ie, abysmal (~13fps) performance, even in deep space. I'm not certainly, but in-space performance might even be better in Odyssey than in Horizons (hadn't found the FPS display until Odyssey was released, and I ain't going back).

My expectations for reasonable performance when on foot in Odyssey are not all that high, partly because I have a five year old 1080 (and an i7 xeon, same vintage). I do get 40+, even in stations and settlements, sometimes quite a bit higher (I've seen over 70, maybe 90, out in the black) and down to about 25 (before things break, and even when they do break, just turning while in the pioneer shop took the fps from 15 to 40, and back again), and due to having played quake at 13fps, I'm OK with 25, very comfortable with 30-50, and anything more is icing on the cake. And all the above is in Ultra. I'll eat the occasional frame-rate tankage just for the FSD effects!
 
I can't comment on a 1060, but Odyssey does still get triple digit FPS at 1920x1080 on my 1080... usually (sometimes something breaks and it drops down to 30-60, then 20. I think I've seen 10, all while in areas where I was getting 150+ minutes before, and the only way to "fix" it is to restart the program).

I think there are some nasty resource management bugs in Odyssey's rendering code. If some of those resources are compute shader tasks, well, that would bring any graphics card to its knees (orphaned tasks spinning their wheels while the rest of the card tries to get pixels to your screen). Another possibility is memory leaks on the card causing the card to go into memory crises (possibly swapping to system memory, which is quite expensive). More cycles drained away from the pixel pushers.

I do not expect Horizons level performance when on foot in Odyssey, that is just impossible (so much more detail when on foot than even when out in the SRV in Horizons), though I do expect reasonable performance (see below). Expecting Horizons level performance when in space away from any bodies is quite reasonable and just what I get... until something breaks because I got too close to the wrong body, and then I get "GL Quake on my Matrox G200" level (ie, abysmal (~13fps) performance, even in deep space. I'm not certainly, but in-space performance might even be better in Odyssey than in Horizons (hadn't found the FPS display until Odyssey was released, and I ain't going back).

My expectations for reasonable performance when on foot in Odyssey are not all that high, partly because I have a five year old 1080 (and an i7 xeon, same vintage). I do get 40+, even in stations and settlements, sometimes quite a bit higher (I've seen over 70, maybe 90, out in the black) and down to about 25 (before things break, and even when they do break, just turning while in the pioneer shop took the fps from 15 to 40, and back again), and due to having played quake at 13fps, I'm OK with 25, very comfortable with 30-50, and anything more is icing on the cake. And all the above is in Ultra. I'll eat the occasional frame-rate tankage just for the FSD effects!
When I talk about what is "acceptable" performance for any given hardware, it's usually within the context of what that hardware generally can get with other current or generally "modern" games. Cyberpunk is a good comparison I like to use, since even though it launched pretty poorly optimized, it was still getting notably better performance per-GPU than Odyssey is getting, and that's with Cyberpunk having FAR more advanced visuals. A 1060 6GB for example gets roughly 30-50fps in Cyberpunk at 1080p High settings, whereas in odyssey it wouldn't be abnormal to get maybe 15-20 at its lowest point.

Your 1080, on the other hand, can run Cyberpunk at 1080p Ultra settings at keep 60fps minimum, whereas you have experienced yourself how low the fps can get in EDO, all while looking notably worse overall than Cyberpunk.

Performance expectations are always relative. If someone tries playing a pixel style indie game with a 3070 and can only top out at 50fps for example, even if 50 fps is "acceptable" for that individual, it still means the game is terribly optimized considering what that GPU would get in other games.
 
When I talk about what is "acceptable" performance for any given hardware, it's usually within the context of what that hardware generally can get with other current or generally "modern" games. Cyberpunk is a good comparison I like to use, since even though it launched pretty poorly optimized, it was still getting notably better performance per-GPU than Odyssey is getting, and that's with Cyberpunk having FAR more advanced visuals. A 1060 6GB for example gets roughly 30-50fps in Cyberpunk at 1080p High settings, whereas in odyssey it wouldn't be abnormal to get maybe 15-20 at its lowest point.
I won't comment on Cyberpunk because I haven't even seen it, let alone played it (and have little interest in doing so).

I was very specifically comparing Odyssey with Horizons, the only comparison that actually matters in this forum. In space, Odyssey's performance is on par with that of Horizons (maybe a little lower, maybe a little higher, not sure, I never looked at FPS numbers in Horizons). In general. Even on the ground in an SRV seems similar, usually. There's no point trying to compare on-foot performance because there simply is nothing to compare (or if you want to force the issue, Horizons got exactly 0 fps).

Without something "breaking" (really, for all I know, it could be unbreaking and things should be that slow, but it seems to have something to do with terrain type), it's when near icy bodies that in-ship and SRV performance plummets (it's usually good near rocky bodies), and I know this is on the GPU because none of my cores go over 60% (usually around 40%). So why icy bodies? Those ice shaders. They're doing a LOT of work. Fresnel, subsurface scattering, reflectivity, translucency, and I don't know what else. In the "early days" of my explorations in Odyssey, I thought the lighting on icy bodies was really weird: this white glare surrounding a moving patch of colored ice, with some interesting blending between the two. Then I realized what was going on (partly because I noted that the border depended on the topography): the white glare is low-incidence reflection of light off the ice, maybe with some SSS, and the colored ice had translucency effects going, and of course there was a transition region between the two as the angle of incidence decreased. You see these effects to some degree even 1Mm above the surface. Horizons icy bodies never impressed me as being icy. Odyssey icy bodies scream icy. Unfortunately for Horizons-Odyssey performance comparison purposes, icy bodies are the norm. Fortunately for Odyssey-other games comparisons, most other games don't have icy bodies.

Then, you have to throw in atmospheric effects. In Horizons, you just could not get close enough to a body with atmosphere for there to even be a comparison.

Near settlements (regardless of current mode of transport), Odyssey's slugs really rear their ugly heads. And it really doesn't help that most settlements are on icy bodies. Odyssey's graphics look simplistic, but they're not, really. Well animated NPCs, with shields when things get interesting (and shields aren't free, that's a whole other mesh to render, transparently: expensive). Dozens if not hundreds of lights, all casting shadows, when the alarms go off. NPC flash lights, again all casting shadows, when it's dark. Fire, smoke, water. Ambient occlusion, (illusory?) radiosity, transparency, reflections...

I am not saying that Odyssey is performing anywhere near what it should be able to, it just isn't. There are a lot of slugs (bugs that make things go slow rather than produce errors) in there. How much Odyssey can be sped up without losing anything, I don't know, but I'm sure it can be sped up somewhat. How much Odyssey can be sped up while losing much of what it has, well, that's pretty obvious: a lot. I'd rather not lose anything, though.

As for how much my 1080 should be able to push, well, I actually have a pretty good idea. with 100-200 realtime lights, no shadows, no particles, minimal transparency, very simplistic lighting calculations, about 400fps at 1920x1080 (Quake in Vulkan, only partially implemented). For reference, a different renderer (static light maps with a handful of real-time lights, horrible lighting calculations: essentially emulating Quake's software renderer in GLSL): 4000fps. So yeah, I have a pretty good (but not extensive) idea of what my card can do. I just need to get those pesky shadows working (it's actually management of stuff on the CPU holding me up), then I'll have an even better idea :p.

Some other interesting stuff I've done with my card includes smooth conic sections (orbit lines, about 40?, using fullscreen SDF), Full umbral/penumbral/antumbral planetary shadows, including rings casting shadows (don't remember if I did the cone stuff for them, or just parallel rays) for t. Never noticed a slowdown for the orbit lines or shadows.

What does that mean for how well my card should be able to run Odyssey? I don't know. Can't know, really, because I don't know all of what Odyssey is trying to do. But I do know that expecting 400fps in Odyssey is out of the question :p (Odyssey has shadows, my renderer does not yet).
 

Deleted member 182079

D
When I talk about what is "acceptable" performance for any given hardware, it's usually within the context of what that hardware generally can get with other current or generally "modern" games. Cyberpunk is a good comparison I like to use, since even though it launched pretty poorly optimized, it was still getting notably better performance per-GPU than Odyssey is getting, and that's with Cyberpunk having FAR more advanced visuals. A 1060 6GB for example gets roughly 30-50fps in Cyberpunk at 1080p High settings, whereas in odyssey it wouldn't be abnormal to get maybe 15-20 at its lowest point.

Your 1080, on the other hand, can run Cyberpunk at 1080p Ultra settings at keep 60fps minimum, whereas you have experienced yourself how low the fps can get in EDO, all while looking notably worse overall than Cyberpunk.

Performance expectations are always relative. If someone tries playing a pixel style indie game with a 3070 and can only top out at 50fps for example, even if 50 fps is "acceptable" for that individual, it still means the game is terribly optimized considering what that GPU would get in other games.
These are also my observations (generally speaking when comparing CP2077 with EDO) - I'm on an RTX2060S.

Ultra settings, 1440p. I locked CP2077 at 48fps (144hz monitor), it can sometimes drop to about 35ish, but it's rare and only when there's more than a few dozen (yes, several dozen, all wearing different and complex outfits) NPCs in view at the same time, and I'm mostly sticking to the preset quality settings so could probably tweak this for better results. This is with ray-tracing enabled (and DLSS set to Quality - my card is considered entry level for RTX so the performance cost is pretty heavy). However since I own a g-sync monitor the frame drops aren't that noticeable as the tech smoothes things out a fair bit.

Without ray-tracing and DLSS I could run the game at Ultra locked at 60fps (maybe a few dips). 1080p would greatly improve performance but it doesn't look great, although not as bad as in Elite as anti-aliasing is usually much better implemented elsewhere.

In settlements in EDO I get anything from below 20 to high 40s (depending on whether NPCs are present), with a lot of jank which g-sync can't seem to compensate - probably because the fluctuations are too big.

The killer is - while EDO looks ... mostly alright if a bit basic (if I ignore the awful jaggies even at 1440p and SMAA enabled), CP2077 can look absolutely glorious, especially NPCs. Even non-raytracing lighting is very pretty in the game. So yes, it's a bug fest, but at least I can see (and appreciate) why it's working out my card so hard.
 
Last edited by a moderator:
Back
Top Bottom