That's okay. I like dumb Taravangian way better anyway.Hi apparently I woke up today dumb ._.
That's okay. I like dumb Taravangian way better anyway.Hi apparently I woke up today dumb ._.
I disengaged with it so hard that the game isnt even installed on my PC anymore.Does anyone actually engage with the oddysey content in stations, with the low frame rates does anyone actually bother? In which case might as well stick to horizons.
Yes and no. A lot of people are having to tweak settings with the new options.Hey, is there a significant improvement in performance since the latest update? I last played the game in june, but now with around 1-2 month later, anybody of you experienced an improvement? Or is there something planned for the upcoming patches?
FSR is enabled by default. So if the average player didn't know better they'd think FDev finally optimised the game.Depends on your system specs. On a 3070 RTX I currently see significant improvements performance wise, while everything on normal, ultra and ultra + (with VSync set to fast). I've just visited a few settlements and most of the time I'm at 60 FPS or above (higher when I'm not recording). But I also see small downgrades of details. Looks like we are in the middle of the polishing an optimising process right now. Play and see for yourself and don't just believe what other people say (me included of course)![]()
It is hit or miss, some things have gotten better (The UI doesn't suck as much, and Stations (some of them) have gotten better) but you can hit some settlements with 4 mining lasers and fires raging, and the FPS drops to single digits.Hey, is there a significant improvement in performance since the latest update? I last played the game in june, but now with around 1-2 month later, anybody of you experienced an improvement? Or is there something planned for the upcoming patches?
I didn't suggest they made no optimisations at all - I leave FSR off as it looks terrible, and in the worst affected area of the concourses (planetary port, bar area) it no longer dips into the red FPS zone. It still runs like a dog though with low frames, jaggies and stutter.Maybe, but that's not what I do. I'm currently playing on extreme settings: everything on max, no FSR (aka normal), ultra/ultra+, even the expensive settings like both shadow settings and even Bloom set to ultra. These are the settings that not too long ago could even bring an RTX 3090 to its knees in surface installations. I'm on an RTX 3070, not exactly low end either, but in most places I get at least 60 FPS now and that's while recording which also costs me about 10-15 FPS. I've just uploaded 5 smaller videos which I'll link here once the processing to HD is complete. They are simple APEX transport missions and only on the largest of the installs does the performance go down to about 40 FPS. I'm pretty sure that optimisations had to be made in the last few days. I don't believe in miracles. In addition, after a few hours of play, I haven't had a single CTD today !YAY! whereas I've had about 3 - 5 or even more in the last few days.
Apart from the extreme settings to push my graphics card to its limits, the only other thing I've changed is fast VSync in NVidia control center. So unless I'm a victim of some weird glitch that prevents the options UI from displaying the true settings, I'll just have to conclude that the improvements actually happened in-game (or more likely server side, as there was no other update since 6.01). I just know they don't always announce everything, not even the positive things. Many improvements have always been done quietly. And that's not something you can just glean from the usual forum drivel. You have to get your own fingers dirty for that.
But a word about these settings and also about a possible future port to consoles. Who says that the game should run smoothly on all kinds of machines with ultra settings, including consoles? If you simply compare the purchase price of a high-end PC with the best console currently available, is that really still a realistic expectation?
I just made a post in the "Dying" thread, my personal problem is that Frontier may well fix performance but by the time it happens I may have lost interest in playing the game for other reasons (visuals which I think are still very problematic in many areas, a lot of work left to do - fwiw I thought and still think Horizons is the benchmark as that plays and feels great still; and secondly gameplay related [specific to Odyssey]).Yes, the inconsistencies are quite strange. A lot of FPS drops can often be "fixed" by logging into the menu at the moment (like so many other things, currently). No idea what's behind it. I don't know enough about such things anyway, but I think only FDev has the tools to find the cause(s). As long as I see steady improvements, I would remain cautiously optimistic.
I'm quite certain I've seen steady improvements (certainly in overall stability), so I'd say your cautious optimism is warranted.As long as I see steady improvements, I would remain cautiously optimistic.
Now I want to be clear; i'm very glad you are able to have a better experience now. Fun is what matters, especially with gaming.Depends on your system specs. On a 3070 RTX I currently see significant improvements performance wise, while everything on normal, ultra and ultra + (with VSync set to fast). I've just visited a few settlements and most of the time I'm at 60 FPS or above (higher when I'm not recording). But I also see small downgrades of details. Looks like we are in the middle of the polishing an optimising process right now. Play and see for yourself and don't just believe what other people say (me included of course)![]()
I can't comment on a 1060, but Odyssey does still get triple digit FPS at 1920x1080 on my 1080... usually (sometimes something breaks and it drops down to 30-60, then 20. I think I've seen 10, all while in areas where I was getting 150+ minutes before, and the only way to "fix" it is to restart the program).But I also want to point out that the game lists a 1060 as the recommended spec, and Horizons could run at Ultra 1080p with triple digit fps with that card. So for you to have a 3070 (which is about equal to a 2080 Ti), which is two generations and half a dozen tiers above a 1060, and just barely get 60fps speaks volumes about how poorly the game still runs and how far it needs to go.
When I talk about what is "acceptable" performance for any given hardware, it's usually within the context of what that hardware generally can get with other current or generally "modern" games. Cyberpunk is a good comparison I like to use, since even though it launched pretty poorly optimized, it was still getting notably better performance per-GPU than Odyssey is getting, and that's with Cyberpunk having FAR more advanced visuals. A 1060 6GB for example gets roughly 30-50fps in Cyberpunk at 1080p High settings, whereas in odyssey it wouldn't be abnormal to get maybe 15-20 at its lowest point.I can't comment on a 1060, but Odyssey does still get triple digit FPS at 1920x1080 on my 1080... usually (sometimes something breaks and it drops down to 30-60, then 20. I think I've seen 10, all while in areas where I was getting 150+ minutes before, and the only way to "fix" it is to restart the program).
I think there are some nasty resource management bugs in Odyssey's rendering code. If some of those resources are compute shader tasks, well, that would bring any graphics card to its knees (orphaned tasks spinning their wheels while the rest of the card tries to get pixels to your screen). Another possibility is memory leaks on the card causing the card to go into memory crises (possibly swapping to system memory, which is quite expensive). More cycles drained away from the pixel pushers.
I do not expect Horizons level performance when on foot in Odyssey, that is just impossible (so much more detail when on foot than even when out in the SRV in Horizons), though I do expect reasonable performance (see below). Expecting Horizons level performance when in space away from any bodies is quite reasonable and just what I get... until something breaks because I got too close to the wrong body, and then I get "GL Quake on my Matrox G200" level (ie, abysmal (~13fps) performance, even in deep space. I'm not certainly, but in-space performance might even be better in Odyssey than in Horizons (hadn't found the FPS display until Odyssey was released, and I ain't going back).
My expectations for reasonable performance when on foot in Odyssey are not all that high, partly because I have a five year old 1080 (and an i7 xeon, same vintage). I do get 40+, even in stations and settlements, sometimes quite a bit higher (I've seen over 70, maybe 90, out in the black) and down to about 25 (before things break, and even when they do break, just turning while in the pioneer shop took the fps from 15 to 40, and back again), and due to having played quake at 13fps, I'm OK with 25, very comfortable with 30-50, and anything more is icing on the cake. And all the above is in Ultra. I'll eat the occasional frame-rate tankage just for the FSD effects!
I won't comment on Cyberpunk because I haven't even seen it, let alone played it (and have little interest in doing so).When I talk about what is "acceptable" performance for any given hardware, it's usually within the context of what that hardware generally can get with other current or generally "modern" games. Cyberpunk is a good comparison I like to use, since even though it launched pretty poorly optimized, it was still getting notably better performance per-GPU than Odyssey is getting, and that's with Cyberpunk having FAR more advanced visuals. A 1060 6GB for example gets roughly 30-50fps in Cyberpunk at 1080p High settings, whereas in odyssey it wouldn't be abnormal to get maybe 15-20 at its lowest point.
These are also my observations (generally speaking when comparing CP2077 with EDO) - I'm on an RTX2060S.When I talk about what is "acceptable" performance for any given hardware, it's usually within the context of what that hardware generally can get with other current or generally "modern" games. Cyberpunk is a good comparison I like to use, since even though it launched pretty poorly optimized, it was still getting notably better performance per-GPU than Odyssey is getting, and that's with Cyberpunk having FAR more advanced visuals. A 1060 6GB for example gets roughly 30-50fps in Cyberpunk at 1080p High settings, whereas in odyssey it wouldn't be abnormal to get maybe 15-20 at its lowest point.
Your 1080, on the other hand, can run Cyberpunk at 1080p Ultra settings at keep 60fps minimum, whereas you have experienced yourself how low the fps can get in EDO, all while looking notably worse overall than Cyberpunk.
Performance expectations are always relative. If someone tries playing a pixel style indie game with a 3070 and can only top out at 50fps for example, even if 50 fps is "acceptable" for that individual, it still means the game is terribly optimized considering what that GPU would get in other games.