Odyssey Optimization Needs to be Fixed

You blamed my hardware for the pixelation, so why don't you go out there and see for yourself? Can't land on Achenar 3? Try Achenar 2:

63Efnr4.png

(POI "Irregular Markers" near 31.5383° / 46.1187°)

Still in denial? Here's Achenar 1:

4q0Zk7h.png

(-33.3966° / 87.7814°, inside the large crater)

Terrain in Horizons never looks like that. For most planets in the game, Odyssey is a downgrade. The only improvement was the addition of atmospheric shaders, which is why all the pretty pictures are now taken at altitude, with a focus on hazy sunsets. But don't go too far from the surface, or you may see tiling.
this happens when you dont have ram and the textures load low res ones pixelated ones.. you are doing it on porpuse to since you are a well known troll in inara... stop opening 50 google chrome tabs ...

its way to easy to catch people spreading missinformation lol
 
this happens when you dont have ram and the textures load low res ones pixelated ones..
In the case of the 2 bodies, they are correct - they do not render properly (even on my 6900XT - but with Terrain work at maximum they are nowhere near as dreadful as the example - I had to move the slider to a little over 25% to look that poor) - one wouldn't expect them to choose bodies that rendered normally, would one?
 
Are you using FSR? That would help frame rates as it downscales the resolution and I would hope that frontier has specified a min spec card that can run 60fps at native 1080p
I think a solid 30 fps 1080p (or maybe even 720p) is fine for minimum spec. I've been playing 30 fps for years on console, and assuming it's locked (no stutter, jitter, or tearing), it's completely playable, especially when games include motion blur. That said, I'm no longer a minimum spec kinda guy, and I personally insist on 60 fps for most of my games.
 
I think a solid 30 fps 1080p (or maybe even 720p) is fine for minimum spec. I've been playing 30 fps for years on console, and assuming it's locked (no stutter, jitter, or tearing), it's completely playable, especially when games include motion blur. That said, I'm no longer a minimum spec kinda guy, and I personally insist on 60 fps for most of my games.
Since Odyssey came out I've locked my fps on 30. Not only to stop it from changing too rapidly, which was a problem, but also I've noticed that Elite runs my video card on 100% load everywhere - even in deep space, making it run way hotter than necessary (I have all settings on max, including supersampling=2, which is the only way to live without AA). I'm quite ok with 30 fps, i've stopped noticing it very quickly, and my card no longer warms above 60 degrees Celsius, which is way healthier for it in a long run I think.
 
Since Odyssey came out I've locked my fps on 30. Not only to stop it from changing too rapidly, which was a problem, but also I've noticed that Elite runs my video card on 100% load everywhere - even in deep space, making it run way hotter than necessary (I have all settings on max, including supersampling=2, which is the only way to live without AA). I'm quite ok with 30 fps, i've stopped noticing it very quickly, and my card no longer warms above 60 degrees Celsius, which is way healthier for it in a long run I think.
Out of curiosity, are you doing this using in-game settings, or externally forcing it using something like the NVidia control panel?

When it comes to things like fast-paced combat, I much prefer 60 fps, but if I'm just out exploring, or even flying at a decent altitude over a planet in a straight line, I can live with a rock-solid 30 fps. Though on the topic of exploring - is the FSS POI resolve time still tied to framerate?
 
I think a solid 30 fps 1080p (or maybe even 720p) is fine for minimum spec. I've been playing 30 fps for years on console, and assuming it's locked (no stutter, jitter, or tearing), it's completely playable, especially when games include motion blur. That said, I'm no longer a minimum spec kinda guy, and I personally insist on 60 fps for most of my games.
I definitely notice the difference between 30 and 60 fps, although I agree we are talking min specs. From what I’ve heard, 30fps on a decent tv screen works pretty well, but not so well on a high speed monitor: mine is 144hz.
I mean, if it’s playable that’s what counts, but I thought the general consensus for an fps game was 60@1080p
 
Out of curiosity, are you doing this using in-game settings, or externally forcing it using something like the NVidia control panel?
Locked using in-game settings
That's basically a good thought, and I also take heat generation into account a lot in my settings. But I'm pretty sure one of the reasons why your card heats up so fast is the supersampling=2. That's just insane. I've tried SS=1.5 and already see a drastic loss in performance for a slightly blurrier result. I know AA is terrible, but SS=2 literally renders the same scenery twice (if I got that right) and that's definitely not worth it. It's like trying to swat flies with a sledgehammer (I mean, maybe you can)....
I know supersampling is mostly responsible and that's why I mentioned it. I've started small at first, but ss=2 makes the game so much better looking that I just can't play without it and prefer to go down with fps than play at lower setting.
...Although I'm not sure I understand what do you mean by "drastic loss in performance for a slightly blurrier result" Supersampling makes everything sharper.
 
Last edited:
Locked using in-game settings

I know supersampling is mostly responsible and that's why I mentioned it. I've started small at first, but ss=2 makes the game so much better looking that I just can't play without it and prefer to go down with fps than play at lower setting.
Is that SS=2 with native or FSR enabled?
 
I definitely notice the difference between 30 and 60 fps, although I agree we are talking min specs. From what I’ve heard, 30fps on a decent tv screen works pretty well, but not so well on a high speed monitor: mine is 144hz.
Even 60 fps won't look right on a 144hz monitor unless you have adaptive refresh. You'll want either 48 or 72 fps with vysnc unless you don't care about tearing. I have a 144hz monitor that I purposefully run at 60 instead so that my refresh matches my obtainable 60 fps.

The irony is that the old CRT monitors did a much better job with lower framerates, because they have a built-in "motion blur". You can mimic the effect on newer TVs that have that "enhanced motion" feature (ie - soap opera effect), but that feature does introduce a bit of latency which can be felt in faster-paced games.

I mean, if it’s playable that’s what counts, but I thought the general consensus for an fps game was 60@1080p
I know that 60 is the new 30 on the latest gen consoles, but I don't think PS5 and new XBox are promoted as "minimum spec" hardware.
 
that all depends on your pc graphics settings you have,most peps i know want to slap everything on ultra,well in today pc's slapping everything on ultra will only give you sad faces. most developers work on high end pcs for testing and not the normal of the stock pc, everybody else buys. They also try to future proof there games for next generation of hardware.
 
In the case of the 2 bodies, they are correct - they do not render properly (even on my 6900XT - but with Terrain work at maximum they are nowhere near as dreadful as the example - I had to move the slider to a little over 25% to look that poor) - one wouldn't expect them to choose bodies that rendered normally, would one?
Deciat 1:
XNQ1r6z.png


Deciat 2:
SleUjZ3.png


Deciat 3:
VFOYp2g.png


Deciat 4:
tLIWxIP.png


Deciat 5:
JYASkez.png


You see, I'm not cherry-picking. Odyssey uses low resolution blend textures literally everywhere. Sometimes it's less noticeable because the detail textures that are blended together do not vary much in color or brightness. But if they do, the result looks like Minecraft. Once you see it, you can't unsee it.

this happens when you dont have ram and the textures load low res ones pixelated ones.. you are doing it on porpuse to since you are a well known troll in inara... stop opening 50 google chrome tabs ...

its way to easy to catch people spreading missinformation lol
Oh really? How much RAM do I need to make Odyssey look as good as its predecessor? 16+8 GB not enough?

Look, we both know there's a reason why you are not posting any screenshots: on your screen the game looks just as crappy as it does on mine.
 
Odyssey's broken optimization needs to be fixed urgently.

I'm seeing that for me, even with above average settings, I still suffer from drops, freezes and extremely low fps in stations and settlements.

This is something that had to be focused on to be corrected, because in addition to alienating new players and even the oldest, it still creates a "bad reputation" for the game. And it's not just me saying this, it's the community itself that evaluates the game through Steam, Epic and other media.

In terms of content, immersion, fun and the like, I can't criticize ED: Odyssey, as the game and its expansion give me all of that. But in terms of optimization and problems currently existing, I wouldn't recommend buying the game to anyone.

One of the experiences I've currently had really put me off even playing it. After a group of friends (8 players) bought ED: Odyssey, in a few hours they all asked for a refund, precisely because they played ED: Horizon without any optimization problem and in ED: Odyssey it becomes annoying and breaks any experience.
Before Update 12 I eventually reached a steady 85-90 fps (Locked in settings, all EDO settings on ultra) on stations, a little less in ground settlements.... Post update 12 I'm back to 35-45 fps in stations like Jamesons.. and well sub 40 fps in ground settlements.... not consistent but back to bloody annoying. (Also re-emergence of occasional 1-2 fps 'lock-up's' too).... just BAHHH!
 
Out of curiosity, Achenar 1 on my system:
LwEE3MX.jpg

PNZSJDh.jpg

tQq9OqN.jpg

sRIhwRS.jpg

RepAQtw.jpg


Ah really? Not that I have tried it yet, but I thought we could run FSR on a higher internal resolution
You can't use both. It's either/or

You can use both, just not without manually editing the game options or applying SS via some other method (DSR/VSR, for example).

16+8 GB not enough?

If you want the game to never throw out LODs at maximum settings, you want 10GiB+ of VRAM.
 
Back
Top Bottom