Are FDev still optimizing Odyssey? Or is this it?

I was going to suggest the opposite (enable vsync), assuming OP's running on a 60 Hz monitor, but I definitely agree with limiting framerate to refresh rate, as this gives some headroom for processing more complex scenes. Digital Foundry did some tests, and certain games benefit greatly from a lower locked framerate. I do the same thing with MSFS, and it made a huge difference, practically eliminating stuttering. There's also a benefit from running cooler, which means less chance of thermal throttling.

Note - if you have a 144 Hz monitor, you'll want to set framerate limit to 72 with vsync on. You have to do this in the config file, since 72 isn't an option in the settings menu. Having framerate set to a divisor (no remainder) of refresh with vysnc on has always given me the best results, assuming a fixed refresh rate. 90 fps on a 60 Hz screen is a waste of processing power, and 90 fps on a 144 Hz screen is either going to have tearing (vsync off) or skipped frames (vysnc on).
I use a 144Hz screen and have Odyssey limited to 90 FPS. Out of the options in game settings 120 was too much because there are many places where the game can't keep a steady 120 fps (let alone 144) so I decided on a compromise of 90.
90 was the closest option in the game settings that I could use and still feel the game being smooth without drops/spikes in most scenes.

So far 90 proves to be a good setting for me and I can't say I noticed any skipped frames (I play with VSYNC on).
I will try to set it to 72 just to see if I can notice any difference.
 
I heard that the real issue is not GPU load. It's something with the way the engine is coded.
At present, on-foot CZ framerate is sort of hard-capped to ~55FPS, whatever your hardware.
It's something about the way CPU and/or RAM memory are handled.
 
Last edited:
I use a 144Hz screen and have Odyssey limited to 90 FPS. Out of the options in game settings 120 was too much because there are many places where the game can't keep a steady 120 fps (let alone 144) so I decided on a compromise of 90.
90 was the closest option in the game settings that I could use and still feel the game being smooth without drops/spikes in most scenes.

So far 90 proves to be a good setting for me and I can't say I noticed any skipped frames (I play with VSYNC on).
I will try to set it to 72 just to see if I can notice any difference.
I have a 144 Hz screen, but I usually just run it at 60 Hz because most of my games won't do 144, and to be honest, I really can't tell the difference between 60 and 72 FPS (but I can tell the difference between 30 and 60 FPS). I'm not saying there isn't a difference, because I actually can discern a difference in smoothness when scrolling a page in my browser between 60 and 144 FPS, but framerate isn't a simple "more is better" equation. For example, the aforementioned 2 FPS is perfectly fine for viewing still images, or even very slowly moving images.

What matters is how far a pixel moves between one frame and the next, the ideal being just one pixel, but on a smaller monitor even skipping a few pixels may not be discernible. I notice this when I go from my smaller laptop screen to my big TV, where the physical distance a pixel moves is greater, and thus lower framerate looks worse than it does on a small screen, even at the same resolution.

For most games, 60 FPS is enough for me, even on the big screen. But I totally understand there are others who feel 60 FPS is too slow, especially competitive eSports players. Just remember your true framerate is limited by your monitor's refresh rate, despite what that little counter is telling you (so nobody out there is getting 300+ FPS, really).
 
What is the rest of your setup?
Either mine is acting very strangely or you are hiding somehting. I don't mean to be rude but I have difficulties believing that a 1070 handles Odyssey in 1440p @60fps on ULTRA all the time.
How about high intensity surface conflict zones? Do you get 60fps all the time?

Six year old self-bulid:

i7 6700K @ 4GHz (no overclocking used)
32GB RAM @3200
GTX 1070 8GB

OS on 1TB NvMe - game files on 2TB NvMe - 6TB HDD

ethernet connection home network with full fibre broadband

I have game set to ULTRA with nothing altered, borderless display, vsync on

my 32" 4K monitor is connected via DP but game is set to 2560x1440 (sharp enough for me and saves the GFX card fans going daft at 4K)


I don't "do" on foot combat - I bought Elite Dangerous, not Call of Duty. ;) So I have no idea what would happen in a high intensity surface conflict zone - never been in one but as I said some settlement scenes do drop into the mid-50s.
 
Last edited:
I disagree. Odyssey was unplayable for me on a GTX1070 8GB when it launched / escaped - over time it has now become perfectly playable frames-wise, running ULTRA at 2560x1440 I keep v-sync on and the game is stable 60fps everywhere except certain (very few now) parts of concourses and settlements where it drops into the mid-50's but is still smooth with no stuttering at all any more.
I concur my old 1070 runs Odyssey really well and only very rarely have anything slowing or stuttering.

O7
 
I don't "do" on foot combat - I bought Elite Dangerous, not Call of Duty. ;) So I have no idea what would happen in a high intensity surface conflict zone - never been in one but as I said some settlement scenes do drop into the mid-50s.
Ah, so that’s the secret of "perfectly playable" :)
Well, to be honest that’s what I consider doing now - to avoid on foot combat because it simply performs terribly.
 
Ah, so that’s the secret of "perfectly playable" :)
Well, to be honest that’s what I consider doing now - to avoid on foot combat because it simply performs terribly.

Like I said I want a space game so on-foot FPS is of no interest to me and frankly I think the whole concentration of effort towards shootie-footie was a real distraction to the development of Elite. To me the influx of (or diversion to) FPS coding messed-up a lot of the game I enjoy - just a trite example that really grips my jobbie is the fact that space ships now fly in "teams" not "wings" just because the on-foot stuff uses "teams". If they had spent the money on actually fixing the game then Odyssey could have been spectacular. To be honest, Odyssey made me leave (well, discouraged me from playing much) the game for a long time, I only came back when I found this forum via the Zoo and read that improvements had made great strides.
 
I need to try CZs again … I think I just about hit 60-ish at 1440p and 45-ish at 4K around 6 months ago, with settings mainly on “high”. That’s on a i7-12xxx with a RTX 3070/8GB and a 4K/60Hz monitor.

I’ve since changed to an Ultrawide (3440x1440) monitor with 144hz support and am currently getting 90-ish on planets and concourses with Ultra settings, apart from Bloom (off) and Shadows (High), so hoping that CZs will cope without having to turn anything down.
 
I believe it was said that work to improve performance hadn't stopped but that Odyssey was in a decent enough place where they could shift their focus to driving the narrative forwards. While I am not putting too much hope/expectation into performance being substantially better, I would expect some performance gains from the November update. However, moving forward, Frontier will have to do something about settlement/npc performance if onfoot thargs are part of the narrative pipeline.
 
FWIW, the last few years when I was playing EDH, I did see a steady improvement in graphics and performance. I suspect that this is an ongoing effort, and nothing worth writing much about. With EDO's botched launch they had to freeze the narrative while they fixed up performance somewhat and axed consoles. Now they are probably back on track and will go on with the narrative (and maybe new features). Too bad the hardware requirements are much higher than for EDH, but I expect them to keep improving on performance, lighting, shadows, etc. And what I really wish for is some kind of anti aliasing that actually works. Sometimes the game looks spectacular, and sometimes absolutely awful..
 
Sorry to jump in here but is it normal for you to mess around with your settings per game ? I'm a console player( plug in and play sort of person ) and after reading some of the comments on various pages , about settings and command files and deletes etc , so it's just a genuine question from someone who is clueless . Thanks in advance
 
I need to try CZs again … I think I just about hit 60-ish at 1440p and 45-ish at 4K around 6 months ago, with settings mainly on “high”. That’s on a i7-12xxx with a RTX 3070/8GB and a 4K/60Hz monitor.

I’ve since changed to an Ultrawide (3440x1440) monitor with 144hz support and am currently getting 90-ish on planets and concourses with Ultra settings, apart from Bloom (off) and Shadows (High), so hoping that CZs will cope without having to turn anything down.
Just run a couple: getting anywhere between 80 - 110 FPS depending on what’s going on. With AMD Freesync it all looks very smooth. 👍
 
Sorry to jump in here but is it normal for you to mess around with your settings per game ? I'm a console player( plug in and play sort of person ) and after reading some of the comments on various pages , about settings and command files and deletes etc , so it's just a genuine question from someone who is clueless . Thanks in advance
Most (all?) games apply settings that they feel are appropriate for your system after installation. In most cases that will allow you to just play out of the box.

Also if you have a fairly new build the majority of games released around the same time can just be set to high/ultra and get good performance, that can be the case for a few years until you see more demanding games.
 
Last edited:
For what it is worth, I am under the vague impression that LOD tweaks between update 12 and 13 may have made (...many, but not all...) planet surfaces need less supersampling than before, to look quite ok (...on my Valve Index VR headset, which is ca.15 pixels per degree).

If this is not all in my imagination, I hope update 14 will not make things worse again (...which is something that has happened before).
 
Its reasonable on my laptop but others eith uber pcs cant run it at 30fps. Weve gone back to the days of looking for the golden configuration only this time its much more expensive than playing around with the config.sys and autoexec.bat files 😉
 
Six year old self-bulid:

i7 6700K @ 4GHz (no overclocking used)
32GB RAM @3200
GTX 1070 8GB

OS on 1TB NvMe - game files on 2TB NvMe - 6TB HDD

ethernet connection home network with full fibre broadband

I have game set to ULTRA with nothing altered, borderless display, vsync on

my 32" 4K monitor is connected via DP but game is set to 2560x1440 (sharp enough for me and saves the GFX card fans going daft at 4K)


I don't "do" on foot combat - I bought Elite Dangerous, not Call of Duty. ;) So I have no idea what would happen in a high intensity surface conflict zone - never been in one but as I said some settlement scenes do drop into the mid-50s.
You do know the on-foot section isn't call of duty either. It elite dangerous, but on-foot.
 
Back
Top Bottom