Are FDev still optimizing Odyssey? Or is this it?

Thanks for the info. I will try to update with a shoot of the cores individually. What polling rate would be better suited?

To get a clear and direct picture of CPU utilization, you'd need to use something that polls ETW as it can do so with sub-millisecond granularity without much overhead.

MSI AB's default polling method cannot be set to this low of an interval and would have prohibitively high CPU utilization itself if it could. That said, 100-200ms should still show many spikes.

The low GPU utilization itself, if frame rate is below any caps involved, is very strong evidence of a CPU/memory bottleneck.
 
I have few problems playing Odyssey.

i5-4690k Overclocked at 4.5ghz
Radeon RX-580 8GB
32gb DDR3
ED installed on a Crucial MX500 1TB SSD.

Playing at 2560x1440@144hz on a 30 inch LG Freesync LCD.

Yea FPS aren't "great".. but it's playable.
 
I pop into ED on occasions and fly around for a bit, unfortunately ED doesnt hold the same as it's predecessors for attention. Its gone more like the Sims craze where its played for a few days after a new patch then we wait for the next one 🤔
 
whilst this is possibly an issue, i see the pop in as well... it does not bother me that much (I have other things to whine about, so its not a hill am gonna die on).... but i do see it and i have 24gb of gpu memory so one would think gpu ram is not my issue.

as for the original trailer......... IF that is the case it was hugely misleading then, didnt FD state it was in game screen shots? WIP subject to change etc etc and i am not suggesting any legal issues or anything like that........ but FD strongly intimated those trailers were a true reflection of the game at that time.

if they were piecing together some sort of photomode to make a video that makes FD disingenous imo.......... perhaps not as bad as Alien Colonial marines or the PS3 bullshots, but still dishonest imo.
My point was for a while, after Odyssey was released, when you dropped out of SC near a station, it wasn't jerky, it was smooth like when the rebels come out of hyperspace in RotJ. The station as a small sphere and growing rapidly as you closed in those last few hundred kilometers. And the station interior details LOD was smooth with it, not the "pop-in" way they do now, for me. Can they get back to that? Would be nice if so.

And the pre-release trailer, showing a smooth working game, was them playing the game, but it's not attached to the rest of the network (Im guessing!).
As I've said a few times, during Odyssey's patch 6 or 7, they had a server side issue for a day or so, where Odyssey somehow was separate from Horizons, and it ran smooth as a bowling ball. Then they "fixed" the server side issue, and bam, back to 22 fps ground CZ's. I suspect that's partially why they want to unhinge Horizons.
 
Has anyone tried the Ryzen 5800X3D?

@Morbad (?)

Yes. It's noticeably less of a bottleneck than the 5800X.

Source: https://www.youtube.com/watch?v=C553i_cQE1o


Source: https://www.youtube.com/watch?v=Ltdktvkfxr8


And it's not just AI that's an issue. The entire main game loop/renderer is bottlenecked by the memory subsystem. ~70 more fps at the beginning of the suit tutorial on the 5800X3D.

Won't change GPU limited scenes much, as the damaged and powered on command center interior shows, but if you're not at high-90s GPU utilization, faster CPU/memory will help.
 
Which are your settings?

I have a similar setup with a 5600X and at 30fps the RX580 struggles and howls even on standard planets.
I used an RX 580 8GB with a 2700X and at mid settings (with some set to high) it ran fine. Could get 50-60fps on atmospherics, no lower than 30fps in a busy settlement, and around 40 fps in stations.
 
Aggregate CPU utilization doesn't say much of anything; the game is limited by the main game loop and render threads, which will max out two, maybe three, logical cores.

To see the clear CPU bottleneck directly, you'd need to look at per-core utilization. You'd also need a much faster polling rate than 2500ms, which is averaging out spikes.

A good rule of thumb is that any time your GPU utilization falls below the upper-90s percent, you are bottlenecked by CPU or memory performance. The main exception being if your GPU doesn't have anywhere near enough VRAM (your RTX 2070 is fine).
New poll. 100ms (best MSI afterburner can do).
The yellow circles is where I looked directly down into the ground. the rest is where I looked straight up.
Looking into the sky lowers GPU usage, but does nothing to frametimes.
The flickering at the end is when I quit ED before taking screenshot.
1667301950131.png
 
I cut my teeth on Geoff crammonds F1GP and stunt car racer which ran single digits. the worst however was wing commander on the A500. that was just not playable.
my A1200 improved things a bit but still very low double digits on wing commander (elite frontier was a big improvement however)
030 @50mhz made a good improvement to Frontier but the magic was an 040 @50mhz that came with my Blizzard PPC card.. ;)
 
030 @50mhz made a good improvement to Frontier but the magic was an 040 @50mhz that came with my Blizzard PPC card.. ;)
I saved an A1200 from a skip at work. 16mb of ram, big HDD. can't remember if it had an 030 or 040 accelerator in it but in its day it would have been the Amiga I dreamed off. (I had an a500, an A1200 stock with 80mb HDD and in a moment of madness bought a CD32). none touched a candle to the skip machine however. ultimately I had no room for it but I found it a good home at least rather than the skip.
 
I saved an A1200 from a skip at work. 16mb of ram, big HDD. can't remember if it had an 030 or 040 accelerator in it but in its day it would have been the Amiga I dreamed off. (I had an a500, an A1200 stock with 80mb HDD and in a moment of madness bought a CD32). none touched a candle to the skip machine however. ultimately I had no room for it but I found it a good home at least rather than the skip.
What a find that was! I towerized my A1200 (Eyetech) but after a HD crash I took it apart while investigating and its been like that ever since and I've never got it back up and running. :( Though when it was running it was great as I got the Bvision card too. I would love to have a stock A1200, I still have my Blizzard 030 and I think that would be a great machine to run my old Amiga games, or even get one of those new boards that are like 200mhz 060 power or something, maybe one day.. I did get a cheap CD32 when they were being liquidated but that's in the loft at my mum's place in England, along with my A500.
 
Here the resolution is scaled down to illustrate the issue with high frametimes vs not full load on GPU
View attachment 330189

Any spikes to 100% CPU utilization while the GPU utilization is low is evidence of a CPU limitation, but it's never going to be readily apparent on the CPU utilization graphs with affinity bouncing around faster than polling rate.

Regardless, lack of CPU power itself is probably less of an issue than the cache/memory subsystem. Faster RAM would help, a faster CPU with bigger caches would help more.
 
Any spikes to 100% CPU utilization while the GPU utilization is low is evidence of a CPU limitation, but it's never going to be readily apparent on the CPU utilization graphs with affinity bouncing around faster than polling rate.

Regardless, lack of CPU power itself is probably less of an issue than the cache/memory subsystem. Faster RAM would help, a faster CPU with bigger caches would help more.

At first I thought a new GPU could aid in bringing up the FPS, also around stations and such (I play a lot in VR).

Looking at the graphs though got me confused, hence the previous post with graph about a week a go.

Should I decide to upgrade, the best path would then be to FIRST improve the CPU as it seems ODY eats clockcycles and instructions for breakfeast.

Being on the AM4 platform with 3200 MHz RAM I do like the prospect of the 5800x3D as the final upgrade on that socket. There could well be a good number of years on that chip. The last upgrade to 2600x was in december 2019.
 
Back
Top Bottom