Possible common denominator for the performance issues

Status
Thread Closed: Not open for further replies.
I have spent most of the evening trawling through the rather angry posts here, on facebook, some of the actual bug reports, steam "reviews", etc, and I may have found a common denominator that may interest the Frontier developers.

Those I could find that reported acceptable performance were on one of the following two "directions" of machines:
1 - Intel processors in the desktop range, LGA1xxx series socket.
2 - AMD Ryzen on mainboards with Bxxx chipsets.

Those I could discern mainboard platform that had problems had the following common denominators:
1 - Intel processors in the enthusiast/workstation range, LGA2xxx series socket
2 - AMD Ryzen on mainboards with Xxxx (mostly X570) mainboards.

There is some ... interesting stuff here. The problematic groups BOTH have more memory channels and PCI express lanes.

They report insane GPU load compared to the frames per second they are seeing.

<speculation segment starts here>
Could this be a texture-streaming (moving textures from system ram to gpu ram) problem with the drivers or windows that has problems with PVR on that many memory channels? That the GPUs get overloaded when receiving the textures, and thus creating a situation where the engine is waiting for the texture to load, but the gpu chokes on loading the textures?
</end speculation>

Human insight: This can also explain part of the anger. These are users who are used to their systems being "a step above" the average gaming user.
 
I have a Ryzen 3600 on a B550 board and it still runs like garbage. I even reinstalled a fresh copy of the latest Win 10 ISO to an extra hard drive with nothing extra but video drivers to get as close as possible to a test bench.
 
True but why is the same code buggier on some machines than others?
Try asking CD Projekt Red, because this exact same stuff happened for Cyberpunk; some people with inferior hardware were getting better performance than those with superior hardware.

And for reference, performance issues for that game still havent been fixed despite the game being out since last November
 
I'm suspecting it has to do with moving textures from system ram to video ram. Something that has been giving programmers nightmares since the days of 386 and EISA bus, not helped by 486 and it's VESA local bus, nor by PCI or AGP, and most definitely not by PCI Express. All of these standards are ... a nightmare-ish maze of "keeping the historical design flaws" because some obscure software somewhere in a mission critical facility had used that flaw as a feature. Just trust me when I say that bus programming is a nightmare.

I sincerely miss the simple days of the 8080/z80. 255 input/output ports, 65536 bytes of memory (including video memory in many cases, except for MSX who used "trickery" and Z80 bank switching to have 16384 bytes of video memory in addition to the 64K of ram. I sincerely miss those days. Yes we had to shoehorn in code very carefully, but it wasn't a historical clusterfu.. of keeping every mistake ever done in computer design while adding new features.

So yes, I can understand the frustration of the programmers.

Just as I hope the financial managers can understand the frustration of users seeing the promise of Odyssey running on the same hardware as Horizons falling faster than a brick dropped through Jupiter's gravity well. ;)

Back to topic:

I think the ONLY way the developers are going to get grips on this, is if the users agree to help them.

They actually NEED our help.

They need systems where the user has said "yes" to very detailed telemetry logging of them launching odyssey, loading in solo, disembarking, walking to the concourse and to the concourse window, walking back to the ship, and boarding before quitting. This WILL give the developers an idea of which textures are causing this, and the only thing it will cost the frustrated users, is 5 minutes of time.

Then those collected data can be AI-compared to find "what is causing this" using pattern matching. It's the only way for them to get enough data on OUR problems.
 
I'm suspecting it has to do with moving textures from system ram to video ram. Something that has been giving programmers nightmares since the days of 386 and EISA bus, not helped by 486 and it's VESA local bus, nor by PCI or AGP, and most definitely not by PCI Express. All of these standards are ... a nightmare-ish maze of "keeping the historical design flaws" because some obscure software somewhere in a mission critical facility had used that flaw as a feature. Just trust me when I say that bus programming is a nightmare.

I sincerely miss the simple days of the 8080/z80. 255 input/output ports, 65536 bytes of memory (including video memory in many cases, except for MSX who used "trickery" and Z80 bank switching to have 16384 bytes of video memory in addition to the 64K of ram. I sincerely miss those days. Yes we had to shoehorn in code very carefully, but it wasn't a historical clusterfu.. of keeping every mistake ever done in computer design while adding new features.

So yes, I can understand the frustration of the programmers.

Just as I hope the financial managers can understand the frustration of users seeing the promise of Odyssey running on the same hardware as Horizons falling faster than a brick dropped through Jupiter's gravity well. ;)

Back to topic:

I think the ONLY way the developers are going to get grips on this, is if the users agree to help them.

They actually NEED our help.

They need systems where the user has said "yes" to very detailed telemetry logging of them launching odyssey, loading in solo, disembarking, walking to the concourse and to the concourse window, walking back to the ship, and boarding before quitting. This WILL give the developers an idea of which textures are causing this, and the only thing it will cost the frustrated users, is 5 minutes of time.

Then those collected data can be AI-compared to find "what is causing this" using pattern matching. It's the only way for them to get enough data on OUR problems.
What you are describing is a QA tester; people who explicitly do repetitive actions to find and diagnose problems in a game.

Which is what the Alpha was supposed to be for.

An Alpha whose user feedback was promptly ignored, and the product shipped with all the same problems that the Alpha players tried to tell FDev about.
 
Ryzen 5 3600 with a B350 chipset motherboard, performance unacceptable.

Paired with a Vega 56 8GB and 32GB RAM, just to confirm I'm above recommended spec.
 
I know it's just one data point, but might be useful as a comparison point.

R5 5600X on an X470 platform (in place upgrade from a 2600X)
16GB 3200Mhz CAS 16 RAM in a 4x4GB config (dual channel dual rank - was just cheaper when I first built the system than 2x8gb sticks)
6800XT in the primary PCIE x16 slot. Eyefinity - 3x 1080 freesync displays, effective resolution 5760 x 1080.

With stock Ultra settings and supersampling turned off;
Free flight : 144fps
Free flight near planets : 144fps
Near / In stations and station hangars : ~70 - 90fps, variable with station type.
On foot on planets or in flight at very low altitude : ~85-100fps
On foot or landing at small planetside facilities : ~70-80fps
On foot or landing at large or complex planetside facilities : ~55-75fps

Dropping resolution to straight 1920x1080 doesn't seem to change performance at all, so the bottleneck seems to be CPU or memory. The game was loading the last 4 logical processors very heavily instead of spreading the load around, but I'm not sure if that's just an engine limitation or something being off with the scheduler.
 
I have spent most of the evening trawling through the rather angry posts here, on facebook, some of the actual bug reports, steam "reviews", etc, and I may have found a common denominator that may interest the Frontier developers.

Those I could find that reported acceptable performance were on one of the following two "directions" of machines:
1 - Intel processors in the desktop range, LGA1xxx series socket.
2 - AMD Ryzen on mainboards with Bxxx chipsets.

Those I could discern mainboard platform that had problems had the following common denominators:
1 - Intel processors in the enthusiast/workstation range, LGA2xxx series socket
2 - AMD Ryzen on mainboards with Xxxx (mostly X570) mainboards.

There is some ... interesting stuff here. The problematic groups BOTH have more memory channels and PCI express lanes.

They report insane GPU load compared to the frames per second they are seeing.

<speculation segment starts here>
Could this be a texture-streaming (moving textures from system ram to gpu ram) problem with the drivers or windows that has problems with PVR on that many memory channels? That the GPUs get overloaded when receiving the textures, and thus creating a situation where the engine is waiting for the texture to load, but the gpu chokes on loading the textures?
</end speculation>

Human insight: This can also explain part of the anger. These are users who are used to their systems being "a step above" the average gaming user.
11/10 marks for doing such research, I admire your diligence.

In alpha I was using an enthususiast grade pc, with an i9-9940x, and is as you noted, an LGA20 series connector, but it's currently down waiting on an RMA on the water cooler. As such I'm trying release on my laptop, which has an LGA10 series connector, and both setups have utterly deplorable performance. So my use case bucks that trend...

...but maybe the fact the i7-6700k is on a laptop might explain that disparity?

Texture streaming is a bad idea in general, GPU's have oodles of V-RAM, they should be able to park the textures there and be done with it. For example, my main PC uses a 1080ti, an admittedly older, but still fairly high end card, it has 11gb of V-ram, Horizons was ~20gb, so you could park half the game in there. Both my computers have 64gb ram, I could put Horizons on a ramdrive, and near as damn it do the same for Odyssey, and TBH I'd rather do that than have unnecessary read writes to the SSD's, eating into their servicable lifespans. So if they are shifting textures back and forth (from disk), I really hope they stop, as this will start to kill players SSD's, which is far more serious than tanking framerates.
 
I have spent most of the evening trawling through the rather angry posts here, on facebook, some of the actual bug reports, steam "reviews", etc, and I may have found a common denominator that may interest the Frontier developers.

Those I could find that reported acceptable performance were on one of the following two "directions" of machines:
1 - Intel processors in the desktop range, LGA1xxx series socket.
2 - AMD Ryzen on mainboards with Bxxx chipsets.

Those I could discern mainboard platform that had problems had the following common denominators:
1 - Intel processors in the enthusiast/workstation range, LGA2xxx series socket
2 - AMD Ryzen on mainboards with Xxxx (mostly X570) mainboards.

There is some ... interesting stuff here. The problematic groups BOTH have more memory channels and PCI express lanes.

They report insane GPU load compared to the frames per second they are seeing.

<speculation segment starts here>
Could this be a texture-streaming (moving textures from system ram to gpu ram) problem with the drivers or windows that has problems with PVR on that many memory channels? That the GPUs get overloaded when receiving the textures, and thus creating a situation where the engine is waiting for the texture to load, but the gpu chokes on loading the textures?
</end speculation>

Human insight: This can also explain part of the anger. These are users who are used to their systems being "a step above" the average gaming user.

Let's stop blaming hardware and call it as it is: bad software.

Let me tell you a story. I have 3090 RTX GPU and the game is freshly installed on M2 SSD. I set everything to high or ultra, rightfully so, and then toned down shadows a notch - as a graphics engineer I know they are taxing for little quality benefit.

The game struggles to keep 60 FPS. On a fastest GPU available on the planet, that has 24 GB video memory. There are hick-ups while streaming in. On M2 hooked up to CPU directly. All of this to produce image quality far below what recent games are able to produce with much higher framerates.

It's not hardware. It's simply bad software.

I'm tempted to run it through PIX to see what the hell they are doing.
 
What you are describing is a QA tester; people who explicitly do repetitive actions to find and diagnose problems in a game.

Which is what the Alpha was supposed to be for.

An Alpha whose user feedback was promptly ignored, and the product shipped with all the same problems that the Alpha players tried to tell FDev about.
Pretty much. It's time for the developers and users who actually WANT this game to succeed to take back control from the fiscal "we must release that date" mentality, and make this thing actually WORK as it is supposed to. Given what I see reported there MUST be some sort of common chokepoint that makes systems that are most definitely "inside spec" not actually run. As I started out, I suspect this is a bus issue (read windows issue).
 
I have spent most of the evening trawling through the rather angry posts here, on facebook, some of the actual bug reports, steam "reviews", etc, and I may have found a common denominator that may interest the Frontier developers.

Those I could find that reported acceptable performance were on one of the following two "directions" of machines:
1 - Intel processors in the desktop range, LGA1xxx series socket.
2 - AMD Ryzen on mainboards with Bxxx chipsets.

Those I could discern mainboard platform that had problems had the following common denominators:
1 - Intel processors in the enthusiast/workstation range, LGA2xxx series socket
2 - AMD Ryzen on mainboards with Xxxx (mostly X570) mainboards.

There is some ... interesting stuff here. The problematic groups BOTH have more memory channels and PCI express lanes.

They report insane GPU load compared to the frames per second they are seeing.

<speculation segment starts here>
Could this be a texture-streaming (moving textures from system ram to gpu ram) problem with the drivers or windows that has problems with PVR on that many memory channels? That the GPUs get overloaded when receiving the textures, and thus creating a situation where the engine is waiting for the texture to load, but the gpu chokes on loading the textures?
</end speculation>

Human insight: This can also explain part of the anger. These are users who are used to their systems being "a step above" the average gaming user.
i7-7700K (LGA 1151), 16 GB RAM, GTX 1060 6GB, struggles to get at 60 FPS or higher, drops to 20 at the fire in the tutorial. GPU constantly at 100% usage. VRAM also filled to the brim.
i9-10900K (LGA 1200), 32 GB RAM, RTX 3090, 77 to 120 FPS, GPU usage 85% at the fire. Could have 144 FPS though (FPS cap), neither CPU nor GPU reached 100% usage. 7.8 GB/24 GB VRAM

Both at 1080p. For the GTX 1060 6 GB i even had to drop settings from High.

The code definitely lacks optimization, and I don't think it's texture streaming that is the issue, but rather the attempt to render geometry that is obscured.
 
I think the ONLY way the developers are going to get grips on this, is if the users agree to help them.

They actually NEED our help.
Amen to that! For all I'm maybe having fun, ok I am having fun posting pictures of dumpsters on fire, in general I'm saying that it will be good when it is fixed.... And I'm actually waiting on targetted requests from Frontier, "we need to see how this stye of star looks on this brand of graphics card" or "would Nvidia card users switch to driver version XYZ.AB and let us know how this verison affects performance". But so far, this thread is the closest I've seen to such wide scale targetted probing of the game, and I suspect that is what it will take to nail down the performance loss so many people are experiencing.
 
OK. It's obvious that my first theory is flawed. But. Right now WE are actually doing something constructive here.

Can someone page one of the moderators so we can get a dedicated thread where people post along a template the:

1 : Complete BASE system spec (in windows 10: Hit windows X, select system, then copy and paste)
2 : Their video board spec AND device driver version (I posted earlier in this subforum an example where I compared nVidia Studio and Game Ready newest and found less difference in FPS than the variance was)
3: Their FPS sitting in cockpit, disembarking (please wait 5 seconds for it to stabilize) When elevator door opens on concourse, when in front of concourse window, and when back in cockpit.
4: At what base

The reason for "back in cockpit" in point 3 is to catch things like we had in hotpatch 001 where onfoot fps were stuck until you hyperspaced away.

If we do something like that, we MAY be able to give the developers a fair chance at fixing this. IF that thread can be kept free of both kinds of trolls (at this point the sycophants are as problematics as those who want this to fail for their pleasure of seeing others fail)

Anybody?
 
Amen to that! For all I'm maybe having fun, ok I am having fun posting pictures of dumpsters on fire, in general I'm saying that it will be good when it is fixed.... And I'm actually waiting on targetted requests from Frontier, "we need to see how this stye of star looks on this brand of graphics card" or "would Nvidia card users switch to driver version XYZ.AB and let us know how this verison affects performance". But so far, this thread is the closest I've seen to such wide scale targetted probing of the game, and I suspect that is what it will take to nail down the performance loss so many people are experiencing.
Yeah. When I tried working with one of their affiliated streamers to get such a project started I was "overly negative" and "people were complaining about my attitude".
 
OK. It's obvious that my first theory is flawed. But. Right now WE are actually doing something constructive here.

Can someone page one of the moderators so we can get a dedicated thread where people post along a template the:

1 : Complete BASE system spec (in windows 10: Hit windows X, select system, then copy and paste)
2 : Their video board spec AND device driver version (I posted earlier in this subforum an example where I compared nVidia Studio and Game Ready newest and found less difference in FPS than the variance was)
3: Their FPS sitting in cockpit, disembarking (please wait 5 seconds for it to stabilize) When elevator door opens on concourse, when in front of concourse window, and when back in cockpit.
4: At what base

The reason for "back in cockpit" in point 3 is to catch things like we had in hotpatch 001 where onfoot fps were stuck until you hyperspaced away.

If we do something like that, we MAY be able to give the developers a fair chance at fixing this. IF that thread can be kept free of both kinds of trolls (at this point the sycophants are as problematics as those who want this to fail for their pleasure of seeing others fail)

Anybody?
I'm away for a shower, but I'll jump on that request later tonight. You're doing good work sir, I hope I can help.
 
Status
Thread Closed: Not open for further replies.
Back
Top Bottom