Uh, concern. Is this working as intended?

The video linked is from the start of the Alpha... The reddit post is just a confirmation there was few optimization done since alpha and no culling..

How do we go from there to 'completely broken'? My fps tanks only when it hits a cpu limit.


EmmyV posts with aproblem. I suggest she might be cpu bottlenecked.. Like me..
I did check it when you suggested it, but it's not stressing. It's safely riding a 70% usage case. It is actually using a little less than Titianfall 2 uses on average. I even have Cortex cache optimizing. it's not a bad guess though. I'll end up trying it on the rig I get tomorrow to see if that's any better.
 
concern

Concern levels are reaching critical mass. Prepare for imminent Quantum Concern Event. Protocol as follows:

Acquire alcoholic beverage.

Acquire bratwurst.

Acquire potato rolls.

Recharge grilling unit fuel supply.

Grill bratwurst.

Place bratwurst on potato rolls.

Consume alcoholic beverage and bratwurst while refreshing forums.
turns around scared
How the hell do you know what I'm doing?
 
I'm fairly skeptical that the game just has no occlusion culling. What method was used to test this? Is it replicatable?
You have a post on reddit with screenshots from the graphics engineer who did the profiling on his own, and the video on youtube with graphics analysis where it shows how it renders
Also, common sense, why else the game would lag so terribly even on real good machines? I personally have i7 10750H, 32gb ram, and 2070 (a laptop), and the game does 25-30 fps in settlements, and 30-50 on a plain surface on a planet where no settlements are present (on the whole planet).
 
You have a post on reddit with screenshots from the graphics engineer who did the profiling on his own, and the video on youtube with graphics analysis where it shows how it renders
I don't have the time to look through it yet, but its just such a bizarre claim. Even more bizarre if its true. Releasing a game without occlusion culling is just unthinkable. How could it even run AT ALL without it, let alone to any acceptable degree in any situation? What reason could there possibly ever be to try to push a game out without occlusion culling? It raises more questions than it answers.
 
So, I checked... a bunch of stuff. I don't think that this is something like expected data source issues. It's just. Chunky. Like, I've played a few chunky titles as of late, Mechwarrior 5 comes to mind, but like, this is chunky on a new chunk level. It is, if you will the Chunkaga of the Chunk that is GTAV and the Chunka that is Mech 5. I am pretty convinced something's actually not working as intended, because I don't think frontier would actually release something that's this broken earnestly and think it's done. This really kinda smacks of some kinda horrid debugger being left on and it's gumbying up the works.

#RefundOdyssey​

 
I don't have the time to look through it yet, but its just such a bizarre claim. Even more bizarre if its true. Releasing a game without occlusion culling is just unthinkable. How could it even run AT ALL without it, let alone to any acceptable degree in any situation? What reason could there possibly ever be to try to push a game out without occlusion culling? It raises more questions than it answers.
really? Its easy - "d-effective" management and shareholders roaring to release something to lure more money.
 
Yes my cpu (4570s) is not powerful enough. It is below minimum spec. I have no problems getting 60 fps when flying or landing on planets.
 
really? Its easy - "d-effective" management and shareholders roaring to release something to lure more money.
Thats just it though. Occlusion culling isn't just like some luxury feature that can be cut from a game to save time and money. It HAS to have it or it wont work the way its supposed to. In other words it is NOT a viable product at all. Its a car pushed off the lot with no radiator. Thats a timebomb waiting to brick peoples hardware because its like running a furmark burn in test for hours at a time on average hardware.
 
Thats just it though. Occlusion culling isn't just like some luxury feature that can be cut from a game to save time and money. It HAS to have it or it wont work the way its supposed to. In other words it is NOT a viable product at all. Its a car pushed off the lot with no radiator. Thats a timebomb waiting to brick peoples hardware because its like running a furmark burn in test for hours at a time on average hardware.
Yeah well its not that bad (regarding the furmark statement, not the game, it IS THAT BAD), since it also stutters the CPU heavily with draw calls, so it simply cannot prepare enough frames for a GPU to render and show - so the system is not burning up like crazy, the GPU is like chilling on 45% load without the work to do, while CPU is like "omg I just don't have the time for this, need to address all of those draw calls omg so many of em". So no hardware threat.
But it is something not easily fixable. I don't think the people, who made it like this in the first place, have the skill for making it the proper way. And the industry is in serious hunger for real skilled specialists, not yesterday students or code monkeys. So, I believe it is seriously fcked now. And its gonna be a really long wait till they fix that, if they will manage at all.
 
What I find odd about all this is, is that it's playable on my below-minimum-spec PC.

Intel Core i3-8100 CPU @ 3.60GHz
NVidia GeForce GTX 650
8gb Ram

Not silky-smooth, admittedly, but playable. Settings default to "High" rather than "Ultra". So I guess it's not trying so hard.
 
What I find odd about all this is, is that it's playable on my below-minimum-spec PC.

Intel Core i3-8100 CPU @ 3.60GHz
NVidia GeForce GTX 650
8gb Ram

Not silky-smooth, admittedly, but playable. Settings default to "High" rather than "Ultra". So I guess it's not trying so hard.
Unfortunately, its not the fps making the Odyssey unplayable.
If only it would be just fps dude(
 
What I find odd about all this is, is that it's playable on my below-minimum-spec PC.

Intel Core i3-8100 CPU @ 3.60GHz
NVidia GeForce GTX 650
8gb Ram

Not silky-smooth, admittedly, but playable. Settings default to "High" rather than "Ultra". So I guess it's not trying so hard.
I'm running a 3440x1440 monitor so I expect it to be a bit challenged. But the difference between Horizons and Odyssey is enormous.
 
What I find odd about all this is, is that it's playable on my below-minimum-spec PC.

Intel Core i3-8100 CPU @ 3.60GHz
NVidia GeForce GTX 650
8gb Ram

Not silky-smooth, admittedly, but playable. Settings default to "High" rather than "Ultra". So I guess it's not trying so hard.
Hey I'm curious, as that's a way lower spec computer than mine and I can't run it on "medium" so do you happen to have any frame rate data?
 
on the reddit post dude clearly wrote about draw calls and them being stupidly used like in dx9 not like in dx11, wasting a lot of cpu time.
Thats the reason.

CPU time being wasted isn't the main issue. Sure, some people with especially lopsided configurations will encounter CPU limits, but that's not where the lack of culling presents itself on most setups. Chances are all the excess depth buffer modification documented are causing issues on the GPU side of things, in a manner where GPU cycles are idle, not waiting for the CPU, but waiting for rendering pipeline bottlenecks that are still on the GPU side of the chain.

An older presentation, but much of it is still relevant:

Regardless, it was very easy to rule out a CPU bottleneck on my systems by reducing CPU speed and observing that performance had not changed.

Yes is replicable. You can use nVidia Nsigth to check by yourself.

A more simple test is to compare frame rates when looking at nothing and then looking at a complex scene.

In Odyssey, in the vicinity of settlements, the maximum differential I can produce between minimum and maximum FPS is about two fold. In areas with no internal settlement areas to render, or in Horizons, is closer to four to five fold. If you stare into a corner or straight up at an empty sky, frame rate on an uncapped setup should skyrocket, but in Odyssey, it's still drawing a bunch of crap you can't see if anything is in range...the only reason for this to occur is a lack of culling.
 
Question. I finally booted up Odyssey and was stoked to try and play this shindig. Got in, started some tutorial mission, which, hey, a tutorial! And went through this clunky 12 frames a second kind of... thing. Where the combat was just 5 guys kind of swoogling back and forth until shot like... honestly. Doom. Not like 2016 doom, or Doom Eternal, but like Doom the kind people put on their calculators to see if they can do it, Doom. Is... is this working as intended? Is... this the game? Like, f'real? Or is there some kind of settings tweak that gives them... I dunno. More AI bandwith, or something?
Hahaha. I honestly keep wondering why anybody would want to play a 2001 FPS in ED. It's like they're trying to get a new fanbase that doesn't actually want to play a spaceship game.
 
The video linked is from the start of the Alpha... The reddit post is just a confirmation there was few optimization done since alpha and no culling..

How do we go from there to 'completely broken'? My fps tanks only when it hits a cpu limit.


EmmyV posts with aproblem. I suggest she might be cpu bottlenecked.. Like me..
Because the entire graphics pipeline is a wreck aside for the lack of culling. It's not just not culling. It's rendering entire carriers and stations twice for no reason.
 
Back
Top Bottom