Anyone that runs Odyssey in a position to try this to see if there is a performance change?

There is no contradiction there since the brain's perception of two different things can of course be different, but you can very easily google that.
Many people have been asking why 24 fps is the standard for movies and TV.
Because it's a different medium.
In a movie, you just watch. There is no input on your part, and you don't expect an answer to those input.

When you play a game, you have inputs. IE, you move the camera and do actions. Your brain is simple, it expect an immediate (or what feel like it) answer to those inputs. When it doesn't come fast enough because the FPS is too low, then it's clearly visible, IE the game is choppy.
 
Your're quite mistaken if you believe that common sense and knowledge that has been validated over decades could be deprecated by scientific results that even talk about another thing. Maybe you're still missing that maximum image processing speed is an entirely different thing than minimum image frequency for the brain to interprete a sequence of images as fluid movement.
Didn't I just say they were two separate things? Are you just pulling my leg here?
 
Didn't I just say they were two separate things? Are you just pulling my leg here?
You said they are different things based on relevancy for different applications, not based on their very definition. On the contrary, you have previously implied they would share a definition since you suggested there was a contradiction. Maybe I have missed some posting of yours, but I couldn't find anything else from you on that.
 
Because it's a different medium.
In a movie, you just watch. There is no input on your part, and you don't expect an answer to those input.

When you play a game, you have inputs. IE, you move the camera and do actions. Your brain is simple, it expect an immediate (or what feel like it) answer to those inputs. When it doesn't come fast enough because the FPS is too low, then it's clearly visible, IE the game is choppy.
It also affects your input.

Edit: That's a daft comment, I'll go and play the game for a while.. ;)

To make it less daft, latency is a real problem and it's not nice to shoot at people if you can't aim properly..
 
You said they are different things based on relevancy for different applications, not based on their very definition. On the contrary, you have previously implied they would share a definition since you suggested there was a contradiction. Maybe I have missed some posting of yours, but I couldn't find anything else from you on that.
Ah right, instead of arguing a case with scientific sources I should've gone for the logical fallacies jugular. Thank you, it's been mildly entertaining.
 
Back to point, perform isnt consistent in game.
Use if GPU differs

E.g. GPu uses 99 or 98% in stations I managed to get nearly 60 fps at 4k

GpU is being utilised 65% at settlements and FSP is about 45

Weird CPU is no higher than 16% used.

Why is ED not using all of the GPu potential all the time.
 
Ah right, instead of arguing a case with scientific sources I should've gone for the logical fallacies jugular. Thank you, it's been mildly entertaining.
It serves no purpose to argue with scientific sources of which you refuse to understand the results. Also, you did the fallacies while I didn't. This is why I could point out your mistake from a simple logical argument (distinction based on different attributes, and non-equivalence does not imply equivalence), while you couldn't.
 
Back to point, perform isnt consistent in game.
Use if GPU differs

E.g. GPu uses 99 or 98% in stations I managed to get nearly 60 fps at 4k

GpU is being utilised 65% at settlements and FSP is about 45

Weird CPU is no higher than 16% used.

Why is ED not using all of the GPu potential all the time.
May I ask your CPU and GPU?
You see the point, loading the GPU is not a reflection of using the entire graphics card.
There is a bus, there is video memory, etc.
 
Last edited:
Yeah, just that 60+ fps is totally unnecessary. Your eyes are able to see 30-60 fps. For me it seems fluid at 30 fps.

For me, fluidity keeps improving until at least 90 fps, and even for most people there are benefits, of increasingly diminishing returns, to be had way past that.


That 13ms mostly comes from the fact that was the lowest duration they could test with the limited hardware on hand...random people were already doing well better than chance at the lowest test duration (13ms) in some of the tests. They could have gone way lower than 13ms before they found the floor where no one was able to extract meaningful information. If they had trained the participants, or featured more simplistic (but still meaningful) tests, the frame duration would have had to be vanishingly short before no one was able to extract useful information.

Anyway, this discussion (and that citation) crop up fairly frequently...

Maybe you're still missing that maximum image processing speed is an entirely different thing than minimum image frequency for the brain to interprete a sequence of images as fluid movement.

The floor for fluid movement isn't the same as the ceiling for useful motion information, anywhere near the point latency ceases to be an issue. It's even further from where motion blur stops being perceptible, or other undesirable artifacts cease.

True, those results from 2014 merely found an upper bound for this kind of optimization.

They didn't come anywhere near the upper bound of anything other than the 75Hz display they were using.

The thing is that our brain creates in illusion of motion when the images are just fast enough, around 24 fps.

The illusion of motion--where we stop being able to count individual frames--is the easily the least relevant of any criteria related to frame rate in interactive media.

I'm playing a game where my performance matters. I'm not looking to be fooled into thinking I'm seeing smooth motion at the cost of a loss of detail information. I'm looking for perceptibly perfect smoothness, while maintaining as crystal clear an image as possible, as a baseline. The more motion information I can extract beyond that, the better.

Nowhere is 24, 30, or 60, or even 77 fps ideal for me. For me 30 fps is not even close to smooth, absent blur (and I strive mightily to eliminate blur). 60-70 fps is perfectly tolerable, frame interval consistency permitting, when it comes to delivering what I feel is smooth motion, but I can still benefit from far more.

It also affects your input.

Edit: That's a daft comment, I'll go and play the game for a while.. ;)

To make it less daft, latency is a real problem and it's not nice to shoot at people if you can't aim properly..

Yes.

There is a reason phones tend to have high refresh rate displays and even higher input sampling intervals...almost everyone and their grandmothers can detect touch and drag latency (which is even more apparent than, but not so far from mouselook and aiming) problems at very low levels.

Source: https://www.youtube.com/watch?v=vOvQCPLkPt4


Back to point, perform isnt consistent in game.
Use if GPU differs

E.g. GPu uses 99 or 98% in stations I managed to get nearly 60 fps at 4k

GpU is being utilised 65% at settlements and FSP is about 45

Weird CPU is no higher than 16% used.

Why is ED not using all of the GPu potential all the time.

Because you are CPU (or memory) bound virtually any time GPU utilization falls below ~99%.

Looking at total aggregate CPU utilization with 1000-2000ms polling rate says almost nothing about how the CPU is being utilized. You need per-core use and much faster polling rates to directly detect CPU limitations. Even then memory bottlenecks can still cause dips in CPU utilization while the GPU is still waiting on the CPU.
 
Last edited:
They didn't come anywhere near the upper bound of anything other than the 75Hz display they were using.
Didn't read it so thoroughly, but if that is true, this is some really poor study, considering 1000/13 ≈ 76.923.

The illusion of motion--where we stop being able to count individual frames--is the easily the least relevant of any criteria related to frame rate in interactive media.

I'm playing a game where my performance matters. I not looking to be fooled into thinking I'm seeing smooth motion at the cost of a loss of detail information. I'm looking for perceptibly perfect smoothness, while maintaining as crystal clear an image as possible, as a baseline. The more motion information I can extract beyond that, the better.

Nowhere is 24, 30, or 60, or even 77 fps ideal for me. For me 30 fps is not even close to smooth, absent blur (and I strive mightily to eliminate blur). 60-70 fps is perfectly tolerable, frame interval consistency permitting, when it comes to delivering what I feel is smooth motion, but I can still benefit from far more.
Considering human reaction times of 200+ms, you are greatly overestimating the relevance of fps, especially for slow-paced games such as ED.
 
Last edited:
That 13ms mostly comes from the fact that was the lowest duration they could test with the limited hardware on hand...random people were already doing well better than chance at the lowest test duration (13ms) in some of the tests. They could have gone way lower than 13ms before they found the floor where no one was able to extract meaningful information. If they had trained the participants, or featured more simplistic (but still meaningful) tests, the frame duration would have had to be vanishingly short before no one was able to extract useful information.
Yes, I'm aware it wasn't the most ideal example, but I thought it would suffice to at least convey the general idea that 30 FPS isn't really the upper limit. Unfortunately, everything took a turn for the worse from there. I feel nothing but regret for bringing it up.
 
Didn't read it so thoroughly, but if that is true, this is some really poor study, considering 13/1000 ≈ 76.923.
There (have fun reading) the NASA say that 120fps is required for their pilots. 60-90fps is not enough.
There was a US military testing that shown some pilots had 200+fps reaction time. But it was some time ago and I can't find a credible source (the original link is dead).

Gamers are trained to see FPS, and Esport players moreso. Probably not as much as jet fighter though.
Considering human reaction times of 200+ms, you are greatly overestimating the relevance of fps, especially for slow-paced games such as ED.
EDO is not "slow paced". It has FPS component, which is arguably one of the most fast paced thing in videogame.
Slow paced game that can have low FPS and you won't notice are stuff like city building sim, or grand strategy games.
 
Didn't read it so thoroughly, but if that is true, this is some really poor study, considering 1000/13 ≈ 76.923.
Rounding. 1000 / 13⅓ = 75.
Considering human reaction times of 200+ms, you are greatly overestimating the relevance of fps, especially for slow-paced games such as ED.
ED has combat, which isn't slow-paced.
Slow paced game that can have low FPS and you won't notice are stuff like city building sim, or grand strategy games.
Definitely noticable there too! Especially as it gets worse as the game goes on. An inconsistent framerate is worse than a low framerate.
 
Definitely noticable there too! Especially as it gets worse as the game goes on. An inconsistent framerate is worse than a low framerate.
Noticeable usually, but less important. Reaction time in EU4 is not exactly as important as in a FPS. And you won't miss a headshot there if you are laggy. Plus you can always pause :D
And games like Dwarf Fortress just "slow down" since they are essentially turn based in real time. So you don't even notice (besides the fact the game is slow).


FPS, on the other hand, are very demanding. Poor FPS will make you miss more often, or turn slowly and so on, and you will die.
 
EDO is not "slow paced". It has FPS component, which is arguably one of the most fast paced thing in videogame.
My mercenary rank might indicate I tried it a bit. And sorry, but I know a lot of FPS games to compare, and on-foot ED gameplay is more a bots and shield simulator RPG than anything else. (Also peer-to-peer renders any fps gibberish for PvP absurd.)
I know fast-paced games, these are mobas or actual PvP egoshooters, but they also run easily with 120fps on my machine.
 
My mercenary rank might indicate I tried it a bit. And sorry, but I know a lot of FPS games to compare, and on-foot ED gameplay is more a bots and shield simulator RPG than anything else.
I know fast-paced games, these are mobas or actual PvP egoshooters, but they also run easily with 120fps on my machine.
Po-tay-toes, po-ta-toes.
First person view + shooting people = FPS. Poor frame makes slower reaction time, which is bad when you need some accuracy to hit, and can't simply "pause".

Whether it's a good or bad FPS compared to others is another debate entirely.

If you are fine with low FPS, then good on you. Some dude said he can't tell the difference above 24fps.
But that doesn't mean it's an absolute truth. Consoles games are capped at 30FPS at their lowest, PC game usually at 60fps (for console port usually). For a reason.
 
Just Wondering If anyone out there that has poor performance wouldn't mind wiping OS, doing clean OS and Odyssey install to see if performance changes drastically.

To get back on topic I may be reinstalling Win 11 this weekend and while I wouldn't say I fall into the category of poor performance as I get high 100+ fps almost everywhere except foot CZ's mid (30's FPS) I'll let you know what my performance is after reinstalling.
 
First person view + shooting people = FPS. Poor frame makes slower reaction time, which is bad when you need some accuracy to hit, and can't simply "pause".
I am dying 0 times in a match while running ideal paths and shooting down bots as efficient as possible regardless of whether the game runs at 10fps or 10000fps.
It simply doesn't make a difference for success in EDO. It is that slow-paced. The only difference is when it feels laggy / framed / non-fluid. This for me starts below 20fps. Therefore, everything above 20 fps is fine for me.
 
Again, if reinstalling Windows improves your performance, there was something seriously wrong with that install. It would be prudent to figure out what happened, so that it doesn't occur again. Performance should not just spontaneously degrade.

Yes, I'm aware it wasn't the most ideal example, but I thought it would suffice to at least convey the general idea that 30 FPS isn't really the upper limit. Unfortunately, everything took a turn for the worse from there. I feel nothing but regret for bringing it up.

It's not a bad source, and illustrated the general point just fine. One just needs to be aware of the limits of the methodology.

Considering human reaction times of 200+ms, you are greatly overestimating the relevance of fps, especially for slow-paced games such as ED.

No, I am not. Human reaction times do not imply much of anything about the utility of higher frame rates.

The post you quoted contains examples of scenarios where sub-10ms latencies are readily apparent, to average people, despite their reaction times being more than an order of magnitude slower than this.

Even once we get below the threshold of perception, it should be obvious that any latency beyond one's own reaction time is still penalizing. Certainly, there are diminishing returns as end-to-end latency falls, but for someone with a relatively modest 200ms personal reaction time, even 2ms of latency could be the deciding factor in a competitive scenario. Against others whose own intrinsic latency is in the same ballpark, it could be the bulk of the difference. Glancing at a reaction time test, like https://humanbenchmark.com/tests/reactiontime, should make it obvious that even very small alterations to end-to-end system latency could significantly skew one's ranking.

I am dying 0 times in a match while running ideal paths and shooting down bots as efficient as possible regardless of whether the game runs at 10fps or 10000fps.
It simply doesn't make a difference for success in EDO. It is that slow-paced. The only difference is when it feels laggy / framed / non-fluid. This for me starts below 20fps. Therefore, everything above 20 fps is fine for me.

Not everyone's benchmark for success is the same. Not everyone will only be fighting bots.

Not everyone finds the same frame rate floor acceptable, and there are absolute mountains of empirical data that demonstrate how even extreme framerates can have utility.

If you're content with 20 fps, that's great. It will certainly allow you to have a more enjoyable Odyssey experience with cheaper hardware. However, a great number of people will never be content with such a level of smoothness, and you are at an objective competitive disadvantage, even if that disadvantage doesn't amount to much in the types of content you favor or how you choose to approach them.
 
Last edited:
Back
Top Bottom