Console Update

While i secretly ( well, not exactly :) ) hoped they will eventually release a version of Odyssey on PS5/XBSeries, the chances for this to happen are really really slim (down to zero actually).
Simply because everyone would expect the new consoles to run Odyssey in 4k/60fps and i'm affraid this will not happen.
Probably the fact Elite didnt sell enough units on consoles also matters (consoles environment ads some costs to the developer/publisher which means even less profits for a company) - Elite is still a niche/nerdy game while consoles are a really good plaform, but for the masses, not for the nerds.
 
While i secretly ( well, not exactly :) ) hoped they will eventually release a version of Odyssey on PS5/XBSeries, the chances for this to happen are really really slim (down to zero actually).
Simply because everyone would expect the new consoles to run Odyssey in 4k/60fps and i'm affraid this will not happen.
Probably the fact Elite didnt sell enough units on consoles also matters (consoles environment ads some costs to the developer/publisher which means even less profits for a company) - Elite is still a niche/nerdy game while consoles are a really good plaform, but for the masses, not for the nerds.
1080p/30fps is good enough for me, I just want to play it!

I've never noticed any difference in resolution above 1080p (hardly even above 720p) and don't worry about frame rates unless it's in the teens and causes actual slowdown!

The film and TV we watch is shown at 24fps 99% of the time, unless you're watching certain Peter Jackson films etc, so why do people demand 5000fps in games when they won't even notice any difference?!

People swear that they can see a difference between 30 and 60 FPS, then often pick the lower one when asked which looks the best! 😯😂😀🤘
 
I think its related to the specifics of gaming. Movies don't pan 180 degrees in 0.5 secs, whereas gamers will switch round fast in FPS games. In order to aim as you move so quickly, you needs some frames or else the brain, mouse, screen control loop will have problems.

So its not about visuals, its about the mechanisms of accurate aiming and positioning. Once below 30fps it gets real hard to hit a barn door with a banjo.
 
I think its related to the specifics of gaming. Movies don't pan 180 degrees in 0.5 secs, whereas gamers will switch round fast in FPS games. In order to aim as you move so quickly, you needs some frames or else the brain, mouse, screen control loop will have problems.

So its not about visuals, its about the mechanisms of accurate aiming and positioning. Once below 30fps it gets real hard to hit a barn door with a banjo.
Well even so, I've certainly never noticed myself unless I get actual slowdown, when a 30fps standard or even an unlocked frame rate should be enough, even if they can't guarantee 60 etc.

Anyway, it will probably take a small miracle to bring the game out on next gen systems, but you never know! 🤔😀🤘
 
Well even so, I've certainly never noticed myself unless I get actual slowdown, when a 30fps standard or even an unlocked frame rate should be enough, even if they can't guarantee 60 etc.

Anyway, it will probably take a small miracle to bring the game out on next gen systems, but you never know! 🤔😀🤘

The pixel response time of your display (not the graphics card) will have an effect on what you perceive to be acceptable smoothness.

On a TV the pixel response isn't usually an important buying decision. On a PC monitor (particularly one capable of refresh rates over 60Hz) pixel response time (imagine the time it takes to go from fully lit to fully dark) can be much faster, which means although it's capable of displaying a faster framerate, it can look worse at lower framerates.

So if you plug a gaming PC into the same TV as your console your perception of acceptable smoothness shouldn't change.
 
The pixel response time of your display (not the graphics card) will have an effect on what you perceive to be acceptable smoothness.

On a TV the pixel response isn't usually an important buying decision. On a PC monitor (particularly one capable of refresh rates over 60Hz) pixel response time (imagine the time it takes to go from fully lit to fully dark) can be much faster, which means although it's capable of displaying a faster framerate, it can look worse at lower framerates.

So if you plug a gaming PC into the same TV as your console your perception of acceptable smoothness shouldn't change.
Ok, thanks for the explanation, but will any of that help Frontier get Odyssey up and running on next gen consoles?! 😯😂😀🤘
 
A big reason 24 fps is acceptable in film and tv is there is a lot more motion blur, this gives the effect of interpolation. Where as games are like static stop motion which at low fps just looks clunky.
 
A big reason 24 fps is acceptable in film and tv is there is a lot more blur, this gives the effect of interpolation. Where as games are like static stop motion which at low fps just looks clunky.
Yep, but a player moving from a 30fps console plugged into a big telly to a gaming PC will be basing their opinion on what's acceptable for them on their console games.
 
1080p/30fps is good enough for me, I just want to play it!

I've never noticed any difference in resolution above 1080p (hardly even above 720p) and don't worry about frame rates unless it's in the teens and causes actual slowdown!

The film and TV we watch is shown at 24fps 99% of the time, unless you're watching certain Peter Jackson films etc, so why do people demand 5000fps in games when they won't even notice any difference?!

People swear that they can see a difference between 30 and 60 FPS, then often pick the lower one when asked which looks the best! 😯😂😀🤘
The difference is significantly noticeable with games.
Resolution-wise, I will agree that 1080p is perfectly acceptable for casual gaming and I actually prefer to lower some games to 1080 in order to get a smooth 120fps. You can directly see the difference in my K/D ratios in games when I am stuck on a 60hz display vs a high refresh one.

I can 100% confirm that with a good quality gaming monitor, lower FPS becomes completely unacceptably slideshow-ey as that faster pixel response means what's slow and blurry on a TV is fast and flickery.

I helped a friend choose a new laptop recently, and she can't go back to playing games on her old one that managed at best 30fps, now she's experienced the same games at 90+ and seen the two running the same game side by side.
 
The difference is significantly noticeable with games.
Resolution-wise, I will agree that 1080p is perfectly acceptable for casual gaming and I actually prefer to lower some games to 1080 in order to get a smooth 120fps. You can directly see the difference in my K/D ratios in games when I am stuck on a 60hz display vs a high refresh one.

I can 100% confirm that with a good quality gaming monitor, lower FPS becomes completely unacceptably slideshow-ey as that faster pixel response means what's slow and blurry on a TV is fast and flickery.

I helped a friend choose a new laptop recently, and she can't go back to playing games on her old one that managed at best 30fps, now she's experienced the same games at 90+ and seen the two running the same game side by side.
Granted indeed it seems to have a big effect on some people's playing ability for some reason, but i've played every sort of game on consoles, including many fast paced ones, and never noticed any lower frame rates or particular frame rates affecting (or certainly improving) my playing experience...

For example, while playing particular Battlefield and Cod games on PS2, PS4 and PS4, I might have noticed a difference in graphical details and textures over the years, but I can't say I played any better as their frame rates premusably got better...

Of the Battlefield games in particular, I did the best in Bad Company 2 on PS3, probably because it was a smaller, tighter, controlled experience, rather than the bigger, more chaotic later games....

I'm generally a team player and don't pay that much attention to K/D, but I still did the best in BC2, having around a 1.3 K/D and being very often high up in the scoreboard (helped largely probably because I loved the tank combat in that game and we didn't have jets in that particular Battlefield game, while also most players indeed play together as part of their squads in that game and BF games tended to attract more lone wolf/headless chicken players as the series became more popular, to point that now teamwork is largely unexistant in the series), when I had about a 0.9 and 0.7 in BF3 and BF4, that were as said wilder experiences...

Indeed on consoles, I rarely ever remember seeing any effect of low frame rates, sometimes seeing them show up in odd places like during a goalmouth scramble in FIFA, but indeed very rarely, I guess because console games are generally optimised so well for their known hardware before we get to play them...

In fact the only time I remember seeing significantly low frame rates and slowdown is in the mid thousands when I briefly tried playing games on PC, before deciding it largely wasn't for me.

The overall point being, it should be easy to craft high quality playable experiences on consoles when you know the exact hardware you're dealing with and know what it's capable of!

I'd be more expecting that it would be on PCs that the Devs would have more trouble releasing a playable version of the game on, than as they actually are, struggling to make it work on consoles! 😯😂😀🤘
 
1080p/30fps is good enough for me, I just want to play it!

I've never noticed any difference in resolution above 1080p (hardly even above 720p) and don't worry about frame rates unless it's in the teens and causes actual slowdown!

The film and TV we watch is shown at 24fps 99% of the time, unless you're watching certain Peter Jackson films etc, so why do people demand 5000fps in games when they won't even notice any difference?!

People swear that they can see a difference between 30 and 60 FPS, then often pick the lower one when asked which looks the best! 😯😂😀🤘
If they stream with a low frame rate members of the PC master race will cast shade unless they deflect by casting shade upon the software.
 
For example, while playing particular Battlefield and Cod games on PS2, PS4 and PS4, I might have noticed a difference in graphical details and textures over the years, but I can't say I played any better as their frame rates premusably got better...
The thing with those is that everyone will have been on an even playing field. The console hardware has a more or less consistent framerate it outputs on every individual system. the console was the bottleneck in those generations. On PC, the players with the higher end rigs with faster refresh rates are able to pull off faster reaction times and tighter aiming than those without. And that extends into modern console shooters with FPS vs resolution toggles, as well as multiplatform titles (You definitely notice it when a game has a 30fps cap on Series S, and 60 on Series X)
 
Last edited:
The thing with those is that everyone will have been on an even playing field. The console hardware has a more or less consistent framerate it outputs on every individual system. the console was the bottleneck in those generations. On PC, the players with the higher end rigs with faster refresh rates are able to pull off faster reaction times and tighter aiming than those without. And that extends into modern console shooters with FPS vs resolution toggles, as well as multiplatform titles (You definitely notice it when a game has a 30fps cap on Series S, and 60 on Series X)
Well already many PS4 games are running at 60fps, but I never noticed and it never helped me play any better, to me and I'm sure most others, those extra frames were wasted ...

As long as a game runs well, we'll enjoy it, we're not asking the Earth! 😯😀🤘
 
Well already many PS4 games are running at 60fps, but I never noticed and it never helped me play any better, to me and I'm sure most others, those extra frames were wasted ...
The point being you didn't have 60fps players playing against 30fps players on those titles. Fun fact, you can actually use it as a handicap system when you have friends over, and stick the most skilled player on a lower fps cap :D
 
The point being you didn't have 60fps players playing against 30fps players on those titles. Fun fact, you can actually use it as a handicap system when you have friends over, and stick the most skilled player on a lower fps cap :D
Maybe, though I don't think lower frame rates bother or even affect most people, as said I never notice or reckon I'm playing any better or worse due to the frame rate...

Similarly people often talk about screen tearing etc in certain games and I generally don't see anything wrong with them - while there's nothing wrong with my eyesight! 😯😂😀🤘
 

stormyuk

Volunteer Moderator
Yep, but a player moving from a 30fps console plugged into a big telly to a gaming PC will be basing their opinion on what's acceptable for them on their console games.
To be honest I game on a 144hz 1440p display on PC (usually with gsync so the refresh is up and down) and on a 60hz 2160p display on the PS5. I can get on with both. I play for example Destiny 2 and FIFA 2022 on both and I can't honestly tell the difference with motion on either apart from the PS5 looks higher resolution as it usually is and I sit further away from the screen, which also helps.

I think Odyssey at a fixed 30fps @ 2160p on the PS5 would be perfectly acceptable for me. I still doubt we will see it. :(
 
To be honest I game on a 144hz 1440p display on PC (usually with gsync so the refresh is up and down) and on a 60hz 2160p display on the PS5. I can get on with both. I play for example Destiny 2 and FIFA 2022 on both and I can't honestly tell the difference with motion on either apart from the PS5 looks higher resolution as it usually is and I sit further away from the screen, which also helps.

I think Odyssey at a fixed 30fps @ 2160p on the PS5 would be perfectly acceptable for me. I still doubt we will see it. :(
Hey mate, quick question, what are your specs ?
 
To be honest I game on a 144hz 1440p display on PC (usually with gsync so the refresh is up and down) and on a 60hz 2160p display on the PS5. I can get on with both. I play for example Destiny 2 and FIFA 2022 on both and I can't honestly tell the difference with motion on either apart from the PS5 looks higher resolution as it usually is and I sit further away from the screen, which also helps.

I think Odyssey at a fixed 30fps @ 2160p on the PS5 would be perfectly acceptable for me. I still doubt we will see it. :(

I used to use a 24" 1080p 144Hz main monitor, now I use a 43" 4k 60Hz monitor (slower pixel response but the same pixel density at the same viewing distance). On the 144Hz display slow framerate was much more noticeable (because the pixel response was faster).


How does 30fps feel to you on the 144Hz monitor with (I am assuming) it's faster pixel response rate compared to the 4k display?
 
Top Bottom