Update 14 and Beyond: Live and Legacy Modes

Well as I've said in other places, I and many others don't see any difference above 30 FPS, so "only" that many frames won't bother most people.

Many of the best, most praised console games are "only" running at 30 FPS or MAYBE 40 and 99% of people didn't notice or care!

If unsupported and unoptimised, Odyssey was as said "only" running at 37fps on the Steam Deck, surely the game would run around 40 FPS on PS4 and indeed with a tiny bit of effort, a "next gen" version would surely run at 60 FPS for those that care or notice!

Even If they still have trouble hitting such specific targets, they could still let people choose between frame rate and resolution as many PS4 Pro games allow - again, give people the choice and let them play the game the way they want.

Even IF 1080p/30fps is all a fully supported "live" mode on "next gen" consoles can manage, which I highly doubt, it would still be perfect as most of us just want to play the game, but for the few that will notice, I'm sure it could hit 4K and 60fps too, MAYBE not at the same time, though with luck it could do both! 😯😂😀🤘
I don't think what was mentioned here is entirely true, though perceptions of such visuals are usually relative, I know when I moved from 30 to 60 FPS, The difference was like night and day for me. Then when I moved from 60 to 120 FPS, it wasn't quite night and day, but rather the visuals felt "buttery smooth". Moving from 120 to 144 FPS and 144 FPS to 165 FPS (my monitor's refresh rate) I still could tell the difference but not needed. I'm happy at 120 to 144 FPS.

I know many people tend to feel like you do because they are trying to display 120, 144, 165 FPS or beyond, on a monitor that only supports up to 60 Hertz refresh rate. Is that the case for you? Or is it just age?

I used to run 300+ FPS uncapped on horizons, but I usually cap it at 120. Now I struggle between 45 FPS on the ground to 80 FPS in supercruise in Odyssey.
 
I don't think what was mentioned here is entirely true, though perceptions of such visuals are usually relative, I know when I moved from 30 to 60 FPS, The difference was like night and day for me. Then when I moved from 60 to 120 FPS, it wasn't quite night and day, but rather the visuals felt "buttery smooth". Moving from 120 to 144 FPS and 144 FPS to 165 FPS (my monitor's refresh rate) I still could tell the difference but not needed. I'm happy at 120 to 144 FPS.

I know many people tend to feel like you do because they are trying to display 120, 144, 165 FPS or beyond, on a monitor that only supports up to 60 Hertz refresh rate. Is that the case for you? Or is it just age?

I used to run 300+ FPS uncapped on horizons, but I usually cap it at 120. Now I struggle between 45 FPS on the ground to 80 FPS in supercruise in Odyssey.
The human eye and brain have been shown to update a a rate of around once every 13 milliseconds. which is in the region of 80fps. Which is why 90 and up is the "Sweet spot" for ensuring you've reached the most perceptible gains. Of course, that's not taking into account things like stereoscopic rendering for 3D and other techniques which require double the FPS generation. But yeah, the improvements above that can be perceptible but show diminishing returns. YMMV of course, and it how you perceive stuff will depend on other factors too, like if you are capping out your fps at your monitor's maximum refresh, or using VRR.

I would imagine though, that most people going out and buying a system capable of 120 and up will buy a monitor to match at the same time. That was the first thing I did upon getting my new laptop and my Series X, grabbing a pair of high refresh monitors for them both. Though it's possible some might make the mistake of hooking it up with an HDMI 1.x cable, or running at 4k in HDR on HDMI 2.0 and then be capped at 60 regardless.
 
I'd imagine that no amount of posting here will bring back ed on consoles. For the foreseeable future it's just legacy...
I’d imagine no amount of posting here will stop people asking for a console version. It’s what they want, why should that disturb those who already have it?
 
Technically it's still possible, but practically, from a "Frontier has made up their mind and won't change it" perspective, it's not going to happen.

But hey, if you're on XBox, you've got Space Engineers, so rejoice!
Are the developers of ‘Space Engineers’ safe when it comes to console development?

Sorry about the list of posts, I was just going through the thread and responding as I did.
 
Last edited:
Ok, you killed it Frontier.
I get 4 fps on my odl machine. Will play the legacy version for racing purposes, or to kill time, but this is simply the end of the game for me.
If anyone is interested I therefore have a pair of Thrustmaster 16000M joysticks for sale.
On my ten year old cheapo laptop that is vastly below min spec and has a shared onboard gpu I still get 20fps in EDO. No offense, but if you genuinely get 4 you should bring it back to the museum you stole it from. :p

Seriously though, check some stuff like drivers and settings. 4fps is almost impossibly poor.
 
Last edited:
Hi, Dr. Skippy here, neuropsych. "The brain updates once every 13ms" means literally nothing, it is nonsense.
So basically, in tests flashing multiple images at subjects, at high framerates, the brain has been shown to process and distinguish a fresh image roughly every 13 milliseconds. Meaning it can distinguish between different frames at up to around 80fps, but with shorter exposure time per frame than that it misses a bunch more frames.

Not sure if more research has been done since then, as this was back in 2014, and one limit of that study was that they were limited to I believe 85Hz monitors? Meaning they couldn't get much data on the exact rate of dropoff once the exposure time per frame got below 13ms, as that's roughly 77Hz. It'd be interesting to see the curve as we get beyond that and into 100fps and up, and where exactly, at least on average, it fully tapers off.
 
Last edited:
Actually it just makes me not get a headache. I absolutely hate for example that movies are shot in 24 FPS. So gross.
That's the frame rate of 99.9999% of film outside of one or two Peter Jackson films, so you mustn't be able to anything!! 😯😂 ...

As I said, I don't know why some people get so hung up on specific numbers, most of us don't even notice frame rates outside of unlikely actual slowdown and just want to play the game on our systems of choice, which the PS5 and Xbox Series X should be able to do perfectly if Frontier would just lift a finger!! 😯😂🩲😀🤘🤞 .
 
I don't think what was mentioned here is entirely true, though perceptions of such visuals are usually relative, I know when I moved from 30 to 60 FPS, The difference was like night and day for me. Then when I moved from 60 to 120 FPS, it wasn't quite night and day, but rather the visuals felt "buttery smooth". Moving from 120 to 144 FPS and 144 FPS to 165 FPS (my monitor's refresh rate) I still could tell the difference but not needed. I'm happy at 120 to 144 FPS.

I know many people tend to feel like you do because they are trying to display 120, 144, 165 FPS or beyond, on a monitor that only supports up to 60 Hertz refresh rate. Is that the case for you? Or is it just age?

I used to run 300+ FPS uncapped on horizons, but I usually cap it at 120. Now I struggle between 45 FPS on the ground to 80 FPS in supercruise in Odyssey.
Well you might notice it, but as said I and most people don't and just want to play the game on our systems of choice!! 😯😂😀🤘 .
 
I’d imagine no amount of posting here will stop people asking for a console version. It’s what they want, why should that disturb those who already have it?
As I and others have said about one million times, for Frontier to lift a finger and release full native "next gen" versions of Elite on PS5 and Xbox Series X, which they surely get, if they'd just make SOME effort!! 😯😂 .

(As I've also said, if they don't have enough staff right to develop or "port" a next gen version, they should either hire more developers for a specific console team or have a third party studio release it! - we don't care too much how long it would take them, we just want to play the game!😯😂 ).
 
That's the frame rate of 99.9999% of film outside of one or two Peter Jackson films, so you mustn't be able to anything!! 😯😂 ...

As I said, I don't know why some people get so hung up on specific numbers, most of us don't even notice frame rates outside of unlikely actual slowdown and just want to play the game on our systems of choice, which the PS5 and Xbox Series X should be able to do perfectly if Frontier would just lift a finger!! 😯😂🩲😀🤘🤞 .
The thing with film vs gaming is that generally in a film, the camera work is as much science as art. It's all about establishing angles and maintaining very precise and fluid camera motion and speeds. A bunch of artihmetic goes into things like setting up high framerate action shots that get slowed down for "bullet time" effects, but generally, outside of genres like "found footage", a solid rule is that the camera isn't darting about, it's tracking the onscreen characters' movement and focusing at a specific depth. That's not at all how we experience the world in first person, but it looks good to watch on a screen because it's so highly choreographed to work well like that.

It's also part of why the Hobbit films look so much worse than LotR. Low framerate means you pick up on stuff like bad CG less. The effects stand out more as not fitting with the scene the more frames of visual info you are taking in, and filmmakers really need to account for that in their decision to film at higher framerate. The smoother framerate can be excellent if you're doing action scenes with physical onscreen combatants and practical effects, where there's a lot of camera motion and you want viewers to really be able to take in all the action.


In a game, the player is the camera. Looking all over the place, and often quite quickly. As a result, while something like Resident Evil on the original Playstation with its fixed camera angles, may be perfectly fine at 30, something like Resi VIII is not, due to the sheer amount of movement on the player's part.


[EDIT] - I really want to see the version of The Hobbit from the parallel universe where the messes surrounding its production never happened, and Jackson and Del Toro were able to go ahead with the originally planned version that was far less reliant on CG work and more on prosthetics and the like, where the higher framerate could really do it justice.
 
Last edited:
Are the developers of ‘Space Engineers’ safe when it comes to console development?

Sorry about the list of posts, I was just going through the thread and responding as I did.
"Safe" how?! 😯😂

Will they catch "Consoleiticis"?! 😯😂 .

As I said after that, Company of Heroes 3 is launching on consoles too, and there's absolutely nothing wrong with that!!

Consoles are the biggest part of the game market, with PCs still rather niche so I still can't understand why Frontier are running away from them and the sales they bring - they should be embracing the " next gen" consoles not shunning them!! 😯😂🩲😀🤘.
 
So basically, in tests flashing multiple images at subjects, at high framerates, the brain has been shown to process and distinguish a fresh image roughly every 13 milliseconds. Meaning it can distinguish between different frames at up to around 80fps, but with shorter exposure time per frame than that it misses a bunch more frames.

Not sure if more research has been done since then, as this was back in 2014, and one limit of that study was that they were limited to I believe 85Hz monitors? Meaning they couldn't get much data on the exact rate of dropoff once the exposure time per frame got below 13ms, as that's roughly 77Hz. It'd be interesting to see the curve as we get beyond that and into 100fps and up, and where exactly, at least on average, it fully tapers off.
I am not going to deep into it now, but no to all of that. All you need to know is that none of this is relevant to the discussion of FPS. At all.
 
As I and others have said about one million times, for Frontier to lift a finger and release full native "next gen" versions of Elite on PS5 and Xbox Series X, which they surely get, if they'd just make SOME effort!! 😯😂 .
I actually take Frontier at their word that they did try to get it working on consoles right up to the end.
It fits with their track record of chasing a sunk cost fallacy on things that are fundamentally broken.
My guess is something foundational to Odyssey simply kept breaking the console versions they tried to make work. Just look at the state of it on RX 6000 series graphics cards with delightful stuff like the pretzel planets, for how bad of a mess it can be. But they desperately wanted the money from the console launch, so when the PC version finally launched after its own delays, they kept going at it, because they had already made the initial mistake in 2020 of taking a bunch of time to try and fix odyssey after they discovered its problems, rather than scrapping a bunch of progress on it and starting over, which would have cost them being able to launch it in 2021 at all, but might have salvaged the possibility of console versions and gotten us a better performing 4.0 client.
 
Back
Top Bottom