What fps is acceptable?

I am extremely sceptical of anyone who claims that anything less than 60fps ruins it for them.
I don't mind if it visually stutters or goes to low fps, it's a bit annoying but it is what it is. It's just that under 60fps you can't really react to anything, especially when doing a settlement raid and someone accidentally sees you and you have to zap them before they alert someone. Currently raids are unplayable for me as it's just a slideshow really and moving my view is an absolute chore.
 
Nope, average photographic flash is 1ms, 20 times longer.

I'm clearly not talking about the average photographic flash (the 1/20000th of a second should have given that away), I'm talking about common-enough examples (strobe flashes at the lower settings, in this case) of very short duration visual stimuli that are still easily perceived.

In any case, a super-high contrast in light levels is very different from a similarly-bright moving image.We can detect instants of flashes are much shorter durations than we can perceive movement, because the difference in the scene is far greater.

Yes, that's described in Bloch's law and it does nothing to contradict anything I've said.

In the absence of other information, you can't tell the difference between a sufficiently bright one nanosecond pulse and a sufficiently dim ten millisecond pulse, but you can still perceive that nanosecond pulse.

The entire point is that what frame rate is useful goes way beyond the point at which perceptibly smooth motion is achieved (which I've already pointed out). I'm sure you can think of plenty of examples of this yourself, but in case you can't the two most relevant to gaming are response times/input latency and determining motion vectors.

Even with human response times being in the 100ms range, every ms you shave off any additional latency can potentially provide a competitive advantage. Even being 1% faster is enough to separate top players (and I'm not claiming to be a top player of anything any more than you're claiming to be a fighter pilot...I might be, you might be, but that's not relevant).

Being able to track motion, even of objects on screen for durations too short to identify can also be very relevant. Take something like a rendered projectile that doesn't have any tracer or blur to highlight it. If it's in your field of view for 10ms and this is only one frame, you have no information other than it was there...you cannot tell how fast it was moving or what direction it came from. If it's three or four frames, well you know a lot more. This is a big deal if that shape is a rocket or grenade in a tactical shooter, for example.

And of course there are effects on comfort and eyestrain that can be totally unconscious, and still be relevant.

Everything I hear from the fps fetishsits reminds me of audiophiles.

That's a rather ambiguous statement. There are audiophiles who are well aware that the difference between a thousand dollar low-oxygen silver cable and an old wire coat hangar are negligible for a short run and some magical thinkers who call themselves audiophiles who would swear by the thousand dollar cable with imperceptibly different electrical properties.

So which are you accusing me of being?
 
Last edited:
That's a rather ambiguous statement. There are audiophiles who are well aware that the difference between a thousand dollar low-oxygen silver cable and an old wire coat hangar are negligible for a short run and some magical thinkers who call themselves audiophiles who would swear by the thousand dollar cable with imperceptibly different electrical properties.

So which are you accusing me of being?
The point is that there is absolutely no difference between a thousand dollar low-oxygen silver cable and a decent quality mid-price cable. This has been experimentally verified, over and over again. There's an entire industry of con-artists that capitalises on audiophiles' inability to accept that, that keeps generating more and more absurd quality claims to justify whatever purely decorative addition they've made this time.

There's a terminal point for quality, both for audio cables and for fps, beyond which is the realm of pure marketing hype and self-delusion.
 
30fps=Minimum
60fps=Target
60fps+=gravey

Now, that's for a single-player, non-competitive, experience. In a competitive multi-player game, you want your fps to be as high as possible even at the cost of visual fidelity. It comes with advantages on the field. Back in the day, Quake II players would set resolution/quality to as low as it would go just to get an fps advantage. I think they still do it in games like CS but QII was where I first remember it being done.
 
There's a terminal point for quality, both for audio cables and for fps, beyond which is the realm of pure marketing hype and self-delusion.

Indeed, and it's about 'one dollar coat hangar' level for a short audio cable, and about one-thousand FPS for sample and hold displays. This has been experimentally verified.
 
IMO 30 is good for 2D. There’s a reason why movies have a frame rate of 24FPS... most people can’t tell the difference visually between that and higher frame rates. Exceptions exist, of course, and higher FPS also allows for a quicker response time in twitch-based gaming. When people say they can tell the difference, that’s usually what they mean.

VR, on the other hand, needs to be as close to 90 FPS as possible. We can’t see the difference, but we can certainly perceive the desync between what our eyes see, and the head movements our inner ear senses.
When feature movies started to be shown with higher frame rates (IIRC some of the movies by Peter Jackson and Cristopher Nolan) a lot of people where complaining about the lack of that "movie feel" and the picture looking more like a soap opera. As it turned out a lot of people can actually tell the difference, they might just lack the terminology to describe it because outside of gaming few people are as obsessive about FPS to even know what it stands for.
Funnily the discussion that followed was quite the opposite to what is seen in gaming - because 24 FPS is so ingrained to the very essence of what a movie should look like that higher rates were outright rejected it didn't catch on as one might have expected.


Personally, for games, I draw the line at 30 FPS, but only because I indulged myself with a FreeSync setup. AMDs implementation just so happens to have 30 as a lower bound so everything above that completely eliminates screen tearing (which I find way more jarring visually). Though I still prefer a higher refresh rate if possible, even when it's just "normal" usage. Having smoother movement for windows, mouse pointers, etc. just makes for a nicer experience.
Most of the gameplay in ED (even Odyssey) is slow enough that I wouldn't mind if the game ran anywhere from 30 upwards. I would still notice a difference compared to a higher refresh rate, but it wouldn't be enough to bother me.

Right now though that is simply not the case, even though my system spec is verbatim what FDev recommends: Ryzen 5 2600, RX 580, 16 GB RAM (of which 12 are fully available). Granted, I'm running a slightly higher resolution of 2560x1080 but the game should be capable of delivering. Even on the low preset there are areas where the refresh rate drops below 30. Like when leaving the elevator on Coriolis/Orbis stations and parts of the docking bay are visible. Overall I do notice improvements compared to alpha (including phase 4), especially planet surfaces went from unplayable (below 20) to mostly good but there's definitely something still amiss that slipped them by.
 
I see people on here complaining that they only get 60 fps....... if you turned your counter off would you even know? What can the human eye discern?

Now, I am only getting 20 fps and I can sure as hell discern that - it runs like a dog.
VR requires 72 to 120 FPS per eye or you get motion sickness. Double what you think is acceptable.
 
TL;DR — I'll take a stable 25fps anytime, over the constant drops from 60-20 that I had in the alpha.
Oh absolutely, and it's why on console most of the games with raytracing options lock the framerate to 30. And I'll happily play like that tbh for the visual benefit. I often find higher framerates less immersive as it highlights the "fakeness" of many textures and animations.


(Though I'd still appreciate more console versions of games giving me the option of 60fps 1080p with RT vs the 4k 30 raytraced and 4k 60 non-raytraced modes they have.)
 
Flying my ship in Horizons I get ~110fps on Ultra settings (3440x1440 resolution), in Odyssey on High it's ~40fps and that definitely feels a bit sluggish.
 
30fps is the minimum acceptable framerate for me, 60fps is the highest I can hope for (and would be completely and perfectly happy with) due to vsync and my setup.

Unfortunately, in the tutorial, especially near windows (someone mentioned this before that windows seem to be an issue) it went down as low as 10fps. Pause the game and run the mouse over the buttons in the options screen, it would lag so bad that they would highlight once every second, in this sort of laggy trail. This is with an identical setup to the "Recommended" specs, albeit I have a Ryzen 5 2600X, not a standard 2600, and with the settings turned down to medium or low.

In space, it's fine. 60fps at 1080p.
 
Flying my ship in Horizons I get ~110fps on Ultra settings (3440x1440 resolution), in Odyssey on High it's ~40fps and that definitely feels a bit sluggish.
It'd be nice to see how the Series X handles Horizons with FPS Boost on to bring it up to 120. Maybe MS will add it in the next hundred or so titles they bring it to.
It's pretty much a certainty at this point that part of the reason the console version of Odyssey is so delayed is because on the last gen ones, it's a full-blown Cyberpunk situation of single digit framerates...
 
Seriously people, what kind of games are you playing?
30 FPS has not been acceptable for a very, very long time.

Even 144 Hz displays are outdated in 2021.
Maybe in 4K 60 Hz is okay, but only because the higher end 20's series cards won't really be able to give you much more than 60 FPS without dialing down the quality significantly. For example my 2080 needs "low" graphics settings across the board to be able to provide the 80 to 90 FPS necessary for the Valve Index VR (which is more demanding than 4K).

But in 1440p they should absolutely be able to reach 100 FPS, let alone in 1080p, where the FPS figures shouldn't really drop below 144. Like, ever.
And since the 30's series cards are out, not even in 4K will you need to accept as low as 60 FPS anymore.

But the real question is not even the "minimum viable FPS". The thing where this whole thread is coming from is the crappy FPS figures people (even high-end GPU users) tend to see in Odyssey.

If your rig is capable of (for example) 144 FPS in Horizons, then it should be able to reach at least 120 in Odyssey, and you bloody damn sure shouldn't encounter 30 FPS, ever.
10-20% FPS loss between Horizons and Odyssey is OK(ish), but there is literally nothing in the graphics of Odyssey that could explain such a massive performance loss. It's not that much better or more demanding, it's just the terrible optimization (or the lack of it), period.
 
Last edited:
There’s a reason why movies have a frame rate of 24FPS... most people can’t tell the difference visually between that and higher frame rates.
The reason why 24fps is the film standard is that it's the absolute minimum speed where your eyes still perceive it as motion but you need to compensate it with shutter angle (adding motion blur between frames). Lowest speed was required to save on film stock.

As we're watching Elite that has no motion blur (I would love it if it had that option for more cinematic captures), the almost recommended minimum is 60fps.
 
It's not just fps that's concerning

When I load Oddity my GPU spikes to 100% load until I log out - not what I want for hardware longevity
 

hs0003

Banned
I see people on here complaining that they only get 60 fps....... if you turned your counter off would you even know? What can the human eye discern?

Now, I am only getting 20 fps and I can sure as hell discern that - it runs like a dog.
The problem is that when you only get 60 fps, if anything actually happens on your screen, like fire or lots of movement, suddenly that nice 60 becomes 10.

(I've been on a ground base where I had 80-100 fps while looking in one direction, and 10-15 when turning 180.
 
I do not even turn on an FPS meter. If the game runs smooth to my eye all is fine. I do not want to get focused on a number. I have a 3080 and the game runs poorly at 4k lots of stutter in stations and the pop in on planets is really bad. It runs much better at 1440. So I guess the next question is what resolution is acceptable? Then you factor in quality settings and gets more complicated. From what I have seen there is a reason Frontier left the graphics out of Horizons on PC at the last minute. They know Odyssey will run poorly on many systems and wanted an out for people to still play the game. They have basically charged the PC community for beta.

After the Cyberpunk fiasco, I think both Sony and Microsoft are gonna make sure games can actual run before they will add them to the library. Too bad there is no such thing for PC players.
 
Id say 40 and up is ok for me. I have a gsync compatible monitor that smooths things out a bit, so 40 and up is acceptable. I would prefer 60 and up.
 
30fps is fine but there is a difference between that and say 60fps. Here's a little demo, I recommend randomly clicking the button and then guessing what it's running at without looking at the toggle.
Wow i never would have imagined there was such a clear difference. I guess thats what motion blur was invented for.
 
Back
Top Bottom