Why ED should ignore the forums and love the silent mob (This is not about the current quality of Odyssey)

17-21 FPS is not playable for me. I run the game on my 7 years old mid range computer at 30-50 FPS, which is just enough on low settings. I still want the game to run as advertised though, and I think people with current top rigs should be able to play at 60-120 FPS at Ultra and 1440p at least.
I agree with the last part, but Im fairly certain there isn't a minimum fps number that defines something as playable or not, officially. If there is, I'd like to see that. And though I say that, it's not really something Fdev should fall back on. There is something wrong on their end for sure, though, since better cards/setups dont seem to improve anything. Something is bottle-necking and, after update 5, freezing processes.
 
I agree with the last part, but Im fairly certain there isn't a minimum fps number that defines something as playable or not, officially. If there is, I'd like to see that. And though I say that, it's not really something Fdev should fall back on. There is something wrong on their end for sure, though, since better cards/setups dont seem to improve anything. Something is bottle-necking and, after update 5, freezing processes.
It been my understanding that 30fps generally is accepted as the bare minimum acceptable framerate, and anything less is usually considered bad in terms of industry standard. Therefore, a minimum spec for a game should be expected to at least net you 30fps at All-Low settings. Recommended should either mean 30fps at Ultra or 60fps at High/Ultra.

I REAAAALLY doubt that any developer is out there saying that 15-20fps is accepted as a standard for minimum or recommended spec. Even people playing Switch don't like 15fps and they're playing on what is essentially a weak smartphone.
 
Interestingly - 24fps is accepted as the lowest refresh rate that the eye can turn into any convincing movement. It is the original refresh rate for TVs (23.976 for NTSC) back in the day.
 
Interestingly - 24fps is accepted as the lowest refresh rate that the eye can turn into any convincing movement. It is the original refresh rate for TVs (23.976 for NTSC) back in the day.
24fps was used as an industry standard for movies and TV shows for realistic motion. That was for non-interactive media. The viewer is passive. The viewer does not realize time lag between actions and results.

Video games however: The poor frame rate of a game like ED is a graphical symptom that we visually see. Notice that updating non-graphical data is also slow when your fps is low, effecting the overall response of the game. Menus and pop-up maps, general loading of stuff is worse when fps is low. The overall game is struggling, not just the graphics.
 
Interestingly - 24fps is accepted as the lowest refresh rate that the eye can turn into any convincing movement. It is the original refresh rate for TVs (23.976 for NTSC) back in the day.
Film has that weird thing where when they've doubled the frame rate, it feels cheaper, like memories of interlaced telly on CRTs or like you're watching a rehearsal or something. It's a strange, counter-intuitive, but hard to deny thing -- I guess it's not reality we're looking for in our movies. Although I think there's something else going on that we haven't figured out yet.
 
Interestingly - 24fps is accepted as the lowest refresh rate that the eye can turn into any convincing movement. It is the original refresh rate for TVs (23.976 for NTSC) back in the day.

As far as I'm aware TVs have always been at 25/30fps, depending on the region, it's to do with the frequency of the electricity.

24fps was the minimum acceptable frame rate for audiences decided by the movie industry- old black and white films were at 18fps, which is why they look slightly speeded up when we now watch old clips of Charlie Chaplin - which was decided with an eye on cost.

The 180 rule creates motion blur that we find pleasing as an audience. However, this is partly a learned acceptance of what looks good.

Interestingly, despite the original Peter Jackson Hobbit movie jarring with most of the audience shown at its native 60fps, most younger people prefer to watch 60fps having grown up playing video games at 60fps.

Most people can't perceive the difference between anything above 60fps, although they can react to up to 1000fps. Other's are extremely sensitive to frame rate - especially neurodivergent people - I personally can see the difference between 24 and 25fps and find it jarring, but that's because I video edit.

The major issue with video games are, the motion blur is never convincing and we have to react to things on the screen. It's noticeable whenever we hit a key/mouse button and it doesn't react straight away. - It's very clear when running at 10fps in a settlement, and you can barely function.

The interesting thing is, we are really quite good at adapting to even low fps, so long as it's consistent. However, the variability of frame rate in Odyssey means you can't learn to predict when your input will work.
 
Top Bottom