Interestingly - 24fps is accepted as the lowest refresh rate that the eye can turn into any convincing movement. It is the original refresh rate for TVs (23.976 for NTSC) back in the day.
As far as I'm aware TVs have always been at 25/30fps, depending on the region, it's to do with the frequency of the electricity.
24fps was the minimum acceptable frame rate for audiences decided by the movie industry- old black and white films were at 18fps, which is why they look slightly speeded up when we now watch old clips of Charlie Chaplin - which was decided with an eye on cost.
The 180 rule creates motion blur that we find pleasing as an audience. However, this is partly a learned acceptance of what looks good.
Interestingly, despite the original Peter Jackson Hobbit movie jarring with most of the audience shown at its native 60fps, most younger people prefer to watch 60fps having grown up playing video games at 60fps.
Most people can't perceive the difference between anything above 60fps, although they can react to up to 1000fps. Other's are extremely sensitive to frame rate - especially neurodivergent people - I personally can see the difference between 24 and 25fps and find it jarring, but that's because I video edit.
The major issue with video games are, the motion blur is never convincing and we have to react to things on the screen. It's noticeable whenever we hit a key/mouse button and it doesn't react straight away. - It's very clear when running at 10fps in a settlement, and you can barely function.
The interesting thing is, we are really quite good at adapting to even low fps, so long as it's consistent. However, the variability of frame rate in Odyssey means you can't learn to predict when your input will work.