Well… 20 fps might be fine for us old fogies, with our poor reflexes and lower standards. But what about those young-uns, with their lightning fast reflexes? They need 120+ FPS just to be competitive! 8ms
reflex times, not 50 ms!
I'm an old fogie, and 20 fps ain't nowhere near fine for me. That said, not all fps are created equal. For example, I have MSFS locked at 30 fps, but thanks to an amazing motion blur algorithm and temporal antialiasing, it looks just as smooth as most 60 fps games I own. RDR2 uses similar technology on PS4, making it look better than some of my PC games. Now Elite on PS4, with its 20-30 FPS, looks quite rubbish (and I'm focusing specifically on frame pacing and smoothness of motion) compared to my PC which I lock at 60 fps. But in Elite's defense, this is the case with most games that don't provide some sort of motion blur / frame interpolation.
But I guess I am an old fogie on some levels, because 60* fps is just fine for me. I have a 144Hz monitor, but I run it at 60Hz because I just don't see that much of a difference for the games I play, not compared to the very noticeable difference between raw 30 and raw 60, and 60 fps runs way cooler and more consistently than 144 for me.
* NOTE - Most of the time I'm playing on a small 15 inch laptop screen. I have noticed that when I throw a game up on the big TV, I'm much more sensitive to framerate. The physical distance an image skips between frames when fps is less than minimum requirement to prevent pixel-skipping makes a big difference IMO.