I'd say the reason is primarily marketing, because the market can't cope with the gaming community saying "the tech is good enough as it is now, leave it at that".
However, without doubting any claims, I'd be much interested in a study over perceived difference between different refresh rates.
No need to go full biology and try to compare our "brain pictures" with frames.
Just take a few hundreds test subjects, trained or untrained in the graphics department, and leave them with different refresh rates/monitors, same quality.
Ask to describe what they saw. Compile results. Could be interesting. Maybe already done ? dunno. Link appreciated.
No offence, but a man's word is a single man's word until proven otherwise.
Me too.
All I know is on a personal level, the difference is very marked. I couldn't go back to a 60hz screen for gaming, and I'll gladly sacrifice a slight lowering of graphics quality to get the smoothness that 144hz gives.
Everyone I know who's made the transition to high refresh rate screens, from one of the dudes in my gaming group who could easily be a pro at just about any fps he chooses, through to people that only play minecraft, have all said exactly the same thing.
The reason e-sports pros use 144+hz screens is because it makes them objectively better at the faster-paced fps games.
So yeah, a person's word is but a single person's word. Until a whole great truck-load of single persons experience and report the same thing, as is the case here.
Last edited: