I don't know if this is true or not (there seems to be some debate), but even if it is, it's not an argument for limiting framerate / refresh rate of a digital computer monitor. Our eyes are analog, so even if they are limited to 60 "frames" per second, it's not at all equivalent to the shutter speed of an LCD display. For example, look at a bright scene and then close your eyes, and you'll see a fading imprint of that scene in the blackness as your rods and cones "desaturate" over time. This gives our eyes a built-in motion blur, smoothing out how we perceive high speed motion. There's also an overlap of the "pixels" in our eye, unlike the sharp grid of a computer monitor.
If a real-life object crosses our vision faster than our eye can process, that object is still smoothly transitioning across our view, both literally and perceptionally. Contrast this with a digital LCD screen. An object crossing the screen faster than the screen can refresh will literally "jump" (the Picard Maneuver) from one location to another, skipping multiple pixels. You can see this clearly and painfully when a TV show pans across a scene at 30 fps - the stuttering is palatable. Even at 60 fps it can be noticeable. This is why many modern TVs run at 120 or 240 Hz and use interpolation to draw the scenes between those frames, to eliminate that pixel-jumping. It's also why older CRT displays didn't suffer from this stuttering as much at lower framerates, because the phosphorous itself was analog in nature, slowly fading like our rods and cones. The lower resolution also helps (the more pixels you have, the more pixels are skipped from one frame to the next at a lower fps). It's also why analog film could get away with 24 fps, because of the built-in motion blur of the analog film from one frame to the next.