I've been a console gamer on a very nice 1080p display for years now, but with Black Friday around the corner, I took some time to really observe some nice 4K UHD TVs in person to see what all the hubbub was about.
WOW...
I must say, I am very impressed, though I think it's the HDR that catches my eye even more than the resolution, though admittedly the resolution is also nice. However, I noticed something seemed "wrong" while watching the demo video on each of the different 4K displays. After close observation, it hit me - 30 fps actually looks "bad" in 4K!
I never understood the "hate" for 30 fps up to this point, because I usually find 30 fps quite tolerable on my own gaming system. But now that I've seen 30 fps on a higher-resolution display, I'm beginning to understand. It's all about the pixels - the more pixels you have, the more evident the framerate becomes, because the lower the framerate, the more the picture "jumps" from one frame to the next, the measure being pixels per frame.
I had entertained upgrading my PS4 Slim to a Pro, and ultimately decided to wait for the PS5, and after seeing this "effect", I definitely feel that's the right call for me. If I'm gaming in 4K, I think I'm going to need 60 fps to fully enjoy the experience. Now if the TV has motion smoothing (frame interpolation), that would likely solve the visual "jutter" I'm seeing, though my experience with this on 1080p displays is it adds noticeable delay to the displayed image (push a button, see the result 1/4 second later).
Anyway, I just thought I'd share that and see if anyone else feels the same way. I'm also curious, do people with 4K TV and UHD Bluray movies notice this effect, or does the movie had significant motion blur to compensate? I usually don't like using motion smoothing on movies due to the "soap opera" effect (depends on the movie), but I didn't like the jutter I was seeing in the 4K demo videos either.
WOW...
I must say, I am very impressed, though I think it's the HDR that catches my eye even more than the resolution, though admittedly the resolution is also nice. However, I noticed something seemed "wrong" while watching the demo video on each of the different 4K displays. After close observation, it hit me - 30 fps actually looks "bad" in 4K!
I never understood the "hate" for 30 fps up to this point, because I usually find 30 fps quite tolerable on my own gaming system. But now that I've seen 30 fps on a higher-resolution display, I'm beginning to understand. It's all about the pixels - the more pixels you have, the more evident the framerate becomes, because the lower the framerate, the more the picture "jumps" from one frame to the next, the measure being pixels per frame.
I had entertained upgrading my PS4 Slim to a Pro, and ultimately decided to wait for the PS5, and after seeing this "effect", I definitely feel that's the right call for me. If I'm gaming in 4K, I think I'm going to need 60 fps to fully enjoy the experience. Now if the TV has motion smoothing (frame interpolation), that would likely solve the visual "jutter" I'm seeing, though my experience with this on 1080p displays is it adds noticeable delay to the displayed image (push a button, see the result 1/4 second later).
Anyway, I just thought I'd share that and see if anyone else feels the same way. I'm also curious, do people with 4K TV and UHD Bluray movies notice this effect, or does the movie had significant motion blur to compensate? I usually don't like using motion smoothing on movies due to the "soap opera" effect (depends on the movie), but I didn't like the jutter I was seeing in the 4K demo videos either.
Last edited: