Stadia: A new paradigm for gaming?

60fps lol.

Does anyone care about 60fps anymore? 144hz is where it's at and has been for a while now. With 240hz screens coming more into the mainstream, 144hz is fast becoming the standard.

And input lag. No thanks.

I personaly think 144hz is overated for the mass market and the console market seems to be thinking the same. Hell even 60fps is not standard yet. MS and sony both went the higher resolution route. Aslong TVs have mostly 60hz, 144hz and above will remain in the enthusiast segment.

Did you ever try to get a 4k display with 144hz? Incase you didnt: they start around 2000 € and above and lets not even get started about the hardware requirements for getting 4k@60fps or even 4k@144fps. There is no gpu on the market that can deliver this performance
 
Last edited:
No thank you.
I like to play games on my PC as well as browse, play music and the usual PC stuff. No input lag.
Playing a streamed game means latency which means input lag. No go.

If I want to play a video stream, I'll use Netflix...
 
Does anyone care about 60fps anymore? 144hz is where it's at and has been for a while now.

What a lot of people don't realize is that framerate and pixel count are related. Watching a video in 320x240 window at 144hz would be no different than watching that video at 60hz. We need higher framerate when images are "crossing" more pixels in a period of time on something like a 4K display.

I'm still very happy with my 1080p display, so 60fps is enough for me, at least for a game like Elite Dangerous. TBH, I'd much rather have a game run at 60 FPS with no LOD transition or pop-in than the same game running at 144hz with assets popping in left and right as we fly across a scene.
 
Last edited:
What a lot of people don't realize is that framerate and pixel count are related. Watching a video in 320x240 window at 144hz would be no different than watching that video at 60hz. We need higher framerate when images are "crossing" more pixels in a period of time on something like a 4K display.

I'm still very happy with my 1080p display, so 60fps is enough for me, at least for a game like Elite Dangerous. TBH, I'd much rather have a game run at 60 FPS with no LOD transition or pop-in than the same game running at 144hz with assets popping in left and right as we fly across a scene.

Lol, those are the words of someone who's never experienced a decent GPU driving a high refresh rate screen.

But anyway, I'm not getting into this. 144hz and above is objectively better than 60hz, provided you have a GPU capable of pushing that frame rate with consistent frametime. Anyone who says otherwise is deluding themselves.
 
Lol, those are the words of someone who's never experienced a decent GPU driving a high refresh rate screen.

But anyway, I'm not getting into this. 144hz and above is objectively better than 60hz, provided you have a GPU capable of pushing that frame rate with consistent frametime. Anyone who says otherwise is deluding themselves.

You are trying to present your personal anecdote as a fact. What kind of games are you playing at 144fps?
 
I wish I could afford a GPU/Screen that could run at 144Hz OR 4K, let alone both... As it is I'll stick to what's possible for me, 60Hz and 2560x1080.
 
Lol, those are the words of someone who's never experienced a decent GPU driving a high refresh rate screen.

But anyway, I'm not getting into this. 144hz and above is objectively better than 60hz, provided you have a GPU capable of pushing that frame rate with consistent frametime. Anyone who says otherwise is deluding themselves.

LOL, those are words of someone who doesn't have a clue what I've experienced IRL. Now that's true delusion. I stand by my original post, and anyone with a shred of understanding of how graphics actually work will likely agree with the basic premise of that post.

But anyway, I'm not getting into this :p
 
You are trying to present your personal anecdote as a fact. What kind of games are you playing at 144fps?

Most games. I have a 2080 driving a 1080p screen with a 5ghz 8700K to back it up, so it's not a problem.

Personally I'd rather trade graphical fidelity for a higher frame rate. I can run Division 2 at Ultra preset and average 100fps, but with a few tweaks to things that have zero discernible difference and it runs about 180fps in most areas, with the occasional drop to 140-150fps.
 
LOL, those are words of someone who doesn't have a clue what I've experienced IRL. Now that's true delusion. I stand by my original post, and anyone with a shred of understanding of how graphics actually work will likely agree with the basic premise of that post.

But anyway, I'm not getting into this :p

Timestamped for your convenience. By someone who has far greater technical expertise and experience than you or I.

[video=youtube_share;W3ehmETMOmw]https://youtu.be/W3ehmETMOmw?t=232[/video]
 
personaly i cant stand looking at 1080p anymore after 2 years of higher resolutions. Everything looks to blurry to me. Of course thats just personal experience.
 
personaly i cant stand looking at 1080p anymore after 2 years of higher resolutions. Everything looks to blurry to me. Of course thats just personal experience.

In my experience, it depends on the size of your monitor. I played many years with 28" FHD monitor but when i bought 32" monitor, FHD was way too "pixelated".
Now i have 4K 32" and wouldnt go back to FHD anymore. :)
 
DSR is only an emulator. Native 2k or 4k is somthing else entirely. I suggest giving it a try atleast before dismissing it.

I have. 144hz at 1440p and above is great. But as soon as the frame rate drops to the point where it's noticeable, ie. below 90-100fps then it's just no bueno for me. I'd much rather take 144fps+ 100% of the time at 1080p over having moments of jankiness at 1440p/4k.
 
Oh yeah, I should have said before that my comments on this are coming from someone who plays a few competitive fps games at a reasonably decent level.

Am I gonna really trade off high refresh rates and input lag and use a cloud gaming service so I don't have to spend a few thousand quid every few years to continue to compete? Of course not.
 
personaly i cant stand looking at 1080p anymore after 2 years of higher resolutions. Everything looks to blurry to me. Of course thats just personal experience.

As someone else said, screen size and distance from screen play a role. My 1080p display is a high-quality 22" monitor, and it's "good enough" for me. Don't get me wrong, someday I'll upgrade to 4K, mainly because HDR and wide color gamut are what really make 4K compelling.

Now consider this - if I watch a professionally-mastered 1080p BluRay movie, it will look amazingly better* than any video game we have, yet people keep demanding more pixels at higher framerates for our current "gamey" looking games. As I said in my previous post, I'd much rather have a 1080p 60 fps game with no aliasing, pop-in, LOD transitions, limited lighting, or fake FS2004 trees (parks in ED stations, I'm looking at you), etc. over the game we have now running at 4K 144 fps.

Once real-time ray tracing becomes the standard and affordable to average folk like me (perhaps this is where tech like Stadia eventually comes to play), then I might find higher resolutions and framerates more compelling.

* disclaimer - not comparing framerate (panning in 30 fps is rubbish), but rather the image itself.
 
Last edited:
As someone else said, screen size and distance from screen play a role. My 1080p display is a high-quality 22" monitor, and it's "good enough" for me. Don't get me wrong, someday I'll upgrade to 4K, mainly because HDR and wide color gamut are what really make 4K compelling.

Now consider this - if I watch a professionally-mastered 1080p BluRay movie, it will look amazingly better* than any video game we have, yet people keep demanding more pixels at higher framerates for our current "gamey" looking games. As I said in my previous post, I'd much rather have a 1080p 60 fps game with no aliasing, pop-in, LOD transitions, limited lighting, or fake FS2004 trees (parks in ED stations, I'm looking at you), etc. over the game we have now running at 4K 144 fps.

Once real-time ray tracing becomes the standard and affordable to average folk like me (perhaps this is where tech like Stadia eventually comes to play), then I might find higher resolutions and framerates more compelling.

* disclaimer - not comparing framerate (panning in 30 fps is rubbish), but rather the image itself.

That's all well and good, regarding the image quality and HDR and colour gamut etc.

But none of that will win you a clutch last-second peek-a-boo gunfight when it's double OT and you have your rank to play for. A fast display with a GPU driving high frame rates and consistent low frametimes will.*

*player skill notwithstanding, of course. You get my point.
 
Another product/service from the company that loves to cancel products/services?
Also, it's like Google doesn't understand that many places have data caps in place.
Maybe the target market is Hong Kong, Tokyo, other fiber cities?
But whatever, I'm not paying another monthly fee, especially for gaming.
 
Top Bottom