General / Off-Topic Does AMD gpus have superior image quality?

So after watching this, and other side-by-side benchmark vids, I am suddenly not so confident in what gfx card to buy anymore..
I had decided to get a 3080 as soon as I could get my grubby paws on one. But now? Hmmm :unsure:


In most of these games tested, I can see the AMD card output a much clearer image. It is especially noticeable on the very detailed games AC:Valhalla and Horizon Zero Dawn.

It's a 4k video, so enable max resolution and fullscreen if you want to take a look. Then pause at times and judge for yourselves.
Clearly the AMD has a much better image, or am I imagining things? 🧐
Look at the ground, grass, the stone works, building textures, wood textures, leaves, etc.
Nvidia has like this gray smudge to it. Some places it is shockingly bad compared to AMD.
If you don't see it, then pause at 1:32 and look at the wooden bridge at the bottom of screen. You almost can't see the wood grains on the 3080 output.
Also at 7:51, the stone works and rocks at the left of the bridge.

Maybe the 3080 is not running any texture filtering, but that doesn't explain the geometric details also being smudgy.
This is run at stock gpu settings and ultra quality in game.
I've looked at other youtubers side by side vids and I see the same trend, though not as clear as here, this video is done with very good quality.
I did google it, and found some old posts sort of explaining it is caused by NVIDIAs "more effective" compression technology to save bandwidth. That may explain why it stutters less than the AMD .. :rolleyes:


But, man that image quality is just so good compared, no amount of raytracing or dlss could compensate.
 
I'm very hesitant to make such comparisons based on youtube videos or without knowing exactly what settings were applied at the driver, game, capture, and transcoding stages.

In this particular case, the games that standout as looking better on AMD are AMD sponsored titles that I would expect to be using CAS by default, which makes it a non-exact comparison.

On hand, I have a pair of 1080 Tis and an R9 Fury, and also have access to an RTX 3080 and a 5700XT. Using actually the same settings, I cannot see an IQ difference between any of them.

But, man that image quality is just so good compared, no amount of raytracing or dlss could compensate.

A simple sharpen filter could, and that's exactly what I think is running in the examples you pointed out.
 
Last edited:
I know from 20 years ago when AMD cards were ATI that ATI offered the better image quality in desktop. :)
 
I know from 20 years ago when AMD cards were ATI that ATI offered the better image quality in desktop. :)

This is true, but that was because AMD tended to put better RAMDACs on their parts.

But it's all digital now, so if there is a difference in IQ, it can usually be traced to some software setting or optimization, rather than any intrinsic property of the part.
 
In this particular case, the games that standout as looking better on AMD are AMD sponsored titles that I would expect to be using CAS by default, which makes it a non-exact comparison.

I suppose you could be right.
Apparently there is something called FidelityFX from AMD(it works on both brands though) that will sharpen the image considerably without causing aliasing.
I suppose it could be on by default for the 6900 here and not the 3080.

 
Back
Top Bottom