AMD FidelityFX CAS vs GFX card overheating

Excuse my ignorance, but isn´t that FX CAS thingy only for image sharpening?
It sounds like there is a second, less demanding supersampling solution I´m not aware of
 
Excuse my ignorance, but isn´t that FX CAS thingy only for image sharpening?
It sounds like there is a second, less demanding supersampling solution I´m not aware of

The game implements FidelityFX CAS and FSR 1.0. CAS supports scaling, but I'm not sure the game uses it in preference to it's own supersampling implementation when CAS is enabled or not. FSR usually implies scaling (spatial in 1.0), and has an optional sharpen pass to clean up some of the blur.

Regardless, it's pretty clear from the results obtained by replacing the TIM that the problem was a thermal one, not something intrinsic to any additional load that an extra scaling and/or sharpening pass may have put on the card.
 

rootsrat

Volunteer Moderator
The game implements FidelityFX CAS and FSR 1.0. CAS supports scaling, but I'm not sure the game uses it in preference to it's own supersampling implementation when CAS is enabled or not. FSR usually implies scaling (spatial in 1.0), and has an optional sharpen pass to clean up some of the blur.

Regardless, it's pretty clear from the results obtained by replacing the TIM that the problem was a thermal one, not something intrinsic to any additional load that an extra scaling and/or sharpening pass may have put on the card.
I can see a very clear performance difference in terms of FPS when I use upscaling from the engine and with CAS enabled. It actually mentions in the description of that option in game that it offers an "optional image scaling".

Based on that I assume the games actually uses the CAS upscaling rather than the built-in one.
 
I can see a very clear performance difference in terms of FPS when I use upscaling from the engine and with CAS enabled. It actually mentions in the description of that option in game that it offers an "optional image scaling".

Based on that I assume the games actually uses the CAS upscaling rather than the built-in one.

Is the performance better or worse with CAS vs. the standard scaling?

The sharpening itself has overhead; some performance cost for using it is expected. A performance hit when using CAS would make it unclear how much of that was coming from the sharpening itself, or any possibly present scaling. However, if performance is better and you're certain it's still being scaled, then yes, CAS scaling is the only reasonable explanation.
 

rootsrat

Volunteer Moderator
Is the performance better or worse with CAS vs. the standard scaling?

The sharpening itself has overhead; some performance cost for using it is expected. A performance hit when using CAS would make it unclear how much of that was coming from the sharpening itself, or any possibly present scaling. However, if performance is better and you're certain it's still being scaled, then yes, CAS scaling is the only reasonable explanation.

It's a lot better with CAS in my case. That is the reason I started using it (and hence discovered the overheating issues :D ), I'm not bothered with the sharpening, I just put it to the lowest value.

I need to actually find it in the config file to see whether it's reduced completely to 0, or is there some minimal value it still applies at the lowest position of the slider.
 
Did a quick test on my main system and I'm unable to produce any difference between the normal and fidelity FX CAS upscalers with the latter's sharpen slider at zero. Performance, power consumption, and image are identical.

Will give it a shot on a few other GPUs later.
 

rootsrat

Volunteer Moderator
Did a quick test on my main system and I'm unable to produce any difference between the normal and fidelity FX CAS upscalers with the latter's sharpen slider at zero. Performance, power consumption, and image are identical.

Will give it a shot on a few other GPUs later.
I'll test the built in one later on too, I have not done that after changing the thermal pads. Not sure what difference it could make, but we'll see.
 
I couldn´t gain any frames by enabling CAS, and FSR was not an option as it gives me headaches and looks really blurry.
I did notice that enabling a higher resolution via DSR (in the Nvidia control panel) gave me an almost comically huge performance boost compared to the in-game supersampling.
I don´t know how or why, but I´ll take it.
 

rootsrat

Volunteer Moderator
I couldn´t gain any frames by enabling CAS, and FSR was not an option as it gives me headaches and looks really blurry.
I did notice that enabling a higher resolution via DSR (in the Nvidia control panel) gave me an almost comically huge performance boost compared to the in-game supersampling.
I don´t know how or why, but I´ll take it.
FSR is a bit different - for example if you play at 4K, it will render the image at 1440p and then upscale it back to 4k, accordingly with the quality settings. If you play at low resolutions (like 1080p) it doesn't make sense to use it, as it will render the image at lower than that, which will make it look blurry.

DSR is definitely an option and it works on the driver level, which should be best performance most of the times I think. But for me it changes my other screen's resolution too, which is annoying, as everything shifts to a side. Hence I use in-game upscaling.

I've not tested it yet unfortunately, but I'll report back once I do.
 
I did notice that enabling a higher resolution via DSR (in the Nvidia control panel) gave me an almost comically huge performance boost compared to the in-game supersampling.

I couldn't reproduce this.

Pretty much any combination of settings that results in the same internal render resolution also results in the same performance, in my testing, all other things being equal. 4k DLDSR + no in game SS, 1440p + 1.5x normal SS, 1440p + 1.5x CAS SS, 5k DSR + 0.75 normal SS...all 3840*2160 render resolution and all the same frame rate +/- a few fps.
 
I couldn't reproduce this.

Pretty much any combination of settings that results in the same internal render resolution also results in the same performance, in my testing, all other things being equal. 4k DLDSR + no in game SS, 1440p + 1.5x normal SS, 1440p + 1.5x CAS SS, 5k DSR + 0.75 normal SS...all 3840*2160 render resolution and all the same frame rate +/- a few fps.
Sorry for misleading you, I´m a fool.
I compared 1080p + 1.5 SS to 1440p using DSR.
They are not the same resolution🤪 I guess you can now officially call me stupid.
On the bright side, I now have better performance with almost no noticable downgrade in visuals, so I´m a happy fool.
 
Sorry for misleading you, I´m a fool.
I compared 1080p + 1.5 SS to 1440p using DSR.
They are not the same resolution🤪 I guess you can now officially call me stupid.
On the bright side, I now have better performance with almost no noticable downgrade in visuals, so I´m a happy fool.

2880*1620 (1080p w/1.5x SS) vs. 2560*1440 would definitely explain the performance differential.
 
Back
Top Bottom