AMD Radeon Settings

Dear fellow AMD Radeon users,

Just wondering if anyone has found that altering the default Radeon Settings profile has improved either graphics quality or frame rate? (particularly on planets).

Thinking especially about settings like Texture Filtering Quality, Surface Format Optimization, Tessellation Mode etc. To be honest I don't fully understand what some of these terms mean and haven't had the time to mess around with them yet so would be interested to hear if anybody had changed them and noticed improvements.

View attachment 95164
 
Did you notice much difference from the older CCC drivers to the new ones for AMD in ED? I've not yet upgraded, since my GPU is older than most this is pitched at (7970).
 
For ED

Anisotropic filtering to 16x, leave the rest as is except the frame rate target control, move that all the way to the left as it's just a bit of fluff. In game you can select frame rate or v-sync, works better ;)

Under the 'More' option you may want to enable scaling depending on what res you want to render in compared to the output.
 
For ED

Anisotropic filtering to 16x, leave the rest as is except the frame rate target control, move that all the way to the left as it's just a bit of fluff. In game you can select frame rate or v-sync, works better ;)

Under the 'More' option you may want to enable scaling depending on what res you want to render in compared to the output.
Why use v sync when you can use FRC and get the same without the drawbacks?
 
Why use v sync when you can use FRC and get the same without the drawbacks?

VSYNC and FRC don't work the same way. FRC is to prevent the videocard from over burdening its self by rendering useless frames. If your monitor is only capable of displaying a max of 60 fps, your videocard producing 70 or 1000 for that matter won't improve your experience, so why waste the power?
 
Did you notice much difference from the older CCC drivers to the new ones for AMD in ED? I've not yet upgraded, since my GPU is older than most this is pitched at (7970).

You should be fine, it is pretty much the same as r9 280x which is the "new" (rebranded) 7970.
 
VSYNC and FRC don't work the same way. FRC is to prevent the videocard from over burdening its self by rendering useless frames. If your monitor is only capable of displaying a max of 60 fps, your videocard producing 70 or 1000 for that matter won't improve your experience, so why waste the power?
I know they don't work the same way, and that's my very point. You can use FRC to limit your FPS to your refresh rate without v-sync's drawbacks. That's why I found the suggestion of moving the bar all the way to the left slightly silly.
 
Last edited:
Did you notice much difference from the older CCC drivers to the new ones for AMD in ED? I've not yet upgraded, since my GPU is older than most this is pitched at (7970).

The release of Crimson coincided with me installing a new card so didn't really get enough play time to see what difference there was. Having said that I did install CCC 15.7.1 for a few days to troubleshoot an issue and didn't notice a single difference. Others have reported slight FPS increases, I suspect this is more noticeable on the less powerful cards.

For ED

Anisotropic filtering to 16x, leave the rest as is except the frame rate target control, move that all the way to the left as it's just a bit of fluff. In game you can select frame rate or v-sync, works better ;)

Thanks, will try AF to 16. What does it do LOL? They really need to add the help to Radeon Settings (you can get the old CCC Help pages via 'Additional Settings' but would be nice if it was integrated).

The FRTC is set to 74 because I have a Freesync monitor with a max refresh rate of 75, so this ensures that Freesync is always on :) This is the only setting where I actually know what it does :)
 
I know they don't work the same way, and that's my very point. You can use FRC to limit your FPS to your refresh rate without v-sync's drawbacks. That's why I found the suggestion of moving the bar all the way to the left slightly silly.

Ok, so VSYNC's goal is to prevent 'tearing' by throttling your GPU to 30 fps if it can't handle 60 fps until it can hit 60 fps again. FRC's job is to try and hit the target number that is set. So you set 60 on the FRC and it will try to hit that number consistently. However, 65fps and 55 fps for example will still cause tearing to occur.

This is my understanding and experience with the functions, if I'm wrong, please correct me!
 
The release of Crimson coincided with me installing a new card so didn't really get enough play time to see what difference there was. Having said that I did install CCC 15.7.1 for a few days to troubleshoot an issue and didn't notice a single difference. Others have reported slight FPS increases, I suspect this is more noticeable on the less powerful cards.

I thought this might be the case, since Crimson devs really aren't going to be that interested in a 4 year old GPU for improvements. Having said that, ED runs fine at 1920x1080 at over 60fps, and apart from some annoying coil whine and loud fan noises on planetary landings, it is well suited for my needs.

I'll wait for Polaris and the updated GPUs in the summer, then install Crimson, thanks.
 
Last edited:
FRTC is not a bit of fluff and can significantly improve your performance for some games. If you like a consistent 60 fps, set it for 65. Your framerate and frametimes will improve. Assuming you already get a solid 60 it'll reduce the workload.

Test the higher quality texture filtering setting and see if you notice a difference in performance. Some cards don't notice it, others will drop a few FPS.
 
Texture Filtering Quality: Formerly "Catalyst AI". Dynamically changes the texture filtering settings to reduce texture lookups and improve performance. May reduce quality on some textures, but should be unnoticeable in motion. High quality is unchanged, standard is low optimization and low quality is aggressive optimization.

Surface Format Optimization: Also a part of Catalyst AI. Changes the pixel format of render targets to reduce fillrate requirements. For example may change an fp32 to fp16, or an fp16 to rgb, when that level of precision isn't needed. Can subtly change the look of the game.

Tessellation Mode: Either use the game settings, specify a maximum tessellation level manually, or let the driver decide on a maximum depending on the game.

The first two, I don't know if make any difference on a modern card. But it's usually invisible so might as well leave it on. AMD sucks at tesselation so you should definitely leave that on or even turn it to "off". Sometimes turning optimizations off can actually help things run better or more correctly, but that's almost never the case.
 
Ok, so VSYNC's goal is to prevent 'tearing' by throttling your GPU to 30 fps if it can't handle 60 fps until it can hit 60 fps again. FRC's job is to try and hit the target number that is set. So you set 60 on the FRC and it will try to hit that number consistently. However, 65fps and 55 fps for example will still cause tearing to occur.

This is my understanding and experience with the functions, if I'm wrong, please correct me!

Tearing is completely independent of framerate. 60fps can tear on a 60fps display. Actually that's not completely true, a closer matched framerate might tear more visibly, because the tearing line doesn't move around on the screen so much.

It's because the GPU started drawing the frame at a different time from when it was displayed. So if a frame is drawn at T=0.5, and the screen presents at T=1, and the next frame is drawn at T=1.5, you see where it is going, even though the framerates are matched, the GPU is out of SYNC.

V-sync forces the GPU to wait for present to draw. If it's not ready to draw, then it will have to wait for the next present. Which can only be every frame, every second frame, or every third frame etc. So that's why you'll drop straight from 60 to 30fps. Usually it will alternate rapidly, showing frames 1, 2, 2, 4, 5, 5, 7, 8, 8 etc.

Triple buffering solves most of those problems but elite doesn't support it, sadly.
 
Last edited:
Tearing is completely independent of framerate. 60fps can tear on a 60fps display. Actually that's not completely true, a closer matched framerate might tear more visibly, because the tearing line doesn't move around on the screen so much.

It's because the GPU started drawing the frame at a different time from when it was displayed. So if a frame is drawn at T=0.5, and the screen presents at T=1, and the next frame is drawn at T=1.5, you see where it is going, even though the framerates are matched, the GPU is out of SYNC.

V-sync forces the GPU to wait for present to draw. If it's not ready to draw, then it will have to wait for the next present. Which can only be every frame, every second frame, or every third frame etc. So that's why you'll drop straight from 60 to 30fps. Usually it will alternate rapidly, showing frames 1, 2, 2, 4, 5, 5, 7, 8, 8 etc.

Triple buffering solves most of those problems but elite doesn't support it, sadly.

Vsync usually does frame drops in factors of 60. I find it's better to deal with the occasional screen tearing than abysmal frame drops.
 
Last edited:
Vsync usually does frame drops in factors of 60. I find it's better to deal with the occasional screen tearing than abysmal frame drops.

vsync has nothing to do with the number 60, and everything to do with the refresh rate of your monitor.
 
vsync has nothing to do with the number 60, and everything to do with the refresh rate of your monitor.

Eh... No... The way many devs implement Vsync is to have it work in increments. If your machine can't handle a steady 120fps, say you're around 105 and staying there, it'll lock your framerate to 60fps. If you're having problems maintaining 60fps, it'll lock you down to 30fps. That's how it maintains the consistent refresh rate which prevents screen tearing, because the GPU now has a consistent target for frame timing instead of the frametimes being erratic due to a constantly shifting workload as it tries to work as fast as possible.
 
Last edited:
Eh... No... The way many devs implement Vsync is to have it work in increments. If your machine can't handle a steady 120fps, say you're around 105 and staying there, it'll lock your framerate to 60fps. If you're having problems maintaining 60fps, it'll lock you down to 30fps. That's how it maintains the consistent refresh rate which prevents screen tearing, because the GPU now has a consistent target for frame timing instead of the frametimes being erratic due to a constantly shifting workload as it tries to work as fast as possible.

120 is a multiple of 60, v-sync has nothing to do with the number 60 and everything to do with your 120hz monitor
 
I turned on V-sync to deal with tearing issues, but that's about it. All that really does is prevent the graphics card sending a new image to the monitor before it's ready to refresh. Because when it does that the monitor stops drawing the current image and starts to draw the new one half-way down the screen - leading to tearing. (Simplified, I know, but that's basically it).

I have discovered that FXAA offers me the best anti-aliasing performance (odd, I know, as it isn't an AMD technology) and that can be set in game.

To be honest, these settings don't seem to make a massive difference to Elite. Or to most other games I've tried them with.
 
FXAA is just a fullscreen shader, I don't know why it would run worse on AMD, especially since AMD typically has more FLOPS to play with.

I wish they would add SMAA T2x, or another temporal filter because those improve quality a lot.
 
Back
Top Bottom