Nvidia DSR / DSR-DL vs AMD VSR vs in game SS in Elite Dangerous?

My experience with DSR and DSR-DL is that DSR-DL isn’t really better or faster than DSR in Elite Dangerous. I don’t have a capture card so it difficult to really compare. I find that the in-game SS is slower than DSR /DSR-DL.

Is AMD VSR as good as or better than Nvidia DSR/DSR-DL?

I use an RTX 2060 Super on a 27” 1080p screen with DSR-DL 1.78 today. My monitor Benq XL2720Z has been half broken for several years now (Blurry picture at higher framerates and a horizonal line) and I planning to do an upgrade.

I tried a 32” 4k QD-OLED and it’s really nice but it´s need as much super sampling as my old 1080p monitor (I want DSR at least 2.25) and I don’t want to but an RTX 5090 to run Elite Dangerous on DSR 4.0. Higher res should reduce AA problems but I think the sharper image of the oled is less forgiving than my old blurry 1080p monitor.

My current plan:
27” 1440p 240Hz QD-OLED
RTX 5070ti or 9070XT

I assume that 9070XT is faster than RTX 5070ti since there is no Ray tracing in Elite?
I understand that 5000 series card has a lot or problems with drivers. Do that include Elite Dangerous as well?

I also play Battlefield 4 and Battlefield 2042 and a bunch of other games as well.

Cheapest models in Sweden (2025-05-18):
5070ti ~ 920 EUR
9070XT ~780 EUR

My setup:
GPU: 2060S
CPU: 7600X
Mem: 32GB
Win: 10

The reason I’m asking is that have a hard time choosing between 9070XT and 5070ti.

Note:
Screens shots of NVIDIA cards only show you the rendered pictures not the downscaled picture that’s Is sent to the monitor.
 
I find that the in-game SS is slower than DSR /DSR-DL.

There does seem to be slightly less overhead with DSR, but the performance difference vs. the in-game supersampling should be very small at the same internal resolution.

Keep in mind that the in-game SS resolutions are a multipliers of linear dimensions while DSR demarcates settings in multiples of total pixel area. 1.78x DSR is the equivalent of 1.33x (which needs to be set in the config file because it's not a preset) in-game suppersampling, 2.25x DSR is 1.5x in-game SS, and 4.0x DSR is 2.0x in-game SS.

Is AMD VSR as good as or better than Nvidia DSR/DSR-DL?

I used to use VSR with Elite: Dangerous on my AMD cards (from Hawaii to RDNA2) and I want to say it produces a slightly softer image than DSR, but that is heavily dependent on the smoothing settings used with DSR and precise render multiplier.

I use an RTX 2060 Super on a 27” 1080p screen with DSR-DL 1.78 today. My monitor Benq XL2720Z has been half broken for several years now (Blurry picture at higher framerates and a horizonal line) and I planning to do an upgrade.

I tried a 32” 4k QD-OLED and it’s really nice but it´s need as much super sampling as my old 1080p monitor (I want DSR at least 2.25) and I don’t want to but an RTX 5090 to run Elite Dangerous on DSR 4.0. Higher res should reduce AA problems but I think the sharper image of the oled is less forgiving than my old blurry 1080p monitor.

My current plan:
27” 1440p 240Hz QD-OLED
RTX 5070ti or 9070XT

I assume that 9070XT is faster than RTX 5070ti since there is no Ray tracing in Elite?
I understand that 5000 series card has a lot or problems with drivers. Do that include Elite Dangerous as well?

I also play Battlefield 4 and Battlefield 2042 and a bunch of other games as well.

Cheapest models in Sweden (2025-05-18):
5070ti ~ 920 EUR
9070XT ~780 EUR

My setup:
GPU: 2060S
CPU: 7600X
Mem: 32GB
Win: 10

The reason I’m asking is that have a hard time choosing between 9070XT and 5070ti.

Note:
Screens shots of NVIDIA cards only show you the rendered pictures not the downscaled picture that’s Is sent to the monitor.

For the best anti-aliasing, short of resorting to third party injectors with settings so aggressive that details are destroyed, I recommend using the in-game FXAA (or SMAA, depending on taste), plus the in-game 'supersampling' (at below target resolution) and DRS/VSR. There is some extra overhead to this, but it doesn't seem to harm latency much and you get two scaling and three filtering passes to soften aliasing. It's not remotely perfect, but it's still subjectively the best balance I've been able to manage between reducing jaggies and blurring.

I haven't had major problems with either AMD or NVIDIA drivers in ED in a very long time, but I don't have any RDNA4 or Blackwell cards (my fastest AMD card is a 6900 XT and my fastest NVIDIA card is an RTX 4090). I also tend to modify my drivers and use settings that preempt many problems.

Anyway, with that gap in price, the RX 9070 XT is the obvious choice. ED doesn't use any proprietary NVIDIA features and doesn't have RT at all.

I'd go 9070XT for the power connectors alone...

The 5070 Ti doesn't pull enough current to make the power connector an issue.

4090s and 5090s melt connectors often enough that one should measure the per-wire current to make sure it's balanced, but power consumption and current draw rapidly falls off as you go down the stack. There are still examples of failures, but the odds of it happening at this segment are infinitesimal. These cards sell in significantly greater quantities than the higher-end parts, while experiencing a tiny fraction of total connector/cable failures.
 
There does seem to be slightly less overhead with DSR, but the performance difference vs. the in-game supersampling should be very small at the same internal resolution.

Keep in mind that the in-game SS resolutions are a multipliers of linear dimensions while DSR demarcates settings in multiples of total pixel area. 1.78x DSR is the equivalent of 1.33x (which needs to be set in the config file because it's not a preset) in-game suppersampling, 2.25x DSR is 1.5x in-game SS, and 4.0x DSR is 2.0x in-game SS.



I used to use VSR with Elite: Dangerous on my AMD cards (from Hawaii to RDNA2) and I want to say it produces a slightly softer image than DSR, but that is heavily dependent on the smoothing settings used with DSR and precise render multiplier.



For the best anti-aliasing, short of resorting to third party injectors with settings so aggressive that details are destroyed, I recommend using the in-game FXAA (or SMAA, depending on taste), plus the in-game 'supersampling' (at below target resolution) and DRS/VSR. There is some extra overhead to this, but it doesn't seem to harm latency much and you get two scaling and three filtering passes to soften aliasing. It's not remotely perfect, but it's still subjectively the best balance I've been able to manage between reducing jaggies and blurring.

I haven't had major problems with either AMD or NVIDIA drivers in ED in a very long time, but I don't have any RDNA4 or Blackwell cards (my fastest AMD card is a 6900 XT and my fastest NVIDIA card is an RTX 4090). I also tend to modify my drivers and use settings that preempt many problems.

Anyway, with that gap in price, the RX 9070 XT is the obvious choice. ED doesn't use any proprietary NVIDIA features and doesn't have RT at all.



The 5070 Ti doesn't pull enough current to make the power connector an issue.

4090s and 5090s melt connectors often enough that one should measure the per-wire current to make sure it's balanced, but power consumption and current draw rapidly falls off as you go down the stack. There are still examples of failures, but the odds of it happening at this segment are infinitesimal. These cards sell in significantly greater quantities than the higher-end parts, while experiencing a tiny fraction of total connector/cable failures.
Thank you for your very through post on this topic.
I know that DSR & VSR settings is a multiplication of the number of pixels and that SS settings are the multiplication of horizontal and vertical resolution with a factor. Easy to miss and it´s good that you pointed it out for us.

I read that you cannot change the smoothing in VSR. I use 33% with DSR (I think that is the default setting). I tried 66% but I ended up using 33%. I don’t think I noticed so much difference between 33% and 66% but 33% was a little sharper.

I use SMAA with DSR. I will test FXAA as well. Thanks for the tip.

I’m not quite sure I understand the other part. Is this correct with my current 1080p monitor?

The new connector shouldn’t be a problem with a 5070ti but it’s an extra adapter since my PSU doesn’t have that connector (I’m sure I get adaptor with the card).
 

Attachments

  • Morbad AA.png
    Morbad AA.png
    26.7 KB · Views: 80
  • Morbad AA.xls
    29.5 KB · Views: 59
I’ve tested Morbas suggestion of combining SS below 1.0 + DSR + FXAA instead of DSR + SMAA

I think that SMAA is slightly better AA than FXAA but FAXX is 3% faster.
SS 0.75+DSR 3.00 AA quality is om pair with DSR1.78 but its 3% faster.

I’m now using SS 0.75 + DSR 3.00 + FXAA instead of DSR 1.78 + SMAA. I think that AA quality is about the same and get ~ 7% more FPS. The rendered resolution is almost the same as before.

This is not a silver bullet that fixes AA. At least not on a 27” 1080p monitor.

Thanks Morbad!

Note:
Very difficult to compare different setting for AA quality.
27" 1080p probably has to low pixel density to really fix the AA problem.
I did the tests in the training simulator: ”Docking and travel”
 

Attachments

  • Morbad AA results.png
    Morbad AA results.png
    420.7 KB · Views: 61
  • Morbad AA.xls
    415 KB · Views: 44
FXAA is softer and does a marginally better job obscuring the aliasing on specular highlights, which is where the game needs antialiasing the most. That softness is the main downside as well, because it doesn't just affect edge geometry. At lower resolutions I tend to use SMAA, but at higher resolutions where the sharpness impact isn't as obvious I tend to use FXAA.

As subtle as SMAA and FXAA are, leaving them off doesn't save enough performance to be worthwhile, unless one is injecting third-party AA shaders. I consider the in game MLAA options to be useless as they are clearly inferior to SMAA in sharpness and FXAA in antialiasing for just as much, if not more, of a performance hit.
 
FXAA is softer and does a marginally better job obscuring the aliasing on specular highlights, which is where the game needs antialiasing the most. That softness is the main downside as well, because it doesn't just affect edge geometry. At lower resolutions I tend to use SMAA, but at higher resolutions where the sharpness impact isn't as obvious I tend to use FXAA.

As subtle as SMAA and FXAA are, leaving them off doesn't save enough performance to be worthwhile, unless one is injecting third-party AA shaders. I consider the in game MLAA options to be useless as they are clearly inferior to SMAA in sharpness and FXAA in antialiasing for just as much, if not more, of a performance hit.
That’s sounds about right.

I found that MLAXX2 and MLAXX4 wasn’t noticeable better than NO AA but they were a little faster than FXAA and slower than NO AA. MLAXX2 and MLAXX4 is quite useless.
I will be interesting to see the result with a faster GPU and a 27” 1440p monitor (when I actually decide get to upgrade).
 

rootsrat

Volunteer Moderator
My experience with DSR and DSR-DL is that DSR-DL isn’t really better or faster than DSR in Elite Dangerous. I don’t have a capture card so it difficult to really compare. I find that the in-game SS is slower than DSR /DSR-DL.

Is AMD VSR as good as or better than Nvidia DSR/DSR-DL?

I use an RTX 2060 Super on a 27” 1080p screen with DSR-DL 1.78 today. My monitor Benq XL2720Z has been half broken for several years now (Blurry picture at higher framerates and a horizonal line) and I planning to do an upgrade.

I tried a 32” 4k QD-OLED and it’s really nice but it´s need as much super sampling as my old 1080p monitor (I want DSR at least 2.25) and I don’t want to but an RTX 5090 to run Elite Dangerous on DSR 4.0. Higher res should reduce AA problems but I think the sharper image of the oled is less forgiving than my old blurry 1080p monitor.

My current plan:
27” 1440p 240Hz QD-OLED
RTX 5070ti or 9070XT

I assume that 9070XT is faster than RTX 5070ti since there is no Ray tracing in Elite?
I understand that 5000 series card has a lot or problems with drivers. Do that include Elite Dangerous as well?

I also play Battlefield 4 and Battlefield 2042 and a bunch of other games as well.

Cheapest models in Sweden (2025-05-18):
5070ti ~ 920 EUR
9070XT ~780 EUR

My setup:
GPU: 2060S
CPU: 7600X
Mem: 32GB
Win: 10

The reason I’m asking is that have a hard time choosing between 9070XT and 5070ti.

Note:
Screens shots of NVIDIA cards only show you the rendered pictures not the downscaled picture that’s Is sent to the monitor.
Personally I would ditch 4k and go 2k, but with high quality HDR. Upscaling from 2k to 4k costs a lot less, but HDR makes a big visual difference.
I run 5080 rtx and I have 2k native resolution, DSR it to 4k and then apply 1.5 in game supersampling. Works flawlessly and the picture is amazing quality.
 
Personally I would ditch 4k and go 2k, but with high quality HDR. Upscaling from 2k to 4k costs a lot less, but HDR makes a big visual difference.
I run 5080 rtx and I have 2k native resolution, DSR it to 4k and then apply 1.5 in game supersampling. Works flawlessly and the picture is amazing quality.
If I understand this correctly:
Elite dangerous render at 6k (5760x3240). Suspersampling (1.5) it down to 4k (3840x2160) then DSR 2.25 scale it down to 2k (2560x1440) which is sent to the monitor (native resolution).

That means two down samplings after each other which should be the same as DSR 5.06 (which off course don’t exist).

I agree that it is too heavy to drive a 4k monitors with high DSR factors. Combining SS and DSR allows us to use higher downscaling that DSR have so if you have the GPU power then this might be a good idea. Also combining two different downscaling algorithms might improve things.

My current GPU would jump out and strangle me if I tried this even on a 1080p monitor…

Didn’t know that HDR was a thing for ED but RTX HDR mod seems to be popular so I will definitely try it out when I get a HDR monitor.

Edit:
Tested Rootsrat setting on my 2060S and 1080p monitor:

Redered resolution: 4320x2430
SS downscaled (1.5) to : 2880x1620
DSR downscaled (2.25) to : 1920x1080

SMAA 36FPS on pad
FXAA 37FPS on pad

Probably better AA but a lot slower. I think the image is softer. 1080p is to low resolution to test this and my GPU is to slow.
 
Last edited:
Personally I would ditch 4k and go 2k, but with high quality HDR. Upscaling from 2k to 4k costs a lot less, but HDR makes a big visual difference.
How are you getting the benefit of HDR? Are you using Windows auto-HDR? Are you sure your "benefit" is not just that your HDR display is objectively better and you therefore have more accurate SDR now?
 
I have RTX 5090 and Ryzen 7800 x3d...My configuration for Elite is

Native Res 4k (4096x2160)
DSR factor 1.78x DL (5461x2880)
2.25xDL (6144X3240) ...This I play inside ED
70 fps block (avoid sttutering)
HDR ON...SDR SLIDEon 24

For me its very nice..but i try KDK Warhead configuration
 
I have RTX 5090 and Ryzen 7800 x3d...My configuration for Elite is

Native Res 4k (4096x2160)
DSR factor 1.78x DL (5461x2880)
2.25xDL (6144X3240) ...This I play inside ED
70 fps block (avoid sttutering)
HDR ON...SDR SLIDEon 24

For me its very nice..but i try KDK Warhead configuration
I got 17FPS average in a satellite (on the pad) at DSR 2.25 SMAA 4k (3840x2160) with my 2060S.
DSR 2.25 at 3840x2160 = 18 662 400 pix
DSR 2.25 at 4096x2160 = 19 906 560 pix

It seems that RTX 5090 is slightly faster than RTX 2060S 😉
I think that AA was great with DSR 2.25 and higher on a 32” 4k Oled but very heavy.
 

rootsrat

Volunteer Moderator
How are you getting the benefit of HDR? Are you using Windows auto-HDR? Are you sure your "benefit" is not just that your HDR display is objectively better and you therefore have more accurate SDR now?
Yes, Windows Auto HDR, which looks a lot better than SDR. Somehow Cobra engine even though it's not supporting HDR natively, works surprisingly well with Auto HDR.
 

rootsrat

Volunteer Moderator
Edit:
Tested Rootsrat setting on my 2060S and 1080p monitor:

Redered resolution: 4320x2430
SS downscaled (1.5) to : 2880x1620
DSR downscaled (2.25) to : 1920x1080

SMAA 36FPS on pad
FXAA 37FPS on pad

Probably better AA but a lot slower. I think the image is softer. 1080p is to low resolution to test this and my GPU is to slow.
Yeah, combining DSR and in game sampling is a bad idea if you don't have the hardware to support it, especially if you're upscaling from 1080p.
 
I got 17FPS average in a satellite (on the pad) at DSR 2.25 SMAA 4k (3840x2160) with my 2060S.
DSR 2.25 at 3840x2160 = 18 662 400 pix
DSR 2.25 at 4096x2160 = 19 906 560 pix

It seems that RTX 5090 is slightly faster than RTX 2060S 😉
I think that AA was great with DSR 2.25 and higher on a 32” 4k Oled but very heavy.
I had that RTX 2600 card and it is very good, I have a great affection for it, I enjoyed it a lot and it gave me great satisfaction in many titles:love:. I had not planned to buy the RTX 5900, but I had a good offer 2 months ago, and I was able to sell the previous one at a good price, because I bought it mainly for flight simulators (MFS, DCS, XPLANE12) because of how demanding they are, but when I tested it in ED and saw the great and abosolut amazing sharpness in planets, ships, etc., then I was more convinced that it was worth getting it, because seeing this title in 6k is something worth mentioning. Anyway, I want to try your configuration because it is supersampling, I know it is very demanding, and I want to see how it behaves:rolleyes:.
 
Yes, Windows Auto HDR, which looks a lot better than SDR. Somehow Cobra engine even though it's not supporting HDR natively, works surprisingly well with Auto HDR.

If the game natively supported HDR render targets, you wouldn't need (or want) AutoHDR.

Personally, I prefer SpecialK's HDR to the WIndows AutoHDR, but configuration is considerably more involved. It's also possible to adjust the game's HDR settings (which are distinct from highter bitness render targets) to eliminate the eye adaptation/exposure features and directly utilize the extra dynamic range provided by the higher quality HDR render targets (AutoHDR or third party), if one's display has a high enough contrast ratio and brightness.
 

rootsrat

Volunteer Moderator
If the game natively supported HDR render targets, you wouldn't need (or want) AutoHDR.

Personally, I prefer SpecialK's HDR to the WIndows AutoHDR, but configuration is considerably more involved. It's also possible to adjust the game's HDR settings (which are distinct from highter bitness render targets) to eliminate the eye adaptation/exposure features and directly utilize the extra dynamic range provided by the higher quality HDR render targets (AutoHDR or third party), if one's display has a high enough contrast ratio and brightness.
My screen's limit is 609 lumens, is it worth digging into the hidden HDR settings? I did something similar (basically enabled mative HDR support in UE5) for Oblivion Remastered and it made it look gorgeous!
 
Back
Top Bottom