Thank you for yout testing Morbad and Tazpoken!

I tweaked my graphic with this thread for the last 2 days and it looks amazing!
Did i understand right: a higher DSR Factor in NVIDIA driver is better than a high supersampling setting in game- options?
For example 0.75x supersampling in game and DSR 4.0 is better than 1.00x supersampling in game and DSR 2.25 at res 2560x1440 ?

You're welcome, it was only fair I gave back a bit of testing to Morbad's amazing work. Everyone in this forum helped me so much since I started playing!

Morbad can definitely answer better than I can, but I think that NVIDIA DSR is for sure a less-taxing method to do super sampling. Often times in-game SuperSampling, no matter which game we are talking about, are more demanding on the GPU. NVIDIA achieved quite a thing with DSR as it does not obliterate the GPU even at 4.00x (which means 2x the width and 2x the height). The price to pay is "blurriness" as the scaling does not even work 1:1 and pixels have to be approximated in some way or weird magical calculation (I don't know the technicalities). So one thing is sure for me: if you compare In-game Supersampling at 2.0x VS NVIDIA DSR at 4.00x you will notice that the game is far smoother with DSR. Try it out for yourself, I know on my system even 1.25x in-game shows its weight.

In terms of quality, it could be that in-game holds a better result since it's more brute-force upsampling, without the voodoo ;)

That's my lay of the land, I am sure others can elaborate.

I play at 1440p > DSR 4.00x 0% smoothing > In-game 0.5x SuperSampling and the game plays as smooth as playing simple 1440p. Granted, if you do the calculations you will notice I end up at the same resolution, so the question would be: "why going through that?" the quality is much better, a tad blurrier maybe but much more enjoyable for me, not only for jagged lines (but also for that).

2nd thing: I have in NVIDIA in program setting supersampling 8x. How effect it to the in game super sampling?

I can't find your "supersampling 8x" setting, could you point me to where it is?
 
Did i understand right: a higher DSR Factor in NVIDIA driver is better than a high supersampling setting in game- options?
For example 0.75x supersampling in game and DSR 4.0 is better than 1.00x supersampling in game and DSR 2.25 at res 2560x1440 ?

Depends on what you think is better. Using both simultaneously results in two scaling/filter passes, both of which reduce jaggies at the cost of a small amount of blur.

DSR 4.00x is also double the resolution, which can then be scaled to the display with 0% smoothing, which means it's a straight bilinear filter, which means less blur for a similar reduction in jaggies. Doing the same thing with any other DSR setting requires the more complex filtering, and thus more potential loss of fine detail (and slightly more of a performance hit), from higher smoothing settings for an acceptable effect.

2nd thing: I have in NVIDIA in program setting supersampling 8x. How effect it to the in game super sampling?

I'd be surprised if that NVIDIA supersampling it was doing anything at all as 8x SSAA would completely criple the game's performance on almost any combination of hardware. It's functionally equivalent, in number of pixels, to 8.00x DSR (which doesn't exist) or 4.00x supersampling in-game (which also doesn't exist, by default). A 2080 Ti running with the game set to 1920*1080 would get about 15-20 frames per second inside starports or on planet surfaces with ultra quality, if you were actually running 8x SSAA.
 
Good info. Here's some screens taken lately with my settings (for what it matters). Very happy with the results.
LA4YpET.jpg


dYkaxfB.jpg

UyQFfAZ.jpg

 
ok I´ve read that in nvidia control panel you can either use supersampling or DSR so morbad is right it should do nothing.
Yesterday i fiddled with it and wondered that i had still 80 fps
 
Damn, those screenshots look great. One of the things that has always bothered me about this game is the anti-aliasing. Currently I run with supersampling 1.25x. I’ve dabbled with DSR before as well, but not this combination. I’ll have to give it a try tomorrow. Would especially be nice if the image quality is truly that good, with less of a performance hit than 1.25x supersampling.
 
Thans @Deebz__ - glad you like them (here is a new one from yesterday night, returning from Imperial Navy patrol duty ;))
txDp1yCr.jpg

In my case I had tried 1.25x in-game way back when I started playing as somehow the jagged lines (which I didn't know where such a common problem in some game engines and Anti Aliasing combinations) would not get reduced and I was already at 2k resolution. I remember my video card taking quite a hit at 1.25x, not unplayable but worse than using DSR+in-game SS as a combo. I also remember the result wasn't improving a lot and - although maybe crispier - I still had that feeling of looking at videogame polygons (I hate to describe it as such but have no idea how to give a better explanation ;)).
 
Well I had a feeling it was too good to be true. Or maybe I’m just doing something wrong. Not sure.

I have DSR 4.00x at 0% smoothness enabled. In the game, I have the 5120x2880 resolution selected, and 0.5x supersampling.

Image quality is not really all that much better than 1.25x supersampling. Maybe a bit worse since the final image is being resized so much. Performance is also much lower. I went from 80 fps in stations to 60 fps.

Well, it was worth a shot anyway.
 
Well I had a feeling it was too good to be true. Or maybe I’m just doing something wrong. Not sure.

I have DSR 4.00x at 0% smoothness enabled. In the game, I have the 5120x2880 resolution selected, and 0.5x supersampling.

Image quality is not really all that much better than 1.25x supersampling. Maybe a bit worse since the final image is being resized so much. Performance is also much lower. I went from 80 fps in stations to 60 fps.

Well, it was worth a shot anyway.

You're scaling it back to native resolution with those settings. I'm not surprised it doesn't look better than just 1.25x in-game, but the internal resolution is actually lower and I am surprised the performance hit is worse.

DSR 4x + 0.65x SS in game should give the closest number of pixels to 1.25x in game supersampling without any DSR and should only perform slightly worse.

What GPU do you have?
 
+1 what Morbaid wrote. I am going to try and see if I can integrate with any useful info.

My “default” is 1440p, DSR 1.00x + SS 1.00x. I played like that for months. My system is not too powerful and I can’t indulge in higher settings but I did want to see if I could get a smoother feel and less jagged lines (since Anti Aliasing wasn’t cutting it much) so I ended up comparing it with DSR 4.00x + SS 0.50x. This ends at being once again 1440p resolution (remember that DSR 4.00x means “twice the width and twice the height”, just as SS 0.50x means “half the width and half the height” - sometimes it can get confusing).

In your case, you were already playing at SS 1.25x (but with DSR at 1.00x) and that means you were already multiplying your native resolution by 1.25x. If you end up with my settings, you will be looking at “less” pixels than I do.

To get the exact comparison between DSR “on” and “off” in your case, you should compare:

1) DSR 1.00x + SS 1.25x
2) DSR 4.00x + SS 0.625x

Unfortunately I don’t think it’s possible to set the in-game SS at 0.625x (but it’s worth trying?!) so Morbad’s setting is correct: SS 0.65x will end up giving you a final “result” of 1.3x, which should have you sit slightly above your original setting of SS 1.25.

Still, you might or might not like it, but it’s always worth trying.

UPDATE:
I tried DSR 1.00x + SS 1.25x and it's a couple steps above not playable, for me. Even in deep space I can feel the stuttering. I tried DSR 4.00x + SS 0.65x and it's just beautiful, crips and definitely playable, only a tad slowing down during station flights, especially in vanity camera. So for me going higher in DSR and lower in SS is clarly 20-30% faster and smoother. And quality wise I even like it better, I must say (or find it practically the same).
Bear in mind my setup is not high-level and when I say playable it's by far close to the FPS average you people get. But I can clearly see that in-game SS is very demanding compared to DSR.
 
Last edited:
You're scaling it back to native resolution with those settings.

Well yes, because I was trying exactly what Tazpoken mentioned earlier:

I play at 1440p > DSR 4.00x 0% smoothing > In-game 0.5x SuperSampling and the game plays as smooth as playing simple 1440p. Granted, if you do the calculations you will notice I end up at the same resolution, so the question would be: "why going through that?" the quality is much better, a tad blurrier maybe but much more enjoyable for me, not only for jagged lines (but also for that).

Anyway I can't seem to get decent performance by scaling up with DSR and scaling down with in-game supersampling. It's considerably worse in each case. Weird that it behaves exactly the opposite for some people, but hey... such is the nature of PC gaming. ¯\(ツ)

For the record, my specs are:
Nvidia GTX 1080 Ti
Intel i7 8700K
32GB DDR4 @ 3200 Mhz
Game installed on Samsung 970 EVO NVMe SSD
 
Agreed.. PCs are weird sometimes and that's why we love them. I mean how else could I get that placebo effect of installing new drivers or even cleaning old ones... aaah dat feel ;)
 
Graphics beyond Ultra?... screen space "raytraced" reflections (look at the floor):

zE3CsVq.png


4627TZR.png


Using the Marty McFly qUINT reshade shader pack: https://reshade.me/forum/shader-presentation/4393-quint

The screen space reflection shader actually raytraces the reflections rather than rasterization although it's still limited to screen space ( so no reflections from out of view of the camera etc). It's not dependant on RTX/DXR so not a DX12/Nvidia only thing.

He's also made a global illumination ray tracing shader that's currently in alpha, that's only available for those that cough up to his patreon though.
 
Anyway I can't seem to get decent performance by scaling up with DSR and scaling down with in-game supersampling. It's considerably worse in each case. Weird that it behaves exactly the opposite for some people, but hey... such is the nature of PC gaming. ¯\(ツ)

For the record, my specs are:
Nvidia GTX 1080 Ti
Intel i7 8700K
32GB DDR4 @ 3200 Mhz
Game installed on Samsung 970 EVO NVMe SSD

Something odd is definitely going on. I'll do some experimenting on my end to see if I can reproduce any such differential.

The screen space reflection shader actually raytraces the reflections rather than rasterization although it's still limited to screen space ( so no reflections from out of view of the camera etc).

Very interesting, though the performance hit for the effect seems quite significant given the limitations.

Do the standard in-game cubemap reflections still work with this shader? Can't think of any reason they wouldn't, but thought I'd ask.
 
Super interesting! Marty says that SSR should be used only in screenshots and not in-game? Mh... Anyway: I did install them and took three screenshots.

Things are very subtle, except for bloom, to the point that I had to double check they are turned on. What do you guys think? With DSR 4.00x it just kills my system, I'd be amazed if you can keep it on with SS at 0.65 too! Let me know...


These are at 1440p, DSR 1.00x + SS 1.0


Vanilla ED (+Morbad's Tweaks)
0Bfh9wL.jpg



Morbad's Tweaks + Reshade qUINT MXAO+ADOF+SSR+Bloom
5s3Cl1m.jpg



Morbad's Tweaks + Reshade qUINT MXAO+ADOF+SSR
hcWfcez.jpg




These are at 1440p, DSR 4.00x + SS 0.5

OFF
Elite-Dangerous64-2019-05-07-12-03-16.png


ON (except bloom which refuses to compile with DSR on)
Elite-Dangerous64-2019-05-07-12-03-40.png
 
Last edited:
Something odd is definitely going on. I'll do some experimenting on my end to see if I can reproduce any such differential.

Well this thread has suddenly reminded me that I use Reshade. I would need to test to know for sure, but I'm assuming that its shaders would not be affected by the in-game 0.5x supersampling, causing it to render stuff in 5k rather than 1440p. Since I do have a few performance-heavy shaders running, I could see this being the cause...

I'll verify that this evening if I have time.
 
Well this thread has suddenly reminded me that I use Reshade. I would need to test to know for sure, but I'm assuming that its shaders would not be affected by the in-game 0.5x supersampling, causing it to render stuff in 5k rather than 1440p. Since I do have a few performance-heavy shaders running, I could see this being the cause...

As I wrote, Reshade destroyed my system at DSR 4.00x, no matter the in-game SS. It literally went from 40 to 2 fps..that could be the cause.

Deebz__ it seems they are not working as intended.. I have a couple questions:
1) Do you keep in-game AntiAliasing, Depth of Field, Bloom and Blur ON even with qUINT?
2) Which order do you have the qUINT shaders at and which ones do you enable?

Except for bloom, MXAOO and FAO/SSR seem to completely break some of the reflections and blur/DoF. And they also crush my fps. Blur alone can make it a bit dreamier but not necessarily "better".

Maybe I am doing something wrong? I haven't found a ReShade entry for E:D so I am not sure the game is entirely compatible with those..
 
Last edited:
Ok I spent some fun, productive time with ReShade and qUINT. Here's a comparison from a starting point (no AA-DoF-Blur-Bloom-AO) and the rest. I still can't get ADOF to work but I will probably manage soon ;)

 
Sorry, didn’t see your edit. I don’t use qUINT yet, just some of the standard shaders that come with Reshade.
 
Well this thread has suddenly reminded me that I use Reshade. I would need to test to know for sure, but I'm assuming that its shaders would not be affected by the in-game 0.5x supersampling, causing it to render stuff in 5k rather than 1440p. Since I do have a few performance-heavy shaders running, I could see this being the cause...

As it's a 3rd party post processing effect, it sounds entirely plausible that the ReShade shaders are using the set resolution rather than the game's internal render resolution.
 
Sorry, didn’t see your edit. I don’t use qUINT yet, just some of the standard shaders that come with Reshade.

I just edited it with a simple OFF/ON comparison (bear in mind the game settings have many things OFF and replaced by qUINT). As soon as I get ADOF to work I will post a comparison between in-game AO/DoF/Blur/Bloom and qUINT. Looks promising!



UPDATE (all 1440p DSR 1.00x SS 1.00x, no Anti Aliasing) - just an example:

Vanilla ED (no AoF/Blur/AO/Bloom)


Vanilla ED (with AoF/Blur/AO/Bloom at highest settings)

qUINT (MXAO, SSR, ADOF, Bloom, Lightroom, custom preset)


qUINT (MXAO, SSR, ADOF, Bloom, Lightroom, custom preset 2 with less dramatic Bloom)

qUINT (MXAO, SSR, ADOF, Bloom, Lightroom, custom preset 2 with less dramatic Bloom + MSAA in-game)
 
Last edited:
Back
Top Bottom