NIS - when NVIDIA fixed what FDev wont.

Which is pretty relevant when it comes to judging visual quality.

Yes.

People use TAA because, despite its performance impact, it improves visual quality. A bit more blurry, but less jaggies, and the trade-off is considered to be worth it.

There is no way in hell I'd consider what I see at the beginning of the video you linked, or in the AA comparison of the DF video, to be acceptable. It's not a bit more blury, it's hugely more blury. It's also adding blatantly obvious motion artifacts. TAA should not look the way it does in these Death Stranding comparisons. It's possibly the worst in-game implementation I have ever seen. I'd hesitate to call it an improvement over no-AA at all, given the cost, and would either settle for the in-game FXAA, or I'd use SMAA via ReShade, were DLSS not available.

DLSS vs FXAA: not fair, TAA is so blurry!
DLSS vs FXAA: not fair, TXAA has terrible jaggies!
DLSS vs nothing: not fair, you didnt even use any anti-aliasing technique!

DLSS vs. any one of a number of viable options: automatically unfair because when judging subjective IQ, there is no universally best option.

In the case of a comparison, using only the flawed TAA is a problem. Even limiting one's self to wholly in-game options, there is still no-AA and FXAA. I don't know if they would result in subjectively better IQ than DLSS in that game, because I've never seen anyone test them against DLSS in that game. Therefore, I do not have the information I need to make an informed decision as to the actual value of DLSS in that game. That is my point.

People use TAA because it looks better than FXAA.

Not, in my opinion, in Death Stranding.

People use DLSS because it looks better than TAA.

Sometimes, but far from always.

For example, in Cyberpunk 2077, I'd argue it's TAA implementation is quite good and DLSS is generally inferior unless internal resolutions approach the same value (upscaling to 4k or 5k with DSR then using DLSS to keep performance acceptable with the 1440p native res of my display looks pretty good). However, when set to the same output resolution, even DLSS quality produces significantly worse IQ, IMO, than the game's default TAA (which cannot be disabled without also disabling DLSS, making the comparison pretty easy via lack of reasonably accessible options).

When comparing DLSS in motion it is absolutely fair to use TAA.

I'm not suggesting they not use the in-game TAA, but that the other viable options have to be given the same treatment for there to be the ability to make an informed choice.

Edit:
If this was a situation where TAA was clearly and universally better than FXAA or even no AA, sure, only using TAA vs. DLSS would make sense. However, I don't see how someone can watch what this TAA implementation is doing to Death Stranding and say, "this is always the best of the non-DLSS options".
 
Last edited:
Just to weigh in on the scaling issue...

I get the best results in ED just using FXAA at native resolution

I'm confused with all the options and settings I could fiddle with that I don't much bother with any of it anymore.
 
Just to weigh in on the scaling issue...

I get the best results in ED just using FXAA at native resolution

I'm confused with all the options and settings I could fiddle with that I don't much bother with any of it anymore.
I agree. At least, FXAA works best for me.
 
I'll give it a "it works" now that I did a quick RTFM and downloaded the updated GeForce Experience.

Been playing Odyssey for a few hours loading some power tat onto my FC for the CG, with it set to the "ultra" equivalent FPS is at least as good as the AMD offering and picture on screen looks good enough to me.

It doesn't require GFE. Enabling scaling in the control panel adds a list of scaling resolutions, which you can select in game. If you enable the indicator, it will show a small "NIS" overlay in the upper left corner; blue if you are running native resolution and only getting sharpening, or green if it's scaling.

In practice this is similar to the old NVIDIA scaling functionality, just with specific presets, rather than having to use resolutions already present, or custom ones.

Any pics to show ?

Unfortunately, I'm not able to capture any meaningful screenshots of NIS in software. I'm only capturing the game/internal resolution used...you see neither the sharpening nor the scaling as they appear to be applied later in the output.

At a 1:1 internal resolution comparison, NIS at default 50% sharpening has slightly better antialiasing and a sharper image than the in-game FSR. However, it's also significantly grainier, with more scaling artifacts.

Those using these options purely for performance will have to test them both to see what they feel is subjectively the best. For those trying to get the best AA out of the game without costing as much as pure supersampling, DSR (with smoothing) + FSR is subjectively the best option as NIS doesn't appear to be working in conjunction with DSR.
 
It doesn't require GFE. Enabling scaling in the control panel adds a list of scaling resolutions, which you can select in game. If you enable the indicator, it will show a small "NIS" overlay in the upper left corner; blue if you are running native resolution and only getting sharpening, or green if it's scaling.
Thanks for that!
I'd read the Nvidia blurb and part of the instructions was to download the addition to GFE - that means that I can now reload the latest driver on my laptop without GFE!

Something to do later (y)
 
Thing is, the 2060 is old and that expensive. Imagine trying to get the one it runs well with at all times. Here in Hungary the 3060Ti is now 1500 EUR :ROFLMAO:
A 2060 is only what, 4 years old? Not "old" at all in the colloquial sense. A GTX 970 is "old". Anything that performs higher than a 1060 6GB is not "old." OldER? Yes. But not "old" in that you won't expect them to run games.

Please don't let people convince you that hardware only one generation old is somehow "too ancient to run Odyssey;" it's revisionist history to defend the awful optimization.
 
A 2060 is only what, 4 years old? Not "old" at all in the colloquial sense. A GTX 970 is "old". Anything that performs higher than a 1060 6GB is not "old." OldER? Yes. But not "old" in that you won't expect them to run games.

Please don't let people convince you that hardware only one generation old is somehow "too ancient to run Odyssey;" it's revisionist history to defend the awful optimization.
Oh I know that. Mostly because I'm happily running Forza Horizon 5 on my 2060, and it wasn't for the VRAM I could probably have everything in on Ultra and it'd still work on 1080p just fine.

I just wrote that currently even a 2060 is overpriced. Last I checked if I sold mine now, I would get more money back then when I bought it. And the 2060 could run EDO just fine, if it wasn't for all the mystery CPU shenanigans at busy places.

But if you would believe those that said 3xxx and everything is fine, yeah, you'd be paying in goat blood and firstborns, currently.
 
Qick update.
I've tested different setting in different conditions. Most cases i can finally achive stable 60.
Except settlements - seems it's bottlenecked by game engine. No matter what settings i've used - even on 720p(NIS enable) .feat low it flows around 40-55 FPS with average of ~50.
Probably low populated small settelements have better perfomance. But FDev definitely should work on it hard.
 
From what I gather the big plus of NIS over FSR plus sharpening is that NIS works on any game. But given ED already has FSR I dont see the advantage of NIS in any concrete terms. DLSS would be cool, but other than that FSR is no worse than NIS.

Kinda makes OP a bit silly. Correct title would be "Nvidia continues to do what AMD also does and FD already implemented".
 
Last edited:
Except settlements - seems it's bottlenecked by game engine. No matter what settings i've used - even on 720p(NIS enable) .feat low it flows around 40-55 FPS with average of ~50.
Probably low populated small settelements have better perfomance. But FDev definitely should work on it hard.

Settlements are often CPU limited. Yes, Frontier should work on it, and probably are. However, I'm tempering my expectations about what sort of gains they will be willing spend the effort to achieve here.

From what I gather the big plus of NIS over FSR plus sharpening is that NIS works on any game. But given ED already has FSR I dont see the advantage of NIS in any concrete terms. DLSS would be cool, but other than that FSR is no worse than NIS.

The differences between NIS and the game's FSR implementation are situational and mostly subjective, but no less real. They certainly don't look identical, even if one tries to match them.

One potentially major advantage for NIS is that the sharpening filter is tunable. EDO's FSR implementation doesn't allow this. At some point they removed the "FFXRCASIntensity" variable, replacing it with the old "FFXCASIntensity", which doesn't affect FSR. Actually, I can't even be 100% sure the former variable was working when they first introduced FSR and had removed the CAS-only option, or if some methodological failure in my earlier testing lead me to mistake some other option I was messing with at the time for a change in FSR's sharpening (I should probably document things better, for posterity). Regardless, in the current live build of the game, FSR has no sharpeness control (I tried reinserting "FFXRCASIntensity" into the current build, with no effect), and this gives NIS a pretty large edge in customization.

A big downside, for me, with NIS, is that it doesn't seem to apply scaling until very late in the display pipeline, which means I can't capture it's effects for comparison, nor for things like video for uploading. This is a big deal for things like YouTube which need 1440p+ source media to use VP9 and will scale non-standard resolutions quite unfavorably...at best this results in an extra step of scaling and retranscoding all captures taken with NIS prior to uploading them if I want the best results from media hosting sites. All the other in and out of game scaling methods I've tried allow me to capture the output resolution in software.
 
EDO's FSR implementation doesn't allow this.
It's a recommendation from AMD to provide standardised profiles that are consistent across games. It's therefore logical not to allow users to modify profiles.
But nothing prevents developers from using an internal resizer and simply adding the CAS on top to give the user more freedom.
 
A big downside, for me, with NIS, is that it doesn't seem to apply scaling until very late in the display pipeline, which means I can't capture it's effects for comparison, nor for things like video for uploading.

If you are using the NVIDIA driver scaling, then the scaling is done by the driver, hence why it's done so late.

A proper implementation will use the SDK, and like FSR, hook into the pipeline earlier.

The call into the NVIDIA Image Scaling shaders must occur during the post-processing phase after tone-mapping. Applying the scaling in linear HDR in-game color-space may result in a sharpening effect that is either not visible or too strong. Since sharpening algorithms can enhance noisy or grainy regions, it is recommended that certain effects such as film grain should occur after NVScaler or NVSharpen. Low-pass filters such as motion blur or light bloom are recommended to be applied before NVScaler or NVSharpen to avoid sharpening attenuation.

 
It's a recommendation from AMD to provide standardised profiles that are consistent across games. It's therefore logical not to allow users to modify profiles.
But nothing prevents developers from using an internal resizer and simply adding the CAS on top to give the user more freedom.

We can still modify the scaling factor used by EDO's FSR implementation, though only unofficially. It would be nice to be able to control the strength of the sharpener as well, AMD recommendation or not, as not everyone is going to like the defaults (which in EDO's case does the least of all available options to mitigate aliasing).

Yes, the game has always had an internal scaler and more recently (but well prior to FSR) added CAS as well, but it's not the same as FSR.

They can keep the predefined profiles, but I do wish there was an advanced option with sliders for both FSR resolution scale and sharpening.

If you are using the NVIDIA driver scaling, then the scaling is done by the driver, hence why it's done so late.

A proper implementation will use the SDK, and like FSR, hook into the pipeline earlier.




The old version of NIS and AMD's GPU Scaling driver options have the same issue.

That said the lack of NIS integration with the game is one of it's major disadvantages vs. the integrated options (FSR and FedelityFX CAS).
 
Last edited:
Perhaps someone can help me understand why I can't use NIS in Elite Odyssey on my 3440x1400 21:9 monitor.
All NIS resolutions result in a stretched screen, where image ratio is not respected. Either I'm doing something wrong or ultra-wide is not supported.
 
Perhaps someone can help me understand why I can't use NIS in Elite Odyssey on my 3440x1400 21:9 monitor.
All NIS resolutions result in a stretched screen, where image ratio is not respected. Either I'm doing something wrong or ultra-wide is not supported.

Probably ultra-wide is not supported.
What resolutions choice you have in settings aften enabling NIS?
 
Top Bottom