TEMPORAL ANTI-ALIASING like this would be awesome in Elite Dangerous.

I strongly belive their R&D team is well aware of the technical solutions.

Btw. congrats for being the first triple Elite Cmdr!
 
Try using SS=2.0. It does a noticeably better job than 1.5. Lately, I've been running 2560x1440 at ss=2.0. I get a tolerable frame rate (~30-60 fps) on my GTX 980 Ti. The crawling lines are still present, but the artifacts are reduced. This seems to be the best compromise for my system. It's a shame that I can't use 4k, which is the native res of my monitor, but it's too slow with SS=2 and too annoying to run with SS=1.0 or 1.5.

Thanks for the tips. The original post is now out of date by a bit. I have 980ti and while I could run these resolutions I've also moved on to play in VR. Well that was the plan anyway, but Vive's implementation ED and SteamVR is not very satisfactory at the moment. I'm also getting a rift to do a one to one compare on my rig and I'll see how it goes, but SS2x is not an option in VR regardless the Graphics card as has been confirmed by most on the VR forum. A good AA solution would go really far for those on weaker Graphics Cards or those who play in VR.
 
Is there any generic 3rd party TAA injectors that can be used for ED?
Yes, here's one: Reshade Injector 3.0. You can find a link here: https://forums.frontier.co.uk/showt...ovement-with-Reshade-Injector-3-0-f-by-Moiker

Unfortunately, none of them will help fix the crawling edges / anti-alias issue.

ED uses deferred rendering, which can not be combined with standard AA. The only thing that I've found which helps at all is super-sampling. In the ED options, SS=2.0 is the only value which really helps; the other values introduce more artifacts.

NVidia's control panel offers a more advanced form of SS (it includes a blur filter), and fractional settings work better, but 4x is best. NVidia uses different values than ED SS, so 4x == 2*x and 2*y, which is the same as ED 2.0. NVidia 2x is roughly the same as ED 1.5.

To put this in perspective, when I run my 4K monitor at native resolution and ED SS=2.0, I still find the crawling edges to be annoyingly visible (and my framerate is only ~10-25 fps). ED SS=3.0 or 4.0 (which aren't available) are probably required to reduce the crawl to an acceptable level. That's not reasonable, until more powerful video cards are released in a year or 2 (like say a GTX 1180 or 1280). Until then, the only possible solution would be FD implementing some form of temporal AA. I did try setting ED SS to 2 and NVidia SS to 4 (to achieve a 4*x and 4*y SS), but the game crashes, so that's not a viable solution, even though it should have fixed the crawling (and would yield ~2 FPS on my GTX 980 Ti).
 
Last edited:
Thanks for the info NW3! I was wondering if the Reshade or others may do it, but if you still see the crawling edges/jaggies then probably wont help me a great deal.

Upgrading to future card aside, I hope something else can be done to minimise it a bit until then.
 
Yes, here's one: Reshade Injector 3.0NVidia uses different values than ED SS, so 4x == 2*x and 2*y, which is the same as ED 2.0. NVidia 2x is roughly the same as ED 1.5.
DSR factor 2.25x actually IS 1.5x SSAA (in ED).
 
Yes please!

The jaggies in ED are annoying!

Most games have their version of Temporal anti aliasing. Even ReShade provide a form of TAA, or rather temporal reprojection anti aliasing, albeit a generic version so not as refined as code specific versions.
There's no reason for FD anymore to not implement it. Deferred rendering is here to stay and developers need to work more with eliminating texture crawling and shimmering with optimised methods such as TAA.
 
AA becomes more irrelevant the higher the resolution for your size of display screen. Even 1080p on a 17inch screen means I can turn AA off and not notice the difference in the majority of games I've played. High pixel density counters the need for AA. It's a good way of upping fps without losing visuals.
 
AA becomes more irrelevant the higher the resolution for your size of display screen. Even 1080p on a 17inch screen means I can turn AA off and not notice the difference in the majority of games I've played. High pixel density counters the need for AA. It's a good way of upping fps without losing visuals.

I think you're a little off here... First of all, 17"? No, 27" here. For me, 24" is a minimum. And at 1080p aliasing becomes very apparent.
Not only that, nowadays we have complex geometries, high contrast effects and lens distortion effects that strengthen the presence of aliasing.
In the end I think it just boils down to how acceptable you and I are towards aliasing in general. :)

Edit: Elite Dangerous features some of the worst cases of aliasing I've seen, even at 4k. Frontier need to do better here. Supersampling at 4k just to get rid of the aliasing is a ridiculous solution.
 
Last edited:
I think you're a little off here... First of all, 17"? No, 27" here. For me, 24" is a minimum. And at 1080p aliasing becomes very apparent.
Not only that, nowadays we have complex geometries, high contrast effects and lens distortion effects that strengthen the presence of aliasing.
In the end I think it just boils down to how acceptable you and I are towards aliasing in general. :)

Edit: Elite Dangerous features some of the worst cases of aliasing I've seen, even at 4k. Frontier need to do better here. Supersampling at 4k just to get rid of the aliasing is a ridiculous solution.

17" monitors, where I live, haven't been on sale for years in normal desktop formats. :D
27" 2560x1440 is my absolute minimum nowadays, and that's after downgrading the monitor size from a 32" 1080p a couple years ago (32" 1440p simply wasn't available)

But yeah, even at my current 27" running at 2560x1440 (with maximum graphics) the jaggies are horrible.
I tried going down to 1080p for recording purposes, but with Elite's bad AA, that quality is just unacceptable for me.
 
AA becomes more irrelevant the higher the resolution for your size of display screen. Even 1080p on a 17inch screen means I can turn AA off and not notice the difference in the majority of games I've played. High pixel density counters the need for AA. It's a good way of upping fps without losing visuals.

I wish I could agree with you, but unfortunately I cannot. I find the crawlies extremely annoying on my 28" screen.

I think you're a little off here... First of all, 17"? No, 27" here. For me, 24" is a minimum. And at 1080p aliasing becomes very apparent.
Not only that, nowadays we have complex geometries, high contrast effects and lens distortion effects that strengthen the presence of aliasing.
In the end I think it just boils down to how acceptable you and I are towards aliasing in general. :)

Edit: Elite Dangerous features some of the worst cases of aliasing I've seen, even at 4k. Frontier need to do better here. Supersampling at 4k just to get rid of the aliasing is a ridiculous solution.
I agree. The worst part is that running supersampling at 2x on top of 4k resolution still does not fix the problem, it only reduces the artifacts somewhat (and completely kills the framerate of my GTX 980 Ti).

It's worse on some ships, due to the alignment of the cockpit frames. A Type-7 is absolutely horrible; an Asp is acceptable (barely). The high detail present in ED's models of space stations exacerbates the problem.

I really wish they would add temporal antialiasing ASAP!
 
Last edited:
Hopefully with the PS4 Pro release, checkerboard rendering will find its way to the PC. It would dramatically reduce the cost of DSR.
 
Last edited:
17" monitors, where I live, haven't been on sale for years in normal desktop formats. :D
27" 2560x1440 is my absolute minimum nowadays, and that's after downgrading the monitor size from a 32" 1080p a couple years ago (32" 1440p simply wasn't available)

But yeah, even at my current 27" running at 2560x1440 (with maximum graphics) the jaggies are horrible.
I tried going down to 1080p for recording purposes, but with Elite's bad AA, that quality is just unacceptable for me.


If current quality is unacceptable for you perhaps you should have invested in better GPU and not in bigger/higher resolution display?
Since I am a peasant I have only 23" 1080p display but with SMAA enabled shimmering/artifacting is minimal and I can assurre you that TXAA is not some end all technology that would make other AA techniques irrelevant.
So you can allready get rid of aliasing but you will get hit in performance. The same would be true with TXAA if it was implemented.
 
If current quality is unacceptable for you perhaps you should have invested in better GPU and not in bigger/higher resolution display?
Since I am a peasant I have only 23" 1080p display but with SMAA enabled shimmering/artifacting is minimal and I can assurre you that TXAA is not some end all technology that would make other AA techniques irrelevant.
So you can allready get rid of aliasing but you will get hit in performance. The same would be true with TXAA if it was implemented.
I respectfully disagree. SMAA does not get rid of the artifacts and I can NOT "already get rid of aliasing" even with a substantial "hit in performance". To a large degree, aliasing is a subjective issue (it bothers some people more than others). I find it highly annoying. The problem is that ED uses deferred rendering, so standard AA techniques can't be used (only post-process blurring filters are available).

FD needs to add temporal AA, period.
 
If current quality is unacceptable for you perhaps you should have invested in better GPU and not in bigger/higher resolution display?
A 27" monitor is a fraction of the costs for a new GPU that can handle 2x supersampling at 4k... Also, should consumers be the ones to blame for aliasing? Not on my planet. Technology is developed by the devs to make the experiences better for the consumers.

Since I am a peasant I have only 23" 1080p display but with SMAA enabled shimmering/artifacting is minimal and I can assurre you that TXAA is not some end all technology that would make other AA techniques irrelevant.
So you can allready get rid of aliasing but you will get hit in performance. The same would be true with TXAA if it was implemented.

TXAA and TAA are not the same. TXAA is extremely costly in performance and only limited to Nvidia cards. TAA doesn't need to be. Look at today's top games. Most of them have their own iteration of TAA which produces a very crisp image.
SMAA is not enough for getting rid of aliasing in ED. It wasn't enough in Alien Isolation. It's simply not enough. And just upping the resolution is silly. 1080p is industry standard with 1440p rapidly becoming more and more popular. We haven't entered the 4k era yet, and already I hear devs talking about 16k... I'd rather have better pixel density than increasing the resolution to fantasy numbers.
Deferred rendering is here to stay. And by all means I agree with you that TAA isn't some kind of end-all tech that makes all other AA obsolete. But it's the best method we have with today's rendering modes, thanks to the offset and temporal parameters. And since TAA is entirely engine specific, the actual outcome of that method is in the hands of the devs and they can optimise their technique to suit their own engine and assets.
Your argument is flawed.
 
  • Like (+1)
Reactions: NW3
A 27" monitor is a fraction of the costs for a new GPU that can handle 2x supersampling at 4k...

SMAA is not enough for getting rid of aliasing in ED. It wasn't enough in Alien Isolation. It's simply not enough.
I can run 2x SS at 4K. It doesn't get rid of the jaggies; it just reduces them somewhat. I agree. ED needs temporal AA badly.
 
Last edited:
I can run 2x SS at 4K. It doesn't get rid of the jaggies; it just reduces them somewhat. I agree. ED needs temporal AA badly.

Yeah there's so many geometries at very long distances it's impossible to completely get rid of it.
We don't have any depth of field effects of chromatic aberration effects that masks the aliasing either. In a way, it's a very honest presentation.
 
Back
Top Bottom