The reason I asked is when you google TXAA loads of NVidia stuff turns up? eg: - http://www.geforce.co.uk/hardware/technology/txaa/technology
That's because TXAA is part of Nvidia Gameworks while the term TAA is just a generic term.
The reason I asked is when you google TXAA loads of NVidia stuff turns up? eg: - http://www.geforce.co.uk/hardware/technology/txaa/technology
That's because TXAA is part of Nvidia Gameworks while the term TAA is just a generic term.![]()
Got an answer from the TAA mod author:
If Elite: Dangerous is still under active development, the devs should just do it. Temporal AA should be pretty simple for the developers to implement. The most difficult part is the temporal reprojection (aka finding velocity vectors) of complex animated objects, such as animated characters and vegetation. In the case of Elite, it's all just rigid objects as far as I know. It shouldn't take the developers more than a week to implement a basic version of Temporal AA, especially that open source implementations thereof exist, and considering that they have implemented a gamma of other post-process anti-aliasing methods."
I would imagine Nvidia WANT it to be a general solution. After all, AA has been something of a technological bugbear for many years. Cripes, since 1960 when you think about it (apart from on Vector displays...).
I would imagine the Nvidia shader/pipe team will be feverishly working on getting this whole pipe buffed, with documentation out there to help game engine developers implement it.
I was thinking, for VR, this could be even more efficient. Because, not only can use re-use samples from previous frames. But you can also SHARE these samples across eyes. Each eye renders a separate, though similar pair of views. So an extension of TAA, is to combine these sample streams from each eye (each eye, rendering, ideally at 90fps).
I think Nvidia already have a lot of this in the bag. As the Pascal drivers allow for sharing of samples between eyes. Nvidia did this, so that rendering two view of the same basic scene, becomes basically free.
I guess, if we can get TAA, combined with two eye views, that visuals in VR will actually take something of a leap forward.
Was my guess, that this method is not especially difficult to implement. Especially when they drop DX10 and 32-bit support to make the rendering code leaner. Now, where do I go and nag so it gets on Frontier's list again?
I have been messing around in UNITY for ages now and AA is one of the weakest points of the engine and Sadly Cobra seems to be having similar issue as far as I can see on Elite Dangerous. This type of AA would be amazing especially if the footprint is really as low as promised.. Sadly it is developed for Unity, but maybe FD devs will take notice and try to come up with their own variation ( I know that it would take a lot of man hours to do, but the results are great never the less) Gosh some of the shots from this trailer look like they were almost inpsired by Elite Dangerous.
https://www.youtube.com/watch?v=VWRfLhOY2AA
What do you guys think? Or maybe you have tips on how to get better AA in Elite Dangerous without cooking your systems, then please share them.
I have GTX 760 OC and I can run it on Ultra with AA - SMAA and supersampling 1.5 with only rare gitters in the station, but sadly there are still many flickers and jagged lines. I've also tried FX mod and at this point I prefer to use their SMAA as it looks a bit better, but still not amazing.
Moral of the day.
Do not do serious projects in casual (Unity) or even homebrewed game engines. It wastes time of devs, money and energy and in the end, you will end with subpar.
It was wrong and ED is all the poorer for it.
Problem is not doing everything, problem is doing everything in a top tier fashion. Impossible for small uk company. Cryo, Unreal or Unigine have large dedicated teams who live and breathe gpu stuff. FDev? Judging from ED progress.. 9 to 5 attitude.
Comparing VR lab AA with ED is a bit silly. ED throws around significantly more geometry, effects, lighting and textures. VR lab is design from the ground up to minimise aliasing and to afford super sampling at 90Hz stereo. You need a beast of a card to do that with ED because it's doing so much more.
Any deferred renderer like ED's required high cost SSAA until pretty much this year. Games that could afford it were essentially PS3/X360 ports running on better hardware.
Games that have been develop from the ground up to use evolved TAA have only just started appearing.
We've no idea how much shader rework building it into ED would take. We've also no idea how well elements like space dust and orbital lines would be dealt without serious effort to avoid artifacts.
There is a difference between a technique being around and widely used in projects. There also a different between using it in a game released in 2016 vs adding it back into a game started in 2013, where there are many competing programmer and artist tasks and more associated risk of breaking something.
I certainly would like the devs to be looking at it too, or I wouldn't have restarted this thread. I disagree with the lazy devs slant to your post though.![]()
I have GTX 760 OC and I can run it on Ultra with AA - SMAA and supersampling 1.5 with only rare gitters in the station, but sadly there are still many flickers and jagged lines.
1.5SS looks actually worse than native resolution (FHD in my case). If you want good 1.5SS, use nvidia DSR factor 2.25. The downsampling filter is far superior than the cobra engine's.Try using SS=2.0. It does a noticeably better job than 1.5. Lately, I've been running 2560x1440 at ss=2.0. I get a tolerable frame rate (~30-60 fps) on my GTX 980 Ti. The crawling lines are still present, but the artifacts are reduced. This seems to be the best compromise for my system. It's a shame that I can't use 4k, which is the native res of my monitor, but it's too slow with SS=2 and too annoying to run with SS=1.0 or 1.5.