Page 2 of 2 FirstFirst 12
Results 16 to 26 of 26

Thread: Anti-aliasing

  1. #16
    Originally Posted by Backer42 View Post (Source)
    VR stuff is essentially two generations behind graphics-wise, which mean you get PS2VR. ;-)
    To be fair, some PSVR games are actually pretty good. Skyrim on my vanilla PS4 looks better in places than Skyrim PS3 did. I hope to upgrade to PS4 Pro during a holiday sale, which will make things better still. Heck, the Battlefront VR mission often looks better than 2D Elite Dangerous on PS4! But to your point, there are plenty of PS2-like VR games that are fun but primitive-looking (Ultrawings comes to mind).

  2. #17
    Originally Posted by Backer42 View Post (Source)
    Well, 1 pixel lines don't work anymore in a scenario where 1 pixel is too small to be visible (4K and up).

    Anyway classic AA was only about smoothing geometry edges, while texture filtering is about fighting pixel aliasing in textures and I was interpreting the OP to be about the first one. I still think smoothing geometry edges is a non-issue with (U)HD viewport resolutions.

    Of course, you have to apply some filter to textures to scale them to the viewport resolution without aliasing (while not losing sharpness). However I never perceived texture aliasing as an issue or texture filtering lacking in fidelity, but maybe I'm not Chris Roberts enough. That's why I didn't miss stuff like TAA and saw no reason to increase spending on PC GPUs until now.

    Also somewhere beyond 2K viewport resolution becomes a non-issue for me too, because my vision doesn't get better either.
    2k is still industry standard, even on PC. 4k is the near future but we're still not there. Most PC gamers use 1060 or 1050ti which won't cut it for 4k. And the most popular resolution is still 1080p among the PC users. This means we still need good anti-aliasing like temporal AA that produces a soft image with almost no pixel artifacts.

    Like I said before, 4k in some games is far from enough to eliminate aliasing. In Elite's case I remember that even at 4k there were still aliasing. And in games with heavy use of post processing and pixel distortion effects, aliasing is even more visible despite using UHD resolutions.

    I guess it all comes down to personal preferences. Personally though, I hate aliasing and it takes away a huge chunk of immersion when I see it. Like a eyelash in the eye.

  3. #18
    Originally Posted by Insomnia View Post (Source)
    2k is still industry standard, even on PC. 4k is the near future but we're still not there. Most PC gamers use 1060 or 1050ti which won't cut it for 4k. And the most popular resolution is still 1080p among the PC users. ....

    I guess it all comes down to personal preferences....
    I'd personally much rather have higher LOD, AA, and framerate on my nice 1080p display than sacrifice those for 4K. There's a whole lot more that contributes to image quality than pixel count. We still have a way to go before video games start looking like the National Geographic channel in 1080p, let alone 4K.

  4. #19
    Originally Posted by Old Duck View Post (Source)
    I'd personally much rather have higher LOD, AA, and framerate on my nice 1080p display than sacrifice those for 4K. There's a whole lot more that contributes to image quality than pixel count. We still have a way to go before video games start looking like the National Geographic channel in 1080p, let alone 4K.
    Ikr.
    It's funny when you see those "how does old games look in 4k" videos on YouTube. It's like people think they'll look awesome just because the pixel count is up. Sure, it almost completely removes the aliasing due to the low geometry complexity, but those games don't have those fancy shaders and textures that makes games look good.

  5. #20
    Originally Posted by Insomnia View Post (Source)
    2k is still industry standard, even on PC. 4k is the near future but we're still not there.
    The current console generation (which is almost two years on the market now) is rendering to 2160p (and downsampling to 1080p if needed). So 2K is last-gen in 2018 and the new industry standard is 4K.

    Most PC gamers use 1060 or 1050ti which won't cut it for 4k. And the most popular resolution is still 1080p among the PC users.
    And it won't move much from there, because PC GPUs are now mostly used for mining "crypto currency", so PCs are stuck with what is left.

  6. #21
    Originally Posted by Backer42 View Post (Source)
    The current console generation (which is almost two years on the market now) is rendering to 2160p (and downsampling to 1080p if needed). So 2K is last-gen in 2018 and the new industry standard is 4K.


    And it won't move much from there, because PC GPUs are now mostly used for mining "crypto currency", so PCs are stuck with what is left.
    Depends on how you look at it.
    Current console generation is 5 years and Sony has sold 74m copies of their PlayStation (MS sold around 35m Xbone). The Ps4 Pro is a fraction of that number and was released late 2016.
    So the majority of people still play at 2k.
    However, that will most likely change when the Ps5 is released.

  7. #22
    Originally Posted by Insomnia View Post (Source)
    Depends on how you look at it.
    Current console generation is 5 years and Sony has sold 74m copies of their PlayStation (MS sold around 35m Xbone).
    That is the last generation now. BTW: PS2 sold over 100m and lived until 2014, yet it was not current generation since 2006.

    The Ps4 Pro is a fraction of that number and was released late 2016.
    Microsoft released another one too. Technically these are current, despite the sales.

    So the majority of people still play at 2k.
    That's always the case, that people do not upgrade to the newest stuff immediately. Thing is: Just because the current generation is forward and backward compatible with the last one, doesn't mean that visuals won't degrade on the previous gen. In fact we already see frame rates taking a dive there, because of much more GPU power available on the lastest tech.

    So while developers could explore AA to help five years old PS4, Xbone and PCs with GTX 750 Ti with image quality, I don't think this going to happen. Instead they will focus on 4K and cut enough LoD until their stuff barely runs on the 2K-only platforms - with AA most likely being disabled to help with performance.

  8. #23
    According to the GeForce specs page, TAA is supported in Jurassic World. Now it's in the Cobra engine, fingers crossed it'll hit ED with the Q4 graphics update.

    https://www.geforce.com/whats-new/ar...phics-settings

  9. #24
    In ED I have just enough surplus performance to get away with 1.5x SSAA + SMAA at the custom settings I prefer, which does a pretty good job removing jaggies. I do wish we had more AA modes available though. SSAA despite being available for almost everything, one way or another, and producing some of the best results IQ wise, is generally a pretty poor trade off, performance wise.

    Originally Posted by Backer42 View Post (Source)
    Well, 1 pixel lines don't work anymore in a scenario where 1 pixel is too small to be visible (4K and up).
    You can easily see a 1 pixel line on even a small 4k display, if the contrast between that line and the rest of the scene great enough.

  10. #25
    Anyone try this gizmo with Elite Dangerous?

    https://www.pcgamer.com/this-119-hdm...emove-jaggies/

    If it were $50 instead of $100, I would be very tempted to get one as a console gamer.

  11. #26
    Originally Posted by Old Duck View Post (Source)
    Anyone try this gizmo with Elite Dangerous?

    https://www.pcgamer.com/this-119-hdm...emove-jaggies/

    If it were $50 instead of $100, I would be very tempted to get one as a console gamer.
    It's basically a cheaper version of SMAA built in the chip. It doesn't really help at 2k, but it will help with low resolutions like 720p.

    I think Linus Tech Tips reviewed it.

Page 2 of 2 FirstFirst 12