cross-posted from: https://lemmy.world/post/11840660

TAA is a crucial tool for developers - but is the impact to image quality too great?

For good or bad, temporal anti-aliasing - or TAA - has become a defining element of image quality in today’s games, but is it a blessing, a curse, or both? Whichever way you slice it, it’s here to stay, so what is it, why do so many games use it and what’s with all the blur? At one point, TAA did not exist at all, so what methods of anti-aliasing were used and why aren’t they used any more?

  • pastermil@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    9 months ago

    Interesting take. Do you think that natural image softening would come back in newer technologies?

    • Kolgeirr@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      ·
      9 months ago

      I’m not that guy, but I don’t think so. The trend will likely be that we get to the point where we render and display in such a high resolution that you can’t even see pixels anymore. We’re getting there already with smaller 4k displays where turning on AA doesn’t have an appreciable difference in 4k native rendering.

      • RightHandOfIkaros@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        9 months ago

        I agree with this. Outside of some media that may release with special effects designed to mimic the softer image of a CRT, I think display technology will just progress to the point where nothing will use AA at all because the resolution is just too high to really tell. I mean, its already like that with 4k TVs, you sit far away enough that you usually can’t tell the difference between 4k and 1080p.