A new tool lets artists add invisible changes to the pixels in their art before they upload it online so that if it’s scraped into an AI training set, it can cause the resulting model to break in chaotic and unpredictable ways.

The tool, called Nightshade, is intended as a way to fight back against AI companies that use artists’ work to train their models without the creator’s permission.
[…]
Zhao’s team also developed Glaze, a tool that allows artists to “mask” their own personal style to prevent it from being scraped by AI companies. It works in a similar way to Nightshade: by changing the pixels of images in subtle ways that are invisible to the human eye but manipulate machine-learning models to interpret the image as something different from what it actually shows.

  • @TheWiseAlaundo
    link
    English
    318 months ago

    Lol… I just read the paper, and Dr Zhao actually just wrote a research paper on why it’s actually legally OK to use images to train AI. Hear me out…

    He changes the ‘style’ of input images to corrupt the ability of image generators to mimic them, and even shows that the super majority of artists even can’t tell when this happens with his program, Glaze… Style is explicitly not copywriteable in US case law, and so he just provided evidence that the data OpenAI and others use to generate images is transformative which would legally mean that it falls under fair use.

    No idea if this would actually get argued in court, but it certainly doesn’t support the idea that these image generators are stealing actual artwork.

    • @[email protected]
      link
      fedilink
      English
      13
      edit-2
      8 months ago

      So tl;dr he/his team did two things:

      1. argue the way AI uses content to train is legal
      2. provide artists a tool to prevent their content being used to train AI without their permission

      On the surface it sounds all good, but I can’t help but notice a future conflict of interest for Zhao should Glaze ever become monetized. If it were to be ruled illegal to train AI on content without permission, tools like Glaze would be essentially anti-theft devices, but while it remains legal to train AI this way, tools like Glaze stand to perhaps become necessary for artists to maintain the pre-AI status quo w/r/t how their work can be used and monetized.