A new tool lets artists add invisible changes to the pixels in their art before they upload it online so that if it’s scraped into an AI training set, it can cause the resulting model to break in chaotic and unpredictable ways.

The tool, called Nightshade, is intended as a way to fight back against AI companies that use artists’ work to train their models without the creator’s permission.
[…]
Zhao’s team also developed Glaze, a tool that allows artists to “mask” their own personal style to prevent it from being scraped by AI companies. It works in a similar way to Nightshade: by changing the pixels of images in subtle ways that are invisible to the human eye but manipulate machine-learning models to interpret the image as something different from what it actually shows.

    • AnonTwo@kbin.social
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      The general argument legally is that the AI has no exact memory of the copyrighted material.

      But if that’s the case, then these pixels shouldn’t need be patched. Because it wouldn’t remember the material that spawned them.

      Is just the argument I assume would be used.

      • 📛Maven@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 year ago

        It’s like training an artist who’s never seen a banana or a fire hydrant, by passing them pictures of fire hydrants labelled “this is a banana”. When you ask for a banana, you’ll get a fire hydrant. Correcting that mistake doesn’t mean “undoing pixels”, it means teaching the AI what bananas and fire hydrants are.

      • FaceDeer@kbin.social
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        Well, I guess we’ll see how that argument plays in court. I don’t see how it follows, myself.

      • FaceDeer@kbin.social
        link
        fedilink
        arrow-up
        7
        ·
        1 year ago

        In order to violate copyright you need to copy the copyrighted material. Training an AI model doesn’t do that.