Mind Matters Natural and Artificial Intelligence News and Analysis
skull-assorted-poison-bottles-stockpack-adobe-stock
Skull Assorted Poison Bottles

Artists Strike Back!: New Tool “Poisons” Images Pirated by AI

Nightshade, developed by University of Chicago computer science prof Ben Zhao, makes the AI generator keep giving you a cat when you ask for a dog
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Last week, lawyer Richard Stevens warned that the “instant art” produced by AI image generators like Midjourney or DALL-E is probably violating the copyrights of the producers whose art is swept up in the mixture. Many artists are suing. But the artists may also have another fix in store for the digital pirates.

Australian digital media profs T. J. Thomson and Daniel Angus warn that artists can engage in data poisoning:

Imagine this. You need an image of a balloon for a work presentation and turn to a text-to-image generator, like Midjourney or DALL-E, to create a suitable image.

You enter the prompt: “red balloon against a blue sky” but the generator returns an image of an egg instead. You try again but this time, the generator shows an image of a watermelon.

What’s going on?

The generator you’re using may have been “poisoned”.

T. J. Thomson and Daniel Angus, Data poisoning: how artists are sabotaging AI to take revenge on image generators, The Conversation, December 17, 2023

How does it happen? A new tool, Nightshade, developed at the University of Chicago, poisons the data used to generate images, making it useless:

Briefly, Nightshade enables artists to change pixels in a way that the human eye cannot see but the AI can, with monstrous results. The underlying assumption is that manipulators of pirated images will lose interest if, say, a “high school sweetheart” prompt keeps generating images of a dire wolf — and it takes the programmers hours to even figure out just where the corrupted data is.

Here’s a more technical description of how Nightshade works, as tried on AI image generator Stable Diffusion:

Nightshade targets the associations between text prompts, subtly changing the pixels in images to trick AI models into interpreting a completely different image than what a human viewer would see. Models will incorrectly categorize features of “shaded” images, and if they’re trained on a sufficient amount of “poisoned” data, they’ll start to generate images completely unrelated to the corresponding prompts. It can take fewer than 100 “poisoned” samples to corrupt a Stable Diffusion prompt, the researchers write in a technical paper currently under peer review.

Take, for example, a painting of a cow lounging in a meadow.

“By manipulating and effectively distorting that association, you can make the models think that cows have four round wheels and a bumper and a trunk,” Zhao told TechCrunch. “And when they are prompted to produce a cow, they will produce a large Ford truck instead of a cow.”

Morgan Sung, “Nightshade, the tool that ‘poisons’ data, gives artists a fighting chance against AI,” TechCrunch, January 26, 2024

The free tool has certainly proven popular with the artists. Developer Ben Zhao, a University of Chicago computer science prof, told Venturebeat, “Nightshade hit 250K downloads in 5 days since release … I expected it to be extremely high enthusiasm. But I still underestimated it…The response is simply beyond anything we imagined.”

Overall, Nightshade may prove more useful to artists than lawsuits do. Picture what would happen to shoplifting if stores could tag their merchandise so that anything removed from the store without being paid for turned instantly into turnips… Shoplifters would really need to like turnips… But let’s see what happens next.

Note: Some of the featured images used here at Mind Matters News are produced by generative AI. However, the generator used by our main supplier, Adobe Stock, is trained only on images that the firm owns or has a licence to use. We have not knowingly used images that violate copyright via generative AI.

You may also wish to read: Cyber Plagiarism: When AI systems snatch your copyrighted images Outright copying of others’ images may put system’s owners in legal jeopardy, Let’s look at U.S. legal decisions. The AI companies offering the image-creating services need Robot from Lost in Space in their legal departments waving its arms, crying out: “Warning! Danger!” (Richard Stevens)


Mind Matters News

Breaking and noteworthy news from the exciting world of natural and artificial intelligence at MindMatters.ai.

Artists Strike Back!: New Tool “Poisons” Images Pirated by AI