Generative AI Is Creating a Copyright Crisis for Artists
How does an artist assert copyright when her image was only one of many used to create a new image? How does she make a living if she can’t?A recent article in Issues in Science and Technology asks us to consider what happens when the AI systems scarf up images from across the internet:
Stable Diffusion alone has harvested 5 billion images and text captions. The Common Pool offers 12.8 billion.
Kate Crawford, a researcher of AI’s impact based at USC Annenberg and New York University professor of clinical law Jason Schultz spell out what that means for artists who hope to make a living from their work:
Their work is being used to train AI systems, which can then create images and texts that replicate their artistic style…
The billions of works produced by generative AI are unowned and can be used anywhere, by anyone, for any purpose. Whether a ChatGPT novella or a Stable Diffusion artwork, output now exists as unclaimable content in the commercial workings of copyright itself. This is a radical moment in creative production: a stream of works without any legally recognizable author.
Kate Crawford, Jason Schultz, “Generative AI Is a Crisis for Copyright Law,” Issues in Science and Technology, January 16, 2024
Essentially, the system is outdated
The problem, Crawford and Schultz say, is that copyright law, as currently framed, does not really protect individuals under these circumstances. That’s not surprising. Copyright dates back to at least 1710 and the issues were very different then.
For one thing, as Jonathan Bartlett pointed out last December, when the New York Times launched a lawsuit for copyright violation against Microsoft and OpenAI, everyone accepted that big search engines have always violated copyright. But if they brought people to your site, while saving and using your content for themselves, you were getting something out of it at least.
But it’s different with generative AI and the chatbot. They use and replace your content. Users are not going back to you for more. OpenAI freely admits that it violates copyright but relies on loopholes to get around legal responsibility.
As the lawsuits pile up, it’s clear that gen AI and chatbots can’t work without these billions of images and texts. So we either do without them or we find a way to compensate the producers.
But if users pay, remember that the end user is you
Robert J. Marks points out that systems for paying original artists exist now. “Today’s Spotify keeps automatic records of song frequency and, from subscriber’s payments, distributes royalties accordingly. Similar methods could be applied to compensate content creators by generative AI.” (Newsmax January 18, 2024) But they would require record keeping on the part of the currently secretive AI system owners.
They would require something else too, Marks notes: End users will end up paying somehow to use the glitzy new AI services. It makes sense. The reason chatbots are free and Disney movies are not is partly because of all the people who aren’t getting paid by the AI vendors — as opposed to the people who are getting paid by Disney.
The deepest issue is that humans are creative and AI is not
When only humans were original producers, copyright law generally provided sufficient protection. AI systems get around that by mixing, melding, and collating at enormous volumes which can often stand in for original creation — for a time. But AI needs constant rejuvenation from original creators; otherwise we can routinely expect the jackrabbit problem (model collapse — a descent into rubbish due to regurgitating the same contents). But people won’t produce that content for them if they are not paid.
Overall, the issue probably won’t be resolved by the application of current law; we will need to move toward a system that recognizes the new technology as a factor and works out a scheme for compensating the producers of the material it uses.
You may also wish to read: Cyber Plagiarism: When AI systems snatch your copyrighted images. Outright copying of others’ images may put system’s owners in legal jeopardy, Let’s look at U.S. legal decisions. The AI companies offering the image-creating services need Robot from Lost in Space in their legal departments waving its arms, crying out: “Warning! Danger!”
and
Model collapse: AI chatbots are eating their own tails. The problem is fundamental to how they operate. Without new human input, their output starts to decay. Meanwhile, organizations that laid off writers and editors to save money are finding that they can’t just program creativity or common sense into machines.