The capacities of AI art generators have grown much in the past couple of years. Through complex algorithms, AI scans the internet and manages to make artistic composites, some sublime, others grotesque. Today, AI art generators have incredible potential, but their capacities can also be easily abused.
According to a Wired article from September 21, Science fiction novelist Elle Simpson-Edin wanted to generate artwork for her newest book. So, she tried AI tools. Her novel unabashedly depicts gore and sex, but most of the AI tools she discovered included “guardrails” that sanctioned explicit content. That is until she found Unstable Diffusion, “a Discord community for people using unrestricted versions of a recently released, open source AI image tool called Stable Diffusion.” Will Knight explains,
“The official version of Stable Diffusion does include guardrails to prevent the generation of nudity or gore, but because the full code of the AI model has been released, it has been possible for others to remove those limits.”Will Knight, This Uncensored AI Art Tool Can Generate Fantasies—and Nightmares.
Here, people are generating content from the violent to the pornographic, without oversight or restriction. While most AI companies keep their technologies protected from public use, some, like Stable Diffusion, are mainstreaming their services. With the Unstable Diffusion version, Simpson-Edin got the violent and erotic elements she was looking for and praised the tool for being filter-free. However, she now helps moderate the open-source AI tool, realizing, along with many others, that the abuses of the technology are far-reaching.
What’s the problem? First, the algorithms of AI art tools prioritize novelty, gathering images across the net and creating something “new.” But “new” often means grotesque, or outright pornographic. With porn addiction especially, the brain searches for novelty. This AI tool makes it easy, therefore, to indulge in forbidden fantasy.
People are also worried that AI image generators can easily create child pornography. There are no boundaries with this thing. Having open access to Stable Fusion’s unrestricted versions gives the darkest personalities on the internet an easy place to live out their perversions. Getting rid of the guardrails may allow writers like Simpson-Edin to add some sex and violence to their promotional websites, but it also encourages the worst sorts of human depravity. Knight adds,
“Because tools like Stable Diffusion use images scraped from the web, their training data often includes pornographic images, making the software capable of generating new sexually explicit pictures. Another concern is that such tools could be used to create images that appear to show a real person doing something compromising—something that might spread misinformation.”Will Knight, This Uncensored AI Art Tool Can Generate Fantasies—and Nightmares.
Besides the dangers we just discussed, AI imaging tools can also pull off “deepfakes.” In 2021, TikTok videos of Tom Cruise went viral.
The only problem was that the real Cruise wasn’t featured. They were contrived using advanced AI, and for the casual observer, were very convincing. Who’s to say this can’t be done with politicians and public figures? In addition, what might deepfakes mean for Hollywood actors? If a young Clint Eastwood can be conjured from thin air, we might see the age of posthumous acting and decreasing demand for human actors. There are a lot of angles to consider.
Can we trust ourselves to be responsible for this technology, or are the guardrails necessary? In one sense, we can’t blame AI tools for the content they produce. Humans invented these tools, and the AI can only draw on images that are already out there, based on text prompts we put in. Just like we can’t praise AI for creating “original” artwork, we can’t hold this technology responsible for creating novel horrors.
At the same time, putting boundaries on technology like this seems like the responsible thing to do as a society, if indeed AI image generators become increasingly mainstream. We don’t want this technology to get into the hands of impressionable children, if we can help it, and should also remember that humans are the real creative agents here, and our technology reflects both our virtues and our vices, often amplifying the vice if we’re not watchful. The more art people produce, the less interested we’ll be in handing creativity over to the machine.