The Danger of Deepfakes (and Deepcake)
In a metaverse world dominated by AI, life and art is in danger of being eclipsedMeta CEO Mark Zuckerberg and other optimistic futurists think the metaverse is our collective future. We will exist in virtual nonexistence. We will eat, shop, worship, communicate, and marry in the metaverse. This is where progress is taking us. So, we’d best go along for the ride if we don’t want to get left behind. But “living” in a metaverse may be much more complicated than you might think. The question of identity, and who has the power to distribute identities at will, haunts the metaverse project, and there doesn’t seem to be an easy solution.
In September, a Russian deepfake company called “Deepcake” (the name strangely makes me hungry for dessert) pasted the face of Bruce Willis on a younger actor in a commercial. According to a Wired article by Steven Levy, most viewers thought this humanoid really was Willis. It’s a convincing look, like what we saw with the viral Tom Cruise deepfakes from last year. Levy writes,
“While the figure has Willis’ face, it doesn’t quite convey his trademark insouciance. And for some reason, this Willis has a different voice—a gruff bark that speaks Russian. Still, it looks like Willis—digitized and generated, Chmir says, by algorithms trained on 34,000 images from his earlier films.”
Steven Levy, What’s Deepfake Bruce Willis Doing in My Metaverse? (wired.com)
In addition to celebrity imitation, Deepcake digitally “cloned” a copy of someone who didn’t want to be filmed for an educational film project. If you have stage fright, fear not! Contact Deepcake and they’ll digitally recreate you to a T.
It is alarming that AI tech can so convincingly replicate human features. The line between reality and lies, especially in a metaverse, will only grow blurrier.
The metaverse, though I’ve personally never visited, seems like a malleable creation. It’s built to be tampered with. With deepfakes and a plenitude of avatars, it’s worth asking ourselves how we will be able to differentiate between the online projection and the real thing, or if we will eventually stop caring about the distinction at all. Suppose an actor is dead. No problem. AI can scan past images of the actor’s face and recreate him or her at will. Suppose you want to go on a date with Taylor Swift. Cool! Just put in the order at Deepcake and you’ll be on your way to Tompkins Square Park arm in arm. Is a movie problematic, or in need of some retro-editing? Let’s go back and dot some i’s and cross some t’s.
While AI deepfakes have political ramifications (digitally cloning world leaders, creating false news, etc), the actual human qualities of our lived experience may grow ever dimmer in lieu of the metaverse and its deepfake minions. If I can choose my own virtual world, the actual created order loses its appeal. Social media and the internet already seduce us into an alternative “reality,” charming us with the possibility of ideal online selves, offering an escape from the drudgery of daily life. I already regret how much time I’ve spent curating an online presence, neglecting to notice the real world and the people around me.
But companies like Deepcake also raise questions about the future of the arts. Why do we need human actors if they can be reassembled digitally? Why write novels if AI can generate stories, or paint if AI tools can do the trick? In an October 10th podcast, hosted by Wesley J. Smith, Dr. Robert J. Marks II noted that while AI can be a powerful tool, it can never be creative. Human beings are uniquely imaginative, thus distinguishing themselves from the machines they create. The danger now, it seems, is merging ourselves too much with our own technology, and diminishing our creative capabilities in the process. We created AI, but that doesn’t mean we can depend on it to replace human creativity. If we do that, we will find ourselves not more but less creative. Perhaps when we discover that technology is using us instead of the other way around, it’s time to reevaluate our relationship with it.