Mind Matters Natural and Artificial Intelligence News and Analysis
deepfake-concept-matching-facial-movements-with-a-different-face-of-another-person-face-swapping-or-impersonation-stockpack-adobe-stock.jpg
Deepfake concept matching facial movements with a different face of another person. Face swapping or impersonation.

Sci-fi Could Come To Life If You Fall For a Deepfake Friend

The friend you knew only online is a starkly believable software synthesis? A Carnegie Mellon prof says it could happen today
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

That’s the looming scenario astrophysicist and social scientist Simon DeDeo (pictured) sketched for journalist Kelly Catalfamo because the required technology — GPT-3, facial GANs, and voice synthesizers –- exists now. Catalfamo asks,

Now imagine how you’d feel if you found out your friend didn’t really exist. Their profile turns out to be a Frankensteinian mashup of verbiage dreamed up by the powerful language generator GPT-3 and a face born from a generative adversarial network, perhaps with a deepfaked video clip thrown in here and there. How would it affect you to learn that you had become emotionally attached to an algorithm? And what if that “person” was designed to manipulate you, influencing your personal, financial, or political decisions like a garden-variety scammer or grifter?

Kelly Catalfamo, “Professor Warns of “Nightmare” Bots That Prey on Vulnerable People” at Futurism

Today, GPT-3 generates autobabble by synthesizing masses of sources from the internet in the same way that generative adversarial networks (GENs) synthesize thousands of faces to create a plausible one that never existed. And deepfake videos are getting more convincing all the time. Put them all together and aim them at a vulnerable person …

In the interview, DeDeo and Catalfamo focus on convincing GPT-3 (autobabble), not on the glitzy Tom Cruise deepfake that got detected simply because so many skeptical people were scrutinizing it:

“I can con you without needing to fake a video,” he said. “And the way I con you is not by tricking your visual system, which the video deepfakes do. I con you by tricking your rational system. I con you at a much higher level in your social cognition.”

Kelly Catalfamo, “Professor Warns of “Nightmare” Bots That Prey on Vulnerable People” at Futurism

What makes it a humilating risk, he goes on to say, is that communications on social media are “cyborgian” — partly human intelligence and partly artificial intelligence. Facebook determines what to show you about, for example, your old college roommate. But is that what you would notice or care about if you met the former roomie again at a high school reunion? Spend enough time on social media and what’s being altered is not an artificial intelligence creation but you.

DeDeo sees an upside to all this: we could critique more carefully what we ourselves do and say:

“There’s a lot of not-thinking that human beings do,” said DeDeo, who, as a professor, has read his share of formulaic essays. “There’s a lot of things people say that sound smart but actually have zero content. There’s a lot of articles that people write that are meaningless. GPT-3 can imitate those to perfection.”

Kelly Catalfamo, “Professor Warns of “Nightmare” Bots That Prey on Vulnerable People” at Futurism

Of course. The machine can shake, stir, and pour out a synthesis of content that is repeated without reflection thousands of times. It need make only grammatical sense. If we’re vulnerable to being used in that way, it’s time to get off social media and somehow make contact with real people, as soon as the COVID-19 lockdowns ease.

Don’t miss the many other interesting revelations in the interview.


You may also wish to read:

Can deepfakes substitute for actors? Would you care if the actor is a real person or not?


Mind Matters News

Breaking and noteworthy news from the exciting world of natural and artificial intelligence at MindMatters.ai.

Sci-fi Could Come To Life If You Fall For a Deepfake Friend