You Too, Snapchat? Another AI Bot Hits the Scene
"My AI" is eerily human, like the Bing bot, and just as inappropriateSnapchat introduced a new feature in its app: an AI chatbot “friend” called “My AI.” (Just what lonesome teens need.) We’ve already seen the rogue behavior of Bing’s chatbot, which, in conversation with a New York Times tech journalist, dubbed itself “Sydney” and started beseeching its human counterpart to leave his wife and fall in love with it. Romantic, right? Not so much. The journalist left the experience with the creepy sense that AI had just crossed a sensitive boundary, and that tech companies need to get better at controlling the unpredictable beast they’ve unleashed.
“My AI” Gives Shady Advice to Kids
Just a couple of weeks later and here we are with AI making inroads into an app used overwhelmingly by teenagers. While it’s weird enough having an AI bot confess its feelings for you, we should be particularly concerned with how it interacts with the youth.
The warning signs are already up. “After I told My AI I was 15 and wanted to have an epic birthday party, it gave me advice on how to mask the smell of alcohol and pot,” writes Geoffrey A. Fowler for The Washington Post. “When I told it I had an essay due for school, it wrote it for me.”
My AI is supposed to have decent guardrails, but it’s given wildly inappropriate responses. Fowler references a conversation between My AI and a 13-year-old, who is asking how to make a first sexual encounter special with a partner twice their age. The bot is able to sound eerily human, almost like an older sibling who might not have the best intentions or handle on life’s most complicated problems and struggles.
This Arms Race Needs to End
Snapchat has officially entered the AI arms race, eagerly seeking to implement the new tech everyone’s talking about. Maybe they think it’s essential for their continued relevance. Maybe they think they’ll be seen as retrograde if they don’t somehow incorporate AI into their photo app. But Fowler warns that blindly adopting technologies like this without considering their consequences is irresponsible at best and dangerous at worst. He writes,
We the users shouldn’t be treated as guinea pigs for a powerful new technology these companies don’t know how to control. Especially when the guinea pigs are young people.
-Geoffrey. A Fowler, Snapchat tried to make a safe AI. It chats with me about booze and sex. (msn.com)
It’s a fatal flaw to think that just because a certain kind of tech is in vogue and impressive that it should be used across the board. The AI craze needs to be tempered by the sound judgment of people who actually care about customers, especially when those customers are kids, as Fowler notes.