Mind Matters Natural and Artificial Intelligence News and Analysis
womans-face-with-ai-wireframe-for-artificial-intelligence-deepfakes-and-facial-scanning-concepts-stockpack-adobe-stock
Woman's face with AI wireframe for artificial intelligence deepfakes and facial scanning concepts
Image Credit: Brian - Adobe Stock

Character.AI Let a User “Recreate” Deceased Woman

I think we can all agree this really crosses the line.
Share
Facebook
Twitter/X
LinkedIn
Flipboard
Print
Email

Pandora’s box is an appropriate parable for the age of AI. Questioning the range of impacts of new technologies seems like a simple ethical necessity, but some major AI companies haven’t read the memo. Character.AI, an OpenAI competitor started by a squad of former Google employees, was designed to allow users to create AI avatars that mimic real or imagined characters, like celebrities, politicians, or characters from novels or movies. Recently, a user created an avatar on the platform based on Jennifer Crecente, a young woman who was murdered by her ex-boyfriend in 2006.

The company allowed an AI character based on a dead person to chat with the living, much to the horror of Crecente’s family. Jennifer’s Uncle, as reported by Futurism, wrote a furious post on X decrying Character.AI’s blatant ethical failure to protect the young woman’s memory and dignity. Foster Kamer writes,

In response to Brian Crecente’s outraged tweet, Character.AI responded on X with a pithy thank you for bringing it to their attention, noting that the avatar is a violation of Character.AI’s policies, and that they’d be deleting it immediately, with a promise to “examine whether further action is warranted.”

In a blog post titled “AI and the death of Dignity,” Brian Crecente explained what happened in the 18 years since his niece Jennifer’s death: After much grief and sadness, her father Drew created a nonprofit, working to change laws and creating game design contests that could honor her memory, working to find purpose in their grief.

Although the profile has since been taken down, it’s fair to ask how these billion-dollar AI companies should be regulated in the future to avoid further damage like this. How can this be avoided in the future? Can it? Or do AI companies treat incidents like unfortunate but inevitable outliers? Although Character.AI expressed how posting an avatar of a deceased person violates their guidelines, it’s hard, given the power of the technology, to see how people won’t find loopholes to do similar things again.

The situation uncovers a bigger discussion about AI regulation, where the ethical lines should be drawn, and who should have the power to draw them.


Peter Biles

Writer and Editor, Center for Science & Culture
Peter Biles is a novelist, short story writer, poet, and essayist from Oklahoma. He is the author of three books, most recently the novel Through the Eye of Old Man Kyle. His essays, stories, blogs, and op-eds have been published in places like The American Spectator, Plough, and RealClearEducation, among many others. He is a writer and editor for Mind Matters and is an Assistant Professor of Composition at East Central University and Seminole State College.

Character.AI Let a User “Recreate” Deceased Woman