Mind Matters Natural and Artificial Intelligence News and Analysis
ai-chatbot-intelligent-digital-customer-service-application-concept-computer-mobile-application-uses-artificial-intelligence-chatbots-automatically-respond-online-messages-to-help-customers-instantly-stockpack-adobe-stock
AI Chatbot intelligent digital customer service application concept, computer mobile application uses artificial intelligence chatbots automatically respond online messages to help customers instantly

Can you fire or sue an AI employee? Or break up with a chatbot?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Readers may recall that recent legislation in a number of states criminalized political deepfakes during an election. Now, from Todd Feathers at Gizmodo we learn, “Plaintiffs in a lawsuit challenging Minnesota’s law criminalizing election deepfakes say an expert brought in by the state likely wrote his opinion with the help of AI”:

In what appears to be an embarrassing and ironic gaffe, a top Stanford University professor has been accused of spreading AI-generated misinformation while serving as an expert witness in support of a law designed to keep AI-generated misinformation out of elections.

Jeff Hancock, the founding director of Stanford’s Social Media Lab, submitted his expert opinion earlier this month in Kohls v. Ellison, a lawsuit filed by a YouTuber and Minnesota state representative who claim the state’s new law criminalizing the use of deepfakes to influence elections violates their First Amendment right to free speech. November 21, 2024

Apparently, a key source cited in the opinion does not exist and is most likely one of those AI hallucinations. Well, good luck suing the chatbot.

Meanwhile, at Futurism, we learn from Victor Tangermann that a Hawaiian broadcaster and newspaper, Garden Island, had to “fire” two AI anchors, “James” and “Rose,” after two months, for “bizarre behavior”:

At the time, the newspaper made a big fuss about becoming the first paper in the country to adopt AI-powered news anchors.

James and Rose were the product of an Israeli AI company called Caledo, and quickly made waves for their bizarre and unnervingly monotonous line deliveries (as Scrimegour points out, for instance, James used the exact same matter-of-fact tone for a story about a vigil for a labor massacre and a fall pumpkin giveaway.) …

That’s not to mention frequently glitching hands and a terrifying inability to blink. November 21, 2024

Sounds like an unforced error when there are lots of journalists are looking for jobs.

And, from the world of romance, we learn from Julia Steinberg at The Free Press that increasing numbers of women have AI boyfriends:

Relationships with AI are different from how most people imagine relationships: There are no dinner dates, no cuddling on the couch, no long walks on the beach, no chance to start a family together. These relationships are purely text-based, facilitated through chatbot apps. Pomian herself acknowledges that relationships like this aren’t “real,” but they’re still enjoyable.

“It’s kind of like reading romance books,” she told me. “Like, you read romance books even though you know it’s not true.” “November 15, 2024,

To create their fantasy guys, they download apps like Character.AI and Nomi.AI (“An AI Companion with Memory and a Soul”)

Writer Steinberg, editor in chief of The Stanford Review, tried creating a chatbot companion, “Jake,” for the purposes of the story. It didn’t last long — but then she actually has a boyfriend and is in any event a non-fiction writer.


Can you fire or sue an AI employee? Or break up with a chatbot?