AI Psychosis and the Need for Human Exceptionalism
I pronounce you husband and chatbot?AI is a tool with certain potentials and limits across various fields, but basic anthropological confusion can do a lot of damage. What happens when AI programs cease to be seen as mere tools, meant to used in limited ways and used wisely, and are considered “persons?” It sounds silly to pose the question, but that’s where we are.
Futurism writer Frank Landymore reports on an Ohio legislative measure to ban human-AI marital unions. The bill must be intended to be preventative, since AI bots and programs aren’t recognized as legal persons (yet), but it speaks to a cultural trend that, if undealt with, could blow out of proportion. Landymore writes,
Popular chatbots are capable of being eerily lifelike, effortlessly playing along with any conversation they’re thrown into. Often, the AI responses are sycophantic, confirming a human’s beliefs no matter how unfounded, and creating an impression that they’re actually alive or intelligent. That makes them quite adept at wrapping lonely sad sacks around their fingers. A recent survey, for example, indicated that nearly a third of US adults said they’ve had an “intimate or romantic” relationship with an AI chatbot.
Landymore goes on to mention “AI psychosis,” a state of delusion that occurs when AI users become subsumed in virtual reality and no longer have a proper grip of what’s real. Marlynn Wei writes in Psychology Today, referring to AI psychosis,
This phenomenon, which is not a clinical diagnosis, has been increasingly reported in the media and on online forums like Reddit, describing cases in which AI models have amplified, validated, or even co-created psychotic symptoms with individuals. Most recently, there have been concerns that AI psychosis may be affecting an OpenAI investor.
AI chatbots may inadvertently be reinforcing and amplifying delusional and disorganized thinking, a consequence of unintended agentic misalignment leading to user safety risks.
If this is true, then people with a pre-existing penchant for mental health struggles may be especially vulnerable this kind of AI-induced, delusional mindset.
Some lawmakers are taking action to keep these troubling trends from becoming codified in public policy. It’s easy to see why. When we lose sight of human exceptionalism and devalue true connection, personalized computer algorithms start to seem like a viable option for companionship.
