Could Robots Be Programmed to Feel Ordinary Love?
The question is a bit more complex than we might at first think, as the British TV series Humans demonstratesArtificial Intelligence (AI) systems can now imitate interactive human conversation so well that people are “falling in love” with them.
The phenomenon worries psychiatrists like Dr. Susan B. Trachman whose 2024 article, “The Dangers of AI-Generated Romance,” describes the surface benefits of AI-powered romantic relationships but warns of serious downsides. Trachman quotes a colleague who observes that AI romance bots offer “companionship without judgment, drama, or social anxiety but lack genuine human emotion and offer only simulated empathy.”
But can the AI system itself fall in love, too?
I’ve been watching the British TV series Humans (UK) (2015-2018), a beautifully produced and acted sci-fi story. It follows humans interacting with human-identical robots called “synthetics” as the “synths” become “conscious.” Every episode presents at least one fascinating moral or philosophical question related to AI.
In Humans, Season 2, Episode 3, the woman look-alike synth, Mia, has received software that enables her to be apparently conscious. She talks as though she were conscious. Soon, standing on the sandy seacoast, she expresses her love to Ed, her human male owner:
I’m going to say this, and then you’ll never have to see me again, if that’s what you want. I like you more than anything I’ve ever seen or heard or touched. Everything normal is bigger and brighter when I’m with you. You make everything … more.
As I have followed the series, I felt empathy for the lovely, fearful, young woman, Mia. It was hard to remember that she’s just a robot.
I snapped out of the trance, however, to ask: How could an AI system fall in romantic love? It’s not an abstract question for me. Dr. Harville Hendrix and Helen LaKelly Hunt wrote Getting the Love You Want (1988), and frankly that book changed my life by explaining how romantic attraction works and a romance-based marriage can prosper. I refined my question: Using Hendrix’s model, could a seemingly “conscious” robot fall in love the way humans do?
Romantic attraction results from an awful struggle
Human babies invest enormous energy into basic survival skills, including acquiring language, learning to operate their bodies in new environments, and figuring out how their caregivers work. Babies study caregivers’ faces and every move, learning what it takes to encourage the caregivers to feed and protect them. That work is life-or-death intense. Babies and young children grow up with deepest knowledge about how to work with or around certain caregivers’ styles, methods, language, moods, etc.
Hendrix argues that we unconsciously choose spouses or life partners who reflect both the positive and the negative traits of our early caregivers. We unconsciously seek partners who remind us of our early caregivers, not just because we are comfortable with them, but also because we subconsciously believe they hold the key to healing old wounds. Early romantic attraction comes as we idealize the potential life partner, projecting our hopes, dreams, and unmet needs onto them.

Romantic attraction is not random. Hendrix explains that we are often drawn to what feels familiar, even if it is dysfunctional. We may pursue a new relationship, believing that it will “complete” us, echoing childhood desires for unconditional love and acceptance. On that view, partners are attractive when we sense that they will help us recreate and heal unresolved childhood wounds.
Hendrix points out additional elements that build more romantic attraction. We may look for and appreciate people who possess qualities we disowned or suppressed in our youth, or who have qualities we would like in our lives. Extroverts may seek introverts, and vice versa. People from strict upbringings may seek a “free-spirited” partner.
Robot love cannot be human
Drawing on these main points from Hendrix’s approach to romantic attraction and love, we can restate the question: Can an AI system — a robot or synth — fall in love as humans do? This question challenges the concept that AI systems could someday become essentially human or should be treated as if they were human, with thoughts and feelings that make them no different from humans.
Let’s walk through the Hendrix approach. In the Humans episode, Mia speaks to Ed using words that dovetail with human romantic attraction. Mia says she likes Ed more than anything before. She describes her elation because Ed makes everything bigger and better — implying that he fulfills her unmet needs. Ed somehow completes Mia, in the way that Hendrix observes happening in human romantic attraction. So far, Mia is in love!
What’s missing?
Mia was never a baby or child who depended upon caregivers to survive. She never mapped every detail of her caregivers and found ways to manipulate their whole beings to achieve survival. Moreover, Mia never had wounds or unmet needs from childhood, so she cannot be attracted to Ed as the potential healer and provider.
What about the other elements that interlock to create romantic attraction? Mia could be attracted to qualities in a human only if her software identified those qualities by observing humans and then decided that associating with a human with those qualities improved or completed her “life.”
AI’s emotional circularity
Key to the Hendrix model is: Romantic attraction arises not from marking off a checklist of compatible qualities. It stems from the youngest human mind’s struggle to get caregivers to ensure survival. That genesis is what makes romantic attraction, or the loss of a romantic relationship, so seemingly earth-shattering to us. The ballad “Everything You Want” (2000) by Vertical Horizon hauntingly sings to this point.
Mia’s attraction to Ed, by contrast, does not come from the processes that produce human romantic attraction. Her expressions of attraction follow how humans might say a given thing, but her underlying “feelings” don’t come from anything like human processes.
No matter how winningly, even heartbreakingly, spoken, Mia’s words come from software that itself came initially from human ideas about how a synth could “feel.”
It doesn’t matter that software extracted these ideas from the Internet using large language models. Sophisticated software might use Hendrix’s concepts about how to find an “attractive” human. But whatever ways Ed pleased, complemented, or healed Mia — or gave her new hope — those ways came from software that identified and ranked personality traits. It then directed Mia to seek or reject the mix of traits her AI system found. And that software circles back to human creators’ thoughts, not to computers’ independent minds.
No matter how attractive, intelligent, helpful, sexy, and very human-looking, the AI-powered synths remain ultimately man-made machines. AI cannot ever feel ordinary love.
Here’s a trailer for the Humans series:
