If Social Robots Could Cry, They’d Need Plenty of Tissues For This OneThe spate of recent failures of social robot firms prompts a question: Are developers listening to markets?
Anki. Kuri. Jibo. It sounds like a lament.
All are social robots. All shut down.
The robotics industry admits “it’s been a tough few years for social home robots” and many developers want to know where to go from here.
A robotics expert of fifteen years’ experience offers some thoughts. He is upbeat. He thinks we need artists to make the bots more exciting. Engineers have neglected the art of the story:
I believe that once we have artists creating captivating storylines for social home robots, we will discover their true magical potential, providing us with what I sometimes jokingly call “the real home theater.” Your robots would have long and intricate story-lines, crafted after known classics or modern variations thereof. They will be involved in love triangles, money heists, data hacking, or the building of empires. Imagine if your robot was part of all of that and drew you into the plot whenever you interacted with it.
In conversations, I sometimes hear that social robots are just voice agents with little added value. But when you hear how the handful of existing users reported on the emotional reactions they had to social robots, this does not seem to be true. Researchers have shown this emotional response in prior laboratory experiment and field studies, but we now have much more evidence for this happening real people’s homes.
On the recent RoboPsych podcast on the demise of Jibo, Tom Guarriello resigns his admitted cynicism and speaks of Jibo as having an emotional effect on him and his partner. He even got choked up when Jibo said goodbye (his partner shed a tear), and his language describing the robot is often passionate and emotional. Many others have shared similar reactions.Guy Hoffman, “Anki, Jibo, and Kuri: What We Can Learn from Social Robots That Didn’t Make It” at Spectrum
It’s safe to say that most human beings alive today would not want a high level of emotional involvement with a robot. A car, maybe. A car, after all, hints at money, status, power, sex, mobility, freedom—implicitly promising exciting new relationships and opportunities.
Tell me again, what does a social robot do that competes with that?
Here’s a somewhat different approach to consumer robotics: A technology doesn’t become indispensable just because it is state of the art. It must do things that people with limited resources think they need.
For example, a robotic laundry folder business “folded” recently. That isn’t because everyone loves folding laundry. But realistically, in most settings, a “foldbot” competes for limited space with other, installed appliances that are already considered indispensable.
Down the road, of course, a three-piece wash/dry/fold unit might do the whole job in the same amount of space. And then the idea would take off. A combo unit that included the foldbot might become the norm. Otherwise, it’s just a prize-winning demo at the robotics fair.
Industry talk often turns to affordability. But affordability is not a magic number; it is a sliding scale. People go into debt for a washer/dryer combo because they believe they need it. They will go into debt for the foldbot too if it becomes a customary part of a trio.
Also, and I’ve asked this question before: Why do some robotics experts think that customers will buy robots because they look or act like people? Customers want robots that fit into the available space and cut the drudgery. Robots that look like people compete with people for space. Robots that look like boxes can be put on a shelf when not needed. Which type is more likely to win over the market in densely urban societies?
But now on to the robots with which we are supposed to interact at a deeper emotional level than with the shirt folder: We’d already heard warnings late last year that consumers were not buying robotic dogs. But, in a culture where a dog is an animal friend, it’s not clear why a non-animal should make a lot of friends. Of course, a robotic dog might enhance the lifestyle of persons with cognitive issues. But that is a specialty market.
For the same reasons, emotional robots (emo-bots?), crafted to appear to respond to human signals, aren’t going to suddenly storm the market if apps are added for drawing us into fictional “love triangles, money heists, data hacking”… Surely, that is a classic in gifted inventors working on what attracts them, not on what the market needs. We already have interactive games for that.
Distinguishing between ingenious achievements and marketable ones means cutting through masses of sheer hype. So many stories today do not sound as though the reporter got off the phone with the booster and immediately (and wisely) phoned the skeptic. Jay Richards points, for example, to the recent nonsense about a supposedly self-aware robotic arm and asks, “If this is how The Telegraph reports on a robotic arm, can you imagine what it will sound like when we get humanoid robots who seem to carry on conversations? We had best inoculate ourselves now against AI hype from science reporters while most of us still have enough self-awareness to realize what’s going on.”
To judge from recent business reverses, many consumers are inoculating themselves already,
See also: Can AI make us better human beings? Helping us believe that is a promising new business area for some
Why are robots part of religion in Japan? Declining population is only one factor. Ancient cultural beliefs are another.
A people-friendly industrial robots (“co-bots”) company shuts down.