Emotion Recognition Software Use Spreads While Science Is Doubted
Emotional recognition software has been coming under fire for misuse and racial bias for some timeAn editor at AI Trends notes
The global emotion detection and recognition market is projected to grow to $37.1 billion by 2026, up from an estimated $19.5 billion in 2020, according to a recent report from MarketsandMarkets. North America is home to the largest market.
John P. Desmond, “Market for Emotion Recognition Projected to Grow as Some Question Science” at AI Trends (June 24, 2021)
But the software has been coming under fire for misuse and racial bias for some time:
“How people communicate anger, disgust, fear, happiness, sadness, and surprise varies substantially across cultures, situations, and even across people within a single situation,” stated the report, from a team of researchers led by Lisa Feldman Barrett, of Northeastern University, Mass General Hospital and Harvard Medical School.
John P. Desmond, “Market for Emotion Recognition Projected to Grow as Some Question Science” at AI Trends (June 24, 2021)
See, for example, a recent study:
It is commonly assumed that a person’s emotional state can be readily inferred from his or her facial movements, typically called emotional expressions or facial expressions. This assumption influences legal judgments, policy decisions, national security protocols, and educational practices; guides the diagnosis and treatment of psychiatric illness, as well as the development of commercial applications; and pervades everyday social interactions as well as research in other scientific fields such as artificial intelligence, neuroscience, and computer vision. In this article, we survey examples of this widespread assumption, which we refer to as the common view, and we then examine the scientific evidence that tests this view, focusing on the six most popular emotion categories used by consumers of emotion research: anger, disgust, fear, happiness, sadness, and surprise. The available scientific evidence suggests that people do sometimes smile when happy, frown when sad, scowl when angry, and so on, as proposed by the common view, more than what would be expected by chance. Yet how people communicate anger, disgust, fear, happiness, sadness, and surprise varies substantially across cultures, situations, and even across people within a single situation. Furthermore, similar configurations of facial movements variably express instances of more than one emotion category. In fact, a given configuration of facial movements, such as a scowl, often communicates something other than an emotional state. Scientists agree that facial movements convey a range of information and are important for social communication, emotional or otherwise. But our review suggests an urgent need for research that examines how people actually move their faces to express emotions and other social information in the variety of contexts that make up everyday life, as well as careful study of the mechanisms by which people perceive instances of emotion in one another. We make specific research recommendations that will yield a more valid picture of how people move their faces to express emotions and how they infer emotional meaning from facial movements in situations of everyday life. This research is crucial to provide consumers of emotion research with the translational information they require.
Emotional Expressions Reconsidered: Challenges to Inferring Emotion From Human Facial Movements Lisa Feldman Barrett 1 2 3, Ralph Adolphs 4, Stacy Marsella 1 5 6, Aleix M Martinez 7, Seth D Pollak 8 Affiliations expand PMID: 31313636 PMCID: PMC6640856 DOI: 10.1177/1529100619832930
This short piece at Forbes gives some sense of the high hopes the proponents have for the technology:
Desmond notes,
The science behind emotion recognition is increasingly being questioned. A review of 1,000 studies found the science behind tying facial expressions to emotions is not universal, according to a recent account in OneZero. The researchers found people made the expected facial expression to match their emotional state only 20% to 30% of the time.
John P. Desmond, “Market for Emotion Recognition Projected to Grow as Some Question Science” at AI Trends (June 24, 2021)
One sought-after use of the technology is to determine whether, for example, a driver is “asleep at the wheel.” But that can be done without making any inferences at all about the driver’s emotional state or opinions. The problem, going forward, is that that valuable, possibly life-saving, use is offered to make the case for the technology in general — and many of its other premises are far more questionable.
You may also wish to read:
AI researcher sounds alarm: AI “emotion detectors” are faulty science. An industry worth over $30 billion uses ERT on school children and potential hires, often without knowledge or consent. The science behind the claim that AI can recognize six basic universal human emotions is coming under fire amid claims of race bias. (April 26, 2021)