Human Emotions: Now predictable at Scale?

Human Emotions: Now predictable at Scale?

A lowdown on how the global Emotion Detection and Recognition Technology space is booming

That the COVID-19 pandemic has led to marked growth in several facets of the global business experience through the adoption of Artificial Intelligence, Machine Learning, Deep Learning and IoT technologies should come as a surprise to nobody. With automation now becoming central to business processes, greater demands for operational efficiencies and social intelligence in AI agents has prompted the acceleration of several speech-based emotion detection systems to analyse emotional states as well. In fact, a recent study by Markets and Markets Research Pvt. Ltd. reports that in a post-COVID-19 scenario, “the global emotion detection and recognition market size is projected to grow from $19.5 billion in 2020 to $37.1 billion by 2026, at a Compound Annual Growth Rate (CAGR) of 11.3% during the forecast period.”

E-motions, Recognised

Emotion detection and recognition (EDR) technology is essentially an evolution of existing facial recognition systems — except they’re far more invasive. EDR technologies not only understand how someone is feeling at any given moment, but also decode their intentions and predict personality traits on the basis of fleeting expressions.

According to the Financial Times (FT), “Hundreds of firms around the world are working on emotion-decoding technology, in an effort to teach computers how to predict human behaviour. American tech giants including Amazon, Microsoft and Google all offer basic emotion analysis, while smaller companies such as Affectiva and HireVue tailor it for specific sectors such as automotive, advertisers and recruiters.”

The technology has found several novel applications as well: such as Disney using the software to test volunteers’ reactions to its films such as Zootopia and Star Wars: The Force Awakens; Kia Motors using it to test driver alertness and Marketing firms like Millward Brown using it to gauge audience responses to advertisements for brands such as intel and Coca-Cola.

And, it’s not just for businesses either. Not only have EDR-enabled cameras been installed in areas to analyse emotions on faces — such as in London’s Piccadilly Circus (to analyse people’s emotional reactions to billboards placed there), or Xinjiang in north western China where EDR technology had been deployed in 2019 to “rapidly identify criminal suspects by analysing their mental state” for the estimated one million-strong Uyghur Muslim population being detained in detention camps, but is also being used by several government entities, such as the Lincolnshire Police in the UK, who have funded EDR technologies to help identify suspicious persons in their area as well.

As FT reports: “No matter the application, the goal is the same: to make humans less inscrutable and easier to predict at scale.”

Primary Concerns: Accuracy and Privacy

Even whilst corporations and governments continue to roll out EDR technologies at scale, accuracy concerns persist. Research into these algorithms show that facial expressions do not necessarily translate into feelings/actions or what a person might do next. Consider this excerpt from FT:

“People, on average, the data show, scowl less than 30 percent of the time when they’re angry,” wrote Lisa Feldman Barrett, a psychologist at Northwestern University and one of the reviewers. “So, scowls are . . . an expression of anger — one among many. That means that more than 70 per cent of the time, people do not scowl when they’re angry. And on top of that, they scowl often when they’re not angry. The authors added that it was “not possible to confidently infer happiness from a smile, anger from a scowl, or sadness from a frown, as much of current technology tries to do when applying what are mistakenly believed to be the scientific facts.

Beyond measurement, (American psychologist Paul Eikman) says companies need to invest in research to prove links between expressions and behaviour. “Simply measuring the face doesn’t tell you whether your interpretation of it in that instance is correct or incorrect. Most of what I was seeing was what I would call pseudoscience — they weren’t doing the research to show the interpretation of the measurements was correct,” he says.

Although such challenges, coupled with privacy issues surrounding consent, remain at the forefront of EDR technologies, it remains, undoubtedly, one of the more promising technologies of our time. A major call for AI governance, regulation is going to be key to the success of Emotion Detection and Recognition going forward.

© 2023 Praxis. All rights reserved. | Privacy Policy
   Contact Us
Praxis Tech School
PGP in Data Science