Using Emotion Recognition Technology at Events
Think about using emotion recognition technology at your next event. Is that a smile or grimace on your face?
Utilizing facial recognition software to analyze event attendee emotions was once seen as a bold step into the future, then received some pushback from attendees and planners, and now is in a bit of limbo. As with any developing technology, developers are making bold claims, consumers and attendees want to understand how it is being used to benefit them and that the collected data will be used responsibly, and event planners want to know that the accumulated data is accurate and actionable.
How Does Emotion Recognition Technology Work?
Humans are complex creatures who go through a wide range of emotions throughout a day. How we feel at a specific time impacts everything we do. When we’re feeling happy, we tend to be more outgoing and gregarious. When we’re sad, we act more withdrawn and insular.
These are social clues that let others know how we feel. If a typically extroverted friend acts quiet and reserved, you ask them what is wrong. The reason you ask is because people rarely just come out and say how they are feeling. The majority of our communication is nonverbal. So, we use facial expressions and context clues to make assumptions about how other people feel.
Emotion recognition technology works in roughly the same manner. Artificial intelligence software is trained by analyzing a vast series of expressions associated with an emotion. The same is done using text and speech patterns. The software is then able to scan a crowd and, using the information in its database, identify how people are feeling.
Why use Emotion Recognition Technology?
Market researchers use emotion recognition technology to analyze people’s reactions to products. Comparing the software’s feedback to survey responses can be very revealing and uncover how people genuinely feel about a product.
Events can utilize emotion recognition technology to judge a crowd’s response to a keynote speaker, including reactions to specific aspects of a presentation. Sponsors and exhibitors can receive valuable feedback on the effectiveness of their displays. Event planners can use the tech to see how people felt about various, previously unknowable aspects of the event, such as navigating the show floor.
Because the tech can rapidly evaluate thousands of faces, event planners can receive real-time information about the “happiest” places on the show floor. Conversely, they can immediately respond to “unhappy” areas to improve the attendees’ experience.
The potential of this technology extends well beyond marketing concerns. The healthcare industry can use it to monitor patient wellness and pain levels, especially after surgery. Companies are creating devices that may help people with autism understand other’s emotions so they can better form bonds. Smart cars may implement it to detect if a driver is impaired due to alcohol or exhaustion. The Chinese government is currently experimenting with emotion recognition technology to predict criminal behavior (which raises a whole host of ethical and moral issues).
Does Emotion Recognition Technology Work?
It’s one thing for emotion recognition technology to misread someone’s attitude toward a product, and something else entirely if it misinterprets a person as a potential criminal (whatever that means).
So, does emotion recognition technology work? Well, sort of.
Emotion recognition technology can accurately identify whether someone is smiling or frowning (and hundreds of subtler expressions) and make inferences based on these expressions. However, the software cannot recognize that people are really good at hiding their emotions. Have you ever seen someone smiling with their mouth but not their eyes? There’s a saying that, “The eyes are the windows to the soul.” While that may be true, the emotion recognition technology may prioritize the smile and misread their feelings.
Context is crucial to understand how someone is feeling. Also, cultural differences can make it difficult to read emotions or fully comprehend someone’s response to a situation.
In July 2019, a review commissioned by the Association for Psychological Science that examined more than 1,000 studies of emotion recognition technology was published. Five scientists, each with a different specialty of emotion science, took part in the two-year review.
“We weren’t sure if we would be able to come to a consensus over the data, but we did,” Lisa Feldman Barrett, a neuroscientist, professor of psychology at Northeastern University, and one of the review’s five authors, said to The Verge.
The review corroborated many of the issues listed earlier. While it agreed that there are many easily identifiable emotional expressions, it is extremely difficult to use a facial expression as a definitive read of an emotion.
“People, on average, the data show, scowl less than 30 percent of the time when they’re angry. So, scowls are not the expression of anger; they’re an expression of anger – one among many. That means that more than 70 percent of the time, people do not scowl when they’re angry. And on top of that, they scowl often when they’re not angry,” said Barrett.
“Would you want that in a court of law, or a hiring situation, or a medical diagnosis, or at the airport … where an algorithm is accurate only 30 percent of the time?”
In September of the same year, two papers were presented at the Eighth International Conference on Affective Computing and Intelligent Interaction that also interrogated the accuracy of emotion recognition technology. In fact, the papers suggested that certain “emotion analytics” techniques should be paused.
Researchers had participants play a “prisoner’s dilemma” game. Split into pairs, both players had to choose to split or steal a virtual ball. Players were rewarded equally if they both chose to split. However, if one decided to steal and the other chose to split, the stealer received a larger reward. If both elected to steal, both players received a small reward.
The participants could not talk or use hand gestures, only communicate using facial expressions. Researchers found that the expression used most often by participants was a smile regardless of whether a person was happy, sad, or surprised. For example, when a partner made a surprising move, most participants would smile broadly with wider smiles indicating a higher level of astonishment.
“You couldn’t infer whether [an outcome] was good or bad for them,” Jonathan Gratch, a computer science professor at the University of Southern California who presented the papers with colleagues, said to OneZero. “That undermines this idea that from looking at someone’s facial expression, you can figure out whether they are lying. The context of what just happened was a better predictor.”
“There is simply no strong evidence that supports the claim that there are universal emotional expressions, such that a certain set of facial muscle movements (e.g., a scowl) can be used to specifically diagnose a person’s emotional state (e.g., anger) with strong reliability,” Barrett said.
These two studies indicate that emotion recognition technology, at its current stage, would be more accurately labeled as expression recognition technology. However, that does not mean it should be discounted entirely for events. There are still many areas where the addition of emotion recognition would be beneficial. However, it is too soon to rely exclusively on the technology. Paired with surveys and event data, it could be a useful tool.
Plus, artificial intelligence learns and evolves. The accuracy of emotion recognition technology is likely to improve with time.
If you decide to utilize emotion recognition technology at your event, be sure to be transparent about it with your attendees. Clearly explain your intent for using the technology, how any data gathered will be stored and disposed of, and give attendees an option to opt-out. When you’re transparent about your data gathering, attendees are more likely to be transparent with their emotions.