Emotion recognition isn't just about facial expressions and body language

UC Berkeley researchers show that humans can guess how someone is feeling from context cues alone

Spread the knowledge

cute baby sitting on blanket outside making grumpy face

 Photo by Ryan Franco on Unsplash

Your roommate bursts through the door, throws her bag down onto the kitchen table, and storms into her room. How do you think she's feeling?

We can generally guess people's emotions based on their facial expressions and body language. Your roommate's body language  provides pretty clear clues that she's probably feeling angry. But in some situations, physical cues could be ambiguous, like if you smile nervously or cry because you're happy. That's why people can't rely on facial expressions or body language alone — they consider the context too. But can people accurately guess the emotions of someone they can't see, based only on the setting the person is in?

In this new study by UC Berkeley scientists, they set out to figure out how context helps people identify others' emotions. They showed videos in which one person's face and body were masked to study participants. The researchers then asked the participants to predict the invisible person's feelings based solely on the context of the visual scene they were in. Context clues included factors such as the spatial configuration of the people in the video, the behavior of other people in the scene, or the type of interactions other people have with the invisible character.

They discovered that people can infer and track an invisible person's emotions based only on context. These results suggest that we need to update emotional intelligence tests to include context clues, not just static pictures of faces with no background or movement. The researchers also suggest that we could  improve computer vision, a technology that, among other features, allows machines to infer people's emotions in social profiles or security footage.