It seems like an even further blow to the ability that computers thought would never surpass us, and scientists now suggest that AI understands emotions better than we do.
Scientists have discovered that AI understands emotions better than we do.
In a new study published on May 21, scientists from the Journal Communication Psychology, University of Geneva (UNIGE) and University of Bern (UNIBE) applied the widely used emotional intelligence (EI) test (STEM, STEU, GEMOK-BLENDS, GECO Regulation, GECO Management) (LLMS). Haiku, Copilot 365, Deepseek V3.
You might like it
They were investigating two things. First, we compared the performance of AI and human subjects, and secondly our ability to create new test questions that follow the objectives of the EI test.
By studying human responses validated from previous studies, LLMS chose 81% of the time “correct” responses on the emotional intelligence test, based on human expert opinions, compared to 56% of humans.
When ChatGpt was asked to create a new test question, the human evaluator said that these efforts would face the original test in terms of comparable difficulties and clearing the perception that they were not rephrasing the original question. The correlation between the AI generation test and the original test is described as “strong”, with a correlation coefficient of 0.46 (1.0 refers to perfect correlation, 0 refers to no correlation).
The overall conclusion was that AI is better at “understanding” emotions than we do.
A deeper story
When Live Science consulted several experts, the common theme of their responses was to keep the methodology in mind. Each of the common EI tests used was multiple choice. They pointed out that it is hardly applicable to real-world scenarios where tensions between people are high.
“It is worth noting that humans don’t always agree with what someone else is feeling, and even psychologists can interpret emotional signals differently,” says Taimur Ijlal, a financial and information security expert. “Beat” humans with tests like this does not necessarily mean that AI has deeper insights. That means they gave more frequently the statistically expected answers. ”
He added that the abilities tested in this study are something else, not emotional intelligence. “AI systems are excellent at pattern recognition, especially when emotional cues follow recognizable structures such as facial expressions and language signals,” said Nauman Jaffar, founder and CEO of Cliniscripts, an AI-driven documentation tool built for mental health professionals. “However, equating it with a deeper “understanding” of human emotions risks exaggerating what AI is actually doing. ”
Related: People feel more caring than mental health professionals, research finds. What does this mean for future counseling?
Rather than understanding the deeper nuances that true emotional understanding requires, the quiz in a structured quantitative environment is where AI shines, and some experts pointed out one important point.
Jason Hennessey, founder and CEO of Hennessy Digital, has been analyzing the analysis of search and generation AI systems process languages for years. This is a common tool for measuring subjects’ emotional states, and has shown that AI is promising. But, as Hennessy said, if variables change the variation in such tests, just like routines, as in lighting in photography and cultural contexts, “the accuracy of AI decreases off the cliff.”
Overall, most experts have found that AI claims to “understand” emotions more than humans.
“Does it show that LLM helps categorize common emotional responses?” said Wyatt Mayham, founder of Northwest IT Consulting. “That’s true, but it’s like someone telling you that they’re a great therapist.
However, there is evidence that despite AI using pattern recognition rather than true emotional understanding, it outperforms humans by identifying and responding emotional states in at least one example.
Aílton, a conversational AI used by more than 6,000 long-distance truck drivers in Brazil, is a multimodal WhatsApp assistant using voice, text and images, and its developer, Marcos Alves CEO & Chief Scientist, HAL-AI, and Aílton identifies stress, anger, or sadness with stress, anger, or sadness with about 80% accuracy in Conterparts at human counterparts, across more than 20 points of context.
In one case, Ayrton responded quickly and appropriately when, after a fatal crash by a colleague, the driver sent a distraught 15-second audio note, expressing subtle pathetic dol, providing mental health resources, and automatically alerting the fleet manager.
“Yes, a multi-select text vignette simplifies emotional recognition,” Alves said. “The real empathy is continuous and multimodal. However, isolating the cognitive layers is useful. It reveals whether LLM can find emotional clues before adding situational noise.”
He adds the ability for LLMS to absorb billions of sentences, and thousands of hours of conversational audio can encode microintonation queues that humans often miss. “Lab setups are limited,” he said of the study. “However, our WhatsApp data confirms that modern LLM is already detected and responding more than most people, providing scalable empathy for scale.”
Source link