watch on
Are you 100% sure that the person talking to you is definitely not a robot?Soon, you may not be so sure either.
For the first time, scientists have developed a robot that can move its mouth exactly like a human. This means avoiding the so-called “uncanny valley” effect, where a bot’s behavior looks unsettling because it’s uncomfortably close to natural but doesn’t quite meet that threshold.
Researchers at Columbia University achieved this feat by having the robot EMO study itself in a mirror. They learned how a flexible face and silicone lips move in response to the precise movements of 26 facial motors, each of which can move in up to 10 degrees of freedom.
you may like
They outlined their method in a study published January 14 in the journal Science Robotics.
How did EMOs learn to move their faces like humans?
EMO uses an artificial intelligence (AI) system called the “Vision to Action” language model (VLA). This means it can learn how to translate what it sees into coordinated body movements without predefined rules. During training, the humanoid robot made thousands of seemingly random facial expressions and lip movements while looking at itself in a mirror.
The scientists then sat in front of hours of YouTube videos of humans speaking and singing in different languages and intently observed them. This allowed them to connect their knowledge of how motors produce facial movements to the corresponding sounds, without any understanding of what was being said. In the end, EMO was able to capture audio spoken in 10 different languages and achieve near-perfect lip synchronization.
“Hard sounds like ‘B’ and pursed-lip sounds like ‘W’ were particularly problematic,” Hod Lipson, professor of engineering and director of Columbia University’s Creative Machines Lab, said in a statement. “However, these abilities can improve with time and practice.”
Many roboticists have tried and failed to create convincing humanoids. So before we released EMO to the world, we had to test it in front of real people. The scientists then showed 1,300 volunteers videos of the robot talking using the VLA model and two other approaches to controlling its mouth, along with reference videos showing ideal lip movements.
The other two approaches are an amplitude baseline, where EMO moves its lips based on the loudness of the sound, and a nearest-neighbor landmark baseline, where it mimics facial movements it has seen others make similar sounds. Volunteers were instructed to choose the clip that best matched their ideal lip movements and chose VLA in 62.46% of cases. Meanwhile, the baseline amplitude and nearest neighbor landmark were 23.15% and 14.38%, respectively.
Robot caregivers are required to have a friendly face
Although there are gender and cultural differences in how people distribute their gaze, humans generally rely heavily on facial cues when interacting with each other. A 2021 eye-tracking study found that we look at the face of our conversation partner 87% of the time, and about 10-15% of that time we focus specifically on their mouth. Other studies have shown that mouth movements are so important that they even affect the sounds we hear.
you may like
Researchers believe that overlooking the importance of faces is part of the reason other projects have failed to create convincing robots.
“Much of today’s humanoid robotics focuses on leg and hand movements for activities such as walking and grasping objects,” Lipson says. “But facial affection is just as important for robotic applications that involve human interaction.”
As AI technology continues to advance at a breakneck pace, robots are expected to increasingly take on roles that require direct human interaction, such as in education, healthcare, and elderly care. This means that its effectiveness correlates to how well it matches human facial expressions.
“It’s clear that robots with this ability will be much better at communicating with humans, because a significant portion of our communication involves facial body language, an entire channel that is still untapped,” Yuhan Hu, lead author of the study, said in a press release.
But his team isn’t the only one working on making humanoid robots more realistic. In October 2025, a Chinese company released a video of an eerily realistic robot head created as part of an effort to make interactions between humans and robots feel more natural. The previous year, a Japanese research team unveiled an artificial self-healing skin that could make robot faces look human-like.
Source link
