Humanoid Robot EMO, Capable of 'Self-Learning Lip-Syncing,' Begins to Speak Like a Human
2026-01-17 / Read about 0 minute
Author:小编   

A research team from Columbia University in the United States has unveiled a new humanoid robot head named EMO. This robot not only synchronizes its lip movements with speech to achieve "perfect lip-syncing," but also gradually acquires human-like vocalization skills through self-learning, representing a significant leap forward in the development of ultra-realistic humanoid robots. EMO's face is covered with a layer of flexible silicone "skin," under which 26 micro-motors are precisely arranged. These motors are responsible for driving a variety of facial expressions and shaping the lips into different forms through various combinations. To enable EMO to learn how to control its mouth shape, researchers positioned it in front of a mirror, allowing it to perform random facial movements without human intervention and observe the feedback reflected in the mirror. Over time, this process established a correlation between motor combinations and changes in visual expressions, forming a "vision-to-action" language model. After mastering the mapping between expressions and motor movements, EMO advanced to the stage of "imitating human speech." The research team fed it a vast amount of video footage of humans speaking and singing, analyzed the corresponding patterns between speech and mouth movements, and then learned the lip shape characteristics associated with various sounds. These insights were integrated with the previous model to achieve seamless synchronization between speech and lip movements. Currently, EMO still encounters challenges in pronouncing consonants like 'B' and 'W,' and its overall lip-syncing coordination requires refinement. However, with continued practice, its precision in lip-shape control and conversational fluency are expected to improve significantly.