NEW YORK, April 3 — Engineers at Columbia University, New York, have developed Emo, a silicone-coated robotic face equipped with artificial intelligence enabling it to adapt its facial expression to those of the person it’s interacting with. In the future, it could be grafted onto full-scale humanoid robots to help them interact with humans in the most effective way.

Thanks to the rise of AI, robots have made huge strides in vocal communication in recent months. Now, work is focusing on facial expressions, with the aim of making these robots more genuinely sociable one day. Although still in development, Emo is now capable of making eye contact, and uses two artificial intelligence models to detect a person’s smile even before they even do it, so that it can smile with them. The first model predicts human facial expressions by analysing subtle facial changes, while the second generates motor commands to modify the robot’s attitude accordingly. The idea is to anticipate human reactions so that the robot always has the right expression, thereby gaining human trust.

The difficulty lies not so much in producing these facial expressions, but in their sequencing or timing. According to the researchers working on this project, Emo can predict an upcoming smile some 840 milliseconds before a person actually smiles. This then enables it to smile simultaneously with the person.

Currently, Emo’s head is equipped with 26 actuators enabling a wide range of nuanced facial expressions. To achieve these interactions, the researchers have also integrated high-resolution cameras into the pupils of each of the robot’s eyes. In this way, it can be trained for hours on end by watching videos of human facial expressions.

Advertisement

The potential of Emo and its future incarnations is of broad scope, since this kind of robot, ultimately capable of a form of ‘empathy,’ could be useful in fields as diverse as communications, education and therapy. — ETX Studio