Using of quantitative approach, researchers enabled to make the robot’s face strikingly expressive
Japan's love for robots is no secret. And now, Japanese scientists have made a further step to giving android robots more realistic facial expressions.
Faces of android robots are interfaces providing communication with humans. If we want robots to interact with humans more effectively, robots’ faces must express realistic emotions. Recently, researchers from Osaka University modernized the head of their android child robot, named Affetto. Affetto was first developed in 2011. Now it has got a much more expressive face.
Up to now, facial expressions of android robots have not been examined in details, and displaying humanistic emotions in a robotic face remains a hard-hitting challenge. This is because of a huge set and significant asymmetry of natural human facial movements, because of some restriction put by materials used in robots’ skin, and of course because of complicated mathematics that is used for driving robot’s movements.
Hisashi Ishihara, Binyi Wu and Minoru Asada from Osaka University has found a method for identifying and quantitatively evaluating facial movements of their robot Affetto.
The researchers investigated more than a hundred of different points on Affetto’s face measuring their three-dimensional movement. Each facial point was underpinned by a so-called deformation unit. Each deformation unit includes a set of mechanisms that create a distinctive facial contortion, such as raising or lowering of part of an eyelid or lip. Measurements from these units were then subjected to a mathematical analysis to quantify their surface motion patterns.
The researchers reported their findings in the journal Frontiers in Robotics and AI. These findings will enable android robots to express wider ranges of emotions, that will result in deeper interaction with humans.
Author: Alena Snezhnaya