A robot that adapts its eye shape, color, and movement in response to stimuli
Developed by the team of Heeseung Lee at UNIST
Emotional engagement enhanced for social and companion robots, selected for ICRA 2025
If someone suddenly taps you on the shoulder, your eyes may widen or your body might flinch.
Humans react instantly to such stimuli, and as the same stimulus is repeated, their responses gradually become duller or the way they express themselves changes. Now, a robot has been developed that can mimic this "flow of emotional change."
This technology is expected to be utilized in the development of social robots that offer a higher level of emotional engagement.
The research team led by Professor Heeseung Lee from the Department of Design at UNIST has developed adaptive robot technology that expresses emotions through its eyes and movements, with these reactions changing over time.
This robot expresses a total of six emotions by combining different eye shapes, colors, and movements. Stimuli are input by either stroking or tapping the robot's head; stroking is set as a positive stimulus, while tapping is recognized as a negative stimulus.
For example, if the robot is suddenly tapped, its eyes become wide and turn blue, and it leans its body backward to express surprise. If the same stimulus is repeated, the robot does not simply repeat the same reaction; instead, its emotional expression changes depending on its previous emotional state and the accumulated value of the stimuli.
This kind of adaptive expression replicates the emotional flow seen in real people. In user evaluations, many participants commented that "the way the robot responds differently to the same stimulus depending on the situation feels impressive and distinct from a simple mechanical reaction." More than 80% of participants rated the emotional expressions as "natural and lively."
Professor Lee's team developed this technology by interpreting emotions not as fixed states, but as "vectors" that change over time, and incorporating this into the robot's control model. Strong stimuli rapidly increase the size of the emotion vector, while weaker stimuli gradually alter the response.
Professor Heeseung Lee said, "Conventional robots only displayed predetermined emotions in response to stimuli, but this model implements the flow of emotional change, making users feel as if the robot is a living being. It could be used in various human-centered robot fields, such as companion robots or emotional support technologies."
This research was led by doctoral student Haeun Park as the first author, and was accepted by ICRA (International Conference on Robotics and Automation), a prestigious international conference in the field of robotics. It was presented at the 2025 ICRA annual conference held in Atlanta, USA, on May 21.
The research was supported by the Ministry of Trade, Industry and Energy.
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.



