Real-Time Integration of Workers' Voice and Robot Vision
Revolutionizing Flexibility in Manufacturing Processes
Successful Demonstration at E-FOREST Tech Day
Accelerating the "Human-Centered SDF Factory"
Professor Kim Wontae's research team from the Department of Computer Engineering at Korea University of Technology and Education participated in the "Hyundai Motor E-FOREST Tech Day 2025" and showcased worker-cooperative Physical AI robot technology.
On November 21, Korea University of Technology and Education and Hyundai Motor Company announced that they have developed the nation's first "Physical AI collaborative robot," which listens to workers' voice commands and autonomously decides and performs tasks. This technology is being evaluated as a next-generation smart manufacturing solution to address the challenges of an aging workforce and a shortage of skilled workers in manufacturing sites.
In Korea's manufacturing sector, the number of skilled workers is decreasing due to an aging workforce, and as consumer demands diversify, flexibility in production processes has become a critical issue.
To address these on-site challenges, Professor Kim Wontae's research team from the Department of Computer Engineering at Korea University of Technology and Education and the Automation Design Team at Hyundai Motor Company, led by Choi Jungho, launched a joint research initiative.
Through multiple on-site meetings, both teams meticulously identified the required technological elements and specifications for real manufacturing processes. As a result, they successfully developed "worker-cooperative Physical AI robot technology," which combines workers' voice information and robots' visual data in real time to enable autonomous task execution.
This technology was validated through its first public demonstration at the recently held Hyundai Motor E-FOREST Tech Day 2025.
During the demonstration, when a worker said to the robot, "Put the Avante hinge in the box," the robot scanned the work environment and objects using its vision sensors.
Next, a large-scale multimodal AI (LMM) simultaneously analyzed the voice and image data to understand the worker's intent and determine whether the task could be performed.
The AI then provided feedback to the worker, saying, "I will perform the requested task," and autonomously estimated the optimal position and posture for grasping the part, sequentially carrying out actions such as moving the robotic arm and gripping the object.
An on-site worker commented, "It's impressive that I can operate the robot simply by speaking to it as I would to a colleague, without needing to learn separate machine controls," adding, "If robots handle simple tasks, workers can focus on more advanced work without musculoskeletal strain, which will greatly improve the work environment."
A Hyundai Motor Company representative also stated, "This technology aligns perfectly with the core values of Hyundai's ongoing intelligent smart manufacturing (SDF) strategy," emphasizing, "Our goal is to realize a human-centered factory where workers and robots communicate and collaborate in real time."
Professor Kim Wontae said, "This joint research demonstrates that Physical AI can ensure stability and reliability even in complex real-world processes."
Choi Jungho, team leader at Hyundai Motor Company, announced, "We will further strengthen our collaboration with Korea University of Technology and Education to secure leadership in Physical AI technology in the SDF era."
Hyundai Motor Company is considering applying this technology first to a pilot production line, and then expanding its adoption to actual manufacturing processes after a stabilization period.
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.

