No skin needed: Robots learn to feel human touch with internal sensors, AI

The tactile breakthrough enables robots to recognize written digits and execute tasks through intuitive touch-based commands.

No skin needed: Robots learn to feel human touch with internal sensors, AI

Robots can intuitively detect human touch using existing force sensors combined with AI.

Maged Iskanda

Researchers have developed a method to give robots an innate sense of touch by integrating their internal force-torque sensors with machine learning algorithms.

The method developed by a team at Germany’s Deutschen Zentrums für Luft- und Raumfahrt (DLR) enables robots to sense and interpret human touch without requiring costly synthetic biomimetic skins or extra sensors to cover their surface.

The team used high-resolution internal sensors and deep learning to give a robotic arm a full-body sense of touch and accurately detect force details.

“The intrinsic sense of touch we proposed in this work can serve as the basis for an advanced category of physical human-robot interaction that has not been possible yet, enabling a shift from conventional modalities towards adaptability, flexibility, and intuitive handling,” said researchers in a statement.

The robot's touch sense, achieved through internal sensors and machine learning, accurately interprets writing/drawing and enables various interactions without extra sensors.
The robot’s touch sense, achieved through internal sensors and machine learning, accurately interprets writing/drawing and enables various interactions without extra sensors.

Human-robot interaction

Modern robotic systems’ capabilities are advancing rapidly, making them promising collaborators in diverse fields such as manufacturing, space exploration, healthcare, and daily life assistance.

The integration of human problem-solving and reasoning with robotic precision is an active research area, particularly in human-robot interaction (HRI).

Various HRI modalities, such as vision-based, voice-recognition, and physical-contact approaches, have been explored, yet achieving intuitive physical interaction remains challenging.

Enabling robots with a sense of touch is crucial for safe and efficient interactions, allowing for precise identification of physical contacts. According to researchers, traditional force-torque sensors are used for control, but explicit tactile sensing is necessary for detailed contact information.

Although advances have been made with tactile skins and sensors, challenges remain in coverage, wiring, robustness, and real-time capability.

According to researchers, robots must be equipped with robust, sensitive sensors that can sense force in order to interact with humans physically. This can become costly and complex when dealing with large or curved robotic surfaces.

Tactile breakthrough

The researchers at DLR’s Institute of Robotics and Mechatronics (RMC) used integrated sensors to equip a robot with built-in tactile capabilities.

They overcome difficulties by using the equipment already included in the Safe Autonomous Robotic Assistant (SARA) system, a robotic arm with force-torque sensors in its joints and base that detect position and direct movement.

This allows the robot to detect and respond to physical interactions without needing external touch sensors. Using the sensors, the robot can detect where and in what order different forces were applied to its surface.

The robot recognizes written digits and executes tasks based on touch trajectories. Virtual buttons on its surface can also trigger commands.
The robot recognizes written digits and executes tasks based on touch trajectories. Virtual buttons on its surface can also trigger commands.

They combined this ability with deep learning algorithms to then interpret the applied touch. They demonstrated that the robot could recognize numbers or letters traced on its surface by using neural networks to predict each character.

The team also extended this mechanism to include virtual “buttons” or sliders on the robot’s surfaces that could be used to trigger specific commands or movements.

Researchers claim the approach endows the system with an intuitive and accurate sense of touch and increases the range of possible physical human-robot interactions.

“The intrinsic sense of touch we proposed in this work can serve as the basis for an advanced category of physical human-robot interaction that has not been possible yet, enabling a shift from conventional modalities toward adaptability, flexibility, and intuitive handling,” said the team in the study.

RECOMMENDED ARTICLES

The details of the team’s research were published in the journal Science Robotics.

0COMMENT

ABOUT THE EDITOR

Jijo Malayil Jijo is an automotive and business journalist based in India. Armed with a BA in History (Honors) from St. Stephen's College, Delhi University, and a PG diploma in Journalism from the Indian Institute of Mass Communication, Delhi, he has worked for news agencies, national newspapers, and automotive magazines. In his spare time, he likes to go off-roading, engage in political discourse, travel, and teach languages.