Home About us

Science Robotic's intrinsic haptics enable intuitive physical human-machine interaction

Brain-computer interface community 2024/09/05 10:28

Tactile sensors and e-skins are common devices that provide a sense of physical interaction to robots, but they can become complex and expensive when used for large areas of the robot. Science Robotics research recently published by the Germany Aerospace Center uses an internal high-resolution joint force torque sensor to achieve an inherent whole-body tactile sensation in a robotic arm. Deep learning technology and artificial neural networks enable robots to sense the location, direction, and magnitude of forces exerted anywhere on their surface. The bot can recognize and react to characters, such as numbers drawn on its surface, as well as virtual buttons or sliders, providing users with an intuitive way to interact with the bot.

Brain-computer interface community, Science Robotic intrinsic haptics enable intuitive physical human-computer interaction

https://www.science.org/doi/10.1126/scirobotics.adn4008

Innovations:

  1. A robotic tactile perception method that does not rely on external tactile sensors or artificial skin is proposed.

  2. The redundant configuration of high-resolution articular force-torque sensors enables the robot to sensitively perceive its surroundings.

  3. Through a combination of manifold learning and artificial neural networks, robots are able to recognize and interpret touch trajectories as machine-readable letters, symbols, or numbers.

Brain-computer interface community, Science Robotic intrinsic haptics enable intuitive physical human-computer interaction

Technical Solution:

  1. A momentum-based monitoring method is used to estimate external forces and torques by means of an integrated force-torque sensor.

  2. Manifold learning technology is used to expand the touch trajectory from a surface in 3D space to a 2D plane to eliminate the influence of the surface curvature of the robot.

  3. The Rotation Isovariant Convolutional Neural Network (RICNN) was used to classify and recognize the unfolded touch trajectories, and the text interpretation of the touch input was realized.

Experimental Results:

  1. The robot is able to accurately detect and locate the touch trajectory in real-time and interpret it as a specific command or intent.

  2. Experiments verify the effectiveness of the technology under different robot configurations and user perspectives, and achieve high accuracy recognition of numbers and letters.

  3. The so-called "virtual button" concept is demonstrated, allowing the user to create programmable interactive buttons anywhere on the robot's surface that can be used to trigger preset tasks or functions.

  4. Through a series of experiments, including writing numbers and letters on the surface of the robot, as well as using virtual buttons to control the robot's movement and task execution.

The technology proposed in this paper provides new possibilities for intuitive and flexible interaction between robots and humans, and opens up new research directions in the field of human-robot collaboration in the future.

Originated from the Internet

It is only used for academic sharing, if there is any infringement, please leave a message and delete the infringement immediately!

This article is from Xinzhi self-media and does not represent the views and positions of Business Xinzhi.If there is any suspicion of infringement, please contact the administrator of the Business News Platform.Contact: system@shangyexinzhi.com