A Robotic Hand Employs Touch, Rather Than Vision, to Manipulate and Rotate Objects

A Robotic Hand Employs Touch, Rather Than Vision, to Manipulate and Rotate Objects

Scientists at the California's University San Diego have developed a novel approach, inspired by human dexterity, to enable a robotic hand to rotate objects solely through touch, eliminating the need for visual input.
A robotic hand. Credit: Pixaobay

Scientists at the California’s University San Diego have developed a novel approach, inspired by human dexterity, to enable a robotic hand to rotate objects solely through touch, eliminating the need for visual input.

The team equipped a four-fingered robotic hand with 16 touch sensors on its palm and fingers. These low-cost, low-resolution touch sensors, each costing around $12, can detect whether an object is in contact with them or not, providing simple binary signals.

The robotic hand utilizes this touch-based information to smoothly rotate a wide range of objects, including small toys, cans, fruits, and vegetables, without causing damage.

Enabling Robots to Manipulate Objects in Low-Light and Vision-Limited Environments

This innovative technique shows promise in enabling robots to manipulate objects in darkness or environments where visual perception is limited. The team presented their work at the 2023 Robotics: Science and Systems Conference, highlighting the potential applications of their touch-based rotational method.

In contrast to other approaches that rely on a few high-resolution touch sensors placed at the fingertips, this method disperses many low-cost sensors across a larger area of the robotic hand, offering unique advantages and versatility.

Xiaolong Wang, a professor specializing in electrical and computer engineering at UC San Diego and the lead researcher of this study, has pointed out several issues with current methods of robotic hand manipulation.

Challenges in Robotic Hand Sensing and Perception

Firstly, using a limited number of sensors on the robotic hand reduces the likelihood of contact with objects, thus restricting the system’s ability to sense its surroundings. Secondly, the complexity and cost of simulating high-resolution touch sensors that provide texture information make them impractical for real-world experiments. Lastly, many existing approaches heavily rely on visual feedback.

To overcome these challenges, Wang and his research team propose a simple solution. They demonstrate that detailed texture information about an object is unnecessary for the task at hand. Instead, they find that binary signals indicating whether the sensors have made contact with the object or not are sufficient and much easier to simulate and implement in real-world scenarios.

Advantages of a Comprehensive Array of Binary Touch Sensors for Robotic Object Rotation

The researchers emphasize that using a comprehensive array of binary touch sensors provides enough data about the object’s 3D structure and orientation, enabling the robotic hand to rotate objects effectively without relying on visual cues.

To train their system, the team utilized simulations of a virtual robotic hand manipulating various objects, including irregularly shaped ones.

The system tracks which sensors on the hand make contact with the object during rotation, along with the positions and previous movements of the hand’s joints. Using this information, the system guides the robotic hand on the necessary joint movements for the next steps in the rotation process.

Real-Life Testing and Object Rotation Performance

After successful simulation training, the researchers tested the system with a physical robotic hand on unfamiliar objects. The robotic hand was able to rotate different objects, such as a tomato, pepper, a can of peanut butter, and a toy rubber duck (the most challenging due to its shape), without stalling or losing its grip. While objects with more complex shapes required more time for rotation, the robotic hand was still able to rotate them around different axes.

In the future, Wang and his team plan to expand their approach to tackle more intricate manipulation tasks, like enabling robotic hands to catch, throw, and juggle objects. The ultimate objective is to equip robots with in-hand dexterity, a skill that comes naturally to humans but poses significant challenges for robots to master.

Accomplishing this would greatly enhance the range of tasks that robots can perform. The research paper titled “Rotating without Seeing: Towards In-hand Dexterity through Touch” lists co-authors Binghao Huang, Yuzhe Qin, UC San Diego; and Zhao-Heng Yin and Qifeng Chen, HKUST, with the asterisk denoting equal contributions to the work.


Read the original article on Science Daily.

Read more: Enabling Autonomous Exploration for Robots.

Share this post