
Next time you’re scrolling on your phone, pause to appreciate what’s happening: that simple motion relies on the precise coordination of 34 muscles, 27 joints, and more than 100 tendons and ligaments in your hand. Our hands are, in fact, the most dexterous parts of the body, and replicating their subtle, complex movements has long posed a challenge for robotics and virtual reality.
AI-Powered Ultrasound Hand Tracker
Engineers at MIT have developed an ultrasound wristband that can accurately track a user’s hand movements in real time. The device captures ultrasound images of the wrist’s muscles, tendons, and ligaments as the hand moves, then uses an AI algorithm to continuously convert those images into the positions of the five fingers and the palm.
The system can be trained to recognize an individual’s specific hand motions, allowing it to relay those movements instantly to a robot or within a virtual environment. The research was published in Nature Electronics.
In tests, the team demonstrated that a wearer could wirelessly control a robotic hand that mirrors their gestures and movements. Much like a wireless puppet, the user was able to guide the robot to play a simple piano tune and toss a small basketball into a desktop hoop. Using the same wristband, the wearer could also interact with digital objects on a screen—for example, pinching their fingers to zoom in or out of a virtual item.
Hand-Motion Data for Robotics and VR
The team is now using the wristband to collect hand-motion data from a wide range of users with different hand sizes, finger shapes, and gesture styles. They aim to build a large dataset that could be used, for example, to train humanoid robots to perform highly dexterous tasks, including certain surgical procedures. The ultrasound band could also enable users to grasp, manipulate, and interact with objects in video games, design tools, and other virtual environments.
“We believe this technology could quickly replace existing hand-tracking methods in virtual and augmented reality with wearable ultrasound bands,” says Xuanhe Zhao, the Uncas and Helen Whitaker Professor of Mechanical Engineering at MIT. “It could also generate vast amounts of training data for dexterous humanoid robots.”
Zhao, Gengxi Lu, and their colleagues introduced the new wristband design alongside MIT collaborators, including former postdocs Xiaoyu Chen, Shucong Li, and Bolei Deng; graduate students SeongHyeon Kim and Dian Li; postdocs Shu Wang and Runze Li; and Anantha Chandrakasan, MIT’s provost and Vannevar Bush Professor of Electrical Engineering and Computer Science. Additional co-authors include graduate students Yushun Zheng and Junhang Zhang, as well as Baoqiang Liu, Chen Gong, and Professor Qifa Zhou from the University of Southern California.

Seeing Strings
Researchers have explored several ways to capture and replicate human hand dexterity in robots. Some methods rely on cameras to track hand movements as people handle objects or perform tasks. Others use sensor-equipped gloves that record motion and transmit it to a robotic system. However, complex camera setups can be impractical and easily obstructed, while sensor-heavy gloves may restrict natural movement and reduce tactile feedback.
Another approach measures electrical signals from muscles in the wrist or forearm and links them to specific hand actions. Although this method has seen notable progress, it remains vulnerable to environmental noise and lacks the sensitivity to detect fine, continuous motion. For example, it might identify whether the thumb and index finger are pinched together or apart, but not the subtle transitions in between.
Zhao’s team began exploring whether ultrasound imaging could offer a more precise and fluid way to track hand movements. Their group has been developing compact ultrasound “stickers”—miniaturized versions of the transducers used in medical settings, combined with a hydrogel layer that allows them to adhere safely to the skin.
In their latest study, the researchers integrated their ultrasound “sticker” technology into a wearable wristband that continuously images the muscles and tendons in the wrist.
“The tendons and muscles in the wrist act like strings controlling a puppet—your fingers,” Lu explains. “By capturing each snapshot of those strings, you can determine the exact position of the hand.”
Charting Manipulation
The researchers created a wristband featuring an ultrasound sticker about the size of a smartwatch, along with onboard electronics comparable in size to a cellphone. After placing it on a volunteer’s wrist, they verified that the device could generate clear, continuous images as the person moved their fingers through different gestures.
The next hurdle was linking the grayscale ultrasound images of the wrist to precise hand positions. The human hand has 22 degrees of freedom—various ways the fingers and thumb can move or rotate. The team discovered that specific regions within the ultrasound images correspond to each of these movements. For example, changes in one area may indicate thumb extension, while shifts in another reflect motion in the index finger.
To map these relationships, a volunteer wore the wristband while performing different hand positions, as multiple cameras recorded the movements from surrounding angles. By aligning variations in the ultrasound images with the hand positions captured on camera, the researchers were able to label regions of the wrist images according to each degree of freedom. However, carrying out this translation continuously and in real time would be far beyond human capability.
Enhancing Hand Tracking with AI and Volunteer Testing
To overcome this challenge, the team turned to artificial intelligence. They employed an algorithm capable of learning visual patterns and linking them to specific labels—in this case, the hand’s different degrees of freedom. The researchers trained the system using carefully annotated ultrasound images, marking the regions tied to particular movements. When tested on a new set of images, the algorithm successfully identified the corresponding hand gestures.
After successfully integrating the AI algorithm with the wristband, the researchers tested it on additional volunteers. In this study, eight participants with varying hand and wrist sizes wore the device while performing different gestures and grasps, including all 26 letters of American Sign Language. They also handled objects like a tennis ball, a plastic bottle, scissors, and a pencil. In every case, the wristband accurately tracked and predicted the hand’s positions.
Controlling Virtual Objects and Robots
To showcase potential applications, the team created a basic computer program that connected wirelessly to the wristband. As users performed pinching and grasping motions, the program translated these gestures into smooth, continuous actions, such as zooming in and out on objects or moving and manipulating them on the screen.
The researchers also demonstrated the wristband’s ability to wirelessly control a commercial robotic hand. A volunteer mimicked keyboard motions while wearing the device, and the robot mirrored the movements in real time, playing a simple piano tune. The same robot was also able to replicate finger taps to play a desktop basketball game.
Zhao aims to make the wristband’s hardware even smaller and to train the AI software on a broader range of gestures and movements from volunteers with diverse hand sizes and shapes. The ultimate goal is to create a wearable hand-tracking device that anyone can use to wirelessly control humanoid robots or virtual objects with high precision.
“We believe this is the most advanced method for capturing dexterous hand movements—through wearable imaging of the wrist,” Zhao says. “These ultrasound bands could offer intuitive and versatile control for both virtual reality and robotic hands.”

Read the original article on: Tech Xplore
Read more: A five-tier system rates humanoid robots by mobility, manipulation, and cognition
