
Engineers at Northwestern University have created a new wearable device that could transform how technology simulates touch. The thin, flexible patch sticks to the skin and can generate sensations like vibration, pressure, and twisting.
Beyond enhancing gaming and virtual reality, the device may have healthcare uses, including support for people with visual impairments and users of prosthetic limbs. The research was recently published in Nature and represents the latest advance by Northwestern bioelectronics expert John A. Rogers.
From Epidermal VR to Advanced Haptic Sensations
The innovation expands on 2019’s “epidermal VR,” using advanced actuators that deliver controlled, constant-force touch across multiple frequencies. They can also create a smooth twisting motion on the skin, increasing the realism of the sensation.
The project was a collaborative effort involving researchers from Northwestern University, Westlake University in China, and Dalian University of Technology.
Jiang’s research group focused on creating the miniature components that enable twisting sensations. The wearable itself features 19 tiny magnetic actuators arranged in a hexagonal pattern and embedded within a flexible silicone mesh, with each actuator capable of producing different tactile effects.
Bluetooth Integration and Energy-Efficient Design
The device connects via Bluetooth, receiving input from a smartphone that translates environmental data into sensory signals. Although it runs on a small battery, its bistable design greatly improves energy efficiency by allowing the actuators to remain in two stable states without continuous power.
When the actuators press downward, energy is stored in both the skin and the device’s structure. As they move upward, that stored energy is released, requiring only minimal electrical input. This approach extends the device’s operating time and reduces the need for frequent charging.
Matthew Flavin, lead author, explained that the device uses the skin’s elasticity to store and reuse energy, boosting efficiency like a stretched rubber band. He is now an assistant professor at Georgia Tech.
Testing Sensory Substitution in Blindfolded Volunteers
Researchers had blindfolded sighted volunteers test the device to see if mechanical feedback could replace visual input. Participants navigated obstacles, adjusted their steps to avoid tripping, and shifted their posture to maintain balance.
In one experiment, a participant walked along a path filled with obstacles. As objects drew nearer, the device delivered gentle signals in the upper-right area of the sensor. These signals increased in strength and moved toward the center as the distance decreased.
After only a brief adjustment period, participants were able to change their movements in real time. This showed that the device could effectively translate visual cues into mechanical sensations, enabling users to interpret their surroundings through touch.
Providing “Haptic Vision” Beyond Traditional Tools
Flavin noted that the technology could function in a role similar to a white cane, while also providing additional environmental information beyond what traditional tools offer. The study demonstrated that the system can deliver a basic form of “vision” by sending patterned haptic signals to the skin.
Using data from smartphone-based 3D imaging sensors such as LiDAR, the device converts spatial information into tactile feedback. According to Rogers, this kind of sensory substitution supports meaningful spatial awareness and could significantly enhance mobility and independence for people with visual impairments.
Advancements in wearable technology highlight the growing potential to merge electronic systems with human biology, broadening how people and machines interact.
Read the original article on: Clickpetroleoegas
Read more:A New Artificial Skin Aims to Give Humanoid Robots the Sensation of Pain
