Science has long acknowledged five core senses—sight, hearing, smell, taste, and touch—but ongoing research suggests this list may be incomplete.
A recent study from Queen Mary University and UCL suggests humans have a seventh sense, “remote touch,” allowing object detection without contact.
Nature’s Inspiration: How Birds Detect Hidden Prey
The idea was inspired by birds like sandpipers, which detect prey hidden under sand.
By sensing subtle mechanical signals transmitted through the grains, these birds can detect hidden objects in their environment.
Psychologist Elisabetta Versace of the Prepared Minds Lab conducted experiments showing humans could detect a cube buried under sand without touching it, revealing extraordinary tactile sensitivity.
In a follow-up, researchers used a robotic arm with tactile sensors and an LSTM model to identify the same object.
Humans Outperform Robots in Tactile Detection
The findings were striking: human participants outperformed the robotic system by roughly 30%. While people reached an average accuracy of 70.7%, the robot generated numerous false detections and achieved only 40% accuracy overall.
Zhengqi Chen, a PhD student at Queen Mary University, said the discovery could inspire tools that enhance human touch.
Applications include detecting artifacts without damage and exploring sandy terrains like Mars or the ocean floor.
Bridging Psychology, Robotics, and AI
Lorenzo Jamone from University College London emphasized the value of integrating psychology, robotics, and artificial intelligence. “Human experiments guided the robots’ training, and the robots’ feedback helped us reinterpret the human data,” he said.
The study, “Exploring Tactile Perception for Object Localization in Granular Media,” published in IEEE Explore, could reshape our understanding of human touch.
USC researchers have created a wearable system that allows for more natural and emotionally rich interactions in shared virtual environments, expanding opportunities in remote work, education, healthcare, and more.
Restoring the Power of Touch in a Digital World
Touch is essential to human communication and connection, helping build trust, regulate stress, and form emotional bonds from infancy through adulthood. Yet in today’s digital world, where many interactions happen through screens, physical contact is often absent.
To address this, researchers at the USC Viterbi School of Engineering have created a wearable haptic system that allows users to share and feel physical gestures—like handshakes, pats, and squeezes—in virtual reality, even across long distances.They detailed their work in a paper published on the arXiv preprint server.
Wearable Devices Bring Realistic Touch to Virtual Interaction
The system features gloves and sleeves equipped with small vibration motors that mimic pressure and motion, enabling users to engage with both virtual objects and each other through realistic touch feedback.
A user study, also presented at the IEEE World Haptics Conference, found that participants felt more connected and engaged when they could physically feel virtual gestures.
“Even as people spend more time socializing online, we’re seeing increased rates of depression, anxiety, and what’s known as ‘touch starvation,‘” said Heather Culbertson, associate professor at USC Viterbi and lead author of the study. “Virtual interactions are here to stay—but we need to make them better mirror the emotional benefits of real-life experiences.“
Image Credits:Premankur Banerjee
The system allows up to 16 users to connect at once, each represented by a full-body 3D avatar that mimics their real-world movements in a shared virtual space. Unlike video calls, users can move freely, interact with each other, and engage with virtual objects—such as passing items or collaborating on tasks.
“This project came from a simple but powerful human need—to feel close to those we miss,” said Premankur Banerjee, a PhD student in Heather Culbertson’s Haptics Robotics and Virtual Interaction Lab and the study’s lead author.
Making Long-Distance Communication Feel Close
“After spending over five years away from my own family, this work became personal. It’s not just about creating a sense of presence, but about bringing back the feeling of physical closeness in long-distance communication,” he said.
To recreate touch, users wear gloves and armbands with vibration motors that simulate motion and pressure, allowing them to feel gestures and interactions in VR.
Tests showed that participants experienced a stronger sense of connection and presence with tactile feedback. The study also examined how gesture speed and vibration type affect emotional and sensory perception, offering insights for designing more immersive touch-based experiences.
Merging Science and Emotion in Communication Technology
“Building this kind of technology demands collaboration across disciplines,” said Culbertson. “Our team integrates computer science, engineering, neuroscience, psychology, and social science to develop tools that are not just technically effective, but also enable emotionally rich social interaction.“
The global move toward online communication—accelerated by the COVID-19 pandemic—has offered great convenience but also led to unintended effects. Despite being more digitally connected than ever, many people, especially younger generations, continue to struggle with loneliness, anxiety, and depression.
“Platforms like Zoom and FaceTime help us stay visually and verbally connected, but they lack the physical interaction that humans naturally need,” said Heather Culbertson.
She emphasized that while this technology can’t replace real-life contact, it can meaningfully enhance social interaction when being together in person isn’t possible.
Enhancing Care, Collaboration, and Closeness Across Distances
In hospitals and long-term care settings, the system could help patients and loved ones share comforting physical gestures across distances. In remote work or learning environments, it enables more immersive, collaborative engagement. For families and friends separated by travel or deployment, it helps restore a deeper sense of closeness.
“Touch is essential to human well-being. While technology can’t fully replicate it, bringing tactile experiences into virtual spaces is an important step toward more emotionally connected digital communication,” Culbertson said.
Scientists at the California’s University San Diego have developed a novel approach, inspired by human dexterity, to enable a robotic hand to rotate objects solely through touch, eliminating the need for visual input.
The team equipped a four-fingered robotic hand with 16 touch sensors on its palm and fingers. These low-cost, low-resolution touch sensors, each costing around $12, can detect whether an object is in contact with them or not, providing simple binary signals.
The robotic hand utilizes this touch-based information to smoothly rotate a wide range of objects, including small toys, cans, fruits, and vegetables, without causing damage.
Enabling Robots to Manipulate Objects in Low-Light and Vision-Limited Environments
This innovative technique shows promise in enabling robots to manipulate objects in darkness or environments where visual perception is limited. The team presented their work at the 2023 Robotics: Science and Systems Conference, highlighting the potential applications of their touch-based rotational method.
In contrast to other approaches that rely on a few high-resolution touch sensors placed at the fingertips, this method disperses many low-cost sensors across a larger area of the robotic hand, offering unique advantages and versatility.
Xiaolong Wang, a professor specializing in electrical and computer engineering at UC San Diego and the lead researcher of this study, has pointed out several issues with current methods of robotic hand manipulation.
Challenges in Robotic Hand Sensing and Perception
Firstly, using a limited number of sensors on the robotic hand reduces the likelihood of contact with objects, thus restricting the system’s ability to sense its surroundings. Secondly, the complexity and cost of simulating high-resolution touch sensors that provide texture information make them impractical for real-world experiments. Lastly, many existing approaches heavily rely on visual feedback.
To overcome these challenges, Wang and his research team propose a simple solution. They demonstrate that detailed texture information about an object is unnecessary for the task at hand. Instead, they find that binary signals indicating whether the sensors have made contact with the object or not are sufficient and much easier to simulate and implement in real-world scenarios.
Advantages of a Comprehensive Array of Binary Touch Sensors for Robotic Object Rotation
The researchers emphasize that using a comprehensive array of binary touch sensors provides enough data about the object’s 3D structure and orientation, enabling the robotic hand to rotate objects effectively without relying on visual cues.
To train their system, the team utilized simulations of a virtual robotic hand manipulating various objects, including irregularly shaped ones.
The system tracks which sensors on the hand make contact with the object during rotation, along with the positions and previous movements of the hand’s joints. Using this information, the system guides the robotic hand on the necessary joint movements for the next steps in the rotation process.
Real-Life Testing and Object Rotation Performance
After successful simulation training, the researchers tested the system with a physical robotic hand on unfamiliar objects. The robotic hand was able to rotate different objects, such as a tomato, pepper, a can of peanut butter, and a toy rubber duck (the most challenging due to its shape), without stalling or losing its grip. While objects with more complex shapes required more time for rotation, the robotic hand was still able to rotate them around different axes.
In the future, Wang and his team plan to expand their approach to tackle more intricate manipulation tasks, like enabling robotic hands to catch, throw, and juggle objects. The ultimate objective is to equip robots with in-hand dexterity, a skill that comes naturally to humans but poses significant challenges for robots to master.
Accomplishing this would greatly enhance the range of tasks that robots can perform. The research paper titled “Rotating without Seeing: Towards In-hand Dexterity through Touch” lists co-authors Binghao Huang, Yuzhe Qin, UC San Diego; and Zhao-Heng Yin and Qifeng Chen, HKUST, with the asterisk denoting equal contributions to the work.
Read more: Enabling Autonomous Exploration for Robots.
We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept All”, you consent to the use of ALL the cookies. However, you may visit "Cookie Settings" to provide a controlled consent.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
Cookie
Duration
Description
cookielawinfo-checkbox-analytics
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional
11 months
The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
viewed_cookie_policy
11 months
The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.