Tag: Touch

  • Humans Have a Seventh Sense called Remote Touch

    Humans Have a Seventh Sense called Remote Touch

    Science has long acknowledged five core senses—sight, hearing, smell, taste, and touch—but ongoing research suggests this list may be incomplete.
    Image Credits:pplware

    Science has long acknowledged five core senses—sight, hearing, smell, taste, and touch—but ongoing research suggests this list may be incomplete.

    A recent study from Queen Mary University and UCL suggests humans have a seventh sense, “remote touch,” allowing object detection without contact.

    Nature’s Inspiration: How Birds Detect Hidden Prey

    The idea was inspired by birds like sandpipers, which detect prey hidden under sand.

    By sensing subtle mechanical signals transmitted through the grains, these birds can detect hidden objects in their environment.

    Psychologist Elisabetta Versace of the Prepared Minds Lab conducted experiments showing humans could detect a cube buried under sand without touching it, revealing extraordinary tactile sensitivity.

    In a follow-up, researchers used a robotic arm with tactile sensors and an LSTM model to identify the same object.

    Humans Outperform Robots in Tactile Detection

    The findings were striking: human participants outperformed the robotic system by roughly 30%. While people reached an average accuracy of 70.7%, the robot generated numerous false detections and achieved only 40% accuracy overall.

    Zhengqi Chen, a PhD student at Queen Mary University, said the discovery could inspire tools that enhance human touch.

    Applications include detecting artifacts without damage and exploring sandy terrains like Mars or the ocean floor.

    Bridging Psychology, Robotics, and AI

    Lorenzo Jamone from University College London emphasized the value of integrating psychology, robotics, and artificial intelligence. “Human experiments guided the robots’ training, and the robots’ feedback helped us reinterpret the human data,” he said.

    The study, “Exploring Tactile Perception for Object Localization in Granular Media,” published in IEEE Explore, could reshape our understanding of human touch.


    Read the original article on: Pplware

    Read more:Scientists Found a Way to Reduce Heat Without Obstructing the View

  • New Haptic Technology Brings the Sensation of Touch to Virtual Reality Experiences

    New Haptic Technology Brings the Sensation of Touch to Virtual Reality Experiences

    USC researchers have created a wearable system that allows for more natural and emotionally rich interactions in shared virtual environments, expanding opportunities in remote work, education, healthcare, and more.
    Image Credits: Premankur Banerjee

    USC researchers have created a wearable system that allows for more natural and emotionally rich interactions in shared virtual environments, expanding opportunities in remote work, education, healthcare, and more.

    Restoring the Power of Touch in a Digital World

    Touch is essential to human communication and connection, helping build trust, regulate stress, and form emotional bonds from infancy through adulthood. Yet in today’s digital world, where many interactions happen through screens, physical contact is often absent.

    To address this, researchers at the USC Viterbi School of Engineering have created a wearable haptic system that allows users to share and feel physical gestures—like handshakes, pats, and squeezes—in virtual reality, even across long distances.They detailed their work in a paper published on the arXiv preprint server.

    Wearable Devices Bring Realistic Touch to Virtual Interaction

    The system features gloves and sleeves equipped with small vibration motors that mimic pressure and motion, enabling users to engage with both virtual objects and each other through realistic touch feedback.

    A user study, also presented at the IEEE World Haptics Conference, found that participants felt more connected and engaged when they could physically feel virtual gestures.

    Even as people spend more time socializing online, we’re seeing increased rates of depression, anxiety, and what’s known as ‘touch starvation,‘” said Heather Culbertson, associate professor at USC Viterbi and lead author of the study. “Virtual interactions are here to stay—but we need to make them better mirror the emotional benefits of real-life experiences.

    Image Credits:Premankur Banerjee

    The system allows up to 16 users to connect at once, each represented by a full-body 3D avatar that mimics their real-world movements in a shared virtual space. Unlike video calls, users can move freely, interact with each other, and engage with virtual objects—such as passing items or collaborating on tasks.

    This project came from a simple but powerful human need—to feel close to those we miss,” said Premankur Banerjee, a PhD student in Heather Culbertson’s Haptics Robotics and Virtual Interaction Lab and the study’s lead author.

    Making Long-Distance Communication Feel Close

    After spending over five years away from my own family, this work became personal. It’s not just about creating a sense of presence, but about bringing back the feeling of physical closeness in long-distance communication,” he said.

    To recreate touch, users wear gloves and armbands with vibration motors that simulate motion and pressure, allowing them to feel gestures and interactions in VR.

    Tests showed that participants experienced a stronger sense of connection and presence with tactile feedback. The study also examined how gesture speed and vibration type affect emotional and sensory perception, offering insights for designing more immersive touch-based experiences.

    Merging Science and Emotion in Communication Technology

    Building this kind of technology demands collaboration across disciplines,” said Culbertson. “Our team integrates computer science, engineering, neuroscience, psychology, and social science to develop tools that are not just technically effective, but also enable emotionally rich social interaction.

    The global move toward online communication—accelerated by the COVID-19 pandemic—has offered great convenience but also led to unintended effects. Despite being more digitally connected than ever, many people, especially younger generations, continue to struggle with loneliness, anxiety, and depression.

    Platforms like Zoom and FaceTime help us stay visually and verbally connected, but they lack the physical interaction that humans naturally need,” said Heather Culbertson.

    She emphasized that while this technology can’t replace real-life contact, it can meaningfully enhance social interaction when being together in person isn’t possible.

    Enhancing Care, Collaboration, and Closeness Across Distances

    In hospitals and long-term care settings, the system could help patients and loved ones share comforting physical gestures across distances. In remote work or learning environments, it enables more immersive, collaborative engagement. For families and friends separated by travel or deployment, it helps restore a deeper sense of closeness.

    Touch is essential to human well-being. While technology can’t fully replicate it, bringing tactile experiences into virtual spaces is an important step toward more emotionally connected digital communication,” Culbertson said.


    Read the original article on: Techxplore

    Read more:Humanoid Robots Symbolize China’s Ambitions in AI

  • A Robotic Hand Employs Touch, Rather Than Vision, to Manipulate and Rotate Objects

    A Robotic Hand Employs Touch, Rather Than Vision, to Manipulate and Rotate Objects

    Scientists at the California's University San Diego have developed a novel approach, inspired by human dexterity, to enable a robotic hand to rotate objects solely through touch, eliminating the need for visual input.
    A robotic hand. Credit: Pixaobay

    Scientists at the California’s University San Diego have developed a novel approach, inspired by human dexterity, to enable a robotic hand to rotate objects solely through touch, eliminating the need for visual input.

    The team equipped a four-fingered robotic hand with 16 touch sensors on its palm and fingers. These low-cost, low-resolution touch sensors, each costing around $12, can detect whether an object is in contact with them or not, providing simple binary signals.

    The robotic hand utilizes this touch-based information to smoothly rotate a wide range of objects, including small toys, cans, fruits, and vegetables, without causing damage.

    Enabling Robots to Manipulate Objects in Low-Light and Vision-Limited Environments

    This innovative technique shows promise in enabling robots to manipulate objects in darkness or environments where visual perception is limited. The team presented their work at the 2023 Robotics: Science and Systems Conference, highlighting the potential applications of their touch-based rotational method.

    In contrast to other approaches that rely on a few high-resolution touch sensors placed at the fingertips, this method disperses many low-cost sensors across a larger area of the robotic hand, offering unique advantages and versatility.

    Xiaolong Wang, a professor specializing in electrical and computer engineering at UC San Diego and the lead researcher of this study, has pointed out several issues with current methods of robotic hand manipulation.

    Challenges in Robotic Hand Sensing and Perception

    Firstly, using a limited number of sensors on the robotic hand reduces the likelihood of contact with objects, thus restricting the system’s ability to sense its surroundings. Secondly, the complexity and cost of simulating high-resolution touch sensors that provide texture information make them impractical for real-world experiments. Lastly, many existing approaches heavily rely on visual feedback.

    To overcome these challenges, Wang and his research team propose a simple solution. They demonstrate that detailed texture information about an object is unnecessary for the task at hand. Instead, they find that binary signals indicating whether the sensors have made contact with the object or not are sufficient and much easier to simulate and implement in real-world scenarios.

    Advantages of a Comprehensive Array of Binary Touch Sensors for Robotic Object Rotation

    The researchers emphasize that using a comprehensive array of binary touch sensors provides enough data about the object’s 3D structure and orientation, enabling the robotic hand to rotate objects effectively without relying on visual cues.

    To train their system, the team utilized simulations of a virtual robotic hand manipulating various objects, including irregularly shaped ones.

    The system tracks which sensors on the hand make contact with the object during rotation, along with the positions and previous movements of the hand’s joints. Using this information, the system guides the robotic hand on the necessary joint movements for the next steps in the rotation process.

    Real-Life Testing and Object Rotation Performance

    After successful simulation training, the researchers tested the system with a physical robotic hand on unfamiliar objects. The robotic hand was able to rotate different objects, such as a tomato, pepper, a can of peanut butter, and a toy rubber duck (the most challenging due to its shape), without stalling or losing its grip. While objects with more complex shapes required more time for rotation, the robotic hand was still able to rotate them around different axes.

    In the future, Wang and his team plan to expand their approach to tackle more intricate manipulation tasks, like enabling robotic hands to catch, throw, and juggle objects. The ultimate objective is to equip robots with in-hand dexterity, a skill that comes naturally to humans but poses significant challenges for robots to master.

    Accomplishing this would greatly enhance the range of tasks that robots can perform. The research paper titled “Rotating without Seeing: Towards In-hand Dexterity through Touch” lists co-authors Binghao Huang, Yuzhe Qin, UC San Diego; and Zhao-Heng Yin and Qifeng Chen, HKUST, with the asterisk denoting equal contributions to the work.


    Read the original article on Science Daily.

    Read more: Enabling Autonomous Exploration for Robots.