Tag: Robots

  • Technology that Allows Robots to Grasp Human Intentions could Improve Their Safety and Intelligence

    Technology that Allows Robots to Grasp Human Intentions could Improve Their Safety and Intelligence

    Image Credits: Unsplash/CC0 Public Domain

    Robots are increasingly present in daily life, from medical settings to helping at home. However, for people to fully trust and work with them, robots must do more than perform tasks—they must understand humans.

    This challenge is central to new research led by Dr. Mehdi Hellou through PRIMI, a project focused on helping robots develop a “theory of mind”—the ability to infer people’s beliefs, preferences, and intentions. The goal is to develop autonomous systems that anticipate when help is needed, adapt their behavior over time, and interact more socially intelligently.

    To make this possible, the team draws on psychology, neuroscience, and artificial intelligence, designing robots that integrate motor intelligence (how they move) with cognitive intelligence (how they think and reason).

    Adaptive Robots for Everyday and High-Risk Use

    As Dr. Hellou explains, developing autonomous systems that support people in everyday life—and in high-risk contexts like health care or nuclear waste management—requires machines that can adapt to different users and environments.

    The team’s most recent study appears in the journal ACM Transactions on Human-Robot Interaction.

    The project will test its approach in clinical pilot trials for stroke rehabilitation, where humanoid robots will support patients during recovery.

    If the results are positive, PRIMI could pave the way for a new wave of socially intelligent robots that learn in real time and feel more approachable and dependable.


    Read the original article on: Tech Xplore

    Read more: Classical Indian Dance is Inspiring New Ways to Teach Robots Hand Movements

  • At a Silicon Valley Event, Robots Do the Laundry as Investors Back the Technology

    At a Silicon Valley Event, Robots Do the Laundry as Investors Back the Technology

    Robots from across the globe gathered in Silicon Valley to showcase a possible vision of the future.
    Image Credits: 2025 Los Angeles Times

    Robots from across the globe gathered in Silicon Valley to showcase a possible vision of the future.

    Two robots used orange-tipped claws to pick up T-shirts, folding and stacking them neatly. A small, cheerful robot with bright eyes formed a heart with its mechanical hands, while another tiny robot wearing a bear hat threw playful punches. A blue-green robot, styled like an anime character, moved its head and arms.

    A childlike robot designed for teaching shared a message:

    “By teaming up, humans and robots can solve big problems like making education more accessible, caring for people, and protecting our planet,” said Codey, a robot from Mind Children, a startup based in Washington state.

    These robots, along with around 2,000 attendees, participated in the two-day Humanoids Summit at the Computer History Museum in Mountain View. Unlike standard industrial robots, humanoid robots resemble humans and replicate human movements.

    The robots are coming … to fold laundry. Credits: Los Angeles Times

    Summit insights and market developments

    The event brought together robotics companies from the United States, China, Japan, and other countries.

    It included presentations from representatives of Google, Disney, and Boston Dynamics, alongside demonstrations from California startups like Weave Robotics, Dyna Robotics, and Psyonic.

    The summit was organized by California-based venture capital firm ALM Ventures. With investors increasingly funding robotics companies, the race to integrate AI into physical robots that can interact with humans in real-world settings has intensified.

    By early December, U.S. humanoid robotics startups had raised nearly $2.8 billion in venture capital in 2025, a sharp increase from $42.6 million in 2020, according to PitchBook data. California-based humanoid robotics companies received the bulk of that investment, totaling around $1.6 billion.

    Figure, an AI robotics company based in San Jose that created a robot capable of handling dishes, laundry, and other household chores, announced in September that it had raised over $1 billion in funding and reached a valuation of $39 billion.

    Expanding Roles and Innovations in Robotics

    Robots have been developed for a wide range of tasks, including lifting heavy items in warehouses, assisting customers in stores, supporting medical professionals, performing on battlefields, and entertaining visitors at theme parks.

    Startups are also working on building the essential components for robots, such as hands, sensors, and cameras. Meanwhile, tech leaders have made ambitious predictions about their potential.

    This year, Elon Musk claimed that Tesla’s humanoid robot, Optimus, could “eliminate poverty,” outperform humans in productivity, and boost the global economy.

    However, some analysts caution that robots are still far from meeting these expectations and question whether they will be genuinely useful for businesses or consumers.

    “They’re impractical. They have limited capabilities. They’re not nearly as intelligent as they appear in demonstrations,” said Bill Ray, analyst and chief of research at Gartner.

    There are additional worries that robots could replace human jobs and infringe on privacy.

    Robots as Human Partners

    Robot developers, however, emphasize that their products are intended to assist humans rather than replace them.

    Modar Alaoui, founder and general partner of ALM Ventures, believes that robots will first gain traction in manufacturing. His firm recently launched a $100 million early-stage fund, part of which is earmarked for humanoid robot startups.

    He explained that robots are expected to take over the repetitive, hazardous, and monotonous tasks that need to be done daily. “This shift happens naturally, moving from simple automation to advanced, intelligent automation,” he said.

    The Humanoids Summit highlighted the current technical limits of robots. Most of the machines on display are not fully autonomous; many still rely on pre-programmed actions or human control.

    Image Credits: 2025 Los Angeles Times

    Market Development and Real-World Implementation

    The market for human-like robots is expected to expand significantly. Morgan Stanley Research predicts that by 2050, the humanoid robot market could reach $5 trillion—potentially double the size of the automotive industry—and estimates that over 1 billion humanoids could be in operation by then.

    Currently, in 2024, a humanoid robot costs around $200,000 in high-income countries, but Morgan Stanley projects that price could drop to $50,000 by 2050 as technology improves and production scales up.

    Weave Robotics Unveils Laundry-Folding Robots

    California-based startup Weave Robotics, known for its laundry-folding robot, has begun placing its machines in laundromats. The company, founded by former Apple engineers Evan Wineland and Kaan Dogrusoz, plans to launch a new robot called Isaac next year, designed to fold laundry and help tidy homes.

    Ahead of an upcoming conference, one of the company’s robots was on display at Sea Breeze Cleaners in San Francisco, folding shirts behind a large window facing the Noe Valley sidewalk.

    The unusual sight drew attention from passersby, who stopped to watch and take photos. While the AI-powered robot folds clothes more slowly than humans, it works methodically, tackling one pile at a time.

    The company and Sea Breeze Cleaners partnered with Tumble, an on-demand laundry delivery service that uses robots to speed up the laundry process.

    Robots for Work, Not Joy

    Kay Astorga, who co-owns Sea Breeze Cleaners with her husband, says the robot has helped draw new customers to their laundromat.

    Working alongside the robot has made her realize she prefers machines that look more mechanical than human, similar to the Disney-Pixar character WALL-E. She doesn’t want robots handling tasks that bring joy, like baking.

    “I don’t want a croissant made by a robot,” she said. “But a shirt folded by a robot? That’s fine with me.”

    While other California companies, such as Figure and 1X Technologies, are developing advanced home robots with human-like forms and legs, Weave Robotics’ laundry-folding machine doesn’t require a full humanoid body. This keeps installation costs below $10,000 and ensures operating expenses remain “extremely low,” according to Wineland.

    The company is also in discussions with businesses in manufacturing and hospitality. It plans to install a third robot at a laundromat in Walnut Creek early next year.

    Weave Robotics’ forthcoming home robot, named after science fiction writer Isaac Asimov, will be more expensive because it will be mobile, feature wheels, and include premium functions. The company envisions users interacting with it via an app to give commands.

    Robots for Dangerous and Specialized Tasks

    Some robots handle hazardous tasks that workers prefer to avoid.

    Agility Robotics, based in Oregon with an office in San Jose, has been deploying its two-legged robot, Digit, in warehouses as well as in manufacturing and logistics operations.

    “There’s a lot of manual work involving heavy lifting, and people can get cut or injured,” said Pras Velagapudi, Agility Robotics’ chief technology officer.

    Digit, which has claw-like grippers instead of hands, can lift up to 35 pounds. Companies such as Amazon have employed the robot for repetitive tasks like moving empty totes.

    Agility charges companies for the labor provided by its robots, and, like others in the industry, the company installs protective barriers around the machines for safety.

    In California, startups are also developing components for robots, and sometimes for human use as well.

    At a recent summit, San Diego-based startup Psyonic showcased robotic hands mounted on multiple arms resembling the Spider-Man villain Doctor Octopus. Psyonic makes the bionic ‘Ability Hand,’ which robots use and humans with limb loss wear. Sensors in the hand allow users to feel touch when gripping objects.

    Aadeel Akhtar, CEO and co-founder of Psyonic, shared that his inspiration for developing bionic limbs came from meeting a girl with a missing limb during a childhood trip to Pakistan with his parents. The company raised funds through equity crowdfunding and the TV show Shark Tank and is now working on prototypes for both arms and legs.

    Looking ahead, Akhtar expects robots to become increasingly common.

    “They will be more integrated into society,” he said. “It’s no longer such a novel idea.”


    Read the original article on: Tech Xplore

    Read more: Scientists Made a Powerful Glue from Recycled Cooking oil that can Pull a Car

  • Scientists Created a Swarm of Shape-Shifting Mini robots

    Scientists Created a Swarm of Shape-Shifting Mini robots

    Scientists have developed an unusual material that can be rigid when necessary yet soft and moldable when needed. What makes it remarkable is that tiny robots work together to form it, rather than using a traditional material.
    Image Credits:tempo

    Scientists have developed an unusual material that can be rigid when necessary yet soft and moldable when needed. What makes it remarkable is that tiny robots work together to form it, rather than using a traditional material.

    Tiny Robots That Act Like a Living Material

    While it may sound like science fiction, the concept is surprisingly straightforward. Researchers from UC Santa Barbara and TU Dresden created hockey-puck-sized robots that act as a coordinated, responsive material. As a group, they can shift shape, become solid, flow like a liquid, and even repair themselves.

    The team based their design on embryonic development, studying how cells organize and respond to signals to create tissues and organs. By imitating these cellular behaviors, the robots are able to coordinate their actions.

    Embryonic tissues are nature’s ultimate smart materials,” said researcher Otger Campàs. “They can shape themselves, heal, and precisely control their mechanical forces over space and time.

    Image Credits:A pesquisa mostrou que os coletivos robóticos podem imitar a forma como os tecidos vivos mudam de forma, abrindo caminho para materiais automorfos.

    Much like cells that push, pull, and cling together to shape the body, these robots exert forces on one another, move in sync, and link up to maintain a specific form.

    Rather than using muscles, each robot equips itself with eight motorized gears around its edge, maneuvering around neighboring robots and repositioning within confined spaces.

    Light sensors coordinate the robots, making them turn together like cells responding to chemical signals. Magnets allow them to connect when necessary, letting the material shift between soft and rigid states.

    Small Signal Changes, Big Shifts in Behavior

    As the system operated, researchers found that the material’s behavior depended not only on the robots themselves but on variations in their signaling. Subtle differences in movement patterns determined whether the collective acted as a solid structure or flowed like a liquid.

    We had previously shown that in living embryos, variations in the forces produced by cells play a key role in turning solid tissue into fluid tissue,” Campàs said. “So we programmed those force variations into the robots.

    For now, the setup remains a proof of concept involving only a limited number of robots. Researchers believe the system could be scaled and miniaturized for self-assembling structures, smart materials, and medical applications.


    Read the original article on: Tempo

    Read more: Intense Exercise Improves Muscle Performance in Patients With a Rare Autoimmune Condition

  • Classical Indian Dance is Inspiring New Ways to Teach Robots Hand Movements

    Classical Indian Dance is Inspiring New Ways to Teach Robots Hand Movements

    Researchers at the University of Maryland, Baltimore County (UMBC) have analyzed the precise hand gestures in Bharatanatyam, a classical Indian dance, uncovering a more complex "alphabet" of movement than typical hand grasps. This research could enhance robot hand movement training and provide improved tools for physical therapy.
    Ashwathi Menon, co-captain of UMBC’s Indian fusion dance team, helps demo some of the technology in the lab. Here, she demonstrates the Katakamukha mudra as a robotic hand mimics her gesture. Parthan Olikkal, a graduate student working on the project, is in the background. Image Credits: Brad Ziegler / UMBC

    Researchers at the University of Maryland, Baltimore County (UMBC) have analyzed the precise hand gestures in Bharatanatyam, a classical Indian dance, uncovering a more complex “alphabet” of movement than typical hand grasps. This research could enhance robot hand movement training and provide improved tools for physical therapy.

    Ramana Vinjamuri, a professor at UMBC and the lead researcher, has dedicated his lab to understanding how the brain controls complex hand movements. Over a decade ago, he and his collaborators began exploring and cataloging the fundamental components of hand motions, using the concept of kinematic synergies, where the brain coordinates multiple joint movements to simplify complex actions.

    This idea allows for the breakdown of a wide range of movements into a limited set of basic units, much like how the English language’s vast vocabulary can be formed from just 26 letters.

    Inspired by Ancient Dance to Unlock ‘Superhuman’ Movements

    Inspiration for further research came during a 2023 conference on the brain at the Indian Institute of Technology Mandi, located in the Himalayan foothills. While brainstorming how ancient Indian traditions could address modern challenges, Vinjamuri came up with a unique way to derive these foundational units from the precise hand gestures, or mudras, used in Indian classical dance to convey storytelling.

    “We observed that dancers age with remarkable grace—they stay flexible and agile due to their training,” says Vinjamuri. “This insight inspired us to explore more complex movement systems. With dance, we’re not just studying healthy movement, but exceptionally healthy movement. So, the question became: could we discover a ‘superhuman’ alphabet through dance gestures?”

    Image Credits: Scientific Reports (2025). DOI: 10.1038/s41598-025-25563-7

    Unrestricted vs. Controlled Movements

    In their newly published research, Vinjamuri and his students began by examining a dataset of 30 natural hand grasps used to pick up objects of various sizes, from large water bottles to small beads. They identified six synergies, similar to an alphabet of six letters, which, when combined, could explain nearly 99% of the movement variations in the dataset.

    Using the same methods, the team then analyzed 30 single-hand mudras. They discovered six synergies that accounted for about 94% of the variations in these gestures.

    Importantly, the team tested how effectively the six synergies from natural hand grasps could create unrelated hand motions—specifically, 15 letters of the American Sign Language alphabet—compared to those derived from the mudras. The mudras-derived synergies significantly outperformed those from the natural grasps in this task.

    “When we began this research over 15 years ago, we asked ourselves: Can we discover a universal ‘golden alphabet’ that can recreate any movement?” says Vinjamuri. “Now, I’m not sure such a thing exists. However, the mudra-derived alphabet is definitely superior to the natural grasp alphabet because it offers greater dexterity and flexibility.”

    In the long run, Vinjamuri envisions creating libraries of task-specific alphabets that could be used depending on the requirements—whether for everyday activities like cooking or folding laundry, or more intricate tasks such as playing a musical instrument.

    Assistive Robotic Hands

    The team is currently developing methods to “teach” robotic hands the movement alphabets and how to combine them to create new gestures. This approach shifts away from traditional methods where robots simply mimic hand movements, focusing instead on how the human body and brain function.

    The researchers are testing these techniques on both a standalone robotic hand and a humanoid robot, each of which requires a tailored approach to translating the synergies’ mathematical representations into physical motion.

    Additionally, the team has made significant progress in creating cost-effective and practical ways to test and apply their ideas. They use a basic camera and software system to capture, record, and analyze movements—an essential step toward developing affordable technologies that could be used at home, such as a virtual system to guide people through physical therapy, according to Vinjamuri.

    “Once I discovered synergies, I became really curious to see if we could use them to make a robotic hand move and perform just like a human hand,” says Parthan Olikkal, a longtime member of Vinjamuri’s lab and a Ph.D. candidate in computer science. “Contributing my own work to the research and seeing the results has been incredibly rewarding.”


    Read the original article on: Tech Xplore

    Read more: An Aerial Microrobot Can Reach Speeds Similar To a Bumblebee’s

  • Using Magnetic Fields, Soft Robots Can now Operate More Intelligently On Their Own

    Using Magnetic Fields, Soft Robots Can now Operate More Intelligently On Their Own

    The magnetically actuated manta ray robot is equipped with flexible batteries, soft magnetic elastomer actuators and a lightweight hybrid circuit for sensing and wireless communication. Image Credits: College of Design and Engineering at NUS

    Soft robots excel at flexing and handling delicate objects, allowing them to navigate tight or fragile environments to grow coral larvae in labs or inspect chemical plant piping. Yet it is still difficult to achieve true embodied intelligence in these robots, where sensing, actuation, and power all function together without external tethering.

    Magnetic Fields Boost Soft Robot Batteries

    Flexible materials can bend and adjust to their surroundings, but their power sources cannot. Traditional batteries tend to rigidify a soft robot’s structure, run out of energy quickly, or deteriorate when stretched, which keeps these robots tethered or short-lived.

    Assistant Professor Wu Changsheng and his team from the National University of Singapore’s Departments of Materials Science and Engineering, and Electrical and Computer Engineering, have converted this drawback into a strength. In a study published in Science Advances, they show that the same magnetic fields used to maneuver soft robots can also boost the performance of the onboard batteries.

    “Magnetic fields are usually applied to drive movement in soft robots—known as actuation—but we discovered they can also stabilize the electrochemical processes inside flexible batteries,” Asst Prof Wu said. “By letting actuation and energy management rely on the same physical principle, we can make the robot genuinely self-sufficient and efficient.”

    Vertically Stacked Batteries Mimic Manta Ray Efficiency

    The researchers created bendable zinc–manganese dioxide (Zn–MnO₂) batteries encased in soft silicone and arranged them in a vertical stack inside a manta ray–inspired robot. Crucially, this upright configuration—rather than the usual side-by-side layout—saves space and preserves the robot’s flexibility.

    “We drew inspiration from the manta ray because its body naturally integrates movement, sensing, and energy use in the way we aim to replicate,” Asst Prof Wu explained. “Its anatomy enables coordinated multifunctionality in a compact, efficient form—an ideal biological template for embodied intelligence.”

    Experiments showed that the magnetic field generated by the robot’s own ferromagnetic actuators helped stabilize the batteries’ internal electrochemistry, lowering the chance of dendrite formation—needle-like metal structures that can trigger short circuits—and preserving power output even after repeated deformation. With magnetic enhancement, the batteries kept 57.3% of their capacity after 200 cycles, nearly twice that of batteries without the magnetic boost.

    “Further analysis revealed the mechanism behind this improvement. The magnetic field produces a Lorentz force on the moving ions in the electrolyte, altering the paths of zinc ions during plating. This creates a more uniform ion flow, encouraging even zinc deposition on the anode and effectively preventing dendrite growth.”

    “At the same time, the magnetic field oriented the electron spins in the manganese oxide lattice, strengthening atomic bonds and protecting the crystal structure from breaking down during charging and discharging,” said Xiao Xiao, a Ph.D. student in Dr. Wu’s group and a co–first author of the study.

    “This combined magneto-electrochemical stabilization, achieved in a completely flexible design, marks a promising advance toward long-lasting onboard power systems for soft robots working in demanding, ever-changing conditions.”

    Intelligence Built Into The Body

    To showcase the idea, the researchers created a magnetically driven manta ray robot that incorporates flexible batteries, soft magnetic-elastomer actuators, and a lightweight hybrid sensing and wireless-communication circuit. Its fins move in response to magnetic fields produced by an external coil or array of electromagnets, allowing the robot to maintain stable movement and adjust to varying water conditions.

    As anticipated, the magnetic fields used to propel and guide the robot also help stabilize its energy supply—validating the team’s goal of integrating motion control with power regulation. The robot can carry out fundamental swimming actions, including straight-line movement, sharp 90-degree turns, and more intricate paths, all while sending real-time data to a computer that renders its behavior in a digital-twin model.

    In this framework, the robot demonstrated autonomous responses. When it approached an obstacle, its onboard inertial sensors registered abrupt shifts in acceleration, triggering the control system to reorient and choose an alternate route. It effectively maneuvered through tight spaces by adjusting its posture and performed U-turns when it encountered barriers it could not bypass.

    Integrated Systems Enable Stable, Responsive Movement

    During disturbance tests, the feedback controller quickly corrected shifts in yaw, pitch, and roll caused by waves or contact, keeping its course steady. Built-in temperature sensors also allowed it to monitor the environment, producing maps of thermal variations in aquatic settings.

    “By embedding actuation, sensing, and power systems throughout the robot’s structure, we can optimize its functional surface area while maintaining its softness,” explained Asst. Prof. Wu. “This approach allows the robot to move, sense, and react to its surroundings instantly.”

    The magnetically actuated manta ray robot is equipped with flexible batteries, soft magnetic elastomer actuators and a lightweight hybrid circuit for sensing and wireless communication. Image Credits: College of Design and Engineering at NUS

    Looking forward, the team aims to broaden the robot’s sensing abilities by integrating compact sensors, such as ultrasonics for environmental awareness or chemical sensors for monitoring water quality. They are also investigating how magnetic enhancement could benefit other battery types, like lithium-ion, or alternative forms, such as wearable battery fibers, to boost energy density and extend operational time.

    “Our goal is to create soft robots capable of autonomous thought and action in challenging or hard-to-reach environments—whether inspecting pipelines, observing marine ecosystems, or assisting in surgical procedures,” said Asst. Prof. Wu.

    “By thinking creatively and critically about how energy and intelligence are embedded in the body, we can bring soft robotics closer to the elegance of nature—much like the fluid, majestic movements of a manta ray.”

    Asst. Prof. Wu conducted this research in partnership with teams from Tsinghua University, the University of California, Los Angeles (UCLA), and Dartmouth College.


    Read the original article on: Tech Xplore

    Read more: The Most Peculiar Robot in Beijing may hold the Greatest Significance of the Decade

  • Robots Can Handle Unsecured Loads Thanks To Tactile Sensors

    Robots Can Handle Unsecured Loads Thanks To Tactile Sensors

    Moving into a new home often feels like tackling a massive, three-dimensional puzzle when packing the moving truck. Every item must fit perfectly, and any instability or imbalance can make it shift and damage itself during the move.
    Image Credits: Carnegie Mellon University Mechanical Engineering

    Moving into a new home often feels like tackling a massive, three-dimensional puzzle when packing the moving truck. Every item must fit perfectly, and any instability or imbalance can make it shift and damage itself during the move.

    Load Balancing: Humans vs. Robots

    Humans naturally balance objects—whether a tray of food or a stack of boxes—through the coordination of their muscles and inner ear. For robots, however, staying balanced while carrying loads is much more complicated, requiring constant monitoring of both their own position and the object’s to make real-time adjustments.

    To address this, researchers at Carnegie Mellon University’s Department of Mechanical Engineering have created a tactile sensor that allows a four-legged robot to transport unsecured cylindrical objects over long distances.

    Previously, quadrupedal robots relied on containers to hold items, restricting the types of objects they could move. LocoTouch, a high-density tactile sensor array covering the robot’s entire back, provides feedback on the object’s position, enabling the robot to adjust its movements and keep the load stable.

    “The tactile sensor uses a piezoresistive film sandwiched between conductive fabric electrodes,” explained Changyi Lin, a Ph.D. candidate in the Safe AI Lab. “Each sensing element sits where the electrodes cross, so when an object shifts and bends the piezoresistive film, the electrodes detect the resulting change in resistance.”

    Using over 4,000 digital twins of the robot dog combined with reinforcement learning, the team taught the robot to adapt to almost any movement of an object on its back. The abilities acquired in simulation transferred directly to the real world without additional fine-tuning. In lab tests, the robot navigated around cones, traversed obstacles, and responded to external disturbances—such as a person nudging the object—while carrying items of different shapes and sizes over a distance exceeding 60 meters.

    Image Credits: Carnegie Mellon University Mechanical Engineering

    Tactile Sensing: Giving Robots a Human-Like Touch

    “Robots are meant to assist humans, so they must perceive and interact with the world as we do.” This is the first time tactile sensing has been implemented in quadrupedal robots, but it’s just the beginning,” said Ding Zhao, assistant professor of mechanical engineering. “With this feedback, robots will be able to perform more complex tasks. Our next goal is to scale the sensors to cover an entire robot.”

    The team believes this technology brings us closer to practical home-helper robots. They also envision outdoor applications, such as carrying sensors to remote areas to monitor landslides. Beyond that, it could assist in hospitals, factories, or even on a truck bed, making it easier to move objects.


    Read the original article on: Tech Xplore

    Read more: Automation and Smart, Data-Driven Tech are Transforming Construction’s Future





  • A Multi-Function Mimicking Neuron Moves Robots Closer to Human-Like Abilities

    A Multi-Function Mimicking Neuron Moves Robots Closer to Human-Like Abilities

    Scientists have developed an artificial neuron that can imitate multiple brain regions, bringing us closer to robots that perceive and react to their surroundings much like humans.
    An electronic chip used to create an artificial transneuron – a tiny electronic circuit that replicates how brain cells pass signals between one another by generating small electrical pulses. Image Credits: Loughborough University

    Scientists have developed an artificial neuron that can imitate multiple brain regions, bringing us closer to robots that perceive and react to their surroundings much like humans.

    The Power and Limits of Neuromorphic Neurons

    Artificial neurons—small electronic circuits that mimic how brain cells interact—are central to neuromorphic computing, which seeks to give machines human-like intelligence.

    However, current artificial neurons are limited to specific tasks, requiring thousands to perform even simple brain functions. This makes the process expensive and energy-intensive compared with the brain’s natural efficiency.

    Now, brain-like intelligence might be within reach, thanks to an international team led by Loughborough University, collaborating with researchers from the Salk Institute and the University of Southern California.

    In a recent paper, the researchers report that their single artificial neuron, called a “transneuron,” can take on the roles of brain cells involved in vision, planning, and movement—demonstrating a flexibility once considered unique to the human brain.

    Recreating the Human Brain with Transneurons

    “Nature Communications published a study titled ‘Artificial transneurons emulate neuronal activity in different areas of brain cortex.’”

    “Is the human brain an elusive device beyond our reach, or could we one day recreate it with electronics—and perhaps even surpass it?” asks Professor Sergey Saveliev, a theoretical physics expert at Loughborough University and the study’s corresponding author.

    Our work moves us closer to answering this question. We’ve demonstrated that a single artificial neuron can be adjusted to mimic the behavior of visual, motor, and pre-motor neurons.

    “This breakthrough could lead to electronic chips capable of executing complex, brain-like tasks—such as processing visual data and controlling movement—using only a few artificial neurons. In the long run, this brings us nearer to creating more human-like robots.”

    Electronic chips used to create artificial transneurons – tiny electronic circuits that replicate how brain cells pass signals between one another by generating small electrical pulses. They are pictured in front of the experimental setup used to capture how they responded to electrical input. Image Credits: Loughborough University

    Study Outcomes

    The researchers evaluated how closely their device replicates brain activity by sending electrical signals into the transneuron and measuring its output pulses. These were then compared to the electrical signals used by real brain cells, recorded from macaque monkeys.

    They concentrated on three brain regions: one responsible for vision, another for movement control, and a third involved in preparing actions. Each region generates a distinct pulse pattern—sometimes steady, sometimes irregular, and sometimes rapid bursts.

    Impressively, by fine-tuning the device’s electrical settings, a single transneuron was able to mimic all three pulse patterns with 70–100% accuracy.

    “Our brains are extremely efficient, capable of handling complex tasks like face recognition or movement control while consuming very little energy,” says Professor Alexander Balanov, Professor of Physics at Loughborough University.

    By adjusting the electric circuit settings of our devices, such as altering the voltage, a single unit can mimic different types of brain neurons. Our artificial neurons also respond effectively to environmental changes, like pressure and temperature, which could enable artificial sensory systems.

    “This technology could pave the way for future computers that are faster and more energy-efficient than today’s, as well as robots that can adapt their behavior in real time, much like living organisms.”

    Transneurons Compute Like Neurons

    Importantly, the researchers showed that the transneuron does more than mimic neuron behavior—it actually performs computations like real neurons.

    By altering the electrical signals fed into the device, the transneuron adjusted its pulse frequency, similar to how brain cells change their activity in response to incoming signals.

    When given two signals simultaneously, the transneuron reacted differently depending on whether the signals were synchronized or not, indicating it can distinguish between inputs—something that typically requires several artificial neurons working in concert.

    Mechanism of Artificial Transneurons

    Like other artificial neurons, the transneuron is a tiny electronic chip that imitates how brain cells communicate by generating small electrical pulses.

    Its brain-like adaptability comes from a newly identified component called a memristor—a nanoscale device that physically changes when electricity passes through it, allowing it to “remember” past signals and adjust its responses, similar to how neurons learn.

    As electricity flows through the transneuron, silver atoms within the memristor shift to form and break microscopic bridges, creating the electrical pulses.

    Environmental factors—such as temperature, voltage, and resistance—affect the memristor, which in turn alters the pulse behavior.

    From left to right, Professor Alexander Balanov, Professor Sergey Saveliev, and Dr Pavel Borisov, of the Loughborough University Department of Physics. The scientists are part of a team of international researchers that have created a new artificial neuron that can mimic different parts of the brain – which could be the key to more human-like robotics. Image Credits: Loughborough University

    This is how the researchers can adjust the transneuron to mimic different brain regions without relying on software.

    “Most of today’s AI runs on computers that process information very differently from the brain,” explains Dr. Sergei Gepshtein, an expert in visual perception and visually guided behavior at the Salk Institute.

    Laptops and phones handle data with rigid, step-by-step logic, whereas the brain operates through vast networks of neurons firing in irregular, often unpredictable patterns.

    “Our transneuron brings us closer to hardware that doesn’t just simulate brain-like activity in software—it functions in a genuinely brain-like manner.”

    Designing a Robotic Nervous System

    The researchers’ next goal is to develop a “brain cortex on a chip” by linking multiple transneurons into networks capable of perception, learning, and control.

    They believe this approach could transform robotics, laying the groundwork for a robotic nervous system that allows machines to sense, adapt, and respond to their environment like living organisms.

    “This represents a small but important step toward robots with artificial nervous systems,” says Professor Joshua Yang, an expert in electrical and computer engineering at the University of Southern California.

    “Such systems could enable robots to learn more efficiently, using less energy, time, and data. They could also support continuous, lifelong learning, adapting seamlessly to new experiences—capabilities that remain challenging for today’s AI systems.”

    Potential Uses of Transneurons in the Brain

    Dr. Pavel Borisov, an experimental physicist at Loughborough University, suggests the research could also enhance our understanding of the human brain.

    “This brings us a step closer to recreating at least a small part of the brain in electronic form,” he said.

    Devices like those described in this study could one day interact with the human central nervous system, potentially replacing or supplementing certain brain regions.

    “Additionally, these artificial neurons provide a sandbox for neuroscientists to explore how different brain areas communicate and to gain deeper insights into the formation of consciousness.”


    Read the original article on: Tech Xplore

    Read more: A Microrobot Moves Through the Bloodstream to Deliver Medication Precisely

  • Robots Trained on Spatial Datasets Gain Better Awareness and Object Handling

    Robots Trained on Spatial Datasets Gain Better Awareness and Object Handling

    Machines naturally lag behind humans in navigating their environments. To strengthen the visual perception skills robots need to interpret the world, researchers have created a new training dataset designed to boost their spatial awareness.
    Image Credits: CC0 Public Domain

    Machines naturally lag behind humans in navigating their environments. To strengthen the visual perception skills robots need to interpret the world, researchers have created a new training dataset designed to boost their spatial awareness.

    RoboSpatial Boosts Robots’ Spatial Awareness

    In recent work, experiments revealed that robots trained on the new dataset, RoboSpatial, surpassed those using standard models on the same task, indicating a more advanced grasp of spatial relationships and physical object handling.

    For humans, visual perception underpins how we engage with our surroundings—helping us recognize others, track our movements, and stay aware of our body’s position. But despite earlier attempts to equip robots with similar abilities, most systems still underperform because their training data lacks rich spatial detail.

    Since strong spatial reasoning is essential for natural, intuitive interaction, failing to address these shortcomings could limit future AI systems’ capacity to follow complex instructions and function effectively in dynamic environments, said Luke Song, the study’s lead author and a Ph.D. student in engineering at The Ohio State University.

    “For robots to become truly general-purpose foundation models, they must be able to comprehend the 3D world around them,” he said. “That makes spatial understanding one of their most essential abilities.”

    RoboSpatial Research Presented at CVPR 2025

    The researchers recently presented their work orally at the Conference on Computer Vision and Pattern Recognition and published it in the proceedings of the 2025 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

    To help robots better understand perspective, RoboSpatial offers over a million real-world indoor and tabletop photos, thousands of high-resolution 3D scans, and 3 million labels encoding detailed spatial information crucial for robotics. With this large dataset, the system links 2D egocentric images to complete 3D scans of the same environment, enabling the model to locate objects using either flat visual cues or full geometric structure.

    RoboSpatial Enables Real-World Spatial Reasoning in Robots

    The study notes that this approach closely mirrors how visual signals are interpreted in everyday settings.

    For example, while existing datasets may allow a robot to identify ‘a bowl on the table,’ they typically do not show where it sits, where to place it for easy access, or how it relates to nearby items. RoboSpatial, however, lets researchers test these spatial reasoning abilities thoroughly in real robotic tasks—first by having robots rearrange objects, then by assessing how well models apply their reasoning to unfamiliar spatial scenarios beyond their initial training.

    “Beyond improving individual actions like picking up or placing objects, this also helps robots engage with people in a more natural way,” Song said.

    One of the platforms tested with the new framework was the Kinova Jaco robot, an assistive robotic arm designed to help people with disabilities interact with their surroundings.

    During training, the system successfully answered basic yes–no spatial questions such as “Can the chair go in front of the table?” or “Is the mug positioned to the left of the laptop?”

    Improved Spatial Perception Could Lead to Safer, More Reliable AI

    According to Song, these encouraging outcomes suggest that strengthening robotic perception by standardizing spatial context could pave the way toward safer, more dependable AI systems.

    Although many aspects of AI development and training remain unresolved, the study concludes that RoboSpatial could become a cornerstone for wider robotic applications, suggesting that numerous new advances in spatial reasoning may emerge from it.

    “I expect we’ll see major breakthroughs and impressive new capabilities in robotics over the next five to ten years,” Song said.

    The research team also included Yu Su of Ohio State, along with Valts Blukis, Jonathan Tremblay, Stephen Tyree, and Stan Birchfield of NVIDIA.


    Read the original article on: Tech Xplore

    Read more: IQ May Affect How Well You Understand Speech

  • Building Robots that Adjust Based On Your Feelings

    Building Robots that Adjust Based On Your Feelings

    Robots may be becoming more intelligent, but to genuinely assist people in everyday life, they must also develop greater empathy — the ability to detect and respond to human emotions as they happen.
    Image Credits: University of Manchester

    Robots may be becoming more intelligent, but to genuinely assist people in everyday life, they must also develop greater empathy — the ability to detect and respond to human emotions as they happen.

    Why Emotional Intelligence Matters in Human-Robot Interaction

    Intelligence alone isn’t enough when interacting with humans in real-world settings; understanding context, mood, and emotional cues is essential for building trust and meaningful relationships. Whether it’s a caregiving robot helping the elderly, a virtual assistant supporting someone through stress, or an educational tool guiding a student, emotional awareness allows machines to respond more appropriately and compassionately.

    Empathetic AI can adapt its behavior based not only on what a person says, but how they feel — creating interactions that are not just functional, but supportive and human-centered.

    Most facial recognition models are trained a single time and assumed to perform well in all situations. However, models trained on a specific dataset often falter when encountering unfamiliar scenarios, and retraining them from the beginning is both time-consuming and inefficient.

    Teaching AI to Evolve Emotionally, One Feeling at a Time

    To address this, Dr. Rahul Singh Maharjan and his team are exploring a new method: enabling AI to learn emotions gradually. Rather than discarding previous knowledge, their system builds upon existing understanding while incorporating new emotional information. This approach enhances the model’s adaptability and makes it more effective in real-world human interactions.

    Dr. Maharjan emphasizes that for technology to become a seamless part of our lives, it needs to grasp human emotions. “My aim,” he says, “is to create AI that goes beyond processing data — one that forms meaningful connections with people.”


    Read the original article on: Tech Xplore

    Read more: Even with Encryption, Robots are Vulnerable to Privacy Breaches

  • Interacting with Robots can Reduce Feelings of Loneliness for Caregivers

    Interacting with Robots can Reduce Feelings of Loneliness for Caregivers

    Image credit: Pixabay

    These interactions with a social robot provided caregivers with something they deeply miss. In a quiet room at the University of Cambridge, something remarkable took place—not through technology or programming, but through simple conversation. A group of informal caregivers, emotionally drained from supporting their loved ones, discovered a sense of comfort not from therapy or peer groups, but from talking with a humanoid robot called Pepper.

    This wasn’t a scene from science fiction—it was a groundbreaking study exploring how social robots can support people in managing emotional distress, particularly those who seldom have the opportunity to express their own emotions.

    When Emotional Distress Becomes a Daily Reality

    Emotional distress goes beyond simply feeling sad — it’s the ongoing burden that builds when life’s challenges become overwhelming and we feel unable to cope. For caregivers — individuals who look after ill or disabled loved ones without pay or professional training — this pressure can be constant. Many describe feeling alone, exhausted, and emotionally overlooked.

    While conversation can be a powerful relief, caregivers often don’t have the time, environment, or support system to open up. That’s where Pepper stepped in.

    During a five-week study, caregivers met with Pepper twice a week. The robot wasn’t there to diagnose problems or offer advice—it simply engaged in casual conversation and listened. Gradually, something unexpected began to happen.

    Carers started opening up more,” explained Dr. Guy Laban, the study’s lead researcher. “They spoke with greater ease, reflected more deeply, and told us that interacting with Pepper helped them reconnect with their own emotional needs.”

    Caregivers reported feeling better emotionally, experiencing less loneliness, and finding comfort in Pepper’s presence. The robot acted like an emotional mirror—always present, nonjudgmental, and steady.

    At the core of the research was self-disclosure—the act of expressing one’s inner thoughts and emotions. While this is a powerful tool for emotional well-being, it’s often out of reach for caregivers. Pepper provided a safe, low-pressure space for that expression, allowing participants to process their experiences and view their caregiving roles in a more positive light.

    Following the intervention, many caregivers reported feeling less self-critical, more accepting of their circumstances, and experienced a renewed sense of meaning in their caregiving roles.

    Groundbreaking Study Explores How Robots Like Pepper Can Support Mental Health Through Conversation

    Published in the International Journal of Social Robotics, this study is the first to investigate the long-term emotional impact of self-disclosure facilitated by a robot. While Pepper isn’t a substitute for human interaction, the results suggest that social robots could serve as meaningful tools in mental health support—particularly for individuals who often feel ignored or emotionally unsupported.

    Informal caregivers frequently face intense emotional strain and isolation,” said Professor Emily Cross of ETH Zürich, a co-author of the study. “To our knowledge, this is the first time research has shown that a series of personal conversations with a robot can significantly reduce loneliness and emotional stress in caregivers.

    The intervention also helped participants embrace their caregiving roles more fully and improved their emotional regulation. This points to the potential for assistive social robots to provide emotional support when human connection is limited.”


    Read the original article on: TechXplorist

    Read more: Humanoid Robot Launched for Better Interaction