
Introducing X1: The world’s first multi-robot system combining a humanoid robot with a shape-shifting drone that can launch from its back and later transform to drive independently.
Global Collaboration Advancing Multimodal Robotics
The new multimodal system is the result of a three-year partnership between Caltech’s Center for Autonomous Systems and Technologies (CAST) and the Technology Innovation Institute (TII) in Abu Dhabi, United Arab Emirates. This robotic platform highlights the groundbreaking potential that emerges when global leaders in autonomous systems, AI, robotics, and propulsion collaborate on cutting-edge research.
“Today’s robots can fly, drive, or walk—each is useful in specific situations,” explains Aaron Ames, director of CAST, Booth-Kresa Leadership Chair, and Bren Professor of Mechanical and Civil Engineering, Control and Dynamical Systems, and Aerospace at Caltech. “The question is, how do we integrate these different forms of movement into one system to maximize their advantages while minimizing their limitations?”
To evaluate the capabilities of the X1 system, the team recently carried out a demonstration on Caltech’s campus. The scenario was designed around a hypothetical emergency situation requiring rapid deployment of autonomous robots to the affected area. For the test, researchers customized a commercially available Unitree G1 humanoid robot to carry Caltech’s M4—a versatile robot capable of both flying and driving—on its back, similar to a backpack.
X1 Demo Highlights Multi-Modal Mobility
The demonstration began with the humanoid robot positioned inside the Gates–Thomas Laboratory. It walked through the Sherman Fairchild Library and exited the building, heading to a raised area suitable for deploying M4. Once there, the humanoid leaned forward, enabling M4 to launch in its drone configuration. After takeoff, M4 landed, transformed into driving mode, and continued its journey on wheels.
On the way to its target location, M4 encountered the Turtle Pond. In response, it switched back to drone mode, flew over the obstacle, and proceeded to the simulated emergency site near Caltech Hall. The humanoid and a second M4 later joined the first unit at the scene.
“The challenge lies in integrating different robots so they operate as a single system with multiple capabilities,” says Mory Gharib, Ph.D., the Hans W. Liepmann Professor of Aeronautics and Medical Engineering at Caltech and founding director of CAST. “Through this collaboration, we found the ideal partnership to make that vision a reality.”
Collaborative Expertise Powers Advanced Robotics Innovation
Gharib’s team, which originally developed the M4 robot, specializes in creating robots that can both fly and drive, along with advanced control systems. The Ames Lab contributes its expertise in robotic locomotion and the development of safe control algorithms for humanoid robots. TII adds deep knowledge in autonomy and robotic sensing in urban environments. Additionally, a team from Northeastern University, led by engineer Alireza Ramezani, supports the project with its experience in designing morphing robots.
“We had a highly collaborative environment, with experts from various fields tackling complex robotics challenges—from perception and sensor fusion to control systems, locomotion modeling, and hardware design,” says Ramezani, an associate professor at Northeastern.
M4 Enhanced with Advanced Computing and Navigation
During a visit to Caltech in July 2025, TII engineers worked with their collaborators to build an updated version of M4, incorporating Saluki—a secure onboard computing and flight control system developed by TII. Looking ahead, the team plans to equip the full system with advanced sensors, model-based algorithms, and machine learning to enable real-time navigation and environmental adaptation.
“We equip the robots with various sensors—like lidar, cameras, and range finders—and then fuse that data to give the robot an understanding of its surroundings,” says Claudio Tortorici, director at TII. “This allows the robot to know where it is and navigate autonomously from one location to another.”
Beyond Imitation: Advanced Autonomous Locomotion in Action
Aaron Ames adds that the demonstration revealed more complexity than was immediately apparent. For instance, the humanoid robot wasn’t just walking across campus—it was performing advanced, autonomous locomotion. Most humanoid robots today rely on motion data captured from humans to replicate specific actions like walking or kicking. These movements are then scaled to fit the robot’s structure, allowing it to repeat the motion. However, in this demonstration, the robot was operating beyond simple imitation.
However, as Ames points out, “If we truly want to deploy robots in complex, real-world environments, we need to develop their ability to generate actions independently—without relying on human movement data.”
Combining Physics Models and Machine Learning for Adaptive Robotics
To achieve this, his team creates mathematical models that capture the underlying physics of how robots interact with their environment. When combined with machine learning, these models enable robots to perform more adaptable, generalized behaviors, allowing them to handle a wide range of unpredictable situations.
“The robot learns to walk by following the laws of physics,” Ames explains. “This means X1 can walk across various types of terrain, climb up and down stairs, and importantly, carry something like M4 on its back while doing so.”
A key aim of the collaboration is to enhance the safety and reliability of these autonomous systems.
“I believe we’ve reached a point where people are beginning to accept robots,” says Tortorici. “For robots to become a common presence in our lives, they need to be dependable.”
This remains a major focus for the team. “We’re working on safety-critical control to ensure our systems are trustworthy and secure,” Ames adds. “We have several projects beyond this one that explore different aspects of autonomy. These challenges are significant, but through our diverse collaborative efforts, we’re able to tackle them on a larger scale and advance autonomy in a meaningful, coordinated way.”
Read the original article on: Tech Xplore
Read more: Study Reveals People Forgive Machines Similarly to Humans
