
Engineers at Penn have created a system that enables robots to see around corners by analyzing radio waves with AI, a breakthrough that could boost the safety and efficiency of self-driving cars and robots working in crowded indoor spaces such as warehouses and factories.
The technology, known as HoloRadar, allows robots to rebuild three-dimensional scenes beyond their direct line of sight, such as detecting pedestrians coming around a corner. Unlike earlier non-line-of-sight (NLOS) methods that depend on visible light, HoloRadar operates dependably in darkness and changing lighting conditions.
“Robots and autonomous vehicles must perceive more than what’s immediately ahead of them,” says Mingmin Zhao, Assistant Professor in Computer and Information Science (CIS) and senior author of the study introducing HoloRadar at the Conference on Neural Information Processing Systems. “This ability is critical for enabling robots and self-driving vehicles to make safer, real-time decisions.”
Using Walls as Reflective Surfaces
A key breakthrough behind HoloRadar stems from a surprising property of radio waves. Unlike visible light, radio signals have much longer wavelengths—something typically considered a drawback for imaging because it reduces resolution. But Mingmin Zhao and his team recognized that these longer wavelengths are actually beneficial for seeing around corners.
“Because radio waves are far larger than the tiny imperfections on wall surfaces,” explains Haowen Lai, a CIS doctoral student and co-author of the study, “those surfaces essentially act like mirrors, reflecting radio signals in consistent and predictable ways.”
In practice, this means flat surfaces such as walls, floors, and ceilings can redirect radio waves around corners, sending information about hidden areas back to the robot. HoloRadar collects these reflected signals and reconstructs scenes beyond its direct line of sight.
“It’s similar to how drivers use mirrors at blind intersections,” Lai adds. “With radio waves, the entire environment effectively becomes filled with mirrors—without any physical modifications.”

Built for Real-World Environments
In recent years, other teams have built systems with similar goals, often relying on visible light. These approaches interpret shadows or indirect reflections, making them heavily dependent on specific lighting conditions. Efforts to use radio signals, meanwhile, have typically required slow, bulky scanning hardware, limiting their practicality outside the lab.
“HoloRadar is built for the real environments where robots actually function,” says Mingmin Zhao. “It’s mobile, operates in real time, and doesn’t rely on controlled lighting.”
Rather than replacing existing sensors, HoloRadar enhances the safety of autonomous machines by working alongside them. Self-driving vehicles already use LiDAR—laser-based sensing that detects objects within direct view—but HoloRadar extends perception beyond that range, uncovering hidden hazards and giving robots and vehicles more time to respond.

Analyzing Radio Signals with AI
A single radio pulse can ricochet several times before reaching the sensor again, producing a complex mix of reflections that traditional signal-processing techniques struggle to separate.
To address this, the researchers created a specialized AI system that blends machine learning with physics-based modeling. First, it sharpens the raw radio data and detects multiple “returns” from different reflection paths. Then, using a physics-informed model, it traces those signals backward—counteracting the mirror-like effects of surrounding surfaces to rebuild the true 3D scene.
“In a way, it’s like stepping into a room lined with mirrors,” explains Zitong Lan, a doctoral student in Electrical and Systems Engineering (ESE) and co-author of the study. “You see many reflections of the same object in different spots, and the challenge is figuring out their real positions. Our system learns to reverse that process using physical principles.”
By explicitly modeling how radio waves reflect off surfaces, the AI can tell apart direct and indirect signals and accurately pinpoint the real-world locations of objects, including people.
Transitioning from Research to Real-World Use
The team evaluated HoloRadar on a mobile robot navigating real indoor spaces, such as hallways and building corners. In these tests, the system effectively reconstructed walls, corridors, and even people hidden from the robot’s direct view.
Looking ahead, the researchers plan to extend testing to outdoor environments, like intersections and city streets, where greater distances and rapidly changing conditions present new challenges.
“This represents a key step toward giving robots a fuller awareness of their surroundings,” says Mingmin Zhao. “Our ultimate aim is to enable machines to operate safely and intelligently in the complex, dynamic environments humans encounter every day.”
Read the original article on: Tech Xplore
Read more: A Modular Robot with Open-Source Design for Investigating Evolution


