
Cornell University
Scientists at Cornell University have created an electronic chip they call a “microwave brain.” Unlike traditional digital chips, this analog chip can process ultrafast data and wireless communication signals at the same time.
Although we usually think of computers as digital, this is only one type of computing. Throughout history and even today, many devices that qualify as computers operate using analog principles.
Digital vs. Analog Computers: Understanding the Difference
Modern computers are digital, using on/off switches in logic circuits to process binary data. In contrast, analog computers model real or abstract systems to perform calculations.
A classic analog computer is a mechanical clock, which tracks time through springs, gears, and an escapement. Other examples include slide rules, speedometers, and spring- or liquid-based thermometers.
In the past, advanced analog computers solved complex equations with rods and cams or simulated economies using liquid flows between reservoirs. One 1947 model was designed to be assembled from a Meccano set by aspiring engineers. Notably, it wasn’t long ago that many electronic computers relied on analog circuits with potentiometers and voltmeters to perform calculations.
With digital computers dominating today, why revisit analog designs? Analog circuits offer several advantages: they are far simpler than digital ones, can bypass many steps digital computers require to solve problems, and operate much faster since they can run tasks in parallel. They also consume less power, excel at handling continuously changing or complex systems thanks to their reliance on physical behavior, and—unlike digital systems restricted to discrete numbers—can process data across an almost limitless range of values.
Mimicking Neural Networks with Analog Technology
Cornell is now developing its “microwave brain,” touted as the first fully integrated silicon microchip to act as a true microwave neural network. By replacing digital logic with the analog physics of microwaves, the chip can emulate the way human neurons recognize patterns and learn, streamlining signal processing by eliminating many steps that digital computers typically need.
The chip achieves this while consuming very little power—around 200 milliwatts to operate at tens of gigahertz. Tests have also shown it can classify wireless signal types with 88% accuracy.
Its compact size suggests it could be integrated into devices like smartwatches and smartphones, providing AI capabilities without relying on cloud servers. Beyond that, the technology has potential applications in enhancing hardware security, detecting anomalies in wireless communications, and improving radar tracking and radio signal decoding.
“In conventional digital systems, increasing task complexity requires more circuitry, higher power, and additional error correction to stay accurate,” explained research lead Bal Govind. “Our probabilistic approach, however, allows us to achieve high accuracy on both simple and complex computations without that extra burden.”
Read the original article on: New Atlas
Read more: Laser-Free Vision Correction Uses Electrical Current to Reshape the Eye
