Home » Technology » How brain-inspired analog systems could make drones more efficient

How brain-inspired analog systems could make drones more efficient

AI Learns to See Like the Brain for Efficient Machines

Engineers Pursue Neuroscience for Smarter, Greener Autonomous Systems

Researchers are crafting artificial intelligence that mimics the human brain’s visual processing, aiming to create more energy-efficient AI for drones and self-driving cars.

Advancing Beyond Digital Limits

Current AI, including neural networks guiding autonomous vehicles, operates on conventional digital computers. While reliable, these systems consume significant power. Engineers at the University of Rochester are pioneering a new approach by developing analog hardware inspired by predictive coding networks. This method draws from neuroscience theories suggesting the brain uses an internal environmental model, constantly refined by visual feedback.

“Research by neuroscientists has shown that the workhorse of developing neural networks—this mechanism called back propagation—is biologically implausible and our brains’ perception systems don’t work that way.”

Michael Huang, Professor of Electrical and Computer Engineering, Computer Science, and Data Science and Artificial Intelligence at Rochester

This shift away from traditional digital neural networks, particularly for computer vision tasks, seeks a more biologically plausible and efficient solution. The goal is to move beyond computationally intensive methods like backpropagation, which are not believed to reflect how human brains function.

Predictive Coding: A Brain-Inspired Framework

The prevailing theory driving this research is predictive coding. This neurological framework posits a hierarchical system of prediction and correction. Think paraphrasing what you heard, telling it to the speaker, and using their feedback to refine your understanding, explains Michael Huang.

The University of Rochester boasts a significant history in computer vision research. Notably, late computer science professor Dana Ballard was an author on an seminal early paper concerning predictive coding networks.

A multidisciplinary team, including Michael Huang, Hui Wu, and Tong Geng, professors of electrical and computer engineering at Rochester, along with students and research groups from Rice University and UCLA, will lead this initiative. They have secured up to $7.2 million from the Defense Advanced Research Projects Agency (DARPA) for a 54-month project focused on developing these biologically inspired predictive coding networks for digital image recognition on analog circuits.

Real-World Impact and Future Applications

The initial prototype will focus on classifying static images. However, success in matching the performance of current digital methods could pave the way for more complex perception tasks crucial for autonomous vehicles and drones. Notably, this innovative system will leverage existing manufacturing technologies like complementary metal oxide semiconductor (CMOS), avoiding experimental components.

This innovative research could significantly enhance the capabilities of autonomous systems. For instance, advancements in AI vision processing are critical for self-driving cars to navigate complex urban environments. A study by McKinsey reported that AI could add $13 trillion to the global economy by 2030, with AI in transportation playing a key role (McKinsey 2023).

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.