Microsoft Unveils Light-Based Computer Perhaps 100x More Efficient Than Traditional Systems
REDMOND, WA – Microsoft researchers have developed a new type of computer utilizing light instead of electricity, inspired by analog computing technology dating back to the 1940s. The “Analog Optical Computer” (AOC) prototype demonstrates the potential to dramatically improve the energy efficiency of artificial intelligence adn othre complex calculations, potentially by a factor of 100.
The AOC leverages the principles of analog computing,where data is represented by continuous physical quantities like light intensity,rather than discrete bits. This approach, largely abandoned with the rise of digital computers, is experiencing a resurgence due to the limitations of energy consumption in modern AI workloads.Microsoft’s prototype uses micro-LEDs to process information using light waves, offering a pathway to overcome those limitations.
“Our goal, our long-term vision is this being a notable part of the future of computing,” said Hitesh Balalani, a researcher in Microsoft’s Cloud systems futures team.
In initial tests, the AOC performed comparably to a digital computer on basic machine learning tasks. However, researchers believe that larger AOC models, capable of handling more variables, could surpass digital computers in energy efficiency. The team successfully used a digital twin of the AOC to reconstruct a 320-by-320-pixel brain scan image from 62.5% of the original data, accurately reproducing the scan – a development that could lead to faster MRI times.
Furthermore, the AOC outperformed existing quantum computers in solving complex financial problems involving efficient fund exchange and risk minimization.Michael Hansen, senior director of biomedical signal processing at Microsoft Health Futures, highlighted the potential impact in a recent blog post detailing the research.
Currently, the AOC remains a prototype.Future iterations are planned with increased micro-LED capacity,aiming to enable computation with millions or billions of variables together.This technology could revolutionize fields requiring massive computational power, including AI, medical imaging, and financial modeling, by significantly reducing energy consumption and processing times.