Meet the chip that uses light to power AI

Meet the chip that uses light to power AI

A team of engineers from the University of Pennsylvania has just presented a technology that could set a new course for the future of computing. Their creation, a chip that uses light to perform artificial intelligence computations, could significantly increase data processing speeds while reducing energy consumption.

The new chip also promises increased privacy for users

At the heart of this innovation is the field of silicon photonics (SiPh), where silicon, the common and cost-effective material at the heart of traditional computer chips, is combined with the extraordinary ability to manipulate light. This union builds on the pioneering work of University of Pennsylvania professor Nader Engeti, who has explored how materials can be fine-tuned at the nanoscale to perform complex mathematical operations using light waves.

This chip is not just another step in the development of technology; it represents a potential paradigm shift. Traditional computing chips, despite their progress, are still closely tied to concepts created in the 1960s. The new chip, however, operates on a completely different principle, performing computations at unprecedented light speeds.

The development of this chip was a collaborative effort, with significant contributions from Firouz Aflatouni, an assistant professor of electrical and systems engineering at the University of Pennsylvania. The team focused on ensuring that the chip could perform vector-matrix multiplication, a key process underlying neural networks that are vital to artificial intelligence applications that increasingly impact our daily lives.

One of the chip’s ingenious features is its design, which achieves computing magic by varying the height of the silicon wafer in certain areas. This allows light to scatter in precise patterns, facilitating the rapid execution of mathematical operations.

In addition to lightning-fast computing and reduced power consumption, this chip also promises increased privacy. Its ability to perform numerous calculations simultaneously-without the need to store data in a computer’s RAM-could make future computers using this technology nearly impossible to hack. Now we just have to wait for this chip to start being implemented in our everyday devices, and one thing is for sure: it could take a long time.


Please enter your comment!
Please enter your name here