Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/threads/optical-chips.24011/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2030770
            [XFI] => 1060170
        )

    [wordpress] => /var/www/html
)

Optical chips?

Will there ever be a chip that uses light instead of electrical current? Is it even possible?

The idea of using light instead of electricity to power computer chips is moving from science fiction to reality. Photonic chips, or optical processors, transmit information using photons rather than electrons. Because photons travel at the speed of light and produce virtually no heat, they promise enormous gains in speed and energy efficiency. Instead of metal wires, these chips use waveguides to steer light, modulators to encode data, and photodetectors to read signals.

Companies such as Intel, IBM, Ayar Labs, and Lightmatter are pioneering silicon photonics—integrating optical components with traditional transistors. These hybrid systems are already used in data centers to accelerate communication and reduce power consumption. Researchers are also experimenting with purely optical logic systems capable of performing computations through light interference, potentially transforming AI and supercomputing.

 
The idea of using light instead of electricity to power computer chips is moving from science fiction to reality. Photonic chips, or optical processors, transmit information using photons rather than electrons. Because photons travel at the speed of light and produce virtually no heat, they promise enormous gains in speed and energy efficiency. Instead of metal wires, these chips use waveguides to steer light, modulators to encode data, and photodetectors to read signals.

Companies such as Intel, IBM, Ayar Labs, and Lightmatter are pioneering silicon photonics—integrating optical components with traditional transistors. These hybrid systems are already used in data centers to accelerate communication and reduce power consumption. Researchers are also experimenting with purely optical logic systems capable of performing computations through light interference, potentially transforming AI and supercomputing.

Between optical chips and large scale chips like the Cerebra's large-scale chip could this change the whole game as far as size and location of data centers and even bring that power to offices and the home cutting the demand for data centers in the future by as significant amount?
 
Between optical chips and large scale chips like the Cerebra's large-scale chip could this change the whole game as far as size and location of data centers and even bring that power to offices and the home cutting the demand for data centers in the future by as significant amount?

It is certainly possible. I remember during my undergraduate work I learned to program in LISP which was billed as an AI language. I thought for sure AI would change the world. 45 years later I am right!
 
The (silicon photonics) optical chips are used for data transport -- and switching, in Google's case -- but not for processing the data, this is still all-electronic and there's no obvious route to changing this.

The data center game -- and demand for optical transport and silicon photonics -- is changing because the massive AI models need data centers which are simply too big and power hungry to build in one place, so the AI processing has to be distributed over a number of smaller (but still massive!) data centers spread across a local region (typically 10-100km apart) which need to behave like one big one as far as AI is concerned -- this is what is being referred to as "scale-across". And what it needs is a monumentally huge data bandwidth between these data centers, far bigger than for any normal networking application.

Distributing it further into homes and offices simply doesn't work -- this is the equivalent of "edge computing", there's a need for this but it's nowhere near the AI data center demand.
 
The (silicon photonics) optical chips are used for data transport -- and switching, in Google's case -- but not for processing the data, this is still all-electronic and there's no obvious route to changing this.

The data center game -- and demand for optical transport and silicon photonics -- is changing because the massive AI models need data centers which are simply too big and power hungry to build in one place, so the AI processing has to be distributed over a number of smaller (but still massive!) data centers spread across a local region (typically 10-100km apart) which need to behave like one big one as far as AI is concerned -- this is what is being referred to as "scale-across". And what it needs is a monumentally huge data bandwidth between these data centers, far bigger than for any normal networking application.

Distributing it further into homes and offices simply doesn't work -- this is the equivalent of "edge computing", there's a need for this but it's nowhere near the AI data center demand.
Many of these optical connectivity technologies are evolving with ever increased speed rate, and like everything else is labeled AI now. In this case, it is absolutely the key components of AI infrastructure or AI foundry. To me, the other 2 are XPUs and memory.

Optical computing is going through idea/research to production phases. The last hurdle is optical memory, and there are people working on that already.
 
In order to answer this properly, It's important to understand what *really* transistors do in IT industry. Transistors(MOSFETs) are switch devices. And there are two known facts.

- If we have small unit devices which can control binary signals(0 and 1) then it's possible to make processors(CPU, GPU) and memories(SRAM to DRAM) which are fundamental components of the "computer".
- Transistors allow us to control current(conductivity of the channel) by voltage.

So, transistors are perfect building blocks for processors and memories. We can control signal(current) by changing gate voltage. Any devices with similar property can be used as a building blocks as long as they allow us to control signals. Vaccum tubes have same property. They use heat to control signal(current), but big and slow. And for now, we don't have any small unit devices which uses light as a signal carrier. This is the reason why we don't have light based computers. We need super fast, small unit devices with signal control capability.

Of course, light can be used in semiconductor industry. They can be used to transfer signals far away(chip to chip communication), but those light signals needs converted to current in the end so that transistors can understand what those signals mean. Or it can be used like what Jim351 said above, but it's more like replacing analog part of the chip(where non-transistors do many many works) to optics.
 
Back
Top