Hacker News new | past | comments | ask | show | jobs | submit login

All optical computing surfaces again!

Warning.

There is NO opportunity for large scale integration, the MOST IMPORTANT ASPECT in computing. This is because the de Broglie wavelength of the information carriers, typically 1.5um, is so HUGE.




There's some factors to consider here. For visible light yes, the smallest feature probably won't be smaller than several hundred nanometers, however, optical computing comes with several major advantages over traditional electrical circuits.

The first is that light beams can cross paths without interfering with each other, allowing for a level of parallelism and density of signal paths without the concern of crosstalk/interference or shorting. Additionally, the information density of an optical signal is vastly higher than an electrical signal, and multiple optical signals can share the same pathways simultaneously. Also, energy usage is greatly reduced, so the constraints due to heat waste are much less.

Having said all that, the idea of optical circuits in a VLSI is still a very foreign and exotic concept for us, so it's hard to say how far we can take it if we invest at the level we have for electrical ICs. It's naive though to say it's not feasible due to some oversimplification of feature size limitations.


Unless computation can be accomplished on differing overlapping frequencies, wdm devices, which resemble train switchyards, are huge, on the order of mm. So optical computing would have to take place at multi THz switching rates using very short pulses.


As I understand it, microring resonators can be very small, µm rather than mm.


um is huge.


Since you can carry 100+ frequencies around 1500 nm, the feature-size per data stream is more like 15 nm/stream, which is closer to electronics, and the heat dissipation of optical waveguides may be much lower than with equivalent electronics. Photonics also have the advantage of better signal propagation over longer distances, for efficient interconnection of multiple devices.

So there is a likely opportunity for large-scale integrated photonics, as long as you have enough parallelism.


Referring to computing, not data transmission. Perhaps some day, using quantum computing, all those differing ‘portly’ photons, superimposed, could yield advances in density. To mix different wavelength streams on chip involves wdm devices, which are huge.


Optics/photonics can potentially perform analog as well as digital computation. One trendy thing at the moment is accelerators for neural networks.

Some potential benefits of optics include high data rate, parallel processing of multiple streams, transmission over longer distances, and lower heat dissipation.


Good point.


> There is NO opportunity for large scale integration

What about vertically?

CMOS logic is still "mostly planar". With sufficiently low heat dissipation, you could make a cube and easily overcome the planar density problem.

The main challenge seems like it would be lithography cost for each of the many layers, but if the minimum feature size is 1.5um, there might be a clever way to make this work cheaply (DLP projection + gradual extrusion?)


Hundreds of of millions of dollars have been raised from naive investors by ignoring this fact. Often, board-member and founder physics PhD’s aid in the deception by omission.


Why can’t we reduce the wavelength?


shorter wavelength light really doesn't like existing. It takes more energy to produce, there are way fewer possible materials to make mirrors out of etc. Just look at how much trouble the industry had doing EUV lithography.


Even UV wavelengths aren't terribly small, and the shorter the wavelength, the more energy it has and the more likely it is to destroy whatever material your optical CPU is made of.


Sounds like an engineering problem, not a fundamental one.


Not sure what gives you that idea. It seems unlikely that there are materials that can withstand billions of x-ray pulses per second and continue to function without being altered. They might exist, but the higher the energy to get to low wavelengths for fast and information dense computing, the increasingly implausible it gets that a suitable material is physically possible.


I am shocked, shocked that quantum computing researchers might have figured out this trick too.


Not sure what this means. De Broglie waves are defined for matter (mass is required). While photons have relativistic mass, this isn't the same thing.


The deBroglie wavelength of the photon IS its wavelength. That’s why it can’t squeeze into nm features and optical waveguides are still not used on-chip, after 35 years of effort.


"Information carrier" means the actual medium the light is travelling through, doesn't it? Which has to be matter of some sort.


Last time I checked the sun transfers its light through the vacuum of space to us.


Sorry, I must've missed that these optical CPUs contain vacuums of space for the light to travel through.


Well there is no matter in an struct vacuum obviously, so they are hard to see.


Indeed, as orlp mentioned, light is self-propagating and does not require a medium. This is broadly true for all EM waves.


But it needs SPACE, on the order of a few um minimum, that cannot be occupied by other devices.


We're talking about a CPU, not light traveling in a straight line forever.


Why isn't the EM field the medium? Or even spacetime?


You are confusing geometry, and the excitation traveling through that geometry.


If I was designing one of these things my goal would not be to replace present day computers - which at this point is nearly impossible given the millions of man hours spent optimizing them - but to carve a niche where you outperform them in specific tasks. I have the vague impression that should be possible.


Plasmonics may solve this problem. The interaction of light at an interface can lead to what essentially amounts to photon confinement. This allows for what's called near field optics which overcomes the limitations of wavelength and unlocks nanometer scale optoelectronics. For examples, see the solar sail for the "starshot" project.


That's why most optical computers lean into quantum computing.


Exponentially more powerful.

But unfortunately, for small N, like the N = 2 bits here, the additional complexity of pure optical + quantum computing just doesn’t pay off!


Right, Modern computers don't know what their purpose is, that makes it such a mind boggling challenge. Nothing is ever good enough. If you define the purpose it can be better and much cheaper.


Please elaborate?


Current electron based computers are 10s of nanometers per transistor. Optical equivalent of transistor cannot be smaller than 1um. Equivalent optical CPU to your smartphone would be the size of several football fields.


Asking from a position of total ignorance. The energy savings mean you can increase clock speeds, right? Assuming a big enough jump, won't that relieve a CPU from the need to have most specialised instruction sets and potentially also that many cores? In that case, wouldn't it be acceptable that transistors grow (back) in size?


Energy savings on what basis?

If your gate gets 50x50x50 times bigger, you need some pretty extreme savings per area/volume of circuit if you want to reduce the per-gate usage. Can they save that much?


I would think yes. Which would be something. Huge rooms of enormous optical computers running lightening fast on low power would have a kind of retro future feel.

Light would reduce the time cost of distance, and increase the density of connections (optical signals can pass through each other) so this could actually work.


> Optical equivalent of transistor cannot be smaller than 1um

For classics optics. Exists superlens optics, which using metamaterials and monochromatic light source, and could "see" artifacts of size much less then wave length.


While this is true doesn’t ignore the difference in clock rate capacity ? If the photonic cpu can run 10,000x the clock rate without the extreme heat build up that would melt the smartphone


Isn't part of the point that you don't need as many transistor equivalents because you can run them thousands of times faster?


One potential of optics is teraherz frequencies.


Do optical computing need to use visible light?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: