Hacker News new | past | comments | ask | show | jobs | submit login
Engineers demo first processor that uses light for ultrafast communications (news.berkeley.edu)
120 points by signa11 on Dec 31, 2015 | hide | past | favorite | 24 comments



> The photonic I/O on the chip is also energy-efficient, using only 1.3 picojoules per bit, equivalent to consuming 1.3 watts of power to transmit a terabit of data per second. In the experiments, the data was sent to a receiver 10 meters away and back.

They mentioned this, but didn't compare it to classic electrical chips.

Does anyone know what we have now?


> Does anyone know what we have now?

I can't quote exact figures but to give a ballpark figure, we're talking about ~5 picojoules per bit at a distance of less than 10 millimeters. This is on a modern (20 nm) SoC.

So we're talking about a pretty significant improvement in power consumption and communication distances. A modern chip running at 2-4 GHz requires several clock cycles for a bit to reach from one corner of the silicon to another. It's becoming a major hurdle to think about when designing chips.


Does this imply that it will be possible to increase the clock-speed 5/1.3=3.8 times and still be able to cool the CPU with basic air-cooling?


No, I don't think so. I don't think that light can be used to replace all (metal) conductors on a silicon chip, at least in the near term.

However, this technology has the potential of bringing performance gains to on-chip memories such as L1 and L2 caches as well as software configurable scratch pad memory. Other possibilities that spring to mind are faster core-to-core communications as well as bundling more cores on a same die as the faster communication makes physically larger dies possible.

disclaimer: I work in the semiconductor industry but this is not my area of expertise, my speculation is as good as anyone else's.


No it doesn't. This does not affect clock frequency on a single CPU. This would improve bandwidth and latency in I/O with off chip circuitry such as memory or co-processors (like GPU).


Why would it improve latency? At most it would reduce it by half; i.e. Free-space propagation vs traces buried in FR4.


It wouldn't and you're correct. I don't know why I wrote that. Sorry!


Modern 12.5 and 28 gigabit SerDes are around 15 to 20 picojoules per bit when being used for something like PCIe3 over your standard FR4 PCB traces, while longer distance (like the linked work) is in the 30pJ/bit range. 1.3 picojoules/bit is a hug advancement, as the only things that get that low are over very short distances (I am working with a brand new SerDes for multi chip module communication that has an amazing 1pJ/bit, but only up to ~24mm).


It's difficult to compare as for optical you have to add the power usage of the laser itself and for the electrical chips you may need repeaters (as stated in the article) depending on the length and bandwith used.


Does anyone have a link to the actual paper?

Key point which people may get confused on: this is light for inter-chip communication, not light-based computation. The achievement is miniaturizing the transceivers, making them with a standard-ish IC process (Germanium isn't entirely standard, but very useful), and presumably finding some way to connect optical fiber in the same way that gold wires are currently bonded to IC pads.

There are probably a lot of sub-advancements in photonics involved as well.



Doesn't seem like a non-pay wall version is easily accessible.


On the 8th ISP and Carrier Lunch in Frankfurt 2013, I heard a talk from CompassEOS thematizing optical processing chips for a new Generation of routers. In my opionion It's kind of click baiting to talk about the "first" processor utilizing light to improve connection speed when I had similiar technology already in my hands back in 2013.


Two questions:

1. Had they demonstrated the optical processing chips could be manufactured with existing processes by building a working chip?

2. Did the chip(s) in question only route networking signals or were they capable of more general purpose computing?

There's been plenty of research into integrating optical connects into a silicon chip, but this is the first general purpose chip I know of which has done so with existing manufacturing methods, and to me both of those are important milestones.

EDIT: Looks like IBM managed to beat the Berkeley team, appears they got something very similar working in 2012:

http://researcher.watson.ibm.com/researcher/view_group.php?i...


> 1. Had they demonstrated the optical processing chips could be manufactured with existing processes by building a working chip?

They had a working setup for demonstration, also we were allowed to hold the chips, also you wouldn't invite the biggest german networking companies for lunch without having something to sell. So basically yes.

> 2. Did the chip(s) in question only route networking signals or were they capable of more general purpose computing?

Those chips were limited to route networking signals chip to chip or chip to backlane. If I remember right CompassOS choosed networking to get production ready as fast a possible, propably also on the interessents of their investors who were mainly network focused (e.g. Cisco, Comcast and T-Venture).

Edit: So Berkeley is again selling already developed Innovations as their accomplishment. “This is a milestone. It’s the first processor that can use light to communicate with the external world,”


I see, thanks for the info about the Compass chip.

As for Berkeley, yes it appears their PR department is telling lies. Even though it's promising work, I do wish they could've announced it in a different way. Silicon photonics is an interesting field, I don't need to be told about world firsts to find it interesting.


It's super interesting and in my opinion the last gasp the von Neumann architecture will go through is going to be a combination of graphene and photonics. Of course not in the next 10 years, but let's hope for the next 30.


It may be hard to correctly anticipate what impact this breakthrough will have, but i get the sense that this combine with quantum computing could change our future pretty quickly. I feel excited about it and frightened at the same time ...


This breakthrough isn't really anything that spectacular in my opinion. Sure they've integrated photonic communication with a standard CMOS process, but the advantage here is mainly bandwidth and power over existing processes.

I imagine integrating Hybrid-Memory Cube or High-Bandwidth Memory with a processor die (2.5D or actual 3D) would likely give you higher bandwidth at lower TOTAL* power and lower latency.

* In a lot of these photonics papers they conveniently forget to mention the power it takes to shine the lasers, and they only focus on the joules/bit number required in the communication. I suspect if you added up the entire power it would be higher overall than an electronic circuit.


Looks like they're pretty unrelated accomplishments; this enables faster interchip communication, while quantum computers enable speedup of a very specific subset of problems if you can get enough qubits within a chip to play nice.


Communicating between quantum computers will probably involve photons. They're pretty good at traveling far fast coherently.

... Not good enough that we won't need to use a quantum error correcting code and repeaters that pump out the decoherence errors along the way, but still important.


This seems to enable higher bandwidth inside a box at lower cost. Quantum computer offers exponential speed ups for very specific types of problems and with huge constant costs. They seem rather orthogonal to me.


You may be interested in HP's research into 'The Machine'. I disagree with some of the use cases being proposed, but it's certainly possible that the technologies involved could provide big steps forward for computing power.

http://arstechnica.com/information-technology/2014/06/hp-lab...


Could anyone throw some insight on how this would help sensors in future (apart from low power consumption)?

Reference from the article - "Further down the road, this research could be used in applications such as LIDAR, the light radar technology used to guide self-driving vehicles and the eyes of a robot; brain ultrasound imaging; and new environmental biosensors. "




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: