I believe that LIDAR will be to silicon photonics what the accelerometer was to MEMS.
The real challenge that the article only touched upon is to get lasers into the same package and keep the costs down. This is still an active area of pursuit in both research and industry--though, there are several promising methods emerging. The $10 cost used in the article is likely closer to the cost of the bare silicon die. Packaging is always the expensive part of optics (doubly so if the lasers are not monolithically or heterogeneously integrated onto the same die). That being said, even with today's technology, integrating a laser chip and a silicon photonics chip into a package is easily south of $1k, which is what they quoted competing technologies costing.
I look forward to seeing these sensors integrated into my self-driving car in 5-10 years.
From this part of the article, the team seems fairly confident in their ability to integrate the laser onto the same die as the waveguides. This seems very plausible to me, given how standard this integration is in communications applications (eg fiber optic transceivers)
> Our device is a 0.5 mm x 6 mm silicon photonic chip with steerable transmitting and receiving phased arrays and on-chip germanium photodetectors. The laser itself is not part of these particular chips, but our group and others have demonstrated on-chip lasers that can be integrated in the future.
The Watts group at MIT has done some fantastic work into rare-earth doped silicon waveguides to produce lasers on a silicon platform [0]. However, I believe this work is still very much in the research stage. I'm not convinced their method is scalable to production for this $10 & million-unit-per-year LIDAR application since Erbium-doped waveguide lasers still require an off-chip pump laser source.
Even at 2 meter range with centimeter resolution, these devices would be a much better solution to the "local obstacle" problem than ultrasonics today. Mobile platforms moving around in spaces with a lot of miscellaneous obstacles have to either be compliant (or padded) enough to just push through them or slow enough to detect them and move around them.
For those curious about performance: "At the moment, our on-chip lidar system can detect objects at ranges of up to 2 meters, though we hope to achieve a 10-meter range within a year. The minimum range is around 5 centimeters. We have demonstrated centimeter longitudinal resolution and expect 3-cm lateral resolution at 2 meters. There is a clear development path towards lidar on a chip technology that can reach 100 meters, with the possibility of going even farther."
This is incredible work. The low price point should create and burgeoning open source / hacker dev community, which in turn, will hopefully greatly accelerate 3D perception algorithms.
My recent EE capstone project was in the area of autonomous vehicle SLAM perception and control- but we were limited by cost to using a PrimeSense Kinect for 3D perception, which had pretty lacklustre resolution. This chip-based LiDAR would have been warmly welcomed at the time. Regardless, I'm glad I'll be able to revamp the project when these hit the market! The more sensors, the better!
Quanergy has yet to ship any production units. They've also been delaying for years. I have no doubt that they are building solid state lidar, but their tech isn't really "here" yet.
I was wondering what happened to those guys. They announced another new model, but haven't shipped the first one yet. As of May 2016, they were talking 2017. Quanergy calls itself "The leading provider of LiDAR sensors" without having shipped a product. Bad sign.
It's definitely possible, because ASC has been shipping such devices for years, but at a much higher price point. Somebody is going to get this right soon, but maybe not Quanergy. Those spinning Velodyne things have got to go. Too clunky, and too expensive.
They've got a normal lidar unit they are shipping or at least offering dev kits for now (similar to velodyne puck at a lower price point and less scanning lines). So they are at least doing something in terms of real products.
From what I've managed to gather from poking around a bit, I'd guess they're using a very fancy waveguide based sensor similar to this, but a more simple emitter as a way to save cost on their first generation solid state product.
I've heard also 2017 as well... but also would be curious to see any actual progress.
We had one of the mechanical Quanergy sensors at my last job, and I worked on integrating it with ROS and doing some mapping with it. It was slow, not well constructed (it kind of vibrated when rotating), and didn't return intensity information (though it's possible a software update could change that).
At $8000 (later dropped to ~$5600), it was poor value. I've never used a velodyne, but I looked over the marketing materials for the VLP-16 puck and it seems like a much more well thought out product. This makes sense as Quanergy is only selling mechanical LIDARs as a bridge, but the fact remains that they aren't very good.
I have a lidar question that maybe HN can answer. So this work is super cool and in the future I'm sure we'll have lots of lidar-based devices driving around. But if we do, won't they interfere with each other and thus render the lidars very inaccurate?
I think not, because they rely on sending laser light in a specific directly, but also on receiving it from the same direction. The chances that another LIDAR system is looking at the exact same point a the exact same time is really low.
At worst it would cause some outliers that you can filter.
There are easy ways to protect a LIDAR against that, but I don't think Velodyne is doing them yet. The duty cycle is low; a ranging operation takes under 1us, and typical cycle rates are 100Hz. So each beam is live only about 1/10,000the of the time. A simple denial jammer thus needs far more power than the pulsed laser used for ranging. A simple denial jammer looks like a target at zero range, so you know you're being jammed. Note that jamming must the same color as the ranging laser; everybody has narrow-band interference filters that only pass a very narrow color band. That's why this works in sunlight.
To defeat a synchronous jammer, the LIDAR only needs to add some random variation in the transmit timing. Then the jammer won't know when to be on. With some random variation, synchronous jamming looks like noise, rather than a solid false reading.
This new MIT system requires that the received light be in phase with the laser beam at the light coherency level. That makes it reasonably immune to anything other than a laser that can sync to a narrow light pulse. Not sure that's even possible.
Even if each ranging operation is fast, they will be scanning it around, so I suspect the effective duty cycle will be higher. Light bounces all over the place, so two sensors could still interfer even if they are measuring two different paths in space. It can probably be resolved with careful, expensive, and coordinated engineering, but I would expect the first gen sensors to interfere with each other just as primesense cameras do unless the authors explicitly claim otherwise.
No, but the bigger issue may be from laser jammers[0]. Laser jammers are used to jam police speed lidars to prevent them from obtaining a reading on vehicles. Such devices are legal in most places, because the optical spectrum does not fall under the FCC's control.
IANAE but I would guess that it's the same for laser light as any other EM spectrum comms, in that modulation and some kind of packet system would work in tandem with limited range. It's probably not even as hard to manage as Bluetooth or WiFi, being far more directional as it is.
This is very nice. Phased-array beam steering on both transmit and receive, and coherent detection to ignore any light that's not synced to the transmitter. Now they have to get the range up. If they can get range up to 10m in the near term, that's enough for indoor navigation. That will be useful for VR, AR, robot vacuum cleaners, and similar applications.
I've been expecting the Advanced Scientific Concepts flash LIDAR to be the direction of the future, because it has no moving parts and works in sunlight at range. It just costs too much. But this is potentially even better.
It's a feedback loop, and a very effective one. The stronger the brand is, the more credibile your article will seem. The more credibile articles, the stronger the brand becomes.
Even if this doesn't have the range for autonomous vehicle hazard avoidance, imagine it being used on regular cars for their blind-spot detectors. No more false alarms on your radar detector.
I saw a demo at SIGGRAPH last week with VR using helmet-LIDAR for model building and geolocating. It was the $8K spinning LIDAR you see on Google cars. Cant wait for the $100 version.
> They also have the potential to be much more robust because of the lack of moving parts, with a non-mechanical beam steering 1,000 times faster than what is currently achieved in mechanical lidar systems.
The LIDAR chips don't use visible light, but the article talks about it as a future project:
> We are also developing visible light phased arrays with applications such as Li-Fi and holography that can be seen by the human eye.
It's a very simple, and very clever mechanism, too. My first guess was that they were going to use piezo mechanisms to squish cavities, but heat is even simpler.
Yes, but your issues will be power output and heat.
My projector is 250 watts and has a ton of cooling due to the light source and is still not all that bright. I can't imagine how difficult it would be to cool a 250 watt light source coming from a .3mm source. Probably impossible for the foreseeable future.
Maybe microfluidic tubes would make it work. But then you have to have watercooling.
It really dependson efficiency. Which I dont recall reading about in the ardicle. Or wait, did they say -6db loss I think? That would still be a ton of heat.
Either way, I don't think this will be the breakthrough we need. Lenses are one of the cheaper components in a projector.
It doesn't need anywhere near that much power. Each pulse is only a microsecond long and since it's all focused in a single beam it takes very little power. Even just a 1 W laser can be seen for miles and it can be a big hazard for pilots because people will try to shine them at the cockpit windows. Heat will definitely be manageable.
It isn't just diffused over the entire field of view. It's a lot like how CRTs work with the beam scanning over the whole screen line by line. The lidar only needs a very short pulse of light for each given point so it's a lot like taking a laser range finder and just moving it from point to point. Even though it's a very short pulse, the intensity of the light coming back from whatever it hit is exactly the same as if it was shining on it constantly.
If you took a 1W laser and were scanning it back and forth over a 90° by 90° fov it would be impossible to see outside in daylight but if you could take a snapshot of how it looked at any particular instant you would see a single bright dot. All it takes is just long enough to trip the photodetector on to get the delay between light out to reflection received, any additional time shining the laser at that point is just a waste so you can scan a massive number of points many times a second.
Fundamentally, a lidar sensor does not need anything higher intensity than what it can detect when it's scanning a single point.
1. We are talking about a movie projector. Our eyes are the photo detectors and they work very differently to CMOS devices.
1.5, fov does not matter. It's area.
2. As I said, It's true strobed (scanned) lights do look brighter to our eyes than their counterparts for less wattage, but only by a small amount I think.
I stand by my statement, making a movie projector will be hard with this thing because of heat and power requirements.
It looks like it's small enough to be included in a smartphone. That would be pretty cool for crowdsourced streetview cartography like mapillary. Or 3D scanning objects.
It's still limited to use in the lab, or at least indoors, short-range:
> At the moment, our on-chip lidar system can detect objects at ranges of up to 2 meters, though we hope to achieve a 10-meter range within a year. The minimum range is around 5 centimeters. We have demonstrated centimeter longitudinal resolution and expect 3-cm lateral resolution at 2 meters. There is a clear development path towards lidar on a chip technology that can reach 100 meters, with the possibility of going even farther.
I wonder what the limiting factors are -- laser power, noise, calibration?
Much like Steve Jobs said no one wanted to watch video on a tiny screen until the day he introduced an iPod that could play videos. When LIDAR is cheap enough, Elon will forget all about RADAR and cameras.
The real challenge that the article only touched upon is to get lasers into the same package and keep the costs down. This is still an active area of pursuit in both research and industry--though, there are several promising methods emerging. The $10 cost used in the article is likely closer to the cost of the bare silicon die. Packaging is always the expensive part of optics (doubly so if the lasers are not monolithically or heterogeneously integrated onto the same die). That being said, even with today's technology, integrating a laser chip and a silicon photonics chip into a package is easily south of $1k, which is what they quoted competing technologies costing.
I look forward to seeing these sensors integrated into my self-driving car in 5-10 years.