Hacker News new | past | comments | ask | show | jobs | submit login

Uber's been likely working on self-driving cars, potentially in partnership with Google. What Tesla has as an advantage though, is an assembly line to produce lots of quality cars. In other words, Tesla will likely be ready for the future of transportation before Uber.



Also millions of miles of fleet learning a day, which Uber doesn't have, since they don't own the cars. Self-driving cars is a supervised learning problem, and he who has the data wins


It is a supervised learning problem, but that doesn't mean it will be solved by just shoving large amounts of data into a black box algorithm. This is not MobileEye's approach, and it almost certainly is not Tesla's approach. The data is useless unless it is annotated (e.g. a human labels where the lanes are, where the obstacles are, what the bicylist is doing, etc.) - that's the bottleneck, not collecting large amounts of raw sensor data + driver actions.


> The data is useless unless it is annotated (e.g. a human labels where the lanes are, where the obstacles are, what the bicylist is doing, etc.) - that's the bottleneck, not collecting large amounts of raw sensor data + driver actions.

Except that is exactly what Nvidia did, and it worked out fine for them: https://arxiv.org/abs/1604.07316


Nope. That car just does lane keeping - it doesn't even do turns or lane changes. This is all stuff solved 20 years ago. And even then it only achieves 98% autonomy on lane keeping - this is a task that needs 100% accuracy. You should not be running into the median every couple miles.

Furthermore, they augmented the data with left/right-offset cameras to supplement the data with examples of "bad" camera views. This is not present on Tesla cars (because these sensors are only used for training purposes)

In fact, the paper actually supports my point. They collected all this data for one task, lane keeping. They subdivided the problem of autonomous driving, and managed to solve one small subproblem (the easiest subproblem of autonomous driving, solved for decades already). They avoided the need for annotators, but only because they used specialized purpose-built cameras to augment the data.


> They collected all this data for one task, lane keeping. They subdivided the problem of autonomous driving, and managed to solve one small subproblem (the easiest subproblem of autonomous driving, solved for decades already). They avoided the need for annotators, but only because they used specialized purpose-built cameras to augment the data.

So why not several autonomous subsystems that use specialized purpose-built cameras and don't need annotators? I'm not saying that like it's easy - obviously it's not. Just seems scalable.


The solution was specific to that subproblem. The left/right-offset cameras were for the sole purpose of providing examples of what it would look like if the car was deviating off path. The same trick would not work for any other problems. Can you think of similar camera data augmentation tricks for obstacle detection, drivable path segmentation, bicyclist signaling/intention, pedestrian detection, and so on?


There's also the matter of that laser scanner on some cars - usually costing more than the car...


Not anymore.

Complete LIDAR units are already available for under $500[1]. There are other, cheaper sensors which aren't as good as full LIDAR but are available for under $100[2].

There are plenty of other options hitting the market soon too[3]

[1] http://www.teraranger.com/products/teraranger-lidar/

[2] https://www.pulsedlight3d.com/

[3] https://www.washingtonpost.com/news/innovations/wp/2015/12/0...


It's not quite here yet. I've been following this since the DARPA Grand Challenge, when we had to use a huge SICK LMS just to get a line scanner. There are now several affordable indoor line scanners. Outdoor sunlight-tolerant systems cost more. 3D scanners, which scan in multiple planes, are still expensive. There are some MEMS devices coming along. Flash LIDAR will probably win out in the end, once someone does the sensor IC development to get the price down from $100K.

Back in 2003, I dragged a VC down to see Advanced Scientific Concepts in Santa Barbara. They make the best flash LIDAR. But they were happy being a DoD and aerospace contractor, selling expensive one-offs. The Dragon spacecraft uses an ASC flash LIDAR to dock with the space station. DARPA buys their units. But their price point is around $100K. There's no inherent reason it has to be that expensive, but it takes custom sensor ICs made in small quantities. Last March, Continental AG (German tire/brake/auto parts company) bought the technology from ASC.[1] We'll have to see how that works out. This is the right technology if the price point can be brought down.

[1] http://www.spar3d.com/news/lidar/flash-lidar-company-acquire...



Data isn't that hard to gather. Uber has more robotics scientists than all the others combined.


Funny when it started as a GPS tracking webapp.


Even google(with boston dynamics?)


Tesla's other advantage is a global fleet that is actively logging millions of real-world miles a day. I don't think that can be understated at this stage.


Google may have a lot fewer miles, but they're much higher quality miles. The sensor suite on their car is much superior, so they're getting much better data. Even if the eventual Google car uses cheaper sensors, the data they get from the superior sensors helps a lot training the algorithms.


I don't see how this will work though. Unless you plan to equip all those future cars with the same high quality sensors as those on the Google ones (which I've heard are prohibitively expensive).

If those sensors eventually do become really cheap, then Tesla could use them too and negate any advantages Google might have had.


They're already cheap (see elsewhere in this thread for links). Tesla has opted not to implement them because Elon thinks LIDAR isn't needed [1]. Of course without LIDAR you get situations like autopilot running into white tractor trailers because it can't see them...

[1] http://9to5google.com/2015/10/16/elon-musk-says-that-the-lid...


I think the main point would be to compare the images from the high-quality sensors with those of the low-quality ones, so that they can make good guesses to how accurate they are and what quality they can get away with.


Does google collect high velocity data like tesla's? I heard that the google self driving car was limited to 25mph in the areas where they tested it, and so their data may only be reflective of that.


Google operates several kinds of cars -- I see the Lexus ones driving at highway speeds on 101 all the time.


As owner of a Tesla is it possible to opt-out of this logging? Or is it a required condition of buying the car?


Google has Waze data. Which I would think serves as a GREAT set of labels for the data that the self driving cars collect.


Waze is just where people are going...the data tesla collects could be from sensors like a front-facing camera, radar, 360 degree ultrasonic sensors, etc.


A Google <-> Uber partnership could reveal something similar.


But with Tesla's open policies, such as open sourcing all their patents, maybe they will happily supply Uber with all the self driving cars Uber might want, plus the API's to remotely control them?


I'm sure they would be more than willing to supply Uber with cars but how much access/control they give to Uber of the data from the cars remains to be seen.


If Tesla can pull off this transportation + energy revolution, it would be a trillion dollar company


Exactly my thinking.


I also think that Tesla will have an easier time acquiring top talent in both the autonomy software and manufacturing. Uber/Google have to partner with Old Guard automotive companies.


Not necessarily. A lot of people have wisened up to Musk's invasive and destructive management style. Don't think the frequent departures from the company at all levels of hierarchy have gone unnoticed.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: