Hacker News new | past | comments | ask | show | jobs | submit login

That article describes a systematic design defect of Tesla’s system. (It’s a defect that all camera based systems have. They are designed to ignore stationary objects because they can’t adequately distinguish background objects from obstacles.) Car and Driver tested this last year, and the Tesla ran into the stationary dummy nearly every time. This is something humans have almost no problem dealing with.



It has been a problem, but it is definitely not an intrinsic defect of camera based solutions. The stopped truck example has been examined in depth with rooted Tesla's. The Tesla actually detects the truck, but fails to fully recognize the scenario and attempts to drive under it! See also, greentheonly's work on Twitter on researching this.

Ignoring some stopped objects is only a temporary limitation.


I said it was a systematic defect camera based systems share, not that it was intrinsic to camera based systems. Camera based systems have much less information about object positioning than LIDAR based systems. They filter out non-moving objects because the vision processing algorithms are not sophisticated enough to distinguish between the background and obstacles. If they stopped for stationary objects in the road, they’d routinely stop because they mistook the background for a stationary object.

That may or may not be a temporary limitation. It depends on whether vision processing becomes reliable enough where you can almost always distinguish an object in the road from an object in the background.


> I said it was a systematic defect camera based systems share, not that it was intrinsic to camera based systems.

And I'm saying that even that is incorrect. Here is a Tesla failing to stop: https://twitter.com/greentheonly/status/1134987596508143617 - The driver clearly noted that aiming for a less clear spot (dangling wires) triggered a stop.

Here is another example, in which greentheonly has used root to overlay what the car sees: https://twitter.com/greentheonly/status/1135011184439169024 - It sees the truck but is treating some of the space under it as drivable space.

Another view, with more details on what the car considers drivable vs non-drivable: https://twitter.com/greentheonly/status/1134490014321131521

Also, a clear case where it did stop: https://twitter.com/greentheonly/status/1134489977801334784

The Tesla system has limitations around object identification, but absolutely does NOT just filter out non-moving objects.


Assuming what you say is true, is the Tesla system worse than a human driver overall in its operating domain?

Humans have weaknesses too. What do the data say on their relative performance? I’m not taking a position here, just asking for data.


Look, "autopilot" works on highways, where there is a median. Everything goes the same direction, and there is a minimal amount of clutter.

Its about as easy as driving gets. This is why a level 2 "autopilot" can parade around the place as something that is both innovate and safe.

None of the data compares what happens at junctions, which is where the majority of crashes happen.

It cannot work in suburban sprawl, It can't cope with poor weather, it can only drive on highways.


Not sure why you state this. I've used autopilot in a Model 3 on city streets and suburbs, it works pretty great.


Similarly, I have used autopilot in rain so bad that I was scared to drive. It performed perfectly and was much less stressful for me.

I was fully engaged and ready to take over, but never needed to.


I'll echo that. Heavy rain, I could barely see the lines, but the car seemed to find them just fine and was driving straight and smooth.

Until traffic on the other side of the median barrier hit a big deep puddle, absolutely covering my car with a thick sheet of water. Autopilot immediately started screaming at me to take control. I already had my hands on the wheel, but it still scared me because I couldn't see either! After the water washed a way in a couple seconds, I was able to re-enable AP and keep going.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: