I think self driving cars should have both high dynamic range cameras and LIDAR and maybe time of flight cameras. Input from a LIDAR system would be much more likely to detect that barrier, and computer vision via a camera much more likely to be fooled. I think an investigation into why the computer vision system failed to detect a barrier under clear daylight conditions will show the negligence on the part of Tesla. Lane lines are frequently not well marked, and sunlight glare is a difficult problem for cameras. However, you have to be able to detect a concrete barrier, in the worst of conditions. Does Tesla have in place some kind of determination of its lane detection accuracy and then alert the driver that it is turning off auto-pilot when accuracy is low?
Yes it shuts off with a beep if it can’t handle current conditions. In case you think this kind of abrupt shutoff sounds dangerous, keep in mind this is in the current generation of the system which relies on a human driver being attentive and ready to take over at all times.
> In case you think this kind of abrupt shutoff sounds dangerous, keep in mind this is in the current generation of the system which relies on a human driver being attentive and ready to take over at all times.
What is a driver to do when the system randomly decides to brake? [1]
Phantom braking has been an issue for awhile, and has yet to be acknowledged by Tesla.
Where has Tesla acknowledged phantom braking as a persistent issue? It has been around since 2016.
> that video is from a super old software version.
Looks like the video was taken a month ago. The driver had the version of software that Tesla gave him. Drivers can't choose which software version they get, so if it was old, that's on Tesla, not the driver.