Hacker News new | past | comments | ask | show | jobs | submit login

Except he won’t argue that because he is dead. Which, in itself, is the best counter argument to your conclusion that he knew what he was doing and allowed it to happen intentionally. Unless you’re also theorizing that he chose to commit suicide using one of the most creative methods he could, while making Tesla take the blame. It’s not impossible I suppose, but it doesn’t seem like the most reasonable conclusion.



Except there are multiple reports from driver that autopilot problematic in that location.


I imagine that the court or some regulatory agency could order to automate these kind of reports and use them to shutdown autopilot or make it unavailable on some routes.

It would be interesting to design the ux for this. Based on the wording it could even make them more liable if accidents happen outside of roads with known issues.


Why does it even need to be reported? Tesla should be able to detect where autopilot is regularly deactivated or does some wrong (driver takes over with sharp input on brakes or steering wheel), and use that to deactivate it prematurely.


I think they do the first part of your suggestion (they send reports from all disengagements, and detailed reports from some), but not the last part.

Sounds like a sensible idea, particularly in an area where they've seen a near collision. The difficulty might be that conditions change all the time (barriers are moved or removed), and that makes it difficult to make this kind of determination on a case by case basis.


Indeed, I meant that the car should report the manual takeovers to Tesla HQ so the information can be shared. In additional you should be able to add context to it as a human.


Well at least Tesla drivers can and to openly report. When I had issues with phantom breaking etc on my BMW while driving on the Autobahn I never reported this since there was no sensible channel.

You just learned to live with the limitations and used it where it worked under supervision.


Agreed, it's not impossible but also seems unlikely. It seems, as I was writing my previous message, like the plot for some crazy courtroom drama show: He couldn't go on living but needed to support his family!

Seems unlikely, but I just don't understand why he'd be using Autopilot in this area given his past with it.

When I first got Autopilot in my Tesla, I was playing with it on the city streets, and it decided to swerve into the divided, concrete median at the end of the intersection, deciding that was the lane. I grabbed control of it, no problem, but 2 years later I still am extra vigilant around any lane dividers. Mostly though, I don't use Autopilot in such restricted areas.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: