Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's not sufficient for these systems to just be "better" than humans in terms of accidents/mile/vehicle, they have to be FAR BETTER than humans.

Today, when accidents occur, blame and liability goes to the responsible parties. We're talking about millions of accidents and millions of drivers.

If you suddenly have a car that drives, the blame and liability for all accidents then gets focused like a giant magnifying glass onto ONE (or a few) companies which have made the self-driving car.

I don't know how many accidents/mile/vehicle is the "magic number" but I expect it is going to have to be far far far lower than what it is now. Otherwise, these self-driving car vendors won't be able to survive the legal onslaught.



Not really. The cars only need to be safe enough to get over the potential fear of users to activate them. This may even be less safe than human drivers, we just don't know until they are deployed. For example, it could turn out that present day autopilot in Tesla cars is less safe than human drivers, but yet people still use it. Humans aren't strictly rational based upon risk when it comes to transportation. If autocars are convenient, and in every day experience work successfully, people will use them (especially those who are young, who have the marginal cost of having to instead learn how to drive, and may just opt-out of learning to drive altogether if they can get through their daily lives by using autonomous vehicles.)

The legal issue you mention seems secondary to the human factors one. If people get addicted to self-driving, and some are dependent upon it, the laws will bend to the culture not the other way around, as they always do. Look at the current regime for an example of this: we gloss over the 30k fatalities that happen regularly due to cars, and hand out licenses to drive these death machines to people who really should have no right to do so given their poor driving skills or physical impediments -- we do so because to deny someone the ability to drive is so averse to culture and freedom that we err on the side of enabling more people to drive despite the fact that it surely increases the risk for everyone on the road.


Why can't we accept a little better now and keep striving for far better? Seems stupid to have an all or nothing approach, especially when plenty of luxury vehicles have pieces of automated driving in them already. Liability is for insurance leeches to figure out


Because the proper comparison isn't X accidents vs Y accidents. It's X accidents and Z billion dollars spent vs Y accidents, where that Z billion dollars could have gone to other efforts. If the delta between X and Y is minor, Z is wasted.


If the added liability of "a little better" self driving car is too much to make a self driving car profitable, companies won't build them.


This isn't as hard of a problem as you make it seem.

The people who pay for accidents will be the same people who pay for them now. It will be the insurance companies.

I am sure that insurance companies will jump at the chance to save money, by insuring statistically safer vehicles.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: