Hacker News new | past | comments | ask | show | jobs | submit login

> I'd hypothesise that a machine that kills everyone in those 1% edge cases (which are actually less frequent than 1%) but drives perfectly 99% of the time would still outperform humans.

Well, no.

Some quick googling suggests that the fatality rate right now is roughly 1 per 100 million miles. So, for certain fatality in the case of human control to be an improvement, it would have to happen only about once in the lifespan of about every 500 million cars. In other words, the car would, for all practical purposes, have to be self driving.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: