Hacker News new | past | comments | ask | show | jobs | submit login

But let's not measure AV safety against the average human driver, but against the most capable human driver.



Interesting way to frame it, but I'm not sure if I agree... If the net benefit is less death and injury, shouldn't that be the obvious choice?


No, because society has already accepted the status quo and needs to be convinced to change it. Technology can't gain acceptance by being just as good as the status quo -- it has to be better.

Every time a self-driving car causes an accident in a situation where a human driver probably would not have (and sometimes in pretty obvious ways, like driving into a freeway divider), it will hurt acceptance of the new technology. Politicians will never miss an opportunity to grandstand against tech companies. People will be afraid to ride in autonomous vehicles.


The status quo is you either have to pay someone ~$10/hr to drive for you, or you have to focus and keep both hands on the wheel and actually drive yourself, and it's still pretty dangerous. Self-driving cars could improve on the status quo significantly without actually being less dangerous.


Driving is dangerous in generally predictable ways, and people accept that. People also accept that there is usually someone to blame for most car accidents. Driving is dangerous but it is under human control.

Autonomous vehicles are dangerous in unpredictable ways. People might be injured or die in accidents that would not have happened with human drivers, as the result of software bugs rather than human decisions. When things like that happen, statistics about accident and death rates being lower with self-driving cars are beside the point.


Hopefully our regulators can regulate rationally, even if our click-bait-emotionally driven general public cannot. If they can manage it, thousands of lives will be saved.


It's not regulators you should worry about.

Its politicians, who are all about "click-bait-emotionally driven" to bring them votes, you should worry about.


If you're concerned about safety, driving aids like drowsiness/attention warnings, lane keeping assist, automatic emergency braking, etc are the way to make things a bit safer. In combination with the continuing march on collision safety for when they do happen.

If you're concerned about the costs of a driver, I grant that the marginal costs per hour of driving is low, but the material costs seem high, unless a significant cost breakthrough is found in lidar production, or a technique breakthrough using multiple cameras; and the R&D costs seem pretty enormous -- people have been seriously working on this since the 80s and it's clearly closer than fusion, but it seems perenially 15 years away.


The status quo is that average drivers can get an license not only the top tier ones.


This about it this way. Would you buy a self driving car that drives worse than you do? Even if it drives better than the average, an average brought down by drunks and the elderly? I know I wouldn't


Think about it the other way: would you rather a below average driver behind the wheel? or a self driving car?

Everyone likes to think they're better than average (https://en.wikipedia.org/wiki/Illusory_superiority#Driving_a...).

I would rather keep the worst drivers off the road than keep the best.


Then again, bad drivers often make themselves known on the road and you can give them wide berth when recognized (granted, not always possible). Watching for the warning signs of a bad / reckless / distracted drivers quickly became an almost unconscious habit for me.

A machine suddenly hitting a bug is whole new ball game though. I guess over time maybe you could build up a sense of where machines have trouble driving and adjust accordingly.


If you want to keep the worst drivers off the road, it might make sense for the elderly to get self driving cars. Or cars to detect your blood alcohol level and go into self driving mode then. That solved, I believe the average driver would go back up enough that self driving cars are worse.


The rational approach would be to license them when it was likely they'd increase the overall average safety of drivers.

That isn't just a question of the safety of the autonomous systems though, it also includes consideration of which drivers end up using them. Some drunks will use auto taxis, lots won't.


Sounds like a good plan, can we start by only granting driver's licences to the most capable human drivers?


They should be measured against the average taxi driver.


Well not the best, but at least the 95th percentile probably.


If we are going to measure their safety are we going to also expect them to obey all traffic laws to the letter? I am curious how speeding is going to be addressed. Will keeping in sync with traffic be acceptable or do we say, thou shall not exceed the limit for any reason? Because as soon as you let them break the law, logical reason or not, they have one transgression against them in case of an accident. Then the barn door is wide open.

I still think marketing it as a safety issue was the wrong approach this early in their development. Say "safety" and "cars" in the same sentence and people think of seat belts, air bags, and anti-lock brakes. All things to protect you from an accident on unpredictable situation.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: