Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm getting tired of saying this.

Why is it one or the other? Why only a human driver or a robot driver? Could you have a human driver in charge with an accident preventing robot, getting the best of both worlds? If the human falls asleep, the robot can do trivial driving, and the human still makes the more difficult driving decisions.

The question isn't if robot car is better than an average car being driven by a human. It's if the same robot car would be safer if the human was driving it, and the robot in accident prevention mode. And I don't think that is going to be true in any near future.



> Could you have a human driver in charge with an accident preventing robot, getting the best of both worlds?

We already have this and we've had it for years. What do you think automatic emergency braking is? Adaptive cruise control? Lane keeping assist?

> The question isn't if robot car is better than an average car being driven by a human. It's if the same robot car would be safer if the human was driving it, and the robot in accident prevention mode. And I don't think that is going to be true in any near future.

No clue what you're trying to say here. Driver assist features don't improve safety?


So call them driver assists.

I am tired of seeing driver assists being sold as driverless systems.


Do you think people buying cars with autopilot believe they are getting fully driverless systems?


People buying current cars were just told that their cars will earn them 30k$ a year. Is fsd driver assistance or driverless?


yes


Human drivers already space out, check their Facebook, text their friends, and otherwise lose track of what's going on without having cars that claim to do most of the driving autonomously. The more autonomy you add, the more drivers will take advantage of the opportunity to do something behind the wheel besides drive.

Autonomous cars are one of those things that really need to be done right or not at all. The notion that human drivers will be ready and able to take control at a moment's notice is nothing short of laughable.

IMO, about the only way partial self-driving cars can work would be to delegate it to a remote building full of 'drone drivers' somewhere. Heck, I'd pay for that today, just out of laziness.


> Could you have a human driver in charge with an accident preventing robot, getting the best of both worlds?

I'm not sure that human+robot would be better than robot alone. Eg. The human is in control and starts moving close to the adjacent lane without indicating. Does the robot (a) assume the human driver is changing lanes and so turns on the indicator light; (b) take control of the car and move it back to the center of the current lane; (c) do nothing and let the driver change lanes without indicating? The ambiguity in the human driver's intentions means the robot might make a decision that is less safe than if it was in complete control.


That's a fair argument to hold, that's why I limited the robot to only intervene when accident is imminent.


Sure, the way I'm imagining it is some sort of collision avoidance system, where the person drives but the computer can also put on the brake? It seems like that could be safer (although it could still cause an accident by braking in certain situations like in an intersection). If the self-driving system does other things to reduce accidents like leaving greater margins of safety for other drivers (by stopping less suddenly, pulling out into intersections differently, etc), it seems like that would be harder to integrate with a human driving, instead of having the self driving system take over.

A lot of the utility of self driving cars is not just increased safety. It's allowing disabled people to be more independent, reducing the need for car ownership with (less expensive) self-driving taxis, reducing the need for parking lots and allowing that land to be used differently, etc. Do you think self driving cars should be prohibited as long as they are less safe than the standard set by human+collision avoidance cars?

I don't think that makes sense unless cars with human drivers and no collision avoidance are also prohibited. Even then I'm not sure. As a society we know that driving is not totally safe but allow people to drive anyway because it's a useful tool. I don't see why the safety standard should be even higher for self-driving cars, which are even more useful, especially for people who can't drive.

One last thing is that anything that slows the adoption of self-driving cars will probably lead to a much greater number of old, less-safe, cars in use for a longer period of time, since it's very unlikely politically that the existing car stock will be just kicked off the road, regardless of what regulations are set for new ones.


> Do you think self driving cars should be prohibited as long as they are less safe than the standard set by human+collision avoidance cars?

If having a human in the car reduces the accident rate, then I propose making it necessary for the human to be in the car. Are you advocating that we make roads unsafe so that disabled people be able to take taxis more?

> I don't see why the safety standard should be even higher for self-driving cars.

I see it as the opposite. The advocates for driverless cars are advocating lowering standards. For me the standard is simple. If the same car is as safe driving itself as it is assisting a human driving it, then it can drive itself. You can only remove the human if the human adds no safety in the same car. Please do not compare to "the average car on the road".

I see there are positive externalities to cheaper taxis. There are really negative externalities too, in terms of traffic, urban sprawl, and carbon emissions. But I don't think we should deviate from the topic of safety for those externalities


Safety is important, but I don't think any amount of safety is worth any amount of negative externalities. I mean there's things we could do right now that would make things more safe. We could ban everyone with older cars from driving (since they don't have as many safety features). We could ban everyone from the roads that has a somewhat-higher than average accident rate. We could ban all people past a certain age. If safety is always more important than access to transportation we should do those things.

I disagree with those, so I also disagree with the premise that having a human in the car should be required for any amount of increased safety. I think the amount matters. And keep in mind we are only talking about safety increases from present day, you're just proposing to ban them in order to achieve a greater safety increase.

My >90 year old grandmother still drives. She hasn't run anyone over, but I imagine her reaction time isn't the best. Apparently it's common for cops to be lenient with elderly folks who are known bad drivers, since in many areas they wouldn't be able to get groceries or go to the doctor otherwise. I'd argue that maintaining the status quo (by having a blanket policy) for many groups of drivers would probably be safety theater instead of anything real.

Lastly, I touched on this in my earlier post, but if you care about deaths it's very important to consider what cars are actually being replaced in addition to individual car statistics. Imagine in scenario A, 10 million older cars are replaced by 10 million driver-assist cars, and scenario B where 30 million older cars are replaced by 15 million self-driving taxis. Assume the driver-assist car is safer than the self-driving car, which is safer than the older car. Even so, the driver assist scenario may have thousands more deaths, since the uptake rate is different. And this is without taking into account any safety-improving software updates, which are very difficult to estimate but may be very significant statistically.

A simple per-car comparison may lead to all of the negative externalities and thousands of more people killed. I really do not see how that outcome could be justified in the name of "safety". At the very least, even if safety is the first and only concern, policy makers should try to minimize vehicular deaths which requires considering things at the fleet level.


Not that it's relevant, but as someone who can't drive due to a disability, nothing short of Level 5 autonomy basically doesn't change the calculus for me at all.


This basically describes the Toyota Guardian system


Watch Lex Fridman's interview with Musk (https://www.youtube.com/watch?v=dEv99vxKjVI), where they discuss exactly this topic.

Basically, Musk's point is that if self-driving cars are statistically safer than humans, then allowing humans to drive will make cars less safe again.

Furthermore (regarding some other children here), the study that Lex and his team has done show that drivers are not less engaged while Autopilot is on (as many critics say).


> if self-driving cars are statistically safer than humans, then allowing humans to drive will make cars less safe again.

You mean "if driverless cars are statistically safer than humans". It shouldn't be a surprise that driver assistance decrease accidents. But Autopilot isn't a driverless system, and neither is there satisfactory proof that it is safer than a human.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: