Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Right. Except we are talking about corner cases where pre-programmed computers on wheels kill people. Merely better rule following does next to nothing, and that's expected. The idea that we can follow rules to make computers on wheels "work" is false, unless one wants to go the China route where near-everything is monitored and punishment is extracted automatically (and even then... it still wont work).



My whole point is that if you make the environment more predictable you make the problem easier, not that the only thing you need to do is that. You seem to be attacking a strawman where somehow the whole environment is tailored towards self-driving cars by creating a police state and then the software is very simple. What I'm describing is using all the resources we already have to design and maintain roads to also help with solving the problem, together with all the technology that still has a long way to go. And if we can do that by doing things that also lower risks in normal driving I don't really see the downside. We already see that in the world. There are countries where it is much safer to drive because there's been a continuous focus on solving exactly this type of issues.


Nope. As I just said, even going the police state route _wont work_. Your premise, that we can make the environment more predicable to make pre-programmed cars easier to program is wrong. That would be attacking the tiny minority of the real world problem. I'm entirely comfortable with that prediction. You disagree, that's fine. I suspect we will both be around to see.

If you want to instead talk about making it safer for human drivers, fine, that's a different subject.


It's not just making the programming easier, it's making the problem actually possible. We have a much higher tolerance for fatal accidents with human drivers than we ever will with automated ones. So if you have an environment where many thousands of people are killed today and then expect self-driving to work in the exact same context with airliner level reliability you've defined the problem as practically impossible. If someone builds a great self-driving technology that applied to the total US fleet only kills 20 thousand people a year I doubt it will ever be accepted. And yet that would be half the fatalities that currently exist.


There is no making the programming easier, or possible. It's not a problem you can solve with a program. I blame sci-fi for this disconnect.

Human sensory augmentation will save lives, but that has nothing to do with the marketing ploy that cars can have self.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: