Hacker News new | past | comments | ask | show | jobs | submit login

Yes, it is the worst case scenario. Also, it's the best you're gonna get. Toyota already got the nod from the courts to kill people via technology bugs in their cars (see the 'unintended acceleration' controversy that killed someone (maybe 2? I forget) a few years ago due to absolutely preventable bugs if they'd splashed out and spent a couple grand on static analysis tools), so if you think companies are going to break with their tried-and-true policy of hiring the cheapest, least experienced person they can find to slap together whatever halfway works, ignoring every warning from their engineers that they need better tools, that they need more time, that the system needs more testing to be safe, etc.

This isn't building a bridge. If a company builds a bridge and it collapses and kills people and it turns out they didn't hire qualified structural engineers or that the CEO ignored warning from the engineers to push the project for a scheduled release window or to keep profits high - the CEO goes to prison for criminal negligence. With self-driving cars, it's a different story COMPLETELY. You're talking about SOFTWARE. No company that's killed people with software has ever found themselves being found guilty of criminal negligence. And they won't for the forseeable future.

This is how self-driving cars will go. I'll give you whatever you want if I'm wrong. I'm that confident. A company, using standard modern business practices (that means doing absolutely every single thing research has shown destroys productivity and ensures the end product will be as unreliable, and poorly engineered as possible. Open floor plan offices, physical colocation of workers, deadlines enforced based on business goals rather than engineering goals, business school graduates being able to override technical folks, objective evidence made subservient to whoever in the room is more extroverted, aggressive, or loud, etc. You know, you probably work in exactly the kind of company I'm talking about because it's almost every company in existence. Following practices optimized over a century to perform optimally - at assembly-line manufacturing processes. And absolutely antithetical to every single aspect of mental work.) will rush to be first to market. Maybe they'll sell a lot, maybe not. That's hard to call. What's NOT hard to call is the inevitable result. Someone dies. Doesn't matter if its unavoidable or not. No technology is perfect. That doesn't matter either. Won't matter what "disclaimers" the company tries to pull trying to say it's in the drivers hands. The courts won't care about that either.

But... they will absolutely get away with it. They will not be fined, they will not be forced to change their practices (most likely they will not even be made to REVEAL their development practices at all). You see, if the courts bother to ask what their practices are, their lawyers will point out it doesn't matter. There's no such thing as "industry standard practices" that you could even CLAIM they failed to follow. So their software had bugs. As far as the court is concerned, that's a fact of life, it's unavoidable, and no company can be held responsible for software bugs. Not even if they kill people.

So they'll get away with it - in the courts. In the court of public opinion? Nope. You see, even if they made their self-driving cars out of angel bones and captured alien predictive technology and it never so much as disturbed someones hairdo, they are destined to fail as far as the public is concerned. Because human beings are, shocker, human beings. They have human brains. Human brains have a flaw that we've known about for ages. Well, by "we", I mean psychologists and anyone whose ever cared enough to learn Psych 101 basics about the brain. There is an extremely strong connection between how in-control a person feels they are and how safe they feel they are. Also, feeling safe is stupendously important to humans. This is why people are afraid of flying. If things go wrong, there's nothing they can do. (The same is true when they're driving a car, but people are also often just wrong and they wrongly think they have some control over whether they have a car accident or not. No evidence suggests they have the ability to avoid most accidents.) If the self-driving car goes wrong while they're not paying attention, nothing they can do. People will be afraid of them as they are of flying.

And if you haven't noticed, our society deals poorly with fear. They LOVE it way too much. They obsess over it. They spend almost every waking hour talking about it and patting themselves on the back about what they're doing or going to do to fix their fears and the fears that threaten others. Mostly imagined fears, of course, because we're absurdly safe nowadays. So it will be the only thing talked about until unattended driving laws get a tiny extension to cover the manufacturing of any device which claims to make unattended driving safe. It'll pass with maybe 1 or 2 nay votes by reps paid for by Uber, but that's it.




"People will be afraid of them as they are of flying."

This is a great analogy, because at the dawn of flight, many people were really, really really afraid of flying -- for very good reasons: the airplanes of the day were incredibly dangerous.

Yet people still flew, and flew more and more, despite many very public disasters in which hundreds of people died, and the airline industry grew and flourished.

Now most people don't think twice about flying, as long as they can get a ticket they can afford. Sure, some people are still afraid of flying, but most of even them fly anyway if they have to, and the majority aren't afraid at all or don't think about it.


Sure, planes were introduced at a time when people were willing to step back and tell themselves 'OK, I don't feel great about this but that's just my emotions running away with themselves, I really shouldn't be scared so I should just do it'. Those times are over. Suggesting people should question, much less actively resist, their most primitive impulses is seen as a direct threat to their person. It simply isn't done.

When a mom is saying 'I'm not putting my children in one of those killmobiles' and someone says 'well actually ma'am its much safer and you're endangering your childs life significantly by taking the wheel', that person gets punched in the face and lambasted on social media as an insensitive creep. That's just how it goes.


> There's no such thing as "industry standard practices" that you could even CLAIM they failed to follow.

Are you sure? What about IEC 61508 and ISO 26262? The latter especially, as it was derived as a vehicle-specific version of the IEC standard.

It's an industry-wide standard:

https://en.wikipedia.org/wiki/ISO_26262

...geared specifically to ensuring the safety of electrical, electronic, and software of vehicles.

Look it up - you'll find tons of stuff about it on the internet (unfortunately, you pay out the * for the actual spec, if you want to purchase it from ISO - it's the only way to get a copy, despite the power of the internet, afaict).

...and that's just one particular spec; there are tons of others covering all aspects of the automobile manufacturing spectrum to attempt to ensure safe and reliable vehicles.

Are they perfect? No. Will the prevent problems? No.

But to say there isn't any standards to look to isn't true.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: