Apologies if these long cut and pastes are not in the spirit of HN, but I've saved this comment I found that succinctly explains why the AI trolly problem as it were just isn't something we should concern ourselves with at our current technological capacity. Double apologies for the comment's biting tone, but I found it entertaining.
What the hell environment do these non-technical types think the cars are going to operate in. Are they driving through day-cares or something?
These endless moral arguments are coming from non-technical non-business types who are desperate to try to contribute to the revolution that is selfdriving cars.
The very few technical types who make these moral pronouncements are identifying themselves as having a skillset so far out of date that moralizing is all they have left to "contribute".
It boils down to some simple "moral" choices. Cars are on roads, people are not. Some people will accidentally end up on roads. The car will have the wisdom to try to predict people being idiots and do its best to avoid them.
But very much like right now. Idiots who jump in front of cars are going to be darwin'd if the car has no easy option to avoid them. If the a child jumps out in front of a car, it too should, shall and will be darwin'd if the other option is injury to someone who wasn't a nitwit. I have exactly zero interest in being in a car that would say, "Oh, the occupant of my car is older than the nitwit child that just jumped out into traffic. I am now going to drive off a cliff to save the moron."
I don't care if a crate of orphans spills on the road. I hope my car will swerve if possible. But if any option involves risk to me, then a crate of speedbumps is how it shall be.
Why such "moral" choices? Because they won't be moral choices. The technical limits of weighing these things is not going to happen any time soon. The car will stay out of situations where it is at fault such as driving on sidewalks; after that it will do its best to mitigate for idiots, and then its primary purpose will be to keep the occupants safe. Otherwise, you will have the car doing things like avoiding a blowing garbage bag or somesuch that it identifies as a 4 year old child and then driving into a tree to prevent the travesty of scattered garbage.
Arguing that trolley problems are irrelevant because the software can't identify whether the obstacle is a pedestrian or a garbage bag reliably enough to take evasive action which might increase risk to the vehicle's occupants is actually a pretty strong argument against self driving cars....
(I mean, I sympathise a bit with the Reddit user's contempt for surveys which ask whether self driving cars should avoid over "a criminal" in preference to other types of people as if this were the sort of thing humans were capable of doing, but at the same time if I believed autonomous vehicle programmes were being run by people with his "occupant first" mindset - and I'm sure they aren't - they should be shut down immediately and permanently)
Source: https://reddit.com/comments/9rav8y/comment/e8fps5m?context=3
What the hell environment do these non-technical types think the cars are going to operate in. Are they driving through day-cares or something?
These endless moral arguments are coming from non-technical non-business types who are desperate to try to contribute to the revolution that is selfdriving cars.
The very few technical types who make these moral pronouncements are identifying themselves as having a skillset so far out of date that moralizing is all they have left to "contribute".
It boils down to some simple "moral" choices. Cars are on roads, people are not. Some people will accidentally end up on roads. The car will have the wisdom to try to predict people being idiots and do its best to avoid them.
But very much like right now. Idiots who jump in front of cars are going to be darwin'd if the car has no easy option to avoid them. If the a child jumps out in front of a car, it too should, shall and will be darwin'd if the other option is injury to someone who wasn't a nitwit. I have exactly zero interest in being in a car that would say, "Oh, the occupant of my car is older than the nitwit child that just jumped out into traffic. I am now going to drive off a cliff to save the moron."
I don't care if a crate of orphans spills on the road. I hope my car will swerve if possible. But if any option involves risk to me, then a crate of speedbumps is how it shall be.
Why such "moral" choices? Because they won't be moral choices. The technical limits of weighing these things is not going to happen any time soon. The car will stay out of situations where it is at fault such as driving on sidewalks; after that it will do its best to mitigate for idiots, and then its primary purpose will be to keep the occupants safe. Otherwise, you will have the car doing things like avoiding a blowing garbage bag or somesuch that it identifies as a 4 year old child and then driving into a tree to prevent the travesty of scattered garbage.