Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

a. the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path

All of those things are bad things to hit, why not slow down?

b. 1.3 seconds before impact, the self-driving system determined that an emergency braking maneuver was needed

Too late for either a human or computer.

c. operator is responsible for monitoring diagnostic messages

There is superseding responsibility to drive the car safely. Uber's policy sets up the test driver for failure, and puts people's lives at risk.

d. emergency braking maneuvers are not enabled while the vehicle is under computer control

The pedestrian had no chance.

The very feature that should make the car safer was disabled, but also the judgement was poor and too late, and Uber policy sabotaged the car driver judgement as the exclusive (not merely primary) safety mechanism by distracting them with data that in most every way would have been more diverting than talking on a cell phone or fiddling with the radio.

I hope the family got a lot of money in the settlement.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: