Then we're back to the Tesla model - you don't have a self-driving car, you have a car that claims it is such, but a human needs to be 100% aware and in control at all times.
Honestly, I think that's fair. If you are the one who pushes the button and sends the device out into the world, even if you didn't program it, I think you deserve some part of responsibility of it malfunctioning.
I would think for a self-driving car to reach an acceptable level, it should be like a train. If I get on a subway or a bus and it hits someone, I am not held responsible, even if it stopped specifically to pick me up first.
My biggest problem with this approach is humans are terrible at paying attention to things that don't demand 100% focus.
A car that drives itself 99% of the time, but fails catastrophically the other 1% is doomed to failure. The human operator won't be engaged enough to take over that 1%. At least not without airline levels of squawks and beeps and wheel shakers and even that might not be enough - airlines don't have to worry about children chasing balls into busy streets, etc. And the pilots are highly trained - drivers are not.