Okay, I'll be the one to say it - how does this bode for the future of autonomous cars? If a man drives off a cliff because he was following a GPS and not paying attention then that's not great, granted - but if your car drives you off a cliff whilst you're reading in the backseat that's another matter :)
For a standard GPS head unit, the burden of proof that a road exists and is appropriate to drive on may be some purchased telematics database, itself combined from all kinds of different sources. The burden of proof for a Google self-driving car is that the road recently had every feature (curb, sign, mailbox, lane marking, etc) mapped by a Google employee.
A handheld GPS typically offers a few options: (a) fastest route, (b) simplest route, (c) shortest route. These problems happen to people who pick (c) or worse, pick (c) with an added "avoid major roads" modifier. Which can be a great option if you are hiking or biking or sightseeing and want to see the "real country" and don't care how long the trip takes or how bad the roads are.
The main thing this says for autonomous cars is: don't do that. If the routing algorithm has a bias towards picking highways and major roads and roads you've used before, this problem should virtually never come up.
Well self-driving cars are programmed to avoid obstacles. Hopefully it would stop at the edge of the cliff, dump a segfault onto the nav screen, and then shutoff.