Tesla gets a lot of flak about driver security, but this is incredibly irresponsible.
> Once you engage Drive Pilot, you are no longer legally liable for the car's operation until it disengages. You can look away, watch a movie, or zone out. If the car crashes while Drive Pilot is operating, that's Mercedes' problem, not yours.
If the car crashes, it's your problem primarily because you're the one in the car that could be injured. Maybe they'll pay for repairs or your hospital bills or even a loaner, but its primarily your problem. We have to remember the limitations of these systems and accept that they're level 2 and require full attention
Tesla gets a lot of flak for advertising Autopilot in a misleading way. Advertising with functionality and liability that could exist at some point but simply doesn’t.
What Mercedes does here is not just empty advertising, it’s offering actual lvl 3 autonomous driving complete with the car being liable in those situations.
Which in practice means Mercedes can actually deliver what Tesla so far has only been advertising with; Taking your attention actually away from the street to watch a movie or check your emails.
This is not a “problem”, it’s the literal definition of lvl 3 autonomous driving; Car takes liability in certain situations to such a degree that the driver is legally pretty much just a passenger.
> Taking your attention actually away from the street to watch a movie or check your emails.
If I were the kind of person that can afford and buy an EQS[1] I would rather hire a driver so I can do it all the time, not just at 40km/h in a slow moving highway jam.
> Which in practice means Mercedes can actually deliver what Tesla so far has only been advertising with
Do you have experience with this tech or are you taking the word of an marketing campaign? Taking "liability" doesn't mean much if you get in a serious accident and have medical issues for the rest of your life. If you get in an unsafe cab does it give you a lot of assurance that technically the driver is responsible?
How does one differentiate an unsafe cab from a safe cab?
For a more apples to apples comparison: Tesla vs Mercedes. Tesla broadcasts L5 confidence in a poorly implemented L2 system, and accepts zero liability for its faults. Mercedes broadcasts L3 confidence in their system (which I'm not familiar enough to actually characterize), and accepts full liability for its faults. I'd prefer to share the road with a self-driving Mercedes than a Tesla, because mistakes are inevitable and insurance payouts are not. If I'm disabled for life, I'd rather get paid for damages than not.
It means that failing to prevent an accident will cost MB significantly more than they can possibly make with a single car. So they need to have run the numbers to be confident that this will not bankrupt them. This is _far_ more than Tesla ever did.
Tesla, until maybe this year, was never cashflow positive and the remaining lifetime of the company was always a significant risk. MB is an institution here in Germany. For many many reasons they will be one of the car manufactures that will disappear last. This is a company that cannot do such a risk without massive personal retribution risk to the executives, especially not after the Diesel scandal.
Of course that doesn't remove the "but I'm in the hospital/disabled/..." part of being personally in an accident. But that can happen _anyways_, irrespective of you being responsible even if you drive yourself. Compared to any other offering, this is a significant change of the rules of the game.
Getting in a cab where the driver accepts liability (and has insurance) seems much safer than getting in a cab where the driver passes liability to you and your insurance. Neither is intended to be the only way you asses safety, but the former is much better signal than the latter.
It is in fact the opposite, because it sets the bar at that level understanding that people treat Tesla’s autopilot in the same way. Tesla just hide their relatively lower capability behind a legal out.
People do not care what “level” their driverless control is. After a certain point of “it seems to do everything”, they assume it does everything. If they think they can get away with letting it take control, they will.
Requiring automotive manufacturers to deal with this is manifestly responsible. It is demanding an engineering control over an admin one in the hierarchy of risk controls, and that ecpectation is in line with how every non-software engineering discipline of engineering approaches the world.
The point is that whenever you ride in a conveyance (whether as passenger or driver) it's always "your problem" if there's an accident.
Short of MB putting an executive to sit alongside every passenger in their car (which you'll note that Uber, the airlines etc don't do either), what exactly are they supposed to do above and beyond taking liability for accidents while the system is in operation?
Tesla gets flak because they have repeatedly implied (or flat out said) that their level 2 system drives itself. This is a level 3 system, so it actually does drive itself.
Level 3 is the lowest level where “the car drives itself”
> Once you engage Drive Pilot, you are no longer legally liable for the car's operation until it disengages. You can look away, watch a movie, or zone out. If the car crashes while Drive Pilot is operating, that's Mercedes' problem, not yours.
If the car crashes, it's your problem primarily because you're the one in the car that could be injured. Maybe they'll pay for repairs or your hospital bills or even a loaner, but its primarily your problem. We have to remember the limitations of these systems and accept that they're level 2 and require full attention