> The Tesla FSD system on the other side aims to be a generic driving AI. ... can it ever get so reliable, that the driver can completely hand over to car ever?
Using the current machine learning algorithms? Not likely. Imagine releasing a new airplane and telling the pilot - "this airplane flies well in 99% of scenarios, but 1% of the time it refuses to respond to pilot commands and kills everyone on board. We don't know why it happens but we'll keep training the model". That's the state we're currently at with "AI"; as someone put it, it's alchemy. If it works, we don't know why. If it doesn't, we don't know why - but we tune the parameters until it seems to work. We cannot prove it will work for every possible input. ... Not exactly a scientific approach by any means. Such self-driving cars might be accepted in USA but I imagine they'd quickly get banned in Europe after a few random fatalities.
In the 4th quarter, we recorded one crash for every 4.31 million miles driven in which drivers were using Autopilot technology (Autosteer and active safety features). For drivers who were not using Autopilot technology (no Autosteer and active safety features), we recorded one crash for every 1.59 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 484,000 miles.
1) Autosteer and active safety features are not "autopilot", its adaptive cruise control.
2) if they think something bad is about to happen, it beeps and hands over control, thus its the drivers fault, as the system wasn't engaged at the time of the crash
SAE is good for classification but not for evaluating performance. By definition autosteer isn't "driving", but it does effectively drive for you at the level you'd expect, and it can take you 300 miles on a highway without disengagements and the $12k package can even overtake cars slower than you.
That's my point. Mercedes might be able to operate unsupervised on a some predefined map of highways while under a set of conditions and at a certain speed, but that doesn't mean it's a good product that solves real problems if it can't take over driving unsupervised on a regular road trip where you're driving 70+mph.
> Autosteer + active safety features is exactly what autopilot is.
If you ask a normal person on the street what "autopilot" means in a car, you'll get a range of answers. But the consistent opinion is that autopilot means that you don't need to concentrate. Most will say that its automatic driving, and will drive for you.
But autosteer requires concentration. Its just sparkling lane assist.
> it's different to FSD - yes.
Which is the point. Its marketing fluff. Autopilot isn't really that, its adaptive cruise control with lane assist. Thats the thing that rankles. All of this stupidity, injury, and noise comes from a marketing decision. One designed to cover up that the CEO over promised and wildly under delivered _yet again_.
"NHTSA's Flawed Autopilot Safety Study Unmasked (2019) - The safety regulator's claim that Autopilot reduces crashes by 40% was based on flawed data and analysis, which it attempted to keep secret.":
https://www.thedrive.com/tech/26455/nhtsas-flawed-autopilot-...
1. NHTSA has reiterated that its data came from Tesla, and has not been verified by an independent party (as it noted in a footnote in the report).
2. Second, it says its investigators did not consider whether the driver was using Autopilot at the time of each crash. (Reminder: Drivers are only supposed to use Autopilot in very specific contexts.)
3. And third, airbag deployments are an inexact proxy for crashes.
... which all sound like really flimsy reasons to conclude "doesn't stack up." A more honest summary based on those would be something like "hasn't been verified by a 3rd party yet."
It's incredible how openly the media makes things up about Tesla, likely just to generate page impressions.
The primary issue is that the data doesn't adjust for road classification in any way. Highway driving, where autopilot is used, has considerably fewer crashes per mile than city driving. Tesla compared Autopilot's rates against all driving rather than just highway driving which would be the relevant metric.
That adjustment alone almost completely eliminates any safety advantage of Autopilot before you get into any of the other adjustments like age.
Tesla’s data shows autopilot is safer (crashes less) than not on autopilot. The only question mark I’ve seen in this discussion is on NHTSAs data regarding Tesla.
Could you explain the flaw in Tesla data? How is crashes/mile not a good proxy for safety?
>In the 4th quarter, we recorded one crash for every 4.31 million miles driven in which drivers were using Autopilot
But this is not FSD , is a combo of human + some driver assistance, come back when Elon discloses how many times the human was forced to intervene to prevent a crash.
I am wondering why Tesla fans think this stats are proving that so called FSD is safe , don't you understand stats? did PR tricked you,? this proves that soem assistant like one keeping safe distance between cars is safer, make all the data public and let us compute the real stats please.
Using the current machine learning algorithms? Not likely. Imagine releasing a new airplane and telling the pilot - "this airplane flies well in 99% of scenarios, but 1% of the time it refuses to respond to pilot commands and kills everyone on board. We don't know why it happens but we'll keep training the model". That's the state we're currently at with "AI"; as someone put it, it's alchemy. If it works, we don't know why. If it doesn't, we don't know why - but we tune the parameters until it seems to work. We cannot prove it will work for every possible input. ... Not exactly a scientific approach by any means. Such self-driving cars might be accepted in USA but I imagine they'd quickly get banned in Europe after a few random fatalities.