Hacker News new | past | comments | ask | show | jobs | submit login
Tesla driver dies after his Model 3 stops on highway, gets hit by two other (businessinsider.com)
161 points by giuliomagnifico on March 4, 2022 | hide | past | favorite | 289 comments



The whole premise of "almost working" autopilot is the logical thing to do from Tesla's business perspective but deeply flawed from a safety one.

The allure of having a car that drives itself is to relax the mind from paying attention to the road, handling 99% of the situations correctly through the software. But the 1% that require human input are made much more dangerous because the driver not only needs to handle the situation, they need to instantly catch up and familiarize themselves with the situation they've found themselves in. What was the car trying to do? Why is it suddenly stopping? What do my blind spots look like? Is there anybody behind me? Was anybody in the process of passing me?

Unfortunately the time it takes to refamiliarize with the situation could exceed the total time budget to respond.

It's similar to not paying attention to the road and being in the passenger seat when a junior driver is driving instead. But there are 2 exceptions:

- Most people increase the attention they pay to the road when a learner is behind the wheel. Preparing to respond and offer guidance to correct.

- Self-driving software is a lot more indeterministic than a new driver. And software updates can easily change the behavior in non trivial ways. As if the driving student had Multiple Personality Disorder, one day being over cautious in certain situation and other days over-confident. Though I contend that MPD is actually more predictable than an opaque algorithm, as there are other human clues about the behavior mode of the moment that could help guide the response of the trainer.


I'm also of the opinion that automation is not compatible with safe driving. The issue is AI is able to handle the simple tasks first, trusting humans to takeover in outlier scenarios. But then, where does a new driver gain experience to handle the outliers?

We already went through this with airlines and decided to require pilots fly a certain number of hours without the automation, so when automation fails they are able to control the aircraft with some confidence.

As it is self driving cars are a luxury item for experienced drivers but I fear a future where a plurality of people panic when control is handed over to them once the computer determines the situation is too risky to continue on autopilot.

Some great backstory on automation in airlines is 99% Invisible's "Children of the Magenta":

https://99percentinvisible.org/episode/children-of-the-magen...


We don't let 80 year old pilots with bad eyesight fly planes. We don't let pilots fly planes 15/hours a day for multiple days in a row. Pilots must pass tests every year.

If we are going to hold car drivers to the same training and restrictions that pilots are held to we will have far fewer drivers and much safer drivers. But it seems very unrealistic to think that will happen in a democracy. Realistically it seems that technology is the most realistic way to reduce deaths.


The overwhelming majority of mayhem on the roads is caused by the long tail of bad decisions made by humans who would be able to pass whatever dumb requirements you concoct.

Kicking the elderly off the road, throwing the book at the one guy who gets caught doing NYC-DC and back in a day and onerous regulations on equipment is fundamentally a waste of resources chasing a long tail and is going to do nearly nothing to actually affect things relative to the expense. Most of the problems are caused by people doing things they knew were risky and stupid at the time and did them anyway, breaking rules in an irresponsible manner, distractions, intoxication, etc.

Being applicable to a broad cross section of the actual problem is why low hanging fruit type stuff like AEB has such a big impact.


Really, though, the most straightforward technology to implement to reduce automotive deaths isn't self-driving or driving-assisting vehicles; it's to reduce the number of cars on the road, and the number of miles driven. Given that people will still want to get places, it makes sense to replace cars with safer options such as trains (more buses would help, though not as much).


It would cost trillions of dollars to rebuild our cities in a way that car driving isn't necessary.


So? It cost trillions of dollars to rebuild them in such a way that car driving is necessary. Most cities had functional public transportation before cars reached the tipping point that pushed cities into car dependency[1]. In the mean time, a lot of slack could be taken up by buses, especially by converting existing lanes into dedicated bus lanes.

[1]: https://en.wikipedia.org/wiki/Car_dependency


I’m not arguing we be equally strict, I’m arguing we don’t allow self driving to take over the majority of our hours at the wheel.

But yes, I also think we are much too liberal in handing out licenses and the roads could be twice as safe by removing the bottom 5% of drivers. At least require a portion of the driving test on expressways, as it is you do a written test that you know what the signs mean and you drive around the block proving you know the gas from the brake. Unfortunately because you can’t get around most places in America without a car, we consider it an inalienable right to drive.


We can also have people drive less and have more public transport and riding E-bikes etc.

Cars aren't the solution to all problems.


We don't let them fly a commercial flight, but they absolutely could fly their own Cessna.


> The allure of having a car that drives itself is to relax the mind from paying attention to the road, handling 99% of the situations correctly through the software. But the 1% that require human input are made much more dangerous because the driver not only needs to handle the situation, they need to instantly catch up and familiarize themselves with the situation they've found themselves in.

There's also the separate recognition that no matter Musk's bombast on Twitter, Tesla's FSD stack is _nowhere near 99%_ correct handling. It's still absolutely regularly confused by bike lanes, only recently resolved roundabouts, and I could go on. But bottom line, not 99%. I'd be thinking closer to 60-75%, at best.


> The allure of having a car that drives itself is to relax the mind from paying attention to the road, handling 99% of the situations correctly through the software.

I have this with a Hyundai Palisade with L2.

I still have to watch the road. I still need to identify emergency situations and navigation. I still need to _assess_ many things - just like I would without any support. However, I don't need to take action on everything. This is mentally and physically freeing. On well marked highways, in particular, it means I can just cruise along without much worry.

The problem with Tesla is they set the expectation that you don't have to pay attention. They've set an expectation that the car is capable of _assessing_ every situation when it's clearly not. They further re-enforce this with the car's UI and behaviors such that drivers get lulled into not paying attention.

To me, it almost sounds like Tesla would have better UX and safety if they reframed the capabilities of the self-driving to make it clear that the driver still needs to be assessing.


> They've set an expectation that the car is capable of _assessing_ every situation when it's clearly not.

How did they do that ? The materials that new Tesla owners receive are specifying that the driver needs to pay attention in all circumstances.

From the https://www.tesla.com/support/autopilot page:

> Autopilot and Full Self-Driving Capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment. While these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous.


>The materials that new Tesla owners receive are specifying that the driver needs to pay attention in all circumstances.

Can you please provide an example of material that owners actually receive that states as much? It's safe to assume that we can't expect everyone to read a support page on a website, just like how it's safe to assume that most people don't read the owner's manual when they buy a new ICE vehicle.

>How did they do that?

It's in how they promote it publicly. It's in how Musk constantly says it's "a year away". It's in the fucking name - "autopilot" has certain connotations within society writ large that Tesla can't just handwave away with legalese.


The irony is that to anyone who takes 2 minutes to read a wikipedia article, Autopilot is like cruise control or cruise control + lane assist. It is not "full self-flying" at all. However, too many people believe that a Tesla is years ahead of the rest of the pack when it comes to "full self-driving."

That's marketing for ya.


You are right, I can't give you an example as I don't own a Tesla. Yet I find it hard to believe that when someone is purchasing 45K+ of hardware and 10K+ of beta software they can't be arsed to read the support page for any of it.

I hope that people that pay these sums are responsible adults that don't go out there putting everyone in danger because "autopilot" is right there in the name, and of course it's a totally different thing than when the word is used for airplanes.


I found the example in the end, but I bet you'll think it's also part of those documents that you can't expect anyone to read. The subscription agreement[1] when buying a subscription to Tesla's FSM contains the following in the "Responsibility" section (sixth paragraph):

> Active Supervision; Responsibility. Full Self-Driving capability features require active driver supervision and do not make the vehicle autonomous.[...]

[1] https://www.tesla.com/legal/additional-resources#full-self-d...


For many things, Owners manuals tell users not do to certain things while clearly designing a product to be capable of doing such things. Actions speak louder than words.

* Tesla has had multiple events highlight the "full self-driving capabilities".

* Their marketing indicates "full" self-driving while tucking the nuance away (for the owners manual).

* Their UI is specifically designed to highlight all of the things a Tesla is identifying. From what I've seen, it really doesn't indicate what a Tesla is confused by.

* Their driver engagement check systems clearly allow drives to be distracted and not paying attention.

* They've had multiple situations where accidents with stationary objects have occurred. This indicates a pattern of drivers believing these vehicles are more capable than they are.


> How did they do that ?

By going out of their way to call something FULL self driving that is, by their own admission in your quoted text, not full self driving.


You are right of course, but a lot of things in this world are named for the things that they will become, not the things that they are. If I was a Tesla client I would make sure what they mean by it before I took a nap in the driver's seat.


You can't call a skateboard a car because it may become one.


Can we agree that the name of FSD was selected before it actually was capable of anything? And if that's the case how should they have named it: "someday it might be full self driving"?

I totally understand your point, and I also agree with it, but I also can understand the point of view of someone as ambitious and driven as Musk that saw an idea, and then committed to it by choosing a name that reflects what they intend it to become, not what it's capable of doing at the moment.

You can see this very pattern if you watch Shawn Murray's first interviews about No Man's Sky. He shows the game as being capable of doing all these wonderful things that were proven to be bullshit when it was first released, yet over the years the game was built almost into the state that he first wanted it as. Was that an intentional way of cheating people, or was it just an ambitious person telling you about what they see at the end of road for their project? In my opinion it's the later, but people that bought into the hype at launch day, I bet they felt pretty angry.

So my advice (and apologies if it sounds patronizing), if anyone is in the market for buying a product which sounds too good to be true, maybe research the fine print and the owner's manual before putting your life and others' in danger by trusting the marketing team's selection of a name.


Most likely scenarios:

-Driver stopped himself due to medical reason (driver was 74).

-Car's power-train issue caused it to stop. Less likely in dual-motor configuration. Even less likely to happen suddenly.

-Autopilot was on and driver fell asleep - Autopilot is programmed to come to a stop if driver input is not detected.

Because it's Tesla there's great deal of attention given to Autopilot but in this circumstance, where car stopped completely, driver most likely have been unable to make the car go (for medical or technical reasons).


> driver fell asleep - Autopilot is programmed to come to a stop if driver input is not detected

Hopefully it slows gently, signals a lane shift, and parks on the shoulder with hazards on, rather than slamming on brakes in the middle of the highway with cars close behind.


Of course, Autopilot gives driver visual and audio warnings before stopping, and it takes a while to activate. It does so because it's a Level 2 system and requires driver's supervision. In absence of supervision it has to stop as it's not autonomous.


GM Supercruise pulls over as I described: https://gmauthority.com/blog/2016/11/gm-super-cruise-technol...

Tesla brakes gradually and turns on hazard lights: https://www.youtube.com/watch?v=oBIKikBmdN8

As common sense suggests, a sleeping driver does not cause a Tesla to slam on its brakes in the middle of high-speed traffic. That is not the cause of this accident.


What makes you rule out an issue with the AI?


The system is not autonomous so it requires driver's supervision. Autopilot slowing down can be quickly overridden with a gas pedal, yet the car came to a complete stop.


What leads you to think that the AI could not bring the car to a complete & abrupt stop?


Because the car is not autonomous. Driver has full control of the car and can override Autopilot at any time, especially such obvious maneuver as slowing down. In this case, there's no evidence to even suggest it was an abrupt stop.


> In this case, there's no evidence to even suggest it was an abrupt stop.

That's exactly what happened based on this article, assuming we agree that "slam on the break" means an abrupt stop.

"At first, the Independence Police Department speculated that a mechanical issue caused the Tesla to lose power, but officials are now saying they are unsure what made the vehicle stop unexpectedly [...]

Last month, the National Highway Traffic Safety Administration launched a preliminary investigation into Tesla after hundreds of owners reported that their vehicles would sometimes unexpectedly slam on the brakes for no apparent reason. According to the agency, these incidents, known as "phantom braking," sometimes happened at highway speeds.

Sudden braking occurred when owners had switched on cruise control or Autopilot, a system that automatically brakes, steers, and accelerates to aid with highway driving. NHTSA is investigating the phenomenon in Model Y and Model 3 vehicles from the 2021 and 2022 model years. Owners of 2019 and 2020 vehicles have also complained to NHTSA about phantom braking."


'phantom braking' slows down the vehicle, it does not stop it and can be overridden, at any time, by a driver by pressing accelerator. None of the incidents reported to NHTSA resulted in an accident, but braking is unexpected and unwelcome - likely reason why drivers reported it.

Given that earlier reports suspected loss of power - stopping could not have been abrupt. We don't even know if the car was on Autopilot, but we do know that driver was 74 years old, with higher chance for medical emergency.


The standard argument in favor of FSD vehicles is that they perform better than the average human driver. Except it doesn't take into account the amount of adrenaline involved.


And the issue is the numbers are fake: different population, different car types are compared. Dis-engagements are not noted as "20K dis-engagements = 20K times the human saved himself from the bad AI.

Would people get operated by an AI doctor that is better then the medicine students and the drunk doctors but only when a good doctor is paying attention and intervenes in case the AI shits itself?


While AP is sometimes touted as "FSD safer than a human driver" in reality it's just a set of driver assists that can help a driver drive safer. Since AP doesn't drive itself except for very limited definitions of driving, it can't be itself safer than a human driver. Like when a pair of sneakers help a runner bit can't logically be called "better than a human runner". Let's not forget that the human does most of the work in either example.

As long as any "self-driving" car operates only under human supervision it's not relevant to compare that system taken independently to the human. Compare the FSD alone with the human alone and then we have stats. Everything else is candy for people with more money than sense.


I do not see the point of conversing about full self driving when it is nowhere close to existing yet.


Tesla use this names unforunetly "FSD" and "Autopilot", instead of lane assist or driver assist... we need everyone including fanboys to demand Tesla stop push fake capabilities, maybe name it MFFSD = Maybe in future Full Self Driving


Yes, that was the point of my comment. Tesla does not unfortunately use the words “full self driving”, they intentionally use the words “full self driving” for marketing purposes to mislead people. How it is not a clear case of false advertising for regulators to go after, I do not know.

I could even envision the plausible deniability/marketing bullshit behind “self driving”, but to go out of your way and declare FULL self driving, and not get smacked for false advertising is beyond me.


It also doesn't take account how much the average is skewed by a minority of really terrible drivers.


Do we know this was the result of FSD and not another malfunction, such as a seized component, power failure etc? Seems we should not speculate if we don't know the RCA?


No, the article says police speculate it was mechanical failure and loss of power, but I'm sure we'll see plenty of rigorous, unbiased followup in the following weeks.

It's funny to see the pro and anti Tesla crowd arrive at truth through verbal combat, at least.


Let's make sure we wait for the investigation to be done before accusing the AI so quickly.


If a Model 3 decides to emergency brake without warning, can the driver still steer it to the shoulder of the highway, or are steering commands from the driver ignored?

I'm guessing the first car crashed into the rear of the Model 3, causing the Model 3 to spin out into another lane. Then a second car crashed into the Model 3 (probably side impact?) as it was now very dangerously exposed to oncoming traffic in the other lane. If the driver could have steered the vehicle during the emergency braking to the shoulder of the highway they'd likely still be alive. Unfortunately if the highway was heavily congested and this driver was sandwiched between vehicles on either side, there is probably nothing additional the driver could have done to improve the situation.

From the linked Facebook post there is the following comment:

My husband unfortunately was one of the drivers that struck the Tesla. He said the car in front of him almost hit the car so they swerved barely missing him and then there was the Tesla right there . He absolutely could not of stopped in time. It's a very traumatic situation. My husband is now dealing with survivors guilt and is taking it extremely hard. We are so very sorry this happened....I have no doubt that the gentleman that lost his life was loved by many and God help us all to find some strength to deal with the grief.


> He said the car in front of him almost hit the car so they swerved barely missing him and then there was the Tesla right there.

This is not a criticism of the driver, because everyone does it, but one of my pet peeves is hardly anyone follows at a safe distance. Most of the time you can’t even if you want to because someone will merge in front of you with that much space.


Everyone does it because the law isn't enforced, and because nobody is following the law enforcing it is seen as unreasonable.

On the spectrum between charging people for involuntary manslaughter for cases like this and saying "there's nothing they could have done" we are very far on the latter part of the spectrum.


Sure there is. Every driver has a duty to be able to stop in assured clear distance and to maintain proper lookout. The vehicle that struck the Tesla, breached these duties and is the proximate cause of the loss. Tesla/the driver of the Tesla, shares some contributory negligence for assumption of risk. 80% liability on the rear ending vehicle and 20%on the Tesla.


My car and that of my wife warn audibly, if the distance to the car in front is below a configurable distance. I have to actively ignore that wanring.


I saw someone was downvoted for calling you an asshole. I guess that doesn't really follow the spirit of HN, but it's crazy to me how much we have normalized driving dangerously as a society.

We have seemingly decided it's perfectly fine to drive in a way that puts others at risk of death or serious injury. It's to the point that if you call someone out on it, like the other commenter did, you're the one in the wrong. Yet they were just calling out a person for putting other people's lives at risk. We should be condemning driving like this, just like we condemn many other activities that put other's lives at risk.


There is some ambiguity in jhoechtl's comment. It isn't clear if they are saying they choose to routinely ignore the warning, or if they are saying that they would have to ignore the warning in order to routinely follow too closely and end up in a situation like was being discussed.

One of the site guidelines is to interpret a comment in the best possible way, so I am choosing to interpret it as the latter.


The latter was the first way I read it. It is a common expression.


Same here. And it will display a constant icon on both the console and my HUD if I have insufficient stopping distance.


> Most of the time you can’t even if you want to because someone will merge in front of you with that much space.

This annoys me so much. It creates a game theory situation where everybody has to drive like an asshole in order to avoid being cut off by assholes.


Until you realize being cut off doesn’t actually impact you much so you play a much safer game of giving everyone a wide enough berth to act like an idiot and not ruin your day. Driving habits are extremely psychologically interesting, from road rage at a stranger who did something you didn’t like to people feeling compelled, and indeed proud, of unsafe driving habits that put multiple lives at risk to save a couple minutes on a trip. An argument could be made that curing these tendencies would stand to save more lives then almost any other mass behavioral correction (saving maybe dietary choices).


Absolutely. In the context of high way driving especially with its high speeds, the slight slowing down to maintain follow distance even if people were constantly moving in is negligible to the overall trip time, and it’s highway driving where that follow distance is the most critical to saving lives.

If people sat down with the numbers and realized how little that slight slowing down in the moment someone moves in front of them impacts their trip time, how little it matters to the original trip time that someone passed them, the roads would be a lot safer, less stressful, and faster overall.


If everyone slowed down, there would be far more congestion backing up due to reducing the throughput of the road.

I would argue that the people driving at distances closer to the car in front of them are reducing congestion by increasing the throughput of the road. Obviously this is mitigated by decreased throughput due to increased probability of collisions.

But if EVERYONE drove such that they could fully stop if the car in front do them fully stopped, you would have a public uproar over how much backed up traffic there was.

There road is basically constantly in flux and politicians could enforce minimum driving distances very easily with cameras and drive down collisions, but it would be massively unpopular and they would get voted out immediately. I presume that is why we already do not have speed cameras in the US.

It would be trivial to fine everyone breaking the speed limits. Or even driving too close to each other. For a tolled road, you can get go back however many years you want and retroactively fine people for going past the speed limit based on toll time stamps. But I figure the lack of this happening means the public wants to make this risk vs convenience trade off.


> If everyone slowed down, there would be far more congestion backing up due to reducing the throughput of the road.

This isn't true. The greater the following distance, the more time a driver has to respond. More time to respond means fewer and gentler inputs. In turn, this reduces propagation to following vehicles, and as a result traffic moves faster. It's the difference between turbulent and laminar flow.

> I figure the lack of this happening means the public wants to make this risk vs convenience trade off.

Joe Public has never sat down to think this through, nor is he qualified to do so.


Joe Public decides this by demanding their politicians to not enforce speed limits. A campaign would be dead in the water the second it talks about having cops issue more speeding fines or install speed cameras or using toll times stamps to issue fines. The US public is not sufficiently scared of injury or death from collisions to make it a political priority.

> In turn, this reduces propagation to following vehicles, and as a result traffic moves faster. It's the difference between turbulent and laminar flow.

And in turn, this decreases the rate at which vehicles enter the highway, which backs up onto the streets that lead to the highway.


There are several major traffic accidents in Atlanta every morning during rush hour on the most important arteries. Most of these are probably caused by tailgating or aggressive behavior during merging. They turn a 30 minute commute into a 90 minute commute if any lanes remain open.

I can’t imagine that the downstream effects of maintaining a safe following distance and permitting people to merge would cause anywhere near the inconvenience of these daily, destructive events.


There are varying levels of safe driving distance. You can drive close enough to the car in front of you such that you can slow down quickly enough if they slow down from 70mph to 50mph, or if they slow down 70mph to 10mph, but gradually.

You can also drive further back such that you can slow down if the car in front slams on their brake and comes to a complete stop. This is the distance I am referring to which would be politically unpalatable to enforce.

I agree that people weaving in and out and tailgating and cutting people off are doing more harm than justified, but it all starts getting murky once the road hits capacity limits and people start trying to figure out the best way to optimize their travel times. Eventually, with enough congestion, the situation will deteriorate from stop and go traffic to crawling traffic to downtown Manahattan to Delhi and so on.


> it all starts getting murky once the road hits capacity limits and people start trying to figure out the best way to optimize their travel times.

that actually happens sooner when everyone is crawling up each other's asses.

once you get that first compression wave due to tailgaters slamming on their brakes then you've started the traffic jam.


I think that's an extremely suspect hypothesis. Most congestion is caused by:

- Construction

- Accidents

- Directly counter to your point: people driving like assholes and cutting others off or forcing others to take abrupt maneuvers that then cascade through the tightly packed traffic

Slow speeds and ample space do not cause congestion, changes in the flow of traffic do, and usually those are caused by people following too close and making sudden maneuvers. If enough space is given and the flow of traffic is not disrupted you could go as fast as you want with as many cars as you want and not experience congestion.

It's the mentality you're espousing here that I find so fascinating about road travel. There appears to be such dissonance that one will justify speeding or tailing or road rage in all kinds of ways.


Too many cars on a certain section of the road at a certain point in time is congestion. This can be caused by many things, such as roadwork, collisions, but more often than not, it is simply a result of lots of people wanting to drive in the same direction on the same road at the same time. Aka rush hour.

I am not justifying road rage or tailgating. I am explaining that it is inevitable once the road gets close to carrying capacity, which can only be increased in the immediate term by driving faster and/or decreasing the distance of gaps between cars.


You say you're not justifying tailgating, but it sounds like your prescription for rush hour is exactly that.

I suppose there are different characteristics in different types of congestion. Stop and go congestion is usually caused by disruptions to the flow of traffic, while slower than usual congestion is indeed caused by over-capacity as you indicate. Just moving slower would be all such times would experience if not for people tailing and cutting others off, leading to the stopping part.


Even moving slower will not prevent stop and go traffic if the road has reached its capacity, due to the various different accelerations and decelerations of cars.

My overarching point is something like this video:

https://m.youtube.com/watch?v=Suugn-p5C1M

There simply is not a way to avoid the consequences of being near or above a road’s capacity. Of course, I recommend everyone to play it safe and stay far enough back to avoid any liability of being too close to the car in front. But the reality is we make lots of calculations and choices in driving and at various times, we choose farther distances and other times nearer distances (even though we know it is riskier) and these collective decisions will propagate down through the whole road.


> due to the various different accelerations and decelerations of cars.

Right, changes to the flow of traffic.

> There simply is not a way to avoid the consequences of being near or above a road’s capacity.

I agree, that is a valid bullet point. But that is certainly not the only cause of congestion, and driving aggressively in congestion and attempting to justify it by keeping the roads clearer doesn't hold a lot of water I don't think.

But I see you're saying you're not advocating unsafe driving so that's good, and I take your greater point, thanks for expounding on it.


You're doing static analysis on a dynamic situation. The biggest issue with overcapacity is the start of compression waves of traffic, which are generated much sooner by tailgaters slamming on their brakes in a panic with each following vehicle having to brake harder than the one in front, eventually coming to a complete stop (or just causing an accident).


As soon as someone crashes, that congestion is going to go way up. In very specific conditions where a road is congested already, slowing down does cause a cascading effect, BUT, if everyone is already maintaining a proper follow distance, it takes longer for the road to reach max congestion in the first place. Consider it like a bunch of springs all lined up spring wise; if you have the same number of springs, and you’re trying to insert a new spring (someone is merging), the springs which were less compressed will more easily accept a new spring, while the ones where are already compressed will experience more strain.

And imo, I’d rather be going slow because “we’re going the follow distance” than going slow because “someone went too fast in an unsafe condition and crashed (or worse)”. In the face of sheer car volume, we’re all going slow(er, though note that we’re usually still getting there faster than a horse would have) pretty much no matter what, might as well be safe while doing it.


> BUT, if everyone is already maintaining a proper follow distance, it takes longer for the road to reach max congestion in the first place.

I do not think this is a realistic model of the way roads work due to rush hours and sudden increases in demand. The ideal rules are great for when demand ebbs and flows gradually, but things change when everyone wants to be on the road between 8AM and 8:30AM. Then, you can have all the protocols you want, but the system has reached max capacity and there is no avoiding that other than dealing with the consequences (congestion -> stop and go traffic -> gridlock) and of course a collision or two.

I am not saying this justifies driving like a maniac and cutting people off. I simply expect people to act a certain way over the boundary conditions of a situation are approached, and I will not expect it to be the same as how they act when we are not pushing the limits.


> Until you realize being cut off doesn’t actually impact you much

I keep a long distance, and I drive fast. Germany, so fast means actually fast. My large safety margin means I have to keep braking because slower cars move past me and into my safety margin. Then, once the original blockage in front of me is cleared, I have to pass all those slower idiots who just overtook me again.

If driving wasn’t such a rare occurrence (usually 3-4 times a year) for me, I’d probably stop keeping a large safety distance like every other asshole.

Granted, race car drivers who think distance is measured in centimeters are even worse.


This is another reason why I think of commuting as a religion. Where else would you see such zeal and self-righteous behavior?


Frankly it's fine to create enough gap that people are passing in front of you.. My response is usually to widen the gap even more. It actually has the effect of helping to smooth traffic, reducing traffic jams.


You are also reducing throughput, and causing more congestion behind you. There is no having our cake and eating it too. Once a road hits maximum capacity at the safest distance gaps between cars, the only way to increase the road capacity is to reduce the distance gaps between cars.


In addition to increasing the gap, you have to drive steadily, with no braking. This makes you become the most predictable car on the road, people behind you can follow steadily instead of constantly accelerating & braking. This leads to reduced congestion.

This CGP video covers some of this https://www.youtube.com/watch?v=iHzzSao6ypE


There is no steady in high congestion areas with lots of on and off ramps. The rules are nice when there is no congestion and plenty of room to spread out, but the problems are inevitable at rush times or at chokepoints.


Not all congestion are created by those chokepoints. It's created because someone at some point hit the brakes, and then cars behind get into a pattern of accelerating & braking constantly until the congestion resolves. Steady driving with lots of gaps helps resolve this faster and prevent congestion in general.


unless you do this in the left lane or something, it probably doesn't change throughput that much or at all. A major cause of traffic jams is cascades of braking, so leaving a safe distance allows for coasting instead of overreacting to brake lights, and prevents a traffic jam. You can get higher capacity temporarily, but eventually by reducing distance, you just cause a jam that destroys throughput.


> It actually has the effect of helping to smooth traffic, reducing traffic jams.

I hear this a lot from people who practice this sort of driving, but don't the simulations suggest that the opposite is true, traffic actually improves when drivers are more aggressive?


Having cars be "aggressive" in this context basically means they flow into the voids more readily.

An example would be moving left into a "this would be an impolite move in any other context" sized cap between two cars to in order to let someone merge and then getting right back out of the lane.



Seems like the game theory being applied there is a bit of prisoners dilemma. Is playing “asshole” the right move?


In California, unfortunately, the answer seems to be "yes." When I moved to the northeast US, drivers were more sane on highways.


When I moved to Boston, the first lesson I learned was any gap in front of me that can fit a car will soon have one there. I think it is big cities in general not just regions.


It is not big cities per se, it is any road that has surpassed carrying capacity. Obviously, this will be seen more in areas more densely populated with cars relative to road capacity.

If you drive in south or east Asia, you will see what naturally happens when the number of vehicles on the path greatly exceed the capacity of the path traveling at a certain speed. You can basically model it as fluid dynamics at a certain point in places like Mumbai and Bangkok.


I once took a defensive driving class, and the teacher's primary point throughout the whole thing is the only area of your car you have control over is the area in front of you. You obviously can't control whether people are beside you or not. You can't stop the person behind you from tailgating. All you can really do is ensure you have adequate space in front of you to respond to events.


You don't have complete control of what's behind/beside you but you have some control.

Maintaining situational awareness of these areas and adjusting your behavior to coax other traffic to behave in a way that is best/easiest/safest/whatever-ist for everyone is a prerequisite to being a "good" driver IMO.


Unless it is just heavy traffic, I constantly tweak my speed and following distance to avoid driving in other driver's blind spots and ideally not be driving beside them.


My driver's ed teacher said a similar thing about right of way - you never /have/ right of way, you are given it. I try to drive defensively, but it's a challenge when others are able to choose not to in ways that affect my driving.


That should be "priority". Right of way is about laws allowing walkers to cross private lands etc.


>This is not a criticism of the driver, because everyone does it,

It is a valid criticism of the driver and not everyone does it. It is a major problem that so many people treat safe following distances as great places to cut someone off to save .05 seconds on their commute while risking the lives of others.


You can count lines at the side of the road? I’ve never found it particularly hard to silently count to two. If someone merges in front we just slow down a bit and repeat the excercise.


It's not the counting that's hard. it's that others will merge in front of you constantly.


This does happen, but I find it doesn't result in me getting to my destination any slower.

If someone merges close in front of me, I reduce my speed by 5 mph or so for as long as I need, probably usually around 10-20 seconds, until there's an adequate gap again. Often someone will pull into that gap within a few mins, and I do the same thing. Overall it reduces my average speed by an imperceptible amount. It might add up to a couple minutes over a several hour journey, but I take that as worth it for being less likely to rear end someone.

Additionally, it really only happens in dense traffic when in the outside lane to exit the highway. When you're in the fastest moving lane and keeping up with traffic, few drivers are going to over/undertake you just to get into the gap. More typically they'll overtake the car in front of you too so it's not a problem.

Admittedly most of my driving is in UK and NW Europe rather than the US. US drivers seem a lot more suicidal and US highways tend to be less predictable by design.


Also as you adjust to increase the gap in front of you, the person behind starts tailgating, if they were not already.


In my experience, tailgaters are never satisfied by the speed of the car in front of them. The only good play is to leave enough space in front to stop without getting rear-ended.

In Atlanta, people will tailgate in spite of a wide open lane to the left and the driver in front going the speed of the road. I’m not going to tailgate the person in front of me just because the person behind me is tailgating.


Yeah, tailgating the person in front of me because the person behind me is tailgating has always felt like a: "well, now I have two problems" kind of situation.


If someone is tailgating me, I purposefully slow down, to 10-15% below the speed limit. If someone is determined to crash into my car, I prefer them to do so at as low a speed as I can get.


I drove a rental a year or so ago, and it had adaptive cruise control. This is where it would be like cruise control, except it would slow down if a car in front of you was going slower, and adjust the car speed to match their speed.

I'd never seen this before. I found that I couldn't really use it, because the distance you could set for it to follow the car in front of you was too close for my taste. I found you could decrease the distance by quite a bit, but I wasn't able to increase it to what I wanted.


Which brand was it? The ones I've seen adapt to the speed, so 2 bars in the cruise control adjustment means a certain distance at 100 km/h, and a longer distance at 130.


So true!


To reply to your first question: if the car decides to emergency brake, you can still override it by pressing the accelerator. In my experience, the braking that happens is "hard" but no more hard than you would do yourself if something suddenly appeared in your drive path, and if you are paying attention you can press the accelerator within a second or so. If the person behind you has adequate following distance it shouldn't be a problem.

If the car loses power entirely then it gives you a few seconds warning in which you may or may not be able to cross to the shoulder before the car coasts to a stop. In this scenario the braking is not "hard", but obviously there is no real way to recover and you'll need to get towed.

I have not experienced any scenario where the car does not respond to steering or acceleration commands when it was physically able to do so.


> if something suddenly appeared in your drive path, and if you are paying attention you can press the accelerator within a second or so.

So if something appears in your path, the right course of action is to accelerate into it?

within a second or so is great but not if you have 1/4 second time to respond.

> If the person behind you has adequate following distance it shouldn't be a problem.

Nice assumption.

> If the car loses power entirely then it gives you a few seconds warning in which you may or may not be able to cross to the shoulder before the car coasts to a stop.

may or may not indeed.


> So if something appears in your path, the right course of action is to accelerate into it?

>> ...the braking that happens is "hard" but no more hard than you would do yourself if something suddenly appeared in your drive path...

parent post was just giving a comparison to the deceleration-snap experienced during phantom braking, and saying in the event of a phantom brake, you can cancel the event by tapping the gas (battery?) pedal


> So if something appears in your path, the right course of action is to accelerate into it?

Not into it, but around it – if possible, of course. It's something you learn in motorcycle school, for example.

The idea is that a motorcycle is harder to maneuver while under hard braking, so you have a better chance of avoiding the obstacle or falling by going around it.

>> If the person behind you has adequate following distance it shouldn't be a problem.

> Nice assumption.

Of course. This also feeds into the above. As in, not only do you have to stop, but also to make sure that the person following you will stop without hitting you.


> So if something appears in your path, the right course of action is to accelerate into it?

You mis-parsed what I said here. My sentence was probably too long. Let me try using shorter ones. Say you are driving. Suddenly a squirrel appears on the road 50m away. You apply brakes "hard" to avoid hitting it. This amount of braking is what the car does. You can still press the accelerator. The accelerator works as expected, cancelling the software's braking.

> if you have 1/4 second time to respond.

Under what circumstances would you have this short amount of time? IMO only if somebody is following you too closely. Again, remove the software from the equation and insert the aforementioned squirrel. You brake hard for it. The person behind rear-ends you because they were following too closely. Who is at fault?

> Nice assumption

Thank you


if it emergency brakes, does steering still work or not until you accelerate?


Yes - a slight nudge on the steering wheel or pedals deactivates autopilot. You always have full control like when driving like normal.

One should therefore always have the hands on the wheel and feet by the pedals and be ready to drive.


> slight nudge

To clarify, this only applies if the car doesn't think you're about to exit the lane. It will momentarily fight you, and it will take more than a slight nudge to override it.


It does work, yes.


Self-reply to add: despite (occasionally) experiencing phantom braking and my wife once experiencing the "loss of power" scenario due to a defective drive unit, I'm still very happy with the Tesla overall. In my experience it is the best car I've owned despite the flaws that seem to disproportionately make the headlines.


> He absolutely could not of stopped in time.

Somebody (not necessarily the husband) was driving too close to the car in front. You should have enough time to see the car stopping and stop yourself. A 2 second gap at 60mph is 178 feet. A second to react and that still gives you 88 feet to slow down in before you even get to the point where the Tesla started to slow, reducing the impact speed to far lower -- not enough that should kill anyone.


Not necessarily the case.

Friend of mine was on the middle lane on the motorway when somebody switched on to his lane and braked hard.

He crashed against but wasn't deemed at fault by insurance.

Not saying that this is what happened, btw.


crowded expressways have a way of filling gaps


> If a Model 3 decides to emergency brake without warning, can the driver still steer it to the shoulder of the highway, or are steering commands from the driver ignored?

Yes - a slight nudge on the pedals or steering wheel will deactivate autopilot and let you drive like normal.


After you've spent another second (and 150 feet) fighting the AP to say, "no, I'm not drifting out of my lane, this is intentional".


150 feet a second? Where are you driving where you're doing that speed and changing lanes and are concerned about 150 feet?

Why doesn't the lane detection deactivate when you indicate?


Sure, maybe 100f/s.


> He said the car in front of him almost hit the car so they swerved barely missing him and then there was the Tesla right there .

This is how my friend lost all 4 wheels on his car. Following too close to react to stuff that the vehicle in front of him could. Hit the raised island for a slip lane when someone swerved out of the right turn lane at the last minute.


So, the car in front was able to swerve, but from 2 cars back, he couldn’t and ended up killing a person. Somebody wasn’t paying attention. Not they he’s unique in this, but his statement is baloney.


Car in front saw the car before them suddenly start to brake, but Tesla was still moving fast enough to get out of the way. Which revealed an almost stationary car to the person behind, whom was still driving 70mph.

I can see how that is hard to react to.

I wasn’t sure how hard though, so I did a thought experiment.

Say there is 50m between each car, and they’re all driving 25m/s. First car starts braking at 5ms. After one second the distance between 1 and 2 is 45m. After two seconds, 35m. After 3s, 20m, and the car is evading. After 4s, car 2 is passing by 1, whom is now moving at only 5m/s. Car 1 is finally revealed to car 3, who now has a maximum of two seconds to do something before it crosses the 50m gap.

And that’s assuming an adequate gap. They were likely closer.


Even at a save distance, if the car was at a complete standstill, thats something completely unexpected on the highway. Having to react in such a short timeframe is close to impossible for a human, even when you keep a "safe distance"

The "safe distance" is only "safe" if all the traffic is moving at least.


I've seen cars at a complete standstill on the highway more than once. It's fair to say it is unusual but it should be fully expected, IMO.

I don't understand the scare quotes around safe distance here. Maintaining safe distance (and the implied safe speed and open eyes) absolutely would help in these situations.


Have you ever had someone in front of you swerve out of the way, leaving you facing a stopped car? It's exciting, let me tell you. The idea of safe following distance is predicated on the car in front of you not being able to defy physics when they stop, presumably their braking power is similar to yours.


Sure. But, in this case, there were three+ cars. The Tesla that stopped, the car immediately behind the Tesla swerved, and the third car ran into the Tesla.

The driver of the third car claims there was nothing he could do. That's baloney, as the car that was EVEN CLOSER to the stopped Tesla was able to react. Fact is, he was following too close and/or not paying attention, and crashed.


I agree that the third car is almost certainly at fault. "Didn't have time to stop" is nonsense in every case except if some idiot cuts into your lane within your safe stopping distance and then immediately brakes.

Drivers need to think of every vehicle as being in a moving block[1] signalling system with each block being ~290m long (actual vehicle in the middle of this block) in wet conditions at 110kmh-1[2] and ~230m long in dry conditions at 110kmh-1[2]. No vehicle should ever enter another vehicles moving block. Obviously the type of vehicle, tires, road surface material, etc can reduce (or extend) those required block lengths, but you generally have to assume worst case scenario because how are you to know whether another vehicle has bald tires or is operated by a person who is fatigued or distracted and taking much longer to respond than they should.

[1] https://en.wikipedia.org/wiki/Moving_block

[2] https://www.qld.gov.au/transport/safety/road-safety/driving-...


If the driver of car 3 could not see past car 2, and car 2 hadn't realised the Tesla had stopped, and instead swerved out the say 1 second / 90 feet (assume 60mph) before collision, that would give car 2 traveling at a reasonable gap no more than 3 seconds / 270 feet to recognise what was going on and stop.

Should have recognised something odd (a broken down car) within 1.5 seconds, leaving 1.5 seconds to slow down.

Applying brakes from 60mph would mean you'd still be traveling over 40mph when you hit the back of the car.

Thus you need to give more than 2 seconds if you can't see beyond the car in front


Yes. But because I was following at a safe distance from the car in front, that gave me adequate time to slow down and stop without hitting the stopped car.

Safe following distance isn't predicated on defying physics, it means assume the car in front might stop dead for any reason - hitting an unseen obstacle, swerving to reveal an obstacle, sudden mechanical failure, animal collision, whatever. The distance you need to react and stop is around 100 metres at 70mph. Therefore, that's the distance you should keep from the car in front.

As a bonus, it gives any tailgaters more time to notice you're slowing down too. Hopefully this means you're less likely to get pancaked between two cars.


>Safe following distance isn't predicated on defying physics, it means assume the car in front might stop dead for any reason

No, it's based on assuming the car in front might brake at standard non-physics-defying rates at any time. The 'safe' following distances that are advised don't allow for encountering stationary objects. That's why they're called following distances, as in, you're following another vehicle that obeys the laws of physics.


But that Tesla didn't defy physics. Nor did it fall out of the sky. It braked and slowed like any other car.

Maybe there truly wasn't enough time to completely avoid a crash. But there should have been enough time to avoid a serious (deadly) crash.


>It braked and slowed like any other car.

Imagine you're following - at a safe distance - a vehicle. The driver of the vehicle you're following doesn't notice that the car ahead of them is emergency braking. They notice at the last second and swerve around it to avoid a collision.

You are then face to face with a nearly-stationary object which was concealed from you by the vehicle ahead of you.

From your perspective, it fell from the sky. If you don't swerve, a collision is unavoidable.


I think we are in agreement with the the situation, I just disagree and think you're either driving too fast or following too close. Your logic leads to pile ups, as in the event of a pile up, a car stops dead unexpectedly infront of you and you have nowhere to swerve.

I can't speak for the USA, but in the UK, if someone swerves and you rear end a stationary object, you are at fault. Indeed if you rear end another vehicle, you are nearly always at fault (unless e.g. they pulled in front of you then immediately slammed the brakes).

For stopping distances, this is what we are taught in the UK (rule 126): https://www.gov.uk/guidance/the-highway-code/general-rules-t...

Stopping distance at 70mph is about 100m and "The safe rule is never to get closer than the overall stopping distance". This is a minimum - in adverse conditions the distance should be even greater.

This means when you see the stationary object, you're about 100m away, which is adequate time to stop IF you're paying attention and your car is well maintained (we have annual MOTs to ensure this).


I don't know about the USA, but in the UK our stopping distances assume the car might become stationary at any point. That is, they are actually stopping distances - not following distances. If you follow closer than that on your test, you will fail.


Frankly, I can never imagine seeing an empty football field worth of space between two cars on a busy freeway. Even if you attempt to create such a space, other drivers will fill it.

This assertion about following distance feels purely theoretical for busy freeways.


Caveat: I drive in the UK so driving quality may be different here. I always maintain adequate stopping distances on busy motorways. People do fill those gaps occasionally and it does result in me driving slightly slower on average by a couple mph maybe, but it's perfectly possible to maintain a gap.

A lot of experienced drivers here claim it's impossible too, but there's a good way to know that it's not: on your driving test, you'll fail for following too closely, so most people manage to keep an adequate stopping distance for the test. Following closely is usually a habit picked up when someone becomes a more confident driver.

The real reason is probably people don't want to take the tiny average speed hit or find it unfair that they keep letting people in.


The question is whether the accident would still have happened, if the second car drove a Tesla, also with the autopilot engaged?


This seems like when Windows decides to shut down and apply patches in the middle of a work session. I’ve been in situations where Windows thought it essential and unavoidable to reboot and stay offline for 40 minutes starting at 125pm as I was screen sharing and presenting to 50 people. We all watched the countdown.

It was surreal and an example of such stupid, anti-user design.

I spoke with the “IT” person who set up this policy and they explained that I should have left my computer on to patch overnight. So basically it was ok that I, and my audience, suffered because I forgot to do something I didn’t even know was necessary.

In my case no one died, but I hope Tesla doesn’t apply the same logic to patching.


I’ve never had Windows do this, but my machine also isn’t managed by corporate IT. I think Windows downfall is really providing too many knobs for these people to control.

Corporate IT is inherently anti-user in my experience. They only care about their policies and not enabling you to get your job done.


My home gaming pc has done this and I frequently dust off old vms where windows requires patches just to start. I didn’t set any policy and wish I could turn off this option.

My cohabitants not being able to play games when they like isn’t a huge discomfort, but does make me wish more games would run on Linux or Mac (where this never happens).


> Corporate IT is inherently anti-user in my experience

As someone who did help-desk in the past, corporate IT is not anti-user, but pro-efficiency, though the outcomes may be indistinguishable.

IT is typically under-resourced[1], and has to deal with too many users with too many requests, and prima donna users - more of them than you'd imagine - demand personalized white-glove service (but don't want support to offer that to others because it ought to expedite service to them)


We’re all understaffed. But I’ve never designed such a stupid process and then blown off complaints.

Scripting patches to only run at off hours or prompting to patch at the end of the day is not rocket science. Perhaps they are overworked. Perhaps they are stupid. Perhaps they are just jerks.

Having non-idiotic patch policies isn’t asking for “white-glove service” and responding to user complaints when my process breaks a user process is something we get paid to do.

Comically stupid IT processes lead to more user requests. So “doing it right” is actually easier than doing it stupid and having people complain.


This happens on personal, non-managed Windows as well. In college, my friend dropped out in the middle of a LAN Starcraft game because of it.

Windows simply isn't a good choice for anything but a toy system or a specialized one to be used rarely (eg for gaming)


>Corporate IT is inherently anti-user in my experience.

This is a 2 way street, there is a reason they still always ask if it's been rebooted. They can tell it can't and they are being lied to.

>They only care about their policies and not enabling you to get your job done.

Simply not true, it's almost always the first concern. Judging by your tone you are not considering they have a job to do as well, and it's relies on your assistance, or you get a forced activity like that.


This “leave it on overnight “ is a minor point of contention with me and the IT department currently, because I work from home and electricity prices are insane.

Our Windows is set up to patch overnight, which means wake up at some random time then just sit there consuming power. That doesn’t fly with me so I unplug the power so it doesn’t do that…


Same complaint from me. They also require vpn access for that overnight. So I have to leave my laptop on every night with an active vpn session just in case they want to patch.

It seems smarter just to download the patch in a background process, then prompt to patch at the end of the day.

A “hey, please leave your laptop on instead of logging out and we’ll shut down when we’re done” would be well received.


I hate that. I leave my workspace open and go home, come next morning and am on a fresh desktop because windows decided its totaly fine to reboot in the middle of the night.


And then on top of that, they disable the "resume last session" setting in your browser, so you come back to find the dozen or so tabs you were using for your current task are gone, and you need to spend time going out and finding all those pages again.


Tesla’s software is dangerous. I own one and I have to be a lot more alert than when I am actually driving. Just yesterday when I was driving rural town, it was stopping every time it sees a truck in the incoming lane. I suspect if something like that happened in this case as well.


I can't imagine having a Tesla with the drive-assist software in its current state. It's like having a mentally ill spouse at home, you always wonder what irrational thing is going to come next, you have no peace any second of the day. Now try that during something safety-critical, like driving. The constant hypervigilance is exhausting, and when you aren't looking, the Tesla drives itself into a traffic bollard. Why is such crap road-legal?


I have a Model Y, and I don't feel comfortable with autopilot or lane assist. Both of them do "dumb stuff" regularly. The day I got the car, lane assist attempted to steer me into the center median of a highway when the car was going around 50 miles per hour. After that, I realized I'm not interested in "beta" features that put my life in danger. Autopilot drives like a paranoid grandma. It brakes constantly when it shouldn't. All that said, the car is a lot of fun, and I really enjoy it, but I choose to maintain full control at all times.


Because our current form of government can't keep pace with the rate of new things.


> The constant hypervigilance is exhausting

This is exactly it, IMO.

Whether you like AP or not largely comes down to how your driving style compares to the computer's. If you are a defensive driver, you won't really care for the kinds of obviously dumb situations AP will happily drive you into. Eventually it reacts, but it takes a lot longer for the sensors to notice what a human brain can easily predict is about to occur.

I found AP to be an interesting toy, but it never made driving more relaxing for me, because I had to be more aware, not less, of everything around me.


Autopilot is not as bad as you think it is, for one it does not stop abruptly. Thousands of drivers are using it regularly. Drivers report less fatigue and paying better attention to things that matter rather than keeping the car in line. Of course, there are some drivers that are not comfortable with it yet.

EU seem to have more stringent regulations - for example, to ensure that AI maneuvers will be less unexpected EU regulations put a hard rate limit on steering angle. Trouble with regulating at such level is that in some cases (curved roads, hazard avoidance) it makes the car less safe.

Overall, Level 2 systems, like Autopilot, are not autonomous and their performance depends in large part on drivers judgement. I think regulations need to focus on human - AI interface requirements a bit more.


I agree. Plus it is a lot more stressful to imagine that the car could stop randomly rather that hitting an object in front of you (which is already quite bad). The former is literally mentally exhausting.


and paying $10k for it, too. I imagine a number of customers paid for it, so they want to use it, no matter the state it is in.


that is the only way it will ever get better.


Maybe the lawmakers, are asleep at the wheel?


As a non-owner it seems kinda scary with the regular updates that the behavior would change. One day it stops for trucks in small towns, the next day it doesnt.


Does Tesla force OTA updates, or can the owner turn they off and accept them manually?


There are a couple controls. You can opt-in to get releases earlier, or wait until they're a bit more fully baked. And you can refuse the update when it pops up. I don't recall it ever being forced, but I could be mistaken.


Your experience is not universal. In my case the combination of me+software is certainly safer than me alone, and also less taxing for me.


It comes down to your driving style. Defensive drivers will not like how AP drives, and it makes driving much more tiring, not less. If you are comfortable with the way AP drives, though, I could see it being relaxing. Hell, some people sleep when AP is cruising down the highway, so clearly there is a spectrum.


AP somehow doesn't brake soon enough for my comfort when approaching slower cars and it doesn't accelerate quick enough when traffic speeds back up.


Can you explain how you come to the conclusion that it is safer?


There's been a number of occasions where I would have been in an accident had it not been for the software saving me. Obviously it is possible that one day the software will result in an accident that would not have happened had I been driving alone, but thus far that has not happened, and so, thus far, I can conclude that the combination is better than me alone.


It’s fairly rare for a driver to be involved in “a number of accidents” in just a few years. Most drivers go decades with zero collisions, so if you’re experiencing multiple per year, saved only by Tesla software, you might try to see if there are other avenues that you could explore to reduce your risk to be more like the population average.


I get what you're trying to say, but if anything, you're just making the case for Tesla software.

Consider that there is a range of drivers, from "good" to "bad". If most drivers go decades with zero collisions, that's great, and it probably puts me closer to the "bad" end of the spectrum. By your own admission drivers like me should explore avenues to reduce our risk. Why is Tesla software not a valid avenue?

If you're a "good" driver, you don't need it, and that's fine, you can get some other car or drive with autopilot off or whatever. But for us "bad" drivers, the software makes us safer (both personal risk and to others on the road) so why not use it? What other avenues would you suggest exploring?


I would say close calls are fairly common, especially in urban areas with lots of traffic. It’s not just your driving but the people around you. It takes two to get into an accident.


Right. I suspect that the net effect is that the Tesla software transforms these what would have been close calls into…still close calls where the Tesla software gets credit for a “save”.

If a Tesla “saved” a driver 5 times in 20K miles, my first question is always going to be “how many collisions did they have in the prior 20K miles in their other car?”


This is a good point, and I agree. If there was a close call you can't actually say for sure if it would have been a collision without the software.


This effect has a large influence on the overall A/B analysis. If pre/post test analysis suggests that Tesla saved you from only 0 to 1x collisions vs a previously estimated 4x-5x collisions, your tolerance for newly introduced collisions would naturally be much lower.


One person cannot drive enough miles in their lifetime to allow making a determination that one system is definitely safer than the other. The reliability that we require from an autonomous system means you might never experience a safety failure (again, even if you drive every minute of the rest of your life), but the system is still less safe.


True, but manufacturers can look at aggregate miles traveled and come to some conclusions about the safety of vehicles without any safety systems, with active safety systems like automatic emergency braking, and in Tesla’s case, Autopilot. They publish the statistics regularly and crashes are far less common per million miles driven when autopilot is engaged.


I think these stats aren’t very useful because 1) Tesla drivers are different than other drives as they are rich. Comparing miles driven by rich people to all miles driven isn’t useful for knowing if Teslas are safer; 2) Tesla has really low numbers so it’s hard to compare a small sample to a huge amount. It would be like comparing walking accidents by 7th Day Adventists. They may walk a million miles among the whole population but that’s nothing compared to the trillion miles of the entire population.


Tesla's claims of self-driving being more safe was recently debunked:

https://mobile.twitter.com/Tweetermeyer/status/1488673180403...


There have been difficulties in independently verifying at least some of these claims. I do not know whether they have been cleared up.

http://www.safetyresearch.net/blog/articles/quality-control-...

https://jalopnik.com/feds-cant-say-why-they-claim-teslas-aut...


> crashes are far less common per million miles driven when autopilot is engaged.

Does this count crashes that occur after autopilot disengages?


>They publish the statistics regularly

The data is not transparent, they publish what they want. How many times drivers intervened? Does Tesla or the fans consider those as a +1 for human and a -1 for the AI in the stats? Nope.


It's my understanding there is no independent peer reviewed scientific study showing Tesla cars are safer.


Exactly. I mean how many accident do you expect to participate in your lifetime? Like probably 2-3 (scratched bumber) and 0 severe ones.


I agree, and I think most Tesla owners feel this way, given that the AP outrage seems to manifest itself exclusively on hacker news.


My wife forbade me from using AP [with her in the car] after a few phantom braking occurrences. I got sorta used to it and could jab the go pedal pretty fast, but it scared the daylights out of her. Can't really blame her. I get nervous enough as a passenger when there's a human driving, much less when a computer is driving that mistakes shadows for obstacles.


Exactly correcting software errors is a whole lot more effort than just sticking to your own way of driving.


Tesla/Elon Musk have this weird irrational obsession with using only cameras for their autopilot. Yes, humans can drive using only two eyes (and also actually ears, touch feedback and acceleration sensors), but they also have a lot of accidents. It's not obvious that you can do better with software only, and in any case you're crippling yourself by not using additional inputs for no good reason.


> irrational obsession with using only cameras for their autopilot

it's so they can upsell FSD with the highest margin using the cheapest hardware (they're not even good cameras, 1280×960 resolution because they want to push pixels directly into their neural net, and dynamic range is poor, I will never understand how they didn't spring for infrared night vision, cadillacs had it 20 years ago)


I think you could also say that people seem to have an irrational obsession with Tesla’s decision to only use cameras. Camera only systems aren’t uncommon among other automakers (Subaru Eyesight, for example.)


There is nothing irrational about being concerned by the fact that complaints of phantom braking incidents increased significantly after Tesla dropped radar from its data sources.

https://arstechnica.com/cars/2022/02/teslas-radar-less-cars-...


Your comparison does not apply. I do not believe Subaru is selling Eyesight as a hands-off driving system. AFAIK its purely marketed as a safety system for emergency braking, lane keep assist and the like. On Tesla's side they market it as a the holy grail "autopilot".


The reason I’m making the comparison is that there are reports of phantom braking when Teslas aren’t in Autopilot.

The same vision-based system is used for AEB in Teslas. Subarus use a vision-based system for AEB but don’t seem to experience phantom braking.

I think you could reasonably conclude that autopilot aside, a vehicle can safely use a vision-based system for AEB without phantom braking being a huge issue.


And it’s lane assist isn’t “keep the car centered”… it’s the steering equivalent to emergency braking, a sudden jerk away from the line it thinks you’re about to cross.


The newer versions have lane centering which isn’t quite equivalent to Autosteer in a Tesla, but it’s much closer to it than the older versions that only have lane keep assist, which only nudges you away from a lane line you are about to cross.


Our human "cameras" also scan constantly, have crazy dynamic range, employ dynamic shades (hands) and move around several feet, inside a protective weather proof shield, as well employ "sensor fusion" with "microphones" and "vibration detectors".


While I think there's a good reason to suspect Telsa's software is at fault, I also note this:

> 74-year-old Terry L. Siegel

Every few years there are reports of a (usually foreign) car manufacturer having problems where the car "just accelerates on its own". But what's strange is that it only affects people over 65. Almost like above a certain age, people occasionally mistake the brake and gas pedals and don't react to the mistake quickly enough.

I think it's worth waiting for the full report on this accident before loudly proclaiming that this is Tesla's fault. It may be, but it also may not be.


I'm remembering the Santa Monica Farmers Market Crash: https://en.wikipedia.org/wiki/Santa_Monica_Farmers_Market_cr...


This is a good point. There are many elderly people struggling to keep up with technological advances. I’ve seen these poor people get confused by things like WiFi passwords and self checkout kiosks at the grocery store. Of course these same people would likely never step foot in a Tesla. But it is important to think about these drivers when taking into account the safety of these systems.


Yet another reason for me to avoid private transportation like the plague. As if humans driving 1 ton death machines wasn't bad enough, we've decided to add experimental technology into the mix.

Nope. I'm sticking to trains whenever possible, thank you very much.


They're usually 2 tons now.


OTOH, most of us will die from a heart attack or cancer. So to the extent that I can control risk factors, driving a car is fairly low on the list. Especially given the degradation of lifestyle I would suffer if I tried hard to avoid it.


That's totally understandable. I certainly don't think driving a car is a moral failure or anything. I just personally try to avoid it as much as I can because, from my perspective, it's not worth it for me or the other people involved. Ideally I would live in a world where nobody needs cars to live comfortably, but for now all I can do is optimize for myself and organize to make it happen for my grand children.


But what if the bus/train stops unexpectedly and another drives into it? It happens. Mechanical failures happen. Human error happens. Accidents happen.


I think if we compare deaths / passenger miles driven between cars and trains. Trains come out very favorably. Possibly only beaten by airplanes.

Edit: Numbers

In the US

> Over the last 10 years, passenger vehicle death rate per 100,000,000 passenger miles was over 9 times higher than for buses, 17 times higher than for passenger trains, and 1,606 times higher than for scheduled airlines

That sounded rather high for trains (dunno what the US is doing wrong), so I checked some other country.

In the UK:

> Over the last 10 years, passenger vehicle death rate per billion passenger kilometers was over 5 times higher than for buses, 1100 times higher than for scheduled airliners, and an infinity higher than for passenger trains (no deaths were reported).

Deaths/billion passenger-km Car 1.1 Van 0.4 Water 0.4 Bus or coach 0.2 Air 0.01 Rail* 0

In general of the already almost negligible death rate, it seems that rail deaths are mostly people at rail crossings, and maintenance personnel, not passengers.


How about when you compare a Tesla in driver assist mode per-mile with a human driver per-mile?


Given that Tesla drops you out of assist mode as soon as it encounters a confusing situation, that's a tricky question.


Not really a fair comparison unless we can somehow only count non-autopilot accidents that happen in places where autopilot actually works.


Human error does happen, but it results in death an order of magnitude less frequently in buses, trains and airplanes than in private vehicles.

https://injuryfacts.nsc.org/home-and-community/safety-topics...


Nice for you that that’s an option.


Indeed. I'm fighting for a world where it's an option for everyone.


The headline emphasises the vehicle make, and we assume this is another criticism of Tesla’s autopilot tech. But that determination has not been investigated yet.

If the headline instead emphasised that the driver of this vehicle was seventy-four years old, we might assume it was a different criticism entirely.


They’re just trying to dogpile on the “brake check” reporting previously. Brake checks are not car comes to a stop. There’s a lot more to this.


This is sad, but it's worth noting that even with a human driver, a car that stops suddenly is not responsible for any subsequent rear end collision(s). The colliding driver has the responsibility to keep a safe stopping distance at all times.


This isn't how driving works in practice. Suddenly stopping on a fast freeway isn't a reasonable thing to do unless it's absolutely necessary, and it's not reasonable to expect the following driver to constantly be considering the possibility that the driver in front will slam on the brakes for no reason. Insurance companies (at least in some parts of the world) have a standing agreement to just hold the following driver responsible when there's a collision like this, because in the majority of these collisions, the following driver is at fault. It's efficient for them to agree to this convention to avoid the cost of investigating each collision individually. This has led to the common misconception that the following car is necessarily responsible if there is a front-to-rear collision.


> it's not reasonable to expect the following driver to constantly be considering the possibility that the driver in front will slam on the brakes for no reason.

It's totally reasonable. It's not hard to maintain a safe following distance. Sudden braking isn't the only reason to do so, and insurance claims aren't the only motivating factor. You drive defensively so as to reduce the likelihood of death and destruction while making life easier for other drivers.


It's not hard to maintain a safe following distance.

So if a car swerves into your lane 10 meters in front of you, then slams on the brakes, you're at fault because it should have been so easy to maintain a safe following distance? (This isn't hypothetical, I've had this exact scenario happen multiple times -- on/off ramps seem to bring out the worst in some drivers).


I don't think anyone is saying what you're suggesting. If you're following a car and they stop, you should be able to stop as well. If a car pulls in front of you and slams on your brakes then that is quite a different scenario.


> it's not reasonable to expect the following driver to constantly be considering the possibility that the driver in front will slam on the brakes for no reason

Whether or not it's reasonable is immaterial. If you constantly consider the possibility of the cars around you to do something unexpected, then you're more likely to avoid death and destruction. As well as prevent the death and destruction of the people around you. It's called defensive driving.

Maybe it's unfair, or too much to ask, or just too difficult. But it's reality.

I'm rooting for self driving cars because I believe that, theoretically, they'll do a much better job than people. In the long run.

However, today, you have to pay attention or you might die. And even if you pay attention you still might die.

My dad told me a story about a coworker he had who was in a car accident. He was sitting at a long line in the highway at a tollbooth. It was foggy. My dad's coworker was watching the fog behind him like a hawk because he didn't feel safe just sitting completely stopped in the highway. A semi-truck appeared out of the fog moving at high speed. My dad's coworker cranked his steering wheel and tried to accelerate off to the side of the road but got rear ended by the truck. He was lucky. Because he was already headed away, this car got bumped out into a field. His car was destroyed and he got a bunch of injuries, but he lived. The people in front of him in the line were not lucky. A bunch of them died.

It's not fair or reasonable. But being constantly considering possibilities saved this guy's life. The other people are dead.


Speaking from the UK, if you leave a safe stopping distance on a motorway then, invariably, the gap will be filled quickly with another car.

The willingness of so-called intelligent beings to needlessly risk their lives (and the lives of those around them) for no gain whatsoever is truly astounding.


If people drove such that they had enough stopping distance to come to a full stop on highway at 65+mph, traffic would get backed up very quickly.

Road capacity is a function of number of lanes, the velocity of the vehicles traveling in those lanes, and the density of the vehicles traveling in those lanes. If you reduce the latter, you reduce road capacity.

Theoretically, everyone would be safer if they drove far enough back from the front of the car to come to be able to come to a full stop. Practically, the political uproar of enforcing this would be untenable, but I would be entertained to see the forthcoming overpopulation vs pave over all the land with roads vs mass transit vs variable rate tolling conversations that ensue.


I’m dubious that the equation is so simple.

> Road capacity is a function of number of lanes, the velocity of the vehicles traveling in those lanes, and the density of the vehicles traveling in those lanes. If you reduce the latter, you reduce road capacity.

And yet study after study shows that increasing the number of traffic lanes only worsens congestion.


https://en.wikipedia.org/wiki/Induced_demand

If you have 10 cars wanting to use a stretch of road capable of carrying 10 cars at 8AM at speed X with the safest distance gaps, and you add 10 more cars, then there will be some combination of congestion behind this stretch of road, the distance gaps getting smaller, or cars driving faster now that there are 20 cars.

If you add a lane so that 20 cars can travel at the safest driving distance gaps at 8AM at speed X, but this then causes 10 more cars to come on the road because they see the possibility of using a road that is maybe just under maximum capacity, then you now have 30 cars on a 20 capacity road, and once again you end up with previous situation as 20 cars on a 10 car capacity road.

Obviously these things are impossible to exactly predict and constantly in flux and so engineers and politicians and drivers are all crudely estimating the shifts of supply and demand curves, and the overestimating and underestimating will cause congestion, or complaints of excess road, etc.


You are correct but I'm afraid this is not an excuse to drive at speed 2cm from the rear bumper of the car in front of you.

I should add this my near daily experience driving to my office in a rural area at 5am when there is just me and a random car behind me. With 2 miles or more of empty road behind him.


> You are correct but I'm afraid this is not an excuse to drive at speed 2cm from the rear bumper of the car in front of you.

I anticipated this response, hence I tried to pre empt it by making my first qualifier that I am not talking about tailgating 2cm from the car in front.

> If people drove such that they had enough stopping distance to come to a full stop on highway at 65+mph,

This qualifying statement was supposed to convey that the theoretical safest distance to travel behind someone at 65mph, 75mph, or more is so big that I would not expect it to be followed anytime other than the dead of night.


You are correct, again, but please consider that in order to avoid killing or seriously injuring someone in a road accident you don't need to come to an absolute stop.

You can actually avoid a collision at moderate speed by simply giving your brain time to react to events happening in front of you. And, even if you do hit something, doing so at a moderately low speed may not turn out so bad?


For sure, I keep as much space in front of me as possible to avoid touching the car in front. I just do not expect everyone else to.


Multiple studies in The Netherlands have shown that decreasing the speed limit during rush hour increases the road throughput. So this isn't the simple equation you try to paint, there is a nonlinear interplay between the average velocity, the variance of the velocity, and the vehicle/lane density.


That will work up to a point, but I would bet there are many situations where the amount of cars that want to be on the same piece of road at the same time overwhelms those kinds of measures. And with the same result since congestion force everyone’s speed to be lower anyway.

I can think of quite a few chokepoints in the US (Manhattan/SF-San Jose/LA/Seattle/Austin) where no matter what you do, the crush of cars will be too much for the roads that exist.

In these cases though, I think public transit is the only real solution, but that is a different topic.


> Speaking from the UK, if you leave a safe stopping distance on a motorway then, invariably, the gap will be filled quickly with another car.

Absolutely correct. It is then your responsibility to put a safe stopping distance between yourself and _that_ car.

You are in a death machine surrounded by other death machines. Sometimes you have to choose the less convinent option.


Not entirely true. Some states in the United States have "brake checking" traffic laws, and others that do not have such laws can issue a citation for reckless driving in a case of brake checking. Like a lot of "road rage" stuff it probably goes underreported and under-prosecuted. (brake checking is going to be a lot easier to prove in the near future when cars are littered with cameras)

Neither would seem to apply to an automated system that randomly stomps on the brakes. I would think a system like that might affect fault on the civil, rather than criminal, side, however. For example, it would seem a bit mad for the insurance companies not to blame the vehicle whose screwy brakes caused a multi-car accident.


Actually stopping on the highway is not allowed in the Netherlands. So no, you're wrong.


Same in the UK. Nevertheless, I have encountered stationary traffic on many occasions anyway, and I've had to use the brakes to bring my car to a halt as well.


Where I live that's only true if the stop was warranted. Whether the human driver or an assistance system is the cause is not relevant.


But isn't that the whole point? What's "safe stopping distance"? Enough to react to the vehicle in front of you slamming the breaks at what speed? It all changes based on road visibility, if there are turns, if there are other cars, etc.

It's completely different to drive behind a vehicle going at 100 km/h on a empty, well maintained (yeah, 3rd world where I live has this variable too), straight, ascending, dry road; vs behind a sinuous route downhill on a mountain, with less lanes, under some weather condition, etc.

If a car in front of me SLAMS the breaks in the first scenario, I'm certainly not responsible for not being 3km away from him. In the second? Totally different.

"Safe stopping distance" is not a fixed number. And I doubt any car can properly calculate that because there's just too much context to be accounted for.


>What's "safe stopping distance"?

At least 2-3 seconds, minimum. Did you not take any driver's ed classes?

>Enough to react to the vehicle in front of you slamming the breaks at what speed?

Time, not speed. It will take the average person around 1.5 seconds to react, and a car at least a second to break. Time is not hard regardless of distance, you can see when somebody in front of you passes literally anything, a sign, a tree, a rock, a road market, whatever, then just count. FFS.

>If a car in front of me SLAMS the breaks in the first scenario, I'm certainly not responsible for not being 3km away from him

Not responsible for being 3000m away? Sure. ABSOLUTELY RESPONSIBLE for not being 50-80m way? Yes. Both legally and morally. Maintaining at least 2-3 seconds of distance just isn't a big fucking ask. And yes that's just as true if it's an "empty, well maintained" road, well why the fuck are you tailgating in that case? Shit happens. Animals suddenly leap across the road, a pothole has opened up, someone has collapsed in weird places etc. I literally was walking with a friend along an empty straight road in a rural area a few weeks ago and they suddenly just blacked out while walking and collapsed directly into the road. Scared the crap out of me. They'd had a pulmonary embolism not too long ago and their lung function was badly affected and while they're trying to build back up sometimes they just... black out. And if you're in a car you won't see someone lying down until you're fairly close. Low probability event? Absolutely. But over hundreds of millions of people it happens.

People sometimes need to stop as fast as possible. It is in every single way your job not to rear end them if they do. Period.


Well, good luck with that. You can apply that kind of thinking to yourself, but not to everyone else.

People don't operate like that. Sometimes, if you keep that distance all the times like you're saying you'll end up putting yourself and other people in danger, just because you're the only one thinking like that. I've had my fair share of furious truck drivers cutting me at high speeds in terrible situations at night, while our baby is crying in the backseat, his mother is exhausted and we can't find a single spot to stop and breathe before progressing. It has happened during the day on empty roads too. I've been in danger for acting like that while everyone else doesn't expect me to. Just like the safe-break assistant in this situation.


> At least 2-3 seconds, minimum. Did you not take any driver's ed classes?

GP is right, though, it is highly context dependent. If you are following another car and paying attention, then two seconds is enough if they slam on the brakes. There is an implicit assumption that two cars have roughly equivalent braking ability.

But what if they do some sort of assisted stop? Bridge abutment. Head-on collision. Or what if there is a stopped car in front of the car in front of you, and the car in front of you swerves out of the way leaving you bearing down on an unexpectedly stopped car in the lane? 3 seconds won't be enough.

I still remember vividly many years ago coming around a high speed sweeping corner behind another car, at a perfectly reasonable stopping distance, and then they jerked the wheel into the oncoming lane and suddenly I was facing the back of a stopped car. That was exciting. I did not hit the car, but it was close, very close. The only way to avoid that is a stopping distance much greater than 2-3 seconds.


> "Safe stopping distance" is not a fixed number. And I doubt any car can properly calculate that because there's just too much context to be accounted for.

If people truely cannot calculate this, we shouldn't have cars.

Personally I know my car quite well and its stopping distance in most conditions. I always operate it further than the stopping distance. If I couldn't, I wouldn't drive. It's not worth the risk for myself or anyone else.


Well, we should not have knives nor anything that poses any contextual danger in any situation either. Oh, nor should we use fire.


These things are significantly less likely to cause the death of a stranger, especially as they're things we use in our own home.

However if you don't know the basics of knife safety (you shouldn't throw them, they're sharp, etc), maybe you shouldn't operate one.


Unfortunately, this is a statistic without context. If a random, Toyota fails on the highway and causes a fatal collision, it's unlikely to yield an article of this ilk.

So, how frequently has this type of failure occurred? How often does that type of failure result in a fatality?

The analog here is that polls suggest parents believe the probability of a child abduction is higher now than 50 years ago, though the data show the opposite. The difference? The discussion on cable news when the event occurs. Availability bias is powerful. The car industry is likely to incur the same phenomenon.


I think if a Toyota’s ebrake triggered on a Highway and caused a fatality, we’d hear about it in the national news.

But it doesn’t happen, so we don’t.

I could be wrong, but I think the interest of this article is that this type of accident never occurs with other cars, thus the newsworthiness.


To my understanding this is an issue with pretty much all auto-braking cars

https://arstechnica.com/cars/2022/02/nhtsa-to-investigate-ho...

The Hacker News audience has an extreme hatred of tesla. I think its mostly political personally. Musk disagrees with a couple of the left's untouchable rich elites and all of a sudden you get a million submissions every time a tesla is involved in an accident.


Thanks for the research and posting the article. So it does exist in other cars, but much less frequently. This article has 1.7 million affected cars and “ 278 complaints, the NHTSA says that only six of them alleged collisions with minor injuries, and no severe incidents have been reported.”

Compared to Tesla where there’s like half a million cars and at least one fatality.

Of course, this could be user error (eg, fall asleep, wake up, slam on brakes) so I don’t think blame can be assigned, but I don’t think Tesla gets an inappropriate amount of news coverage here.



>But it doesn’t happen, so we don’t.

Seriously? https://www.motorbiscuit.com/thousands-honda-cr-v-models-str...

It's all about clickbaiting Tesla in headlines for eyeballs. And HN loves it.


No one died there. And Honda doesn’t advertise their automatic braking as beta of full self driving. I don’t think it’s a close comparison.


You're confusing Autopilot with FSD.

Also, we don't know yet if the Tesla crash is due to the auto braking, do we?


From the article:

"Last month, the National Highway Traffic Safety Administration launched a preliminary investigation into Tesla after hundreds of owners reported that their vehicles would sometimes unexpectedly slam on the brakes for no apparent reason."


> If a random, Toyota fails on the highway and causes a fatal collision, it's unlikely to yield an article of this ilk.

That's an interesting way to frame it. Surely the cars travelling behind the car in question are the ones who caused it, by travelling without leaving a safe stopping distance?


Phantom braking in the Tesla Model 3 is very real and can be scary, I've experienced it several times. I've noticed that it seems to occur if it sees shadows when going under bridges on the highway.

If you're alert you can override it pretty quickly by hitting acceleration. The problem is when using driving assists you get a false sense of security, you put too much trust in the system, relax, pay less attention - unfortunately this can be fatal.


Do you mean after it sees the shadow up ahead but not under it yet, or once it enters the shadow?

If the latter... Then Tesla is in a tricky situation with camera-only Autopilot.

With the low dynamic range of the cameras, if you make the software ignore big white (overexposed) rectangles, the car drives straight ahead into white trucks. If you make it recognize big white rectangles normally as obstacles, then the car sees overexposed bridge/tunnel exits as obstacles and gets rear-ended.


Agreed wrt overexposures and thresholds, just to speculate aloud, and thinking of the HDR mode on smartphones, is it possible to rapidly switch from a 1/1000 sec exposure to 1/60 sec exposure, interwoven, such that you receive 2 video streams? And it could still be adaptive, like the duty cycle of a PWM signal, could just use whatever frame has the highest entropy, don't necessarilly have to fuse them (maybe impossible under realtime constaints, I'm no CV expert)


Seems like an issue of only using cameras as sensors.

It's probably triggering Automatic Emergency Braking.

That is a useful feature in cities. I personally heard of one instance of a car stopping automatically for a kid that just jumped on the road...


I'd say "False sense of security"


Driving is a pretty dangerous activity. The forces involved are enormous. In the distant past, car accidents killed an enormous number of people. Modern cars are safer but driving is still not as safe as other activities, such as sleeping in an underground bunker. Surprisingly, people also die while sleeping in underground bunkers, so I guess safety is relative.

Cars stop on highways for a variety of reasons. Cars fail, things obstruct the road and cars need to stop, drivers fall asleep or suffer medical incidents. Ideally stopping in the middle of the road shouldn't cause you to be killed by someone else who is either tailgating you or not paying attention to a car stopped in the middle of the road up ahead.

A pedestrian was struck and killed by a car on the street 1/3rd of a mile from my home; I have to cross this street if I want to walk to the train station. This happens on this corner roughly on an 18 month frequency.


I'm not sure what is going on over at Tesla but the regular ordinary software doesn't really seem to be as much of a priority as before. We had the crappy UI refresh. Phantom AP and emergency braking are still a big problem. But we still get Elon happily chirping about v-whatever of FSD coming out. That the vast majority of customers don't actually use or have.

I don't know if it's a case of nobody wants to work on the boring things because there is no incentive or a disincentive to. But I'm sick of basic features not working well.


My Model 3 has this habit of randomly braking on the motorway and it's quite dangerous. There is a particular spot I pass by almost daily that I know already the car is going to suddently brake.. I turn off all automation, I drive for a Km and then I go back to autopilot. Super dangerous...


Why don't you just keep it turned off then if it's so dangerous?


So you know it will consistently fail in some cases and yet still use it?


My Tesla slams on the breaks every time it sees a car at an intersection waiting for me to pass, it assumes that the car won't stop or has not stopped. It does the same with cars which are stationary (parked on the road) which I am steering around while under cruise control.


How about highway/turnpike driving? Good there?


I've not experienced this on a motorway as of yet.


It's funny, when Tesla came out I was so excited to have an electric dumb car that looked nice and performed like a top end sport car. I think all this self-driving business could've waited.Why they had to start producing smart cars with inbuilt self-driving AI assistants is the part I don't understand, at least it seems too premature. There was already enough appeal for Tesla, stories like this though juts ruin it.

Recently I drove a friends new Volkswagen with similar driver assist / self-driving features as the Tesla. It was unnerving. The car wanted to drive itself, until the lane makers suddenly disappeared on a remote icy road at night. After about 5 seconds I noticed the car had silently (except for a small icon on the dash) decided it couldn't stay in the non-existent lane anymore. I was actually behind the wheel of a car going 60mph with no one paying attention behind the wheel. It freaked me out.


Everyone, and the news piece, seems to be jumping to conclusions.

"the Tesla he was driving stopped" - assumes all Teslas are self-driving all the time. There is nothing yet pointing to a software issue instead of human error, or a medical emergency since the driver was 75 years old.


Also in the article: "Last month, the National Highway Traffic Safety Administration launched a preliminary investigation into Tesla after hundreds of owners reported that their vehicles would sometimes unexpectedly slam on the brakes for no apparent reason."


Technically, from an auto liability perspective, this is still the fault of the vehicle that did the rear ending. Traffic comes to sudden stops all the time, no excuse for not being able to stop for a car stopped in the roadway assuming it had lights on.


Cars break down all the time, new or old; until we know more details, this is not a unique situation. Did it break down and come to a rolling stop? Did it brake itself? Was Autopilot involved? Did the driver stop itself? Was there an opportunity to get onto the hard shoulder?

nearly 40,000 people a year die in traffic every year in the US alone [0]. They're all tragedies, but I don't see why if it's a Tesla it gets so much more media coverage.

[0] https://www.iihs.org/topics/fatality-statistics/detail/state...


Within the article it claims:

"Last month, the National Highway Traffic Safety Administration launched a preliminary investigation into Tesla after hundreds of owners reported that their vehicles would sometimes unexpectedly slam on the brakes for no apparent reason."


Phantom breaking happened to me 3 times when I test drove a Model Y on CO-470 in Colorado near the Park Meadows mall last year. I was keenly aware of my surroundings as this was the first time driving a Tesla; the roads were clear, no obstructions that I could connect the dots on. I'm very thankful I wasn't rear-ended any of the times. I was pretty disappointed as I always had been a fan of the concept of self-driving cars and Tesla in general. After bringing up the behavior with the Sales associate he stated it was a known issue on CO-470 and they were working on a fix.


I've seen videos of teslas on 2 lane roads slam on the brakes for cars coming in the opposite direction. Absolutely terrifying.


I'm terrified of this happening to me in my model 3. It's happened a few times when no one was behind me.


Holy shit!

Similar thing happened to my model3 early on (late 2019) and I had a furious but futile conversation with service centre about how dangerous my situation was. They stonewaled me after “kernel panic, dont know why”.

The car had it’s own “blue screen of death” in the first lane, after a turn, on central expressway right next to their Santa Clara service center. ( https://goo.gl/maps/sPVNDNJZaR9zQ66c9)

Luckily there is signal right there so people were slowing down naturally but man the car turning into a brick and not even emergency blinkers (parking lights) working was shitty experience!

I’m fuming that someone paid a price for tesla’s similar failure.

Edit: I was on “auto pilot”


Whilst the software failure sucks, the agreement by most of society (as reflected in most law) is that the fault is of the driver in the car behind for not leaving a safe distance.

Cars stop for all sorts of reasons, hence why there are laws for leaving these distances.


This happened in one of America’s great judicial hell holes, i.e. Plaintiff friendly law, judges, and juries, so it may turn out to be quite interesting. There are multiple attorneys with multiple 9 figure verdicts within a stone’s throw of this incident.


This kind of thing is why I think we are much farther away from true L5 self-driving than many others think. To handle current road conditions you need true AGI with the ability not just to see as well as humans can but to interpret what you see, as well as theory of mind and contextual awareness of the drivers around you.

What I think will ultimately happen instead is we’ll adapt the roads to allow self-driving by dumber computers. For example, a self-driving lane on highways where there are sensors on the road to help the cars, and where all the cars in the lane must be self-driving and in communication with each other to do things like maintain stopping distance.


Please dont make conclusions from this article. It is missing a lot of info. Even if it was FSD... If you are in a car that start stopping you take over right? This is just a very weird situation that needs more context.


Please read the article:

"Last month, the National Highway Traffic Safety Administration launched a preliminary investigation into Tesla after hundreds of owners reported that their vehicles would sometimes unexpectedly slam on the brakes for no apparent reason. According to the agency, these incidents, known as "phantom braking," sometimes happened at highway speeds."


I almost died in a tesla when it did something similar to this. The model s just abruptly and heavily braked itself on the interstate. My family was in the car and a semi almost hit us from behind going about 80mph.


I wonder which politician has the cojones to first stand up and declare the autopilot features illegal as security threats cant currently mitigated.


What’s Tesla’s exposure here - do they disclaim liability for a failure like this?

Edit: Assuming this was software diver assist issue


I don't think they can do that for assist features like this triggering falsely, assuming that's what happened here too. For it not reacting to something there's always the line of "it's an assistance feature, marketing aside we tell you clearly it's an assistance feature and you need to pay attention and react yourself if needed", but that doesn't work for emergency braking triggering randomly, since there's nothing the driver could've done "correctly" to prevent it?


we literally have zero idea if its even the Tesla's software fault.


Agreed - edited to include my assumption


I believe all car manufacturers have a clause for liability in case of e.g. mechanical breakdowns (if that is the case here), if not the car companies would be sued out of existence a long time ago. Cars break down regularly, new or old, and sometimes in a dangerous situation.


But I would think that only relates to mechanical breakdowns due to normal wear and tear or lack of proper maintenance? Wouldn’t a manufacturer have responsibility if the breaks failed at high speed the day I drove my new car off the lot?


This isn't even autopilot. It's basic cruise control that they can't even get right.


Okay screw the AI. What else can Tesla do ? I bet having a programmable car can be programmed in more deterministic ways than just "with probability so and so this is another car and the machine must stop". Wth

Also I wonder how many of these incidents are buried under NDAs.


One of the things a human does when emergency braking is consider what's behind them. Perhaps Tesla's software doesn't do that?


Private transportation and American society’s total reliance on it is a disaster


Why is this post buried under 500 other posts that aren't nearly as popular?


This was confirmed to be electrical failure, not autopilot or FSD.


I don't underestand what the fuss is all about. The car received a security OTA update and had to restart to apply the update. Everybody does it. You would not use an insecure car, wouldn't you ?


“Move fast and break things”, TSLA edition.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: