Hacker News new | past | comments | ask | show | jobs | submit login

It’s pretty accurate even if you’re a fan of the company. The more accurate description is that they shipped a feature which breaks traffic laws, which is a serious error in a company asking us to trust their judgement in a safety-critical system. Given how many other problems they’ve had and the consistent overselling of their capabilities and safety, that’s an important conversation to have.



> they shipped a feature which breaks traffic laws, which is a serious error in a company asking us to trust their judgement in a safety-critical system

You're not wrong per se, but there is significantly more nuance with self-driving technologies than you're suggesting.

A more famous example in the self-driving car world is the Pittsburgh left [0], where in Pittsburgh a driver turning left will often get conventional precedence over vehicles going through the intersection despite there being no explicit left turn light. This move is technically illegal, but when the self-driving cars didn't do this it drove traffic to a halt regularly and held up every intersection where the self-driving car was turning left. Eventually this had to be added to the software.

Examples like these are why self-driving technologies are so hard to get 100% right, driving is a mix of intuition and rules. A significant amount of driving is doing what other drivers expect you to do. If the cars don't behave like human drivers expect, it often causes more problems than doing what the car is "supposed" to do.

That said, as a pedestrian who has frequently almost been hit by people rolling stop signs, I'm with the NHTSA here...

[0]: https://en.wikipedia.org/wiki/Pittsburgh_left


> driving is a mix of intuition and rules

Intuition and rules and rule breaking.

I remember back in Driver's Ed I was alarmed to discover that through sloppy definitions Colorado legislators had managed to make it illegal to take a right turn within 150ft of a stop sign (or similar, I forget the details). Of course, in reality nobody follows the sloppily defined portion of the rule, nobody enforces the sloppily defined rule, and it isn't a problem -- but the gap between rules and realities is substantial, self-driving cars are going to tease apart this gap, and spiders will come crawling out.


> Examples like these are why self-driving technologies are so hard to get 100% right, modern driving is a mix of intuition and rules.

Indeed! The hardest part of switching to driverless cars overall is that driverless vehicles have to exist on the road for some number of years surrounded by ones driven by primates, with primate reflexes and a variety of ad-hoc behaviors.

If every car on the road were to go driverless at midnight tonight, would the number of accidents and death and disability plummet compared to yesterday's stats?


There are better solutions to this problem than coding cars to behave like bad drivers. Round-a-bouts. Banning left turns during peak hours. Adding a dedicated left turn signal at the beginning of the cycle. The Pittsburg left is a terrible convention... It puts pedestrians at risk (assuming the cross-walk signal follows the traffic signal). And nobody from outside Pittsburg knows it's a thing - if I were visiting Pittsburg, I'd run into the car turning in front of me (well, hopefully not, but it's a risk).


Rolling stops, while all to common, are not at all like a Pittsburgh left. Police everywhere will pull you over for rolling through a stop sign. Police in Pittsburgh probably won’t, unless they have something against you personally.


It’s not serious at all imho. People roll stop signs all the time, it’s part of driving culture. If you feel obligated to obey the letter of every driving law you could have turned it off in settings.

Tesla also has an option to override the speed limit, you want to remove that too?


> Tesla also has an option to override the speed limit, you want to remove that too?

Yes. The Alternative is to encode “break the law” in software and that makes for a very bad option. Stick to the law, computer.


Unfortunately, self driving cars drive in the real world. There are a great many roadways in the US where driving the speed limit is legitimately dangerous in one direction or another.

I live in New York and there is one such roadway I drive often. Palisades Interstate Parkway. The speed limit is 55 and there are no trucks allowed, but if you are driving 55 on this road you are in danger. You will get run off the road by everyone else traveling at a minimum of 65-70 with many of them 80+.

There may come a day where humans are completely out of the equation, but until that time I believe self driving cars are safer if they drive more like human drivers. That means keeping up with traffic and other human quirks.


Agreed, and these roadways are all over the country. It's a pretty widely known fact that the safest speed is the natural flow of traffic, but municipalities all over the country are much more inclined to listen to the vocal minority over traffic engineers. Not a lot of "trust the science" going on there.

https://qz.com/969885/almost-every-speed-limit-is-too-low/


The solution to that is not that everybody gets to ignore speed limits and point to a QZ article as justification. The solution is not that Tesla unilaterally decides that 10% more is fine, everywhere, all of the time. The solution is what the advocate in your article proposes - make a conscious decision to raise the speed limit on certain roads, backed with suitable data.


That’s a lot of snark for such an obtuse comment. Obviously we should fix our laws, but in the meantime we shouldn’t imperil ourselves and everyone around us just to pacify the people who lobby their municipal governments for unsafe speed limits.


Next week’s argument: “all first gen self driving cars are going 70 where 55 is allowed, we need to keep this setting.”


Wait, so you're saying that people would crash into you if you drove the speed limit?

I'm a very calm driver and regularly drive at or sometimes below the speed limit if visibility or other factors don't allow higher speeds. People do slow down and I never felt in danger - granted, this is usually at around 40 km/h instead of the limit of 50 km/h, but I can't imagine people are so careless they'd "run you off the road" if you weren't speeding.


>Wait, so you're saying that people would crash into you if you drove the speed limit?

Maybe not literally crash into you, but that is certainly possible. People will swerve around you, ride your bumper, flash their high beams at you, honk, pull in front of you and hit the brakes, and other dangerous road rage type behavior. It is absolutely unsafe to drive the speed limit.

In general the safest thing to do is just to keep up with traffic.


People will swerve around you, ride your bumper, flash their high beams at you, honk, pull in front of you and hit the brakes, and other dangerous road rage type behavior.

I don't think the one being unsafe in this scenario is the one driving the speed limit!


Does it matter if the result of the behavior is that you are unsafe?


That’s exactly what the driver in front of you and behind you use as justification for being above the limit: I had to go with the flow of traffic. A self-perpetuating force that forces everyone to be too fast.


Disagree. Traffic engineers study roadways and recommend speed limits based on safety and human behavior. And then cities and states ignore them and use speed limits to generate revenue.

>A self-perpetuating force that forces everyone to be too fast.

If it were actually too fast, most people wouldn't travel that speed. Have you ever noticed that traffic speed ebbs and flows with the road conditions or time of day? People will drive the speed they feel safe driving, and for most drivers it's (typically) much faster than the posted speed limit. And if the conditions are poor, it's much slower.


> Traffic engineers study roadways and recommend speed limits based on safety and human behavior.

This isn't really true: what usually happens is that they either go with a default or they do a study and set the limits at the 85th percentile. For a separated highway, that works fairly well but it often has bad results for mixed spaces: the people commuting through a neighborhood, for example, are trying to go as fast as possible but the people who live there are more concerned about safety, the impacts of those decisions on how they use their space (I grew up hearing that “nobody walks in California” which really meant “nobody wants to walk 3 miles further to use the few signaled crosswalks”), etc. A big problem here are the outliers: most of the risk comes from the top of the speed distribution — even if half of the drivers scrupulously follow the speed limit, the speeders are the ones who will influence people's safety perception of the road.


> And then cities and states ignore them and use speed limits to generate revenue.

I think for highways and other limited access roads they're often concerned with minimizing complaints much like the commentary we're seeing up and down these comments so they slap a small number on the sign knowing full well that traffic will ignore it and call it job well done.


When you're driving significantly below the speed limit (or more accurately, below the average speed of the other drivers), it becomes impossible to maintain proper dostance behind you. Add to that the fact that people randomly drive in the left/middle lanes regardless of actually passing anyone on their right and you're forcing people to brake, make dangerous merges, etc.

I know nothing about your driving, but I find (anecdotally) that drivers who drive significantly slower than traffic are also ones who never look in their rear view mirrors.


> Wait, so you're saying that people would crash into you if you drove the speed limit?

Yes. If you're driving significantly above or below the speed of traffic, you're likely to cause an accident. https://qz.com/969885/almost-every-speed-limit-is-too-low/


The human driver is the one that told the car to disobey the stop sign[0]. Should every cruise control on every car with speed limit detection forbid the cruise/adaptive cruise to go above the speed limit?

0: the chill setting doesn't roll stop signs https://twitter.com/cooperlund/status/1488549356873695232?s=...


> Should every cruise control on every car with speed limit detection forbid the cruise/adaptive cruise to go above the speed limit?

Certainly. Why should cruise control be set to above speed limit? Adaptive cruise control should reduce speed to remain within legal boundaries.


I have a car with adaptive cruise control and speed limit detection, and this would absolutely drive me nuts. Not for any philosophical reasons, but because speed limit detection is fallible. On the expressway my apartment is a block away from, which has a speed limit of 50 mph, my car fairly frequently tells me the speed limit is actually 30. It's fairly common for it to read speed limit signs that have conditions on them -- only in effect certain hours, or during school hours, or when light is flashing, or if you're driving a truck -- and incorrectly assume that's the speed limit.

Maybe you'd be perfectly happy with "if you don't want the cruise control to make mistakes, just never use it, because gosh darn it, that's better than allowing people to set the adaptive cruise control five miles an hour over the speed limit like they've been able to do with non-adaptive cruise control since it was a thing." I would not, and I would argue I am not the one taking an unreasonable stance.


Speed limits change, signs are obstructed, some speed limits are only valid for certain times of day. There’s plenty of legal reasons to override the posted speed limit.


Drive on manual then, if your computer is not clever enough.


Do you never drive above the speed limit on the freeway? I highly doubt this...


I’m not a computer, so that point is moot. (And yes, I don’t. Not deliberately, though it likely happens accidentally from time to time.)


"Whether you're willing to hold computer drivers to higher standards than human drivers" is moot? Or you just don't feel obligated to reconcile the inconsistency?


I am willing to hold a computer to a higher standard, that’s the promise that gets made left and right for self driving cars. That they achieve a higher standard.


What if it meets the higher standard of “being more safe” but not the higher standard of “obey current traffic laws that aren’t enforced against humans”?


In the end, it is my choice whether I stick to the law. I'm fine with instituted consequences. I'm not fine with having the option to break a law removed.

Some laws are stupid. Some are unjust. Some are racist.

Discretion is valuable, probably even essential, for society.


Do I really need to link Moxie Marlinspike's talk about why it is important to be able to break the law?


And if the traffic law gets you killed, what then?


It’s not the law that gets you killed. It’s the other drivers that disregard the law. And soon, the self-driving cars that disregard the law.


This axiomatic approach about the law has a clever solution: just make it legal for cars to disobey stop signs and then problem solved.


> People roll stop signs all the time

Those people are called bad drivers.


no i don't think so. i think if you tried to strictly obey the law at all stop signs you would have a lot of trouble


When does obeying stop signs cause a lot of trouble?


Spread out over millions of drivers, it can cause many lifetimes worth of time lost from wasteful and inefficient practices, more damage to transmissions and driveshaft from stop-and-go, and increased environment destroying pollution.


Just stop at the stop sign so you don't kill a random pedestrian you didn't see while California coasting.


People are also killed or injured by drivers all of the time, too. Remember when the selling point of AVs was that they’d reducr the ~40k / 300k Americans so impacted annually? Telling manufacturers that it’s okay to ignore laws if it gets you there faster will have the opposite effect.


Safety is measured in accidents per mile driven, lower being better. While there maybe a loose correlation strict obedience to the law is not in itself a measure of safety.


That metric is misleading since it’s skewed by highway driving where people cover lots of mileage with far fewer points of contention. A better metric would be something like operating time but even there you’d probably want more nuance since different environments have different safety characteristics (someone stuck in traffic on the freeway just doesn’t have as many things to get right as a driver on a city street even if they’re both averaging 5mph).

The law actually does have a fairly strong correlation with safety — this is the basis of the claims that something over 90% of collisions are due to driver error – but there is a growing recognition that it’s not enough, and design changes to streets and cars are needed to reduce the number of times where obedience to the law is the only thing protecting someone else.


You can use whatever denominator you want but blind obedience to the law isn’t ever going to be a measure of safety.


A tingling feeling that this general argument will become very popular and determining in the future...


Yes, and not only that, Elon Musk should be in prison for vehicular homicide. If a human chooses to speed and kills someone as a result, they are going to jail.

Tesla has willfully chosen to violate traffic laws resulting in numerous deaths. Where is the accountability?


I'd normally disagree since most of these systems ask you to keep paying attention or keep your hands on the wheel and they make it clear that this is assistive technology and not a replacement for an alert, licensed, driver being ultimately in control. But Elon's statements, and calling it "fully self driving", is really asking for a lawsuit or legal trouble. It's incredibly irresponsible and reckless, and also is the one big negative on an otherwise pretty awesome car.

I really wish we could divorce all the self-driving/big brother electronic stuff from electric cars, and make available a dumb electric car.


The idea that a driver can anticipate when a self-driving car is going to fuck up and do anything to prevent it is bullshit and the charlatans pushing these cars know it.

These cars need to be taken off the roads until they can safely drive unattended.


> The idea that a driver can anticipate when a self-driving car is going to fuck up and do anything to prevent it is bullshit and the charlatans pushing these cars know it.

There is no factual basis for this claim. Cruise control and adaptive cruise control have been in use for a long time, require this type of attention, and are safe to use.


> It’s not serious at all imho. People roll stop signs all the time, it’s part of driving culture

This is a terrible line of reasoning.

If Tesla ever releases their vaporware humanoid bot would you expect them to program it to rape women in cultures where women rape is part of the culture?


I’m gonna go out on a limb and say that rolling stop signs is not a slippery slope to robot rapists.


I think you're right.

Programming a several ton machine to disregard laws and potentially kill many pedestrians without warning isn't really comparable to programming a machine to rape -- it's unfathomable worse.

I've worked in factories and the idea that a car company can release a machine onto the streets that wouldn't be allowed anywhere near a factory floor is flabbergasting.

If a company tried to release a product that had this safety profile in a factory setting the governments and unions would be all over them.

The fact that some jackass yokel or senile old lady routinely roll through stop signs daily doesn't justify Tesla releasing a product that does the same.


Equating rolling stops to rape has to be the worst argument against FSD I’ve ever heard. I’m honestly at a loss for words. Safely proceeding through an intersection without coming to a complete stop hurts nothing, except maybe the feelings of traffic law puritans.


If a company tried to release a product that had this safety profile in a factory setting the governments and unions would be all over them.

I think the same is probably true of cars/trucks in general.


Sometimes it is more dangerous to follow traffic laws exactly rather than normal human driver when you are surrounded by normal human drivers. It's not as simple as you are making it seem.


How many people are going to be hurt because someone stops at a stop sign?


Multiply the extra few seconds times how many ever millions of people end up using automated driving, times how many stop signs they end up at. The number of lifetimes lost due to stopping at stop signs has got to be the equivalent of the circa-100's area. The real question is why anyone bothers stopping at all if all directions are clear.


This is a fallacy popular with bad drivers since it lets them rationalize their decision to put other people at risk. It's designed to shift the focus away from agency — note how the people most at risk aren't given a decision? — and it relies on an assumption which becomes monstrous as soon as you think about it. Trying to phrase it as a math problem with the implication that time is fungible makes it sound like a minor optimization but you really want to consider just how uneven the impact is: you're saving an amount of time so small you'll barely even remember it but the person you hit may be losing the rest of their life or spending the remainder of it significantly degraded. Around here, a lot of the people getting hit are kids so it's an especially unfavorable cost in years of quality life.

The other unquestioned assumption is that this is even a serious time savings. In my experience, the average driver _massively_ overestimates how much time aggressive driving actually saves them. In most cases, all it means is that they're spending more time at the next backup — I've had people weaving around me for literally hours on the interstate without appearing to notice that they were doing a LOT more work failing to improve their relative position at all, and in the city I routinely see the same driver “passing” me on every block.

Then, of course, if we're talking about lost time, how many years of quality life do people lose because they're deterred from healthier transportation options? It would take rather an awful lot of stop signs to cancel out the savings from even 5% of people switching from driving to bicycling, walking, or transit.

> The real question is why anyone bothers stopping at all if all directions are clear.

Because the most common excuse for hitting someone is “I didn't see them” or “They jumped in front of me!” (which in the vast majority of cases really means that the driver's attention was directed somewhere else). It's not like traffic engineers don't know yield signs exist: they use them because they're intentionally trying to prevent drivers from staying at a high speed due to the risk to pedestrians and cross traffic. Most people who live in neighborhoods which are popular with commuters have to spend years begging to get a stop sign, often requiring a serious collision to force the local DOT to act.


It's a popular fallacy to consider the lives lost you see but not the lifetimes lost from wasteful practices. Think of the children is a nice trope to strut around to scared soccer moms during the elections, though.

You can't make the presupposition that a rolling 'stop' is always more dangerous than a full stop, either.


> You can't make the presupposition that a rolling 'stop' is always more dangerous than a full stop, either.

Actually, you can. Drivers who roll through a stop are far more likely not to notice pedestrians, bicyclists, or often even other drivers. They’re also unpredictable since everyone else has to figure out what they’re doing and react accordingly. I see the chaos on a daily basis where everyone else has to adjust their timing – and the selfish driver almost inevitably ends up no more than one car length ahead at the next back-up.

In contrast, a legal stop is easy to understand - it works just like you learned in driver’s Ed and since you can’t read another driver’s mind, you have to plan on stopping anyway.


Thankfully I design radar detectors for a living so that people like you who inevitably support insane and irrational laws, on the entirely convincing proof of just 'actually you can' and 'think of the children', have minimized impact on people like me. My whole life is making sure people such as yourself and the thieves from government cause me minimal legal hassle.

So enjoy your opinion, I'll keep working to make sure your opinion has the least influence possible.


Thank you for at least being honest about your motivations. I hope you never hit someone.


No prob. I haven't. Also I make more money the more traffic laws, enforcement, and cameras that come out (it drives consumers to seek ways to become educated about police and camera presence), so really it's in my interest to support you ironically.


I’ve known a few people with the same attitude. All but one of them was right up until the time they weren’t, and that changed a stranger’s life forever.


The 'attitude' I'm intending to express is that safety and law are not interchangeable. Sometimes they are in harmony, sometimes they are at odds, and sometimes they're simply completely separate concepts. This can vary from minute to minute, place to place, from one environmental condition to another and even from one driver to another. If how often someone gets in an accident is a guide as you're implying, well then I'm below the national average considering I'm halfway dead, I've been driving more years than not, and I'm definitely below the average person's accident rate of one per 18 years (sitting at zero).


Bikes and pedestrians? Although I understand a lot of cities don’t have those.


Anecdotally as a bicyclist it was infinitely easier to predict rolling stop drivers than those who stop. People stopped are rarely intelligent enough to follow the order of who's next, and constantly would randomly choose when to start up (sometimes just in time to almost hit me). the one rolling through I already know is going and I know the time is now that they're going, so little risk of accident. Biking isn't a game of who wants to kill you (all drivers) but rather _when_ they're going to attempt it.


even this reply is framing it like it sounds ridiculous. I'm not sure if this stop sign incident is the best example, but look into all the reports of the waymo vehicle causing road rage due to its "safety-first" driving.


See also regular driving. The solution for road rage is taking away bad drivers’ licenses, not saying everyone should drive like them. If the police refuse to do their jobs, that problem isn’t one Waymo can solve.


again, you are removing all nuance. It is not an issue of "bad drivers", it is an issue of local driving norms. If every single driver is doing something, it is not an issue of removing bad drivers.

if you are driving under 70 mph on the 101 at 6:00am , you are endangering yourself and those around you.


This thread is about stop signs. That’s a common way for pedestrians and bicyclists to get hit when drivers ignore a safety mechanism because it’s mixing transportation modes, unlike someone going modestly over the speed limit on a controlled highway where everyone else is also in a protective steel cage with various safety mechanisms.


The article is about stop signs. My point and this discussion are about it is not just, "always follow traffic laws exactly, at all times."


Illegal actions and unsafe actions are not the same thing. Laws exist to make us safer, but they aren't perfect.

I get why the law is enforced here. But I think cars doing full stops are more dangerous than cars doing rolling stops.


Ahhhhh. I've struggled with this all my life, where the rules say one thing, but you're "just supposed to know" those cases where they aren't really serious about that.

Now the contradiction is so painful, they'll have to address it. Either a) update the rules to reflect actual practice, or b) admit to "yes, your self-driving car has to obey traffic laws we don't enforce on humans".




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: