My Tesla will allow me to engage autopilot in a school zone, obeying the adjustment I’d set on speed limit - while using, not the correct school zone speed limit, but the non-school time speed limit. It would allow me to go 30 mph over the school zone speed.
How can Tesla claim self driving if the car can’t read a sign that says - speed limit 25 mph during school hours, and properly adjust? Humans just look around to determine if school is likely in session by the number of cars in the parking lot during normal school hours, or they know the school calendar.
How does a self driving car make that determination? Query the school district website for the school, identifying their bell schedule and tacking on a buffer ahead and behind? Assume a school schedule that’s M-F? What if it’s a religious school that operates Sun-Thursday? Now the car has to determine which religious sects obey which calendar? Is it different in each country?
Just another example of a massive hurdle self driving cars have……
Just a reminder at how awful Tesla cars "self driving" cars are at actually stopping.
Please do NOT rely upon autopilot, fsd, or any other "Tesla-tech". They're incapable of seeing objects like small children in time.
This was a test done at CES on January 6th, 2022.
-------
In contrast, the people who setup the demo were showing a more advanced self-driving car who could actually stop when the child suddenly runs out onto the street.
This is great but I think it's always good to be aware of the consequences of stopping quickly (rear ending which might cause its own problems of the same magnitude)
That, and/or possibly killing several other people. I'm not advocating that people run over the kid at all, I just mean: always be aware of people behind you when braking quickly.
I don't know about other people but as a driver I try to be aware of what's behind me at all times. Where I live (Sydney Australia) it's extremely rare to be tailgated on a suburban street — but when I am, I'll drive to the conditions. That means in an area where road incursions are probable (e.g. where there are pedestrians or playgrounds) I will drop my speed below the limit. If someone is being persistent or aggressive, I'll pull over and let them pass.
The drivers behind you are responsible for maintaining a safe following distance so that you can slam on the brakes when a kid runs into the street. It’s not your job to worry about them. Eyes on the road in front of you.
You should worry about what's in front of your car first and what is behind it after. There's a reason we ~always assign blame to the car that does the rear ending.
Don't tailgate, people. You never know when the car in front of you is going to slam on the brakes because a ball, small child, or plastic bag jumps out in front of it.
I totally agree with you that this tech needs to get better, but I really want to see apples-to-apples comparison. I would expect Tesla to also stop if a child was running across the movement path in broad daylight.
The night example looks to be specifically stacked against autopilot. Tesla vision is notoriously bad at detecting stationary objects and it needs a lot of light to function well. Lidar/Radar are significantly better than cameras detecting straight ahead obstacles in low light conditions. I would really like to hear Tesla to defend their decision to not use them.
In any case, this testing is great because it lets us know when the autopilot requires extra supervision.
> but I really want to see apples-to-apples comparison.
EDIT: Luminar's car is on the other lane, and there's also a balloon-child in the Luminar's lane. You can see Luminar's car clearly stop in the head-to-head test.
There's also the "advanced" test, where the kid moves out from behind an obstacle here. Luminar's tech does well:
This "tech" can't even see a firetruck in broad daylight. Why do you think it can see a child?
This isn't a one-off freak accident either. "crashing into stopped emergency vehicles with flashing lights in broad daylight" is common enough that NHTSA has opened up an investigation into this rather specific effect: https://static.nhtsa.gov/odi/inv/2021/INOA-PE21020-1893.PDF
I'm in Sweden, and the sun shining directly into your eyes from barely above the horizon while the road is wet/covered with snow and reflects that sun at you is a regular occurence during winter months. I odubt Tesla's camera will be able to see anything.
This is the reason why a single camera alone is not capable of being the sole source of information for a self-driving system. The technology currently available for camera systems does not capture a high enough dynamic range to be able to see details in darkness when the Sun in in frame. You could use multiple cameras all with different sensitivities to light and combine them, but it's going to be very difficult.
I really don't see what's difficult. You don't even need multiple cameras, you can simply use very short exposures and combine short exposure shots into a longer exposure one when needed. Multiple cameras are useful to handle glare though.
Why would it be very difficult? You can split the same light beam after the lens, and send it to two cameras with different diaphragm or sensitivity. You'd then synthesize a perfectly aligned HDR picture.
Its because Tesla cars are regularly causing "phantom braking" events.
Tesla is trapped between a rock and a hard place. Their "phantom braking" events are causing a lot of dismay to their drivers (https://electrek.co/2021/11/15/tesla-serious-phantom-braking...). But if they reduce phantom-braking, they increase the chance of hitting that child on the road.
Elon claims that the radar was the primary source of phantom braking. He said that matching up a high fidelity sensor (the cameras) with a lower fidelity sensor (the radar) was proving near impossible. I also suspect the supply chain pains massively factored into his decision to remove the radar from all vehicles since roughly late January of last year.
Anyone in the car industry would know this as obviously false? Radar based emergency breaking is availability and works really well in many cars from 5+ years ago.
Radar was removed in May 2021, which predates the article I quoted by multiple months.
I'm sure Elon was blaming Radar for phantom braking in the April / May time period. We can give a few months for the cars to update to the newest version as well.
But by November 2021, RADAR was no longer a relevant excuse. I think you may be mistaken about when Elon said what and when. You gotta keep the dates in mind.
Respectfully, you’re incorrect on the date of the Tesla vision only hardware release. My wife got a model y in early Feb 2021 and it was in the first batch of Tesla vision vehicles that did not ship with a radar. It was manufacturered in January as that’s when we got the VIN. This is first hand experience, not heresay. Elon announced it after they’d been shipping those vehicles for a bit. I was both amused and surprised. She was pissed off that Autopilot was nerfed compared to my 2018 model 3 for max speed as they were working out bugs in the Tesla Vision branch of the code.
I also never said a date about when Elon said those things in my comment, but now understand what you mean about post-vision. But the FSD Beta and Autopilot codebases are so different I am not sure I’d compare them for phantom braking (though recent FSD Beta appears to have way less of this occurrence).
But maybe I’m biased. We have two Teslas, one with, and one without a radar. We’ve seen much more phantom braking with my radar equipped model 3. Anecdotally, I find it happening less in the Y. Also, I didn’t click the article originally as Fred is a click diva and generally disliked by the Tesla community for his questionable reporting. Electrek is an EV fan blog, not much else.
WashPo reports a huge spike of federal complaints from Tesla owners starting in Oct 2021, well into the Vision-only Tesla technology
These are some pretty respectable sources. Federal complaints are public.
> “We primarily drove the car on two-lane highways, which is where the issues would show themselves consistently,” he said in an email. “Although my 2017 Model X has phantom braked before, it is very rare, the vision-based system released May 2021 is night and day. We were seeing this behavior every day.”
So we have Electrek, Washington Post, and the official NHTSA Federal registry in agreement over these phantom braking events spiking in October / November timeframe of 2021. I don't think this is an issue you can brush off with anecdotal evidence or anti-website kind of logic.
That’s totally fair. I’m not pretending it isn’t a problem. Phantom braking is scary as hell when you’re on the highway. I misread your comment on the date and think that’s the thing you really focused on, when I didn’t. You’re right. This is a serious problem.
Tesla partnered with Luminar by the way and even tested their LiDAR on a model 3 last year. I guess they weren't impressed though, since they seem to still be all-in on passive optical recognition.
> I guess they weren't impressed though, since they seem to still be all-in on passive optical recognition.
That's one take - the other take is that they have been selling cars claiming that they are capable of full FSD because they are going to sell it without Lidar, and have been selling FSD as a $5k bolt on, so swapping to Lidar at this point would be a PR nightmare even if it was a better solution....
That's the cynical view though... (Although I also wouldn't be the one to tell the people that have spent lots of money on Autopilot that they have bought total vaporware - or be the CFO that announces they are back-fitting Lidar cameras). Once you are all-in on 'lidar is shit' it makes it hard to reverse the trend, despite rapidly falling costs.
>Once you are all-in on 'lidar is shit' it makes it hard to reverse the trend
It can be done, if there's good cause. Just partner with your lidar oem of choice, get them to do a white paper about how the latest point increase version of hardware or firmware is "revolutionary!" and then claim that your earlier criticisms of lidar have been fully addressed by the groundbreaking new lidar tech.
I've actually been suspecting this will happen once solid state LIDAR technology crossed a certain threshold.
Traditional old school LIDAR units with spinning scan heads are why quite a few self driving cars have the odd bumps and protrusions on them. It's very easy to see someone who wants to make a "cool car" looking at these protrusions, deciding "lidar is shit" and doing everything possible to avoid it. There are some good engineering reasons to avoid traditional lidar units. Meanwhile solid state LIDAR tech has only been on the market for a few years and is still quite expensive compared to traditional LIDAR models, but its definitely superior for a lot of places people want to be able to use LIDAR or where LIDAR would be an excellent competitor to other technology currently in use such as 3D depth mapping and Time of Flight cameras. I briefly looked into some of this stuff when considering work on an "art game" using VR and various 3D scanning technologies in order to make a "fake" Augmented Reality experience as part of constructing the deliberate aesthetic choices of the project.
Solid state LIDAR will definitely be pushed forward by market demand for wider fields of view, lower costs, and smaller module size. All of which will eventually lead to a situation where it will be stupid not to augment the self driving technology due to the massive benefits with zero downsides.
One way out of the LIDAR PR dead end would be for Tesla:
1.) When solid state LIDAR is ready, re-brand it something like SSL technology (Solid State LIDAR) and put it on new high end Teslas.
2.) Wait for all 'camera only' enabled Teslas with FSD beta to age out of service and upsell the owners on a heavily discounted FSD subscription for their brand new Teslas with SSL.
A third path would be to frame the addition of solid state LiDAR as purely an enhancement to their existing cameras, framing it as a camera upgrade instead of a new separate sensor.
That's straight out of Apple's playbook. I recall how Tim Apple ridiculed the OLED displays, until it became impossible to ignore. So I guess it can be done.
> The accusations could be valid or totally baseless
Read the listed report. All 11 accidents were confirmed to be:
1. Tesla vehicles
2. Confirmed to be on autopilot / full self driving.
3. Against a stopped emergency vehicle with flashing lights or road flares.
These facts are not in dispute. The accusations aren't "baseless", the only question remaining is "how widespread" is this phenomenon.
These 11 accidents have resulted in 1-fatality and 11 injuries.
--------
We are _WAY_ past "validity" of the claims. We're at "lets set up demos at CES to market ourselves using Tesla as a comparison point", because Tesla is provably that unreliable at stopping in these conditions.
I'm fine doing away with Uber's self-driving as well. Although I think Tesla's is the worst of the lot, I'm not confident in or thrilled by any self-driving tech on public roads in the next decade
The exact situation where "uber self driving" killed a pedestrian was: the driver was literally watching a movie at her job, while she was supposed to be driving a car and training a self driving system.
Sure, but this was supposed to be fully autonomous. Nobody is arguing the human didn’t make a mistake. The autonomous system, however, definitely also did.
This may be technically true (I actually don't know what the drivers full intended purpose at the time was) but it doesn't negate some extremely sketchy software practices on a safety critical system, like "action suppression" to avoid nuisance braking.
As in most accidents of this nature, there is a chain of mistakes. It's bad practice to ignore some mistakes simply because we can also point to other failures in the chain of events.
Volvo's emergency braking system, which detected it and would have braked in time, had been restricted by Uber to not be able to take any action.
Uber's system was set in a way that "non identified object in my way" didn't trigger an immediate slow down, but instead a "sleep it off for a second and check again". It saw the impact coming, and instead of notifying the driver it decided to wait a full second first, because it was programmed to do so. Which any programmer can recognize as the "turn it off and on again" idiom that tells us their system was misidentifying lots of things.
What the driver did or did not do once notified doesn't change that. That car would have killed someone, somewhere, sometime, because it was programmed to not avoid it.
Wasn't this not a pedestrian, but a cyclist crossing in a completely inappropriate place? Granted an SDC should still react to this while many humans in the same situation would not.
Pedestrian slowly walking a bike, with basically no reflective clothing, on a dark night. This is exactly how humans kill pedestrians with cars all the time.
Sure, but that's also why that specific car was equipped with a radar-based obstacle detection.....which the company specifically disabled. There's a very good chance that this system would have saved that person's life. Also while yes, humans are crap at this, it's very rare that you'd just plow into someone at full speed without even attempting to slow down or swerve - which is exactly what the car did.
Tesla didn't use LIDAR because it is more expensive [0]. Quoting Musk:
> Anyone relying on LIDAR is doomed. Doomed. Expensive sensors that are unnecessary. It’s like having a whole bunch of expensive appendices... you’ll see.
Cost is not the only point he was making. The problem you need to solve is not just “Is there something?”, but also “What is it? And where is it going to move?”. LIDAR cannot do that. Or at least if you get LIDAR to do that, then you would have also been able to get it done with a camera, in which case you wouldn’t have needed LIDAR in the first place.
LIDAR certainly is the low hanging fruit when it comes to the firmer question though (i.e. what is there in my path right now).
One question I've always had about Tesla's sensor approach: why not use binocular forward facing vision? Seems like it would be a simple and cheap way to get reliable depth maps, which might help performance in the situations which currently challenge the ML. Detecting whether a stationary object (emergency vehicle or child or whatever) is part of the background would be a lot easier with an accurate depth map, or so it seems to me.
Plus using the same cameras would help prevent the issues with sensor fusion of the radar described by Tesla due to the low resolution of the radar.
I know the b-pillar cameras exist, but I don't think their FOV covers the entire forward view, and I don't think they have the same resolution as the main forward cameras (partly due to wide FOV).
Sure, but they're not getting that 3d map from binocular vision. The forward camera sensors are within a few mm of each other and different focal lengths.
And the tweet thread you linked confirms it's a ML depth map:
> Well, the cars actually have a depth perceiving net inside indeed.
My speculation was that a binocular system might be less prone to error than the current net.
Sure. You're suggesting that Tesla could get depth perception by placing two identical cameras several inches apart from each other, with an overlapping field of view.
I'm just wondering if using cameras that are close to each other, but use different focal lengths, doesn't give the same results.
It seems to me that this is how modern phones are doing background removal: The lenses are very close to each other, very unlike the human eye. But they have different focal lengths, so depth can be estimated based on the diff between the images caused by the different focal lengths.
Also, wouldn't turning a multitude of views into a 3D map require a neural net anyway?
Whether the images differ because of different focal lengths or because of different positions seems to be essentially the same training task. In both cases, the model needs to learn "This difference in those two images means this depth".
I think with the human eye, we do the same thing. That's why some optical illusions work that confuse your perception of which objects are in front and which are in the back.
And those illusions work even though humans actually have an advantage over cheap fixed-focus cameras, in that focusing the lens on the object itself gives an indication of the object's distance. Much like you could use a DSL as a measuring device by focusing on the object and then checking the distance markers on the lens' focus ring. Tesla doesn't have that advantage. They have to compare two "flat" images.
> I'm just wondering if using cameras that are close to each other, but use different focal lengths, doesn't give the same results
I can see why it might seem that way intuitively, but different focal lengths won't give any additional information about depth, just the potential for more detail. If no other parameters change, an increase in focal length is effectively the same as just cropping in from a wider FOV. Other things like depth of field will only change if e.g. the distance between the subject and camera are changed as well.
The additional depth information provided by binocular vision comes from parallax [0].
> Also, wouldn't turning a multitude of views into a 3D map require a neural net anyway?
Not necessarily, you can just use geometry [1]. Stereo vision algorithms have been around since the 80s or earlier [2]. That said, machine learning also works and is probably much faster. Either way the results should in theory be superior to monocular depth perception through ML, since additional information is being provided.
> It seems to me that this is how modern phones are doing background removal: The lenses are very close to each other, very unlike the human eye. But they have different focal lengths, so depth can be estimated based on the diff between the images caused by the different focal lengths.
Like I said, there isn't any difference when changing focal length other than 'zooming'. There's no further depth information to get, except for a tiny parallax difference I suppose.
Emulation of background blur can certainly be done with just one camera through ML, and I assume this is the standard way of doing things although implementations probably vary. Some phones also use time-of-flight sensors, and Google uses a specialised kind of AF photosite to assist their single sensor -- again, taking advantage of parallax [3]. Unfortunately I don't think the Tesla sensors have any such PDAF pixels.
This is also why portrait modes often get small things wrong, and don't blur certain objects (e.g. hair) properly. Obviously such mistakes are acceptable in a phone camera, less so in an autonomous car.
> And those illusions work even though humans actually have an advantage over cheap fixed-focus cameras, in that focusing the lens on the object itself gives an indication of the object's distance
If you're referring to differences in depth of field when comparing a near vs far focus plane, yeah that information certainly can be used to aid depth perception. Panasonic does this with their DFD (depth-from-defocus) system [4]. As you say though, not practical for Tesla cameras.
>different focal lengths won't give any additional information about depth, just the potential for more detail.
This is also why some people will optimize each eye for different focal length when getting laser eye surgery. When your lens is too stiff from age, it won't provide any additional depth perception but will give you more detail at different distances.
Wow. Ok. I did not know that. I thought that there is depth information embedded in the diff between the images taken at different focal lengths.
I'm still wondering. As a photographer, you learn that you always want to use a focal length of 50mm+ for portraits. Otherwise, the face will look distorted. And even a non-photographer can often intuitively tell a professional photo from an iPhone selfie. The wider angle of the iPhone selfie lens changes the geometry of the face. It is very subtle. But if you took both images and overlayed them, you see that there are differences.
But, of course, I'm overlooking something here. Because if you take the same portrait at 50mm and with, say, 20mm, it's not just the focal length of the camera that differs. What also differs is the position of each camera. The 50mm camera will be positioned further away from the subject, whereas the 20mm camera has to be positioned much closer to achieve the same "shot".
So while there are differences in the geometry of the picture, these are there not because of the difference in the lenses being used, but because of the difference in the camera-subject distance.
So now I'm wondering, too, why Tesla decided against stereo vision.
It does seem, though, that they are getting that depth information through other means:
Perhaps it helps that the vehicle moves? That is, after all, very close to having the same scene photographed by cameras positioned at different distances. Only that Tesla uses the same camera, but has it moving.
Also, among the front-facing cameras, the two outermost are at least a few centimeters apart. I haven't measured it, but it looks like a distance not unlike between a human's eyes [0]. Maybe that's already enough?
> But, of course, I'm overlooking something here. Because if you take the same portrait at 50mm and with, say, 20mm, it's not just the focal length of the camera that differs. What also differs is the position of each camera. The 50mm camera will be positioned further away from the subject, whereas the 20mm camera has to be positioned much closer to achieve the same "shot".
Yep, totally.
> Perhaps it helps that the vehicle moves? That is, after all, very close to having the same scene photographed by cameras positioned at different distances.
I think you're right, they must be taking advantage of this to get the kind of results they are getting. That point cloud footage is impressive, it's hard to imagine getting that kind of detail and accuracy just from individual 2d stills.
Maybe this also gives some insight into the situations where the system seems to struggle. When moving forward in a straight line, objects in the peripheral will shift noticeably in relative size, position and orientation within the frame, whereas objects directly in front will only change in size, not position or orientation. You can see this effect just by moving your head back and forth.
So it might be that the net has less information to go on when considering objects stationary directly in or slightly adjacent to the vehicles path -- which seems to be one of the scenarios where it makes mistakes in the real world, e.g. with stationary emergency vehicles. I'm just speculating here though.
> Also, among the front-facing cameras, the two outermost are at least a few centimeters apart. I haven't measured it, but it looks like a distance not unlike between a human's eyes [0]. Maybe that's already enough?
Maybe. The distance between the cameras is pretty small from memory, less than in human eyes I would say. It would also only work over a smaller section of the forward view due to the difference in focal length between the cams. I can't help but think that if they really wanted to take advantage of binocular vision, they would have used more optimal hardware. So I guess that implies that the engineers are confident that what they have should be sufficient, one way or another.
Because Tesla have demonstrated that it's unnecessary. The depth information they are getting from the forward-facing camera is exceptional. Their vision stack now produces depth information that is dramatically superior to that from a forward-facing radar.
(It's also worth noting that depth information can be validated when the vehicle is in motion, because a camera in motion has the ability to see the scene from multiple angles, just like a binocular configuration. This is how Tesla trains the neural networks to determine depth from the camera data.)
It makes intuitive sense since you can say, play video games with one eye closed. Yes you lose field of view. Yes you lose some depth perception. But you don’t need to touch your finger tips and all your ability to make predictive choices and scan for things in your one-eyed field of view remains intact.
In fact, we already have things with remote human pilots.
So increasing the field of view with a single camera should intuitively work as long as the brains of the operation was up to the task.
What I was talking about was largely doesn’t apply to the Autopilot legacy stack currently deployed to most Tesla cars.
Personally I wish Tesla would spend a couple of months cleaning up their current beta stack and deploying it specifically for AEB. But I don’t know if that’s even feasible without affecting the legacy stack.
> Their vision stack now produces depth information that is dramatically superior to that from a forward-facing radar.
RADAR is more low fidelity though, blocky, slow and doesn't do changes in direction or dimension very well. RADAR isn't as good as humans at depth. Only benefit of RADAR is it works well in weather/night and near range as it is slower to bounce back than lasers. I assume the manholes and bridges that confuse RADAR are due to the low fidelty / blocky feedback.
LiDAR is very high fidelity and probably more precise than the pixels. LiDAR is better than humans at depth and at distance. LiDAR isn't as good at weather, neither is computer vision. Great for 30m-200m. Precise depth, dimension, direction and size of object in motion or stationary.
See the image at the top of this page and overview on it. [1]
> High-end LiDAR sensors can identify the details of a few centimeters at more than 100 meters. For example, Waymo's LiDAR system not only detects pedestrians but it can also tell which direction they’re facing. Thus, the autonomous vehicle can accurately predict where the pedestrian will walk. The high-level of accuracy also allows it to see details such as a cyclist waving to let you pass, two football fields away while driving at full speed with incredible accuracy.
> Because Tesla have demonstrated that it's unnecessary. The depth information they are getting from the forward-facing camera is exceptional.
Sure! Here's a Tesla using its exceptional cameras to decide to drive into a couple of trucks. For some strange reason the wretched human at the wheel disagreed with the faultless Tesla:
That was an issue with the path planner, not depth perception, as demonstrated by the visualisation on screen. The challenge of path planning is underrated, and it's not a challenge that gets materially easier with the addition of LIDAR or HD maps. At best it allows you to replace one set of boneheaded errors with another set of boneheaded errors.
No! It was an issue with the trucks! They shouldn't have been in the way in the first place! Don't they know a Tesla is driving through? They mustn't have been able to see it since they lack exceptional cameras.
Because the software running in release mode is a much, much older legacy stack. (Do we know if the vehicle being tested was equipped with radar or vision only?)
But AI and ML isn't as good as a human brain or maybe any brain. I imagine the gap has to be closed with better and multiple sensors or make fundamental leaps in computing technology.
I've never understood their reasoning. It sounds like a Most Interesting Man in the World commercial: "I don't always tackle the hardest AI problems known to mankind, but when I do, I tie one hand behind my back by not fusing data from every possible sensor I can find in the DigiKey catalog."
IR lidar would be pretty useful in rain and fog, I'd think. But I'd rather have all three -- lidar, radar, and visual. Hell, throw in ultrasonic sonar too. That's what Kalman filters are for. Maybe then the system will notice that it's about to ram a fire truck.
The puzzle piece you are missing is that sensor fusion is not an easy problem either. The Tesla perspective is that adding N different sensors into the mix means you now have N*M problems instead of M.
I hope that's not their perspective, because that perspective would be wrong. There are entire subdisciplines of control theory devoted to sensor fusion, and it's not particularly new. Rule 1: More information is better. Rule 2: If the information is unreliable (and what information isn't?), see rule 1.
Some potential improvements are relatively trivial, even without getting into the hardcore linear algebra. If the camera doesn't see an obstacle but both radar and lidar do, that's an opportunity to fail relatively safely (potential false braking) rather than failing in a way that causes horrific crashes.
Bottom line: if you can't do sensor fusion, you literally have no business working on leading-edge AI/ML applications.
Every Pro iPhone has one. So it already got pretty cheap by now. Looking at Mercedes' Level 3 Autopilot tech you can also see how well you can integrate the sensors into the front of a car.
At the time of comment, a LiDAR rig would cost around $10,000. A few years before that, they were more like $100,000. Presumably the cameras are much cheaper.
I would be willing to bet that production efficiencies will be found that will eventually drive that cost down significantly.
To be fair, it's not a computer performance benchmark being gamed here. If nightime is problematic, autopilot shouldn't be running at night. Because if I'm a pedestrian then the odds are stacked against me with any physical encounter with a vehicle. Fairness in the scenario setup shouldn't really be part of the conversation unless it goes beyond the claims if the manufacturer, i.e., if Tesla had said "autopilot does not function in these conditions and should not be used at those times" and then "nightime" was one of those conditions listed. If Tesla hasn't said that a scenario is outside the scope of AutoPilot then the scenario is an appropriate test & comparison point.
> if Tesla had said "autopilot does not function in these conditions and should not be used at those times" and then "nightime" was one of those conditions listed. If Tesla hasn't said that a scenario is outside the scope of AutoPilot then the scenario is an appropriate test & comparison point.
I'd go further and say add "and set the software not to engage this feature at nighttime". Simple disclaimers are not enough when lives are at stake.
I'd never trust a "self-driving" car without lidar. It should be a requirement. There's tons of research on how easy it is to completely fool neural nets with images.
The night example looks to be specifically stacked against autopilot.
I don't think so. If the autopilot can't be used at night, I - who live in Norway - just can't use it during the winter as there isn't enough light. I don't even live above the arctic circle and am lucky enough to get 4-5 hours of (somewhat dimmed) daylight during the darkest times.
If it doesn't do night, it is simply a gimmick to lure you into paying for a promise of a car.
Reminds me of this incident that happened to me a last December: I was driving my kid to school and I noticed some pedestrians on the sidewalk . The mom was walking and texting and the little boy was dribbling a soccer ball while they walked to the school. And suddenly the soccer ball got on the road and the kid dove after it .. in the middle of the road inches from my car. I am so grateful to whatever braking system my car had for stopping just in time. I honked and the mom flipped me the birdie and cussed me out in what I think was Russian.
Kids are stupid and unpredictable and AI/ML can't work out all the insane ways kids can put themselves in harms way. No autopilot or FSD can . Peolple should not rely upon them.
I think the main point is to know the limitations of the technology and to deploy it appropriately. For instance, I don't rely on old-school cruise control to stop for small children, either, even though I engage it in school zones.
This isn't limited to "Tesla-tech". The same rules apply to ALL technology.
> Where does Tesla provide a list of such limitations for it's customers,
One specific place is first sentence of the FSD Beta welcome email:
"Full Self-Driving is in limited early access Beta and must be used with additional caution. It may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road. Do not become complacent."
That's been my experience with it. Right now, the beta doesn't reduce my workload, it increases it. When I want to "just drive", I turn the beta off.
That said, Tesla can and should do more. They need to better frame the capabilities of the system, staring with the silly marketing names.
So, basically, I need to somehow predict that FSD will do the wrong thing and react myself, _before_ the worst time, because the worst time is when it's already too late.
Or, in other words, whereas any other car manufacturer has fallbacks for when the driver is not doing what they're supposed to, Tesla treats the driver as the fallback instead. I just don't understand what is this magic that is supposed to allow the driver to predict incorrect AI behavior.
> So, basically, I need to somehow predict that FSD will do the wrong thing and react myself, _before_ the worst time, because the worst time is when it's already too late.
Don't confuse prediction and anticipation. Prediction requires that you know what's going to happen. Anticipation is getting ready for something that might happen. Anticipation is a normal part of defensive driving every day, not prediction.
Let's go back to defensive driving 101: defensive driving allows mistakes to be made. It allows bad things to occur and still recover from them safely. Bad things happen because of mistakes are made by humans in the car, humans outside of the car and also by the computer in the car. The change here is that you the computer is being given much more latitude to make mistakes. It does NOT grant the computer the ability to remove defensive margins from driving.
If you drive (regardless of FSD) with no defensive driving margins, you immediately enter "too late" territory whenever a mistake is made.
Definitely, and that's exactly why I claim this is not even close to FSD, and why I absolutely do not want this in my car.
If I have to be on the wheel and ready to react at any point in time, then I'd rather be the one driving and that's it.
I think regular "old" adaptive cruise control and lane assist are vastly superior to this. I am on the wheel and in charge for 99.9% of the time, as I should be regardless of FSD, and the technology saves me in the 0.1% when I am not.
FSD will never be FSD without a complete redesign of infrastructure, which will not happen in our lifetime.
What does FSD really give you, then? It doesn't reduce the mental toll of staying alert and anticipating the road. It's probably only safe to use on the highway. On the highway, it provides the same automated acceleration/braking you could get from radar-assisted cruise control you can find on any modern car. Like cruise control, it's probably a good idea not to rely on it to avoid large stationary obstacles like a turning freight truck. It does purport to control the steering wheel too, but you can't really trust it not to steer into highway medians either.
They should just rename it to something boring like "camera-assisted cruise control" and remove the beta label so everyone can use it.
It may work somewhat like airplane autopilot, but the environments are not comparable. A plane has nothing to hit but terrain which is easily identified and almost all other obstacles in the air are transmitting their position.
In addition, pilots are required to have thousands of hours of training for that specific model airplane. I'm sure the limitations of autopilot come up.
Meanwhile, in most US states, an adult can walk into a DMV, demonstrate the ability to turn on the vehicle and do a 3-point/k turn, and walk out with a license.
That's not that bad, Belgium used to have no driving licenses for normal cars (everyone could drive) and the accident figures were similar to neighbour countries.
A small misunderstanding - "legal reasons" are that it is the person in the driver's seat who is legally liable for damage done while the car was driving itself, not Tesla.
The important thing here is that for over half-a-decade, Tesla has been lying to its customers about its capabilities.
When in actuality, Tesla will reliably crash into pedestrians and stationary firetrucks. To the point where people at other companies are confident to make live-demos of this at electronic shows.
---------
Calling it "autopilot" or "fsd" isn't the problem. The problem is that Tesla actively lies about its capabilities to the public and its customers. It doesn't matter "how" they lie or exactly what weasel words they use. The issue is that they're liars.
We can tell them to change their name or change their marketing strategy. But as long as they're liars, they'll just find a new set of weasel words to continue their lies.
Does autopilot make sense? Aviation autopilot seems to be many orders of magnitude more reliable than Tesla's autopilot.
In fact, autopilot in aviation contexts is regularly used when human pilots are worse, such as landing at airports that regularly experience fog & low visibility conditions. As in, autopilot is the fallback for humans, not the other way around.
Surely autopilot is an easier problem to solve compared to self-driving cars? Air traffic is controlled, road traffic is chaotic. Aerial vehicles move through what's essentially empty space with pretty much no obstacles, cars must navigate imperfect always changing urban mazes full of people whose actions are unpredictable.
I’m not familiar with aviation and the only reason I’m aware that airplane autopilot is actually not a self-flying system is because of Tesla and their weasel excuses for their reckless marketing.
I am not seeing the relation between cruise control and crashes in bad weather?
If I bought something that says it can drive itself, then I expect I do not need to pay attention to the road because it can drive itself. Just like if my friend can drive themselves and I am a passenger, I can trust them to handle paying attention to the road.
To go out of your way and call something "full" self driving only indicates that I should have zero qualms about trusting that I do not need to pay attention to the road.
I'm guessing the 'bad weather' comment is referring to the common belief[1], possibly exaggerated[2], that cruise control can be dangerous and cause crashes when the road is slippery. Not sure what's changed with newer traction control systems. I'd have to believe this has gotten even less likely but I don't know; my cars are too old to even have ABS.
One of the anecdotes in the Jalopnik article mentions that the vehicle is a Town Car, which is significant because those are rear wheel drive and handle very differently from most cars on the road in slick conditions. I would certainly expect more issues with older RWD cars and trucks because they tend to fishtail and spin if the rear wheels are given power without traction.
I personally have a tendency to match the speed of the cars around me. IMNHO, most cars speed through school zones. I use cruise control as a tool to prevent me from accidentally matching the speed of the cars around me and breaking the school zone speed limit.
That's crazy, I've never seen anybody get a ticket for 1-2mph over the limit. Problems with that: cops would be wasting resources because 1-2mph over the limit isn't significantly dangerous. Also, the radar guns can't be easily calibrated to that level of accuracy.
If our local cops did this, I'd just make an online post about it so everybody knew the cops were doing it and then it would stop.
My experience (based on a few tickets and observing many cops) is that they don't really care unless you're about 10+ MPH over the limit and also doing unsafe things. That's not to say they don't snag people just driving 5MPH over the limit, but it's not a core activity unless the department is using tickets as a revenue source or trying to make some sort of weird point.
I was in a drivers education class (I’d rear ended someone, and was trying to keep points off my license), and we went around the room explaining what law we broke to wind up in the class. One attendee was there for 1 MPH over in a school zone. Was it probably racial profiling? Probably. But I now stick to exactly 0 MPH over in school zones, and have routinely seen police monitoring speed while dropping my kids off at school. There appears to be zero tolerance even for the most politically connected soccer mom.
Acknowledged. If I received such a ticket I would sign the form the officer gave me and then immediately protest (appeal) the ticket and go to court. In particular, giving a ticket for 1mph over the limit doesn't make sense because the marginal danger (IE, how much more dangerous it is) to drive 1 mph over the limit is tiny, it's 10-15 mph that is dangerous. The police actually have to make a case justifying the value of spending the time of stopping you.
I just looked into the details. In my state, CA, 1-15mph over the limit is specially treated, with one point that eventually gets cleared.
I'm amused because (as I mentioned) I live in a school zone and I just drove home at about 5mph, because the streets were so croweded that anything faster would have been impossible. A cop could not have parked in any location near my house because every spot was taken, and all sightlines were blocked by SUVs or buses.
> it's not a core activity unless the department is using tickets as a revenue source or trying to make some sort of weird point.
Or the officer is racist. I know we’re veering into very off topic discussion here but your experience and resulting list does miss a key component for an experience often described as “driving while black”. 1mph over the speed limit would absolutely see you get pulled over.
That’s a situation where the cameras were wrong and a court would have likely forced a change.
The scenario under discussion is a case where the police are within their rights. A simple blog post would never force any change in most municipalities, much less immediately.
I forgot to mention that cars don't really have high accuracy speedometers; they could be 10% off which could easily cause a conscientious driver to speed. What's the point of giving somebody a ticket for driving 26mph when their speedometer says 24? That's, like, just silly.
I’m not disagreeing with the premise that the police are being overzealous. I just find it hard to imagine a world where someone complaining about it online is guaranteed to result in changes.
In some locations the speed enforcement is autonomous.
>My experience...
Will of course be much different from someone who lives in a different locale or is a different ethnicity and/or social class than you.
>If our local cops did this, I'd just make an online post about it so everybody knew the cops were doing it and then it would stop.
Yeah, ok.
I've met a lot of people with inflated egos but believing you can dictate local law enforcement policy with your internet posts is on a whole nother level.
Autonomous? You mean, like a system that takes photos and sends you a bill? Yes, most such things were removed in our town after it turned out they were set wrong (sending tickets to people who didn't break the law).
Our city manager reads patch, reddit and other things for our town and occasionally engages with the community around policy. This is absolutely something where if you wrote a careful post on reddit saying "Hey, are our cops doing the right thing stopping people going 1mph near a school instead of stopping <whatever>"? There would be an argument, a few people would say 1mph is 1mph over the law, but really, the outcome would be that the ticket appeal would be approved and the cops in my town would be told not to do that.
Pointing out to somebody who says "in my experience" that others would have a different experience is pointless. I know that. If cops are giving people tickets for ethnicity (or even deadheads driving through georgia, which used to happen) that's an entirely different problem from pointless enforcement.
That's not really valid in the age of Swype-style keyboards. Less effort to do one swipe than multiple taps to hit individual letters for txt type.
Also, if you text and drive I sincerely hope you hit a tree or something else solid that doesn't hurt the road traffic and pedestrians around you - alleged FSD or no, we don't have level 5 fully-autonomous cars yet and so not paying attention to the road is just as bad as drunk or drug driving.
I agree with you. With a little bit of driving experience, you have a natural sense for what the safe speed is on a road, and that speed is almost always the speed limit, in my experience.
On a busy road with lots of pedestrians crossing, I naturally want to go much slower than I would on the same road if there were no other pedestrians or traffic. "School zones" just codify that into law - when you expect lots of kids to be crossing a road, the speed limit of the road should be lower.
The issue, for me at least, is the ambiguity. When is the school zone in effect? This creates a cognitive load. The road was clearly meant for 45 mph travel, because that is the normal speed limit. So if I let my "autopilot" brain take over, I will probably go over the 25 mph school zone limit.
It's a special case. So when I see a school zone, I unconditionally set the cruise control to be the school zone speed limit. This frees my brain from any congitive load about whether school is in session. It also guarantees that I am not influenced by the guy tailgating me.
The ability to set the speed of your car exactly, without monitoring, is really useful.
That's definitive proof that it still doesn't work reliably, let alone the system confusing the moon with the traffic light. [0] It shows that it is even worse at night.
I have to say that FSD is so confused, you might as well call it 'Fools Self Driving' at this point.
While 1000% agree the current Tesla FSD beta is in serious need of work; comparing it to unreleased specialized hardware in trials setup by makers of said specialized hardware is a little disingenuous.
But I'm not comparing it against the technology. I'm simply pointing out that Tesla __regularly__ crashes into balloon-children, to the point where a competitor literally used Tesla as a marketing-mule to show off how much progress they made.
--------------
This entire marketing scheme that Luminar did only works because they 100% trust that the Tesla will run into that balloon child on a regular basis (!!!!). This is literally a live demonstration in a major convention center full of a live audience.
I don't have any idea how seeded or unfair Luminar's test is in favor of Luminar's hardware. I don't even trust that Luminar's hardware works (I don't know if they're faking their tech either).
But what I do trust, is for that Tesla to plow into that balloon over-and-over again on a reliable basis.
That's how far away Tesla is from "full self driving", or whatever the heck they want to call it.
This is nonetheless comparing a product currently on sale with a development platform not currently on sale. Surely a fairer test would be to compare a Luminar tech development rig with a Tesla tech development rig.
Luminar doesn't appear to have access to Tesla's under-development technology beta which is, as YouTube videos of FSD beta clearly shows, markedly superior to the technology currently deployed in most Tesla cars. I can't say whether the FSD beta would reliably stop for balloon-children, but from the warts-and-all videos being posted on YouTube, it seems highly likely that it would.
The "test", as far as I'm concerned, is whether or not Tesla will kill children on our streets. And given this test, it proves that Tesla's emergency braking capabilities are awful.
I don't care about Luminar tech at all, frankly. If Luminar becomes available to buy one day, maybe I'll care then.
---------
At the _ROOT_ of this discussion is a poster who openly admits to using "Autopilot" in school-zones. Given that Tesla will _NOT_ stop for balloon-children until its way too late (the braking action was _AFTER_ the balloon-child was smacked), I have to immediately reprimand the original poster. This is a critical safety issue that people must understand, "Autopilot" and "FSD" are NOT ready to be used in school zones, or other locations where children might run out into the streets randomly.
This has nothing to do with tech-companies swinging their tech around at a show. This has everything to do with stopping the next child from getting smacked by an ignorant driver.
By that logic, the fairer test is for an independent arbiter to compare Tesla's AEB with other cars on sale, modelling a variety of common collision scenarios. All that's been proven here is that a motivated competitor was able to identify one scenario where their development tech works and Tesla's does not.
Let me be clear: I'm not saying that Tesla's current release version of AEB is anywhere near good enough. Clearly there is substantial room for improvement. If I had any influence over Tesla's priorities, I would make them package up the subset of FSD Beta necessary to perform AEB and get that deployed to the entire fleet as a matter of urgency.
------
As for whether a driver chooses to use any L1 or L2 driver assistance features in an inappropriate context, that is always the responsibility of the licensed driver. Engaging these systems is always a manual task performed by the driver. They are not on by default.
If there is a legitimate concern here, perhaps the correct response is to institute a blanket ban on the use of any assistance technologies (e.g. adaptive cruise, lane centring) while in school zones.
This is such a slimy fake test it makes me side with Elon. The cardboard cutout doesn't behave anything like a child. It behaves like a bit of cloth rubbish which I would hope the car would run straight through.
By a school that "cloth rubbish" could be accompanied by child chasing it (see comment near this one), so one would hope it would at least slow down...
This isn't an honest test. Think through the reality and then mimic that - but the reality isn't a child standing still in the middle of the road in the middle of the night.
Also, Tesla requires you pay attention still - which is relying on it, but they tell you NOT to rely on it 100%, so in this demo the driver is at fault for not watching ahead of them and breaking. So your claim that they're awful at stopping is pretty disconnected.
Even if you are paying attention, wouldn’t there be an inherent delay between passively paying attention and taking over a self-driving car? I don’t have any evidence but it seems like it’s inevitable. Furthermore, by design, even attentive, well-intentioned drivers are more likely to be distracted or under estimate risks if the car is doing the driving 99% of the time. The ‘well technically it wouldn’t have murdered that clearly visible child if you were paying attention’ defence is both specious and obviates a lot of the value of teslas autonomous driving tech.
I won’t argue there aren’t some inherent benefits of even basic driver aids, but if you pay have to pay 100% attention 100% of the time, the tech loses it’s lustre. This is a problem largely of Tesla’s own making IMO — their marketing and Elon’s public statements paint a much, much rosier picture and people buy into it.
You can carefully construct a case where it wouldn’t work, obviously. Even humans have blind spots, and are prone to visual hallucinations. Stop spreading FUD. I have AP and I know it works reasonably well - as long as I have my eyes on road and hands on steering, it is safe, while reducing a great deal of stress
Absolutely. My 2021 Audi has traffic sign recognition, and recognizes school zone signs, flags them as such on the console and heads up display. It also recognizes the flashing light indicating that the school zone is "active".
But yet, Tesla, the "not-dinosaur", screws this up completely?
Oof.
AND, if you have adaptive cruise, it will absolutely recognize the discrepancy between your speed and school zones, and will decelerate the car to that speed.
Frankly, as a human I also find "during restricted hours" signs frustrating. How do I know which hours those are?
As a human being, you weigh the risks and make a choice. Which has the worse outcome — getting to your destination 18 seconds later, or killing a child?
Then just respect those limitations at all times. The school zone will last a whole 200m, you can live with driving 20 in there all the time instead of wondering if you can go full speed in a school zone.
A zone near me has two different sign indicator systems. They appear to go off at different intervals. In both cases I don't see humans around. I would not be surprised if the time the crossing guards and children were present was a completely different, third time.
Did you try to google for it? Did you look up the actual laws where you are?
It isn't hard to find the answer. If you're in California, then Vehicle Code 22352 defines it. The same will be true for every other place in the world.
That's a fair point. I grew up in Washington, where I believe it was not defined at all. It was a bit of a hobby of mine to look up vehicular RCWs when I began driving, and the Internet was available.
But it's also possible it's something that puzzled me pre-Internet, and I never bothered to look up post-Internet. There are plenty of those as well.
I asked a CHP officer once and they said "In or near the crosswalks, including sidewalks near crosswalks." Note that police officers are not necessarily up on all of the intricacies of the law though.
It is frustrating that it isn't clearly delineated, but you should be fine basing your guess on some reasonable assumptions: schools are busiest during drop off and pickup, with kids out and about near the road, very good chance those times are included in the "restricted hours", possibly the hours in between as well. Public and secular schools are typically not open on weekends, very likely any weekend hour is not restricted. Schools are usually closed during the summer months and for a time during winter break. This can be tricky because school still can have very different schedules and you may not know them if you don't have kid sin school, but again context can help: is the parking lot completely empty? No one around? School's probably not in session, restricted hours probably don't apply. If you were ticketed for speeding in a school zone during morning drop off, I think you'd have a hard time arguing you didn't know it was a restricted time. Maybe during lunch you could make a case, I guess it would depend on the judge you got.
In the US at least you can typically contest a ticket. If you feel it was unjust or unfair you can make your case before a judge, doesn’t mean you get it overturned.
If the sign didn’t state hours and it wasn’t clear it was a restricted time, maybe you could make a case.
I'm sure I saw cars with an option to read speed limit signs about 10 years back. Really boggles the mind that Tesla have gotten away with calling their cars "Full Self Driving".
How does the adaptive cruise control know whether or not school is in session and children are on their way to/from school? Or does it slow you down even if it's Christmas morning? If your car forces you to go 20mph below the speed limit that seems like a major safety issue
An easy solution is to always slow down in a school zone.
If you want/need to drive faster in the school zone, you simply step on the accelerator and deactivate self-driving. You can turn it on again when through the school zone.
I think this is very much car-dependent, and probably even user-configurable.
For example, the cruise control on my 2021 Mazda 3 doesn't enforce school zone speed limits (or any speed limits), although it recognises them and will flash on the HUD if you're exceeding the limit. Since it's not enforced, I can just ignore it if it has misidentified a speed sign.
Although, school zone signs around my area actually include a bit of genius design: they are hinged in the middle [1], so during school holidays, the signs are covered up by closing the hinge.
Finally, at least in suburban areas in Australia, a school zone covers a couple hundred metres at most. So even if you or your car gets it wrong, driving 20km/h under the speed limit is a very temporary inconvenience for drivers behind you (unlike, for example, driving 20km/h under the speed limit for 50 kilometres on a single-lane highway).
Interestingly, Tesla’s earliest autopilot software (made by Mobileye) could read and respond to speed limit signs, but MobilEye patented that ability and so when Tesla switched to their in-house software, they lost that ability.
Seems insane to me that you can patent reading a speed limit sign, since reading signs is what signs are for and is necessary to obey the law, but there we go… “with a computer” seems sufficient grounds to make something patentable.
I was wondering what happened. I knew they used to be able to actually read the signs, but now its all a database that can be quite wrong. I think the DB is nice to have, since signs can be few and far between, but would really like to see it back to reading signs.
I don't understand how that could possibly stand up as a patent. It shouldn't pass the obviousness test. Reading a sign is a super obvious thing to do. But you would still have to spend millions fighting it in court. Which is insane.
What model do you have? The 3 definitely doesn't read speed limit signs. Or at least doesn't use the information if it does read them. It will drive right past a sign that says 25MPH and still say the limit is 35MPH. Or drive past a 45MPH and say its 35MPH. Things like that. And there is no way to even report it being wrong, that I have been able to find.
Ha, subscription fee to pay for the license to the patent troll.
Then people will install modded bionic eyes firmware downloaded from Ukranian websites[1]. The future will be like Cyberpunk 2077, except the hacks are just to bypass paywall DRM.
There are no truly self driving cars, and probably won't be. When a traffic cop or school teacher holds up a hand to stop cars or say "turn left NOW", what training data exists?
There should be a much stronger involvement from cities / states legislating a ban on any kind of self-driving in these areas. (self-braking -- sure!) It will have to wait for a child to die, unfortunately.
I'd like to see a self-lane-keeping lane on interstates with a 120 mph speed limit and concrete barriers. If we can have "Zero emissions" cars incentivized with access to HOV lanes, why not cars that can do a good job at lane-keeping and merge-scheduling in areas where there are exactly zero other distractions?
This reminds me of when I was driving in rural Texas a few months back and came across an apparently very recent accident involving a jack-knifed truck. The firefighters had just arrived on scene. While slowly and carefully driving around the accident, a firefighter got out seemed quite upset with me and seemed to be yelling something at me as loudly as he could, but I couldn't make out a word. I still have no idea if I did something wrong. Driving is 99.9% routine and boring and that .1% is ambiguous and quite potentially life threatening. I share the skepticism of self driving cars.
I had someone go in front of me while circling his flashlight as if to proceed. When I did, he casually walked in front of me. He wasn't actually paying attention and was just flinging his light for directing traffic without thinking about it.
Yeah, the .1% can be really ambiguous and dangerous.
Cars are made to be completely quiet inside them, which means there is limited way to communicate with those inside and also that horns are obnoxiously loud.
I agree with you. Real self driving is impossible, IMO. Current "AI" tech will never get us there. And even if we cracked real AGI, I don't see a reason to expect the computer intelligence to be a better driver than a human. AGI does not mean the absence of emotions, distractions, or miscalculations.
We as a society should be realistic about the advantages and limitations of self driving technology. On a highway with well marked lanes and no construction, pedestrians, etc, self driving is awesome. That is the use case that should be optimized and encouraged by states. Everything else should realistically be banned.
You're assuming an AGI would have all the characteristics of a machine algorithm and enough intelligence to do exactly what a human would/should (or better) in all driving situations.
And yet there is no evidence that self-driving systems are any safer than human drivers, or that they'll ever even be as safe as human drivers, let alone safer than them.
What do you mean "no evidence"? The companies testing self-driving cars now have hundreds of millions of miles driven and I haven't seen any reliable report showing they have higher kill rates than human drivers.
It would seem entirely possible that large amounts of highway driving can be handled by self-driving cars with slightly lower rates of accidents (tired and drunk people kill a lot of people!)
Well we haven't invented AGI yet, and so we don't even know if it will be possible to control them with an objective function. So your opinion is entirely speculative and not based on any science.
It's still vaporware for now, but next year GM is supposedly going to start selling a system that will allow true hands-off self driving on highways. It will be interesting to see if they really deliver. The claims seem to be well beyond what Tesla currently sells.
> General Motors is adding another tier to its hands-free driving technology with its new Ultra Cruise system that it claims will work in 95 percent of driving situations.
This seems to be exactly what I mean. They admit that their system can't handle the 5% of edge cases, and market it appropriately. This is not a full self driving car.
Unless the death is being exploited in order to remove constitutionally protected liberties, the US has pretty close to zero tolerance for the death of minors.
> When a traffic cop or school teacher holds up a hand to stop cars or say "turn left NOW", what training data exists?
This seems like one of the scenarios (multiple but limited in number) explicitly listed in traffic laws which might be scarce in randomly sampled driving data, however, is trivial to artificially manufacture in as large quantities as you need. Go to a local movie producer for their supply of uniform costumes and other varied clothing and the actors/extras to play the role of a traffic cop, and you can get quite a lot of training data in a single week on a budget that's trivial compared to most other self-driving car experiments.
If a cop is directing traffic, you should obey them. But what if they’re next to a construction worker, a cowboy, an Indian chief, a biker, and a sailor?
You're basically asking the question of what is the minimum sufficient level of general intelligence required to allow self-driving cars to go forward.
I instead have a different criteria. If somebody could show that their self-driving car can drive 250 million miles without a person (pedestrian, driver, or passenger) across a fleet and range of conditions, then that's good enough for me (currently, people drive 100 million miles before somebody gets killed).
I figure 2.5X more safe than the average human would lead to an enormous savings in lives and one might even make the argument, at that point, that there was a moral imperative to disallow people driving!
(BTW I live next to an elementary school and people drive past exceeding the posted limit all the time. I struggle to move my car safely through all the distractions. The one thing that helps the most is the radar which hits the breaks if it thinks I'm gonna back into a person or car.)
the roads self-driving cars drive on at this point are heavily self-selected as well. I want see a few thousand self-driving cars interact on the streets of Delhi or see one go through a snowy mountain pass with barely any signs
when people make these safety stat comparisons nowadays seems to be often ignored under how much more ambiguous conditions humans still drive safely.
I do not expect FSD to handle that. Nor would I consider it an explicit design goal.
However, I was just driving in my neighborhood in northern CA and saw conditions very much like that (I live next to a school and I was driving right when school let out).
snowy mountain pass maybe not but your average traffic in the world is closer to the cities of India, China, Indonesia or Brazil than a grid system in the suburbs of Phoenix Arizona.
Keep in mind the US and Europe account for 10% of the world's population, and I wouldn't even have a lot of confidence in self-driving in your average traffic situation in Rome
Tbh I'm not even sure if self-driving as it is covers 65% of the American population. Has anyone ever tried to use a self-driving car on a rural American road?
The counterpoint is that places and culture can change. I’m sure there was a huge kerfuffle when cars started sharing roads with horse-drawn carriages in London, and I’m sure some people thought cars would never have a place, but here we are. If self-driving cars hit the mainstream in North America and Europe, I guarantee seemingly unfit cities will adjust because truly autonomous vehicles have sooooo many benefits.
yes, people ride teslas in some sort of automated mode on rural roads.
I see your point about the larger driving population, but self driving will roll out in wealthy countries with new car models first, and I think solving the problem there will predate a large solution, for many different reasons.
All my comments about self driving are basically limited to urban and semi-urban wealthy countries since those are the groups subsidizing the research now.
I'm not a fan of Tesla personally but it is worth mentioning that "autopilot" and "self driving" are not the same. Autopilot is, and has always been, cruise control on steroids. Full self driving hasn't reached the consumer market. To expect your Tesla to be that is lying to yourself.
Tesla attempts to bury the lead by saying drivers shouldn't use these features without being "fully attentive" but uses names like "Full Self-Driving" all over their marketing material.
Five years ago Tesla claimed that all cars being produced had the "hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver".
Doesn't Tesla have beta software the driver can turn on with the name "Full Self Driving"? And isn't it intended to be used, "beta tested", on public roads?
Its also worth pointing out what feature set you get with "Autopilot" and even how well those features operate varies a lot depending what country you are in. In the USA, it's gone quite a bit beyond just cruise control on steroids, and will now follow _any_ road no matter how curvy so long as there is a painted center line. In other words, you can turn it on outside schools, in complex urban neighborhoods etc and it will just keep following the road until you disable it or the painted line stops.
In many European countries, the system is restricted to a much more traditional lane assist and does not follow tight bends like this. I have a Model 3 with the standard fit "Autopilot" (not FSD) in US and have tried same model in the UK.
If you follow Bjorn Nyland on YouTube (popular electric car YouTuber), he also recently discovered how much more advanced "Autopilot" is on the American cars when he drove a US import in Thailand and compared to his previous Norwegian Tesla, despite feature being labelled the same.
This all makes having any kind of reasonable discussion on the internet about it really, really hard.
To complicate matters further, conventional Traffic Aware Cruise is technically the third driver assist mode Tesla offer in their cars in addition to "AP"/"FSD", this of course also works without the painted line.
Are the signs not delimited by day of week, time and month where you are? E.g.
7:30-18:30
Mon - Fri
Sep - Jun
Where I live, police can and will ticket you for speeding during those times. Regardless of if there's students around or if school is even in session or not.
In my state the school zone speed limit is only in effect if it's a school day and there are children around. If all the kids are all inside it's the normal speed limit. I guess Tesla could just use the school zone limit all the time during school times, but people would be annoyed.
At least in Georgia, there's at least one yellow light with a speed limit sign attached. During active school hours, the yellow light will flash on and off.
On my daily commute, there's two overhead flashing yellow lights. Previously, my Model Y would begin to brake and then speed back up, thinking it's a normal traffic light (prior to FSD). With FSD, it's at least smart enough to know not to brake; but it certainly doesn't read the speed limit from the sign, as it normally would.
Can be times of day, "when flashing", or "when children are present". Time of day isn't great because of irregular after school activities, events, delayed starts because of snow, etc.
The idea of present might be a difficult one. I'm not up to speed on any court cases that have determined what present means, but it's likely more nuanced than a simple "can Tesla identify a child in sight". A human would be more likely to play it safe and just obey regardless.
> On a school day when school children are present and so close thereto that a potential hazard exists because of the close proximity of the motorized traffic
This doesn't have to be state law, it can be municipal law. All it takes is city council putting up a sign that says "15 MPH when children are present".
>Are the signs not delimited by day of week, time and month where you are?
Depends on the locale. A lot of places just say "Speed Limit is X When Children Present", or "Speed Limit is Y when Lights [on the sign] are Flashing".
Yeah I've seen many such signs and it seems like the clear best approach. People should drive carefully around crowds of children and normally otherwise, regardless of some schedule posted on a given school's website.
Where I live, the school zone speed limits are only in affect when school is actually in session, and the yellow lights are flashing to let you know that it is (but we are also talking about the difference between 15MPH and 25MPH). Sounds like a plum income stream for your city's police department.
Where I live, the signs will state either a time of enforcement of school zones or "When Children Present". Generally, elementary schools will have school zones active during all hours that kids are on site. Middle Schools and high schools tend to be "When Children Present".
I believe signaling has a lot to do with the law in my state, since I'm living in a big city, the signaling is always present. And like I said before, the normal speed limit is 25MPH, so its not like normal traffic is too fast, it just goes down to 15MPH during school in and school out time (there aren't kids hanging around outside the schools otherwise, it is an urban environment).
> Are the signs not delimited by day of week, time and month where you are? E.g. 7:30-18:30 Mon - Fri Sep - Jun
In my local area the signs say either "during school hours", "while children are present" or they say nothing at all and you just have to know what hours and days the lower speed limit is enforced.
In Australia, it's not just little side roads that run by school entrances that have this rule; the school zone thing applies even on fairly major free-flowing roads. Two examples I can think of are
As the usual speed limit is quite a bit higher at 70km/h, driving at 40km/h (25mph) outside these times would make you, at best, a rather annoying obstacle to surrounding traffic.
It's generally a bad idea to fit safety regulations around the safety limitations of the item they're regulating. We set speeds, sometimes based on time of day or presence of children. Humans handle this just fine.
If your car can't, then the car needs to be fixed or it's "self-driving" functionality entirely disabled. Changing speed limit laws to compensate for these limitations is entirely the wrong direction.
My point is that FSD needs to be at least as capable as a human to follow speed limit signs.
The problem generalises - it's also unacceptable for FSD not to keep up with traffic on a freeway or randomly throw in the brakes to avoid spurious hazards for the same reasons.
I remember the story on here a while ago about a self-driving car that got rear-ended because it stopped in the merging lane of an empty highway rather than accelerating like any human would...
Even worse, there are school signs that say 15mph when children are present. Kids could be behind cars, bushes, people, etc. That’s a very hard task to deal with.
I mean if the kids are in the bushes I'm not sure a human would be able to figure that out either. It's been said before: self driving cars don't have to be perfect, just better than humans. And humans are super flawed.
Human drivers "can" be way safer; they aren't always. There's likely some balance where overall, self driving is statistically safer than some group of suboptimal drivers. It remains to be seen but there's always hope.
Unless the unsafe parts of self driving only apply to previously unsafe drivers it will still struggle to take off.
Not every human driver has the same risk, but every self driving car will. (Or it will be based on which car you are in rather than how safe you are.) In other words, relatively safe human drivers could actually see their risk levels go up in a self driving car, even it if it statistically safer than all human drivers.
Could you clarify what interactions that figure includes? I.e. is it fatalities for people inside motorized vehicles, or does it include something like a car-bicycle accident?
So many ifs. Look, even if you pass the school zone, kids play around. In rural areas, this is very common. Yeah, you could go at the speed limit, but people are just careful or make a judgment call.
Tesla doesn't claim that the car is responsible for setting the correct speed limit -- they actually claim the opposite. From the Owner's Manual:
> Warning
> Do not rely on Traffic-Aware Cruise Control or Speed Assist to determine an accurate or appropriate cruising speed. It is the driver's responsibility to cruise at a safe speed based on road conditions and applicable speed limits.
A problem with that is the car sometimes decides to change the set speed based on the signs and its not always obvious when you're in traffic until the car decides to take off. Alternatively it will slam breaks on if it misreads a sign and you have your speed set high. They need an option for a more dumb cruise control that ignores speed limits.
That's in the manual, the fan marketing essential says it reads speed limit signs so you don't miss any of the action while your passenger plays Cuphead (I think though eventually they removed the ability for the passenger to play while you drive).
We should mail a copy of the owners manual to Elon Musk then because just a few days ago he, for about the fourth time, announced that he would be shocked if "Tesla does not achieve Full Self-Driving that is safer than human drivers this year (and 5 years ahead of everyone)".
Maybe we have different definitions of FSD and being safer than a human but to me that includes obeying the speed limits and stop signs
I think what we'll end up seeing are the developments of sensors and protocols that are invisible to people, but can be sensed by cars nearby. Traffic lights and road signs will have sensors that emit electronic messages identifying what they are and what rules they are communicating that self-driving vehicles should obey. Other cars will start to have them as well. We may even see small barriers constructed between roads and sidewalks that exist only to discourage pedestrians from entering the road to make automated driving even safer. Your vehicle will be rented from a third party. The owner will legally have to have it registered with the city so that the interior cameras can conduct facial recognition to identify the drivers and passengers. Law enforcement can remotely disable a car and could also send signals to get all other cars to pull over to ease the arrests of carjackers and other criminals. Possibly coming to a small town near you.
Humans have general intelligence and are pretty bad drivers. Why would a computer intelligence be any better? If anything, a computer intelligence would be more likely to get bored or distracted if their "brain" is running faster than ours.
I agree that it's a miracle (or perhaps, a testament to modern engineering) that fatalities are so low. But I don't think fatalities is a great measurement.
I've been rear ended twice, and I've seen 4 rear endings happen. Nobody was hurt in any of these accidents, but it was definitely a pain in the ass for everyone involved.
I do not understand how cars can really be self driving IMHO. For example, say I'm going down a residential street and there's a few children playing with a football on the side of the road ahead of me. Well, in that situation I'm thinking "one of them could kick the ball into the road and run out" so I'll premptively slow down and be on "alert" just-in-case that happens.
If the worst happens and a child runs out, even though I may have a slower reaction speed than the fancy car, I was already going slower and in the back of my mind I was prepared.
I don't see if a car can do that kind of reasoning.
Another example I can think of, at traffic lights with a pedestrian crossing, my driving instructor taught me to look out for the lit up "Wait" text, if it's on, I can slow down and preempt that it might change before I pass.
This is the exact example I use too. Now self-driving till AI has a proper world model, so maybe not ever.
Another case: I drive and see an empty paper bag blown onto the road by the wind. I know it is empty because of the way it flew, but "autopilot" may mistake it for the rock and suddenly stop or swerve. Though mistaking rock for a paper bag and driving on would be much worse.
OR maybe the Tesla is so good at driving it doesn’t need to slow down. Maybe it should actually speed up to give people less time to get in front of it.
Another example of a walkable "neighborhood" (with sports, after school, and community events, community usage of facilities at 6-7 days a week) that we ignore the safety of in favor of cars to get wherever else they are going faster.
Yeah I’m a bit surprised to see the problem phrased as “car needs to understand complicated signs to deal with 55mph road having variable speed limit to 25mph during school times” rather than “city needs to figure out how to not have a road with highway speeds going right next to a school”.
I don’t want to make a more general argument about changing things to be easier for self driving cars. Merely that higher speeds should be for roads between places rather than the kind of streets used to access ordinary buildings like schools.
It's standard car-centric thinking. If you start looking into sustainable transport or reducing car usage etc you'll find most of the resistance comes from people unable to imagine a life without a car.
The GP said "The simplest thing would be to just get rid of the temporary speed limit and set it to 25mph at all times. 25mph is fast enough."
The reason why the speed limit drops to 25 mph during certain times around schools is because that is when children and adolescents are traveling to or from the campus grounds. It makes sense that you would want the speed limit reduced during a period of increased chaos. But it doesn't make the same sense to keep that reduction when the school is out of session, or when classes are taking place and the attendees are almost entirely contained within the campus.
And, in real terms, no one likes dropping down to 25 mph on a road that is normally 45 mph at all times. Like it or not, most drivers are not as anti-car as the folks on HN and they absolutely will complain to the city about the low speed limit or just ignore it. At which point, they will go back to conditional 25 mph limits.
To be clear, I think the problem is having a fast road right by a school and making a world where the road next to the school is not fast really means putting the fast road somewhere that isn’t full of turnings so drivers can go quickly and without interruptions all day.
The road outside my school when I was young was a 30 but too narrow and bendy for people to drive that fast. When I was older at a larger school in a bigger town it was also a 30 I think with no speed limit changes through the day. The school was on the edge of the town and as you left the urban area it became a little wider and a 40. Outside my place of work in a big city, I’m not even sure what the speed limit is but it would certainly be unwise to attempt more than 20, but that doesn’t matter because the road is for accessing local buildings not crossing the city.
I (or my parents) didn’t choose this arrangement because of some car hatred, that’s just the way it happened to be.
One question is what happens if you reverse the status quo bias. suppose there’s a 25mph street outside a school and 3 proposals:
1. Do nothing
2. Build a fast 55mph bypass road for the traffic that wants to go from roughly one side of the school to roughly the other
3. Widen the street next to the school and increase its speed to 55mph, but only outside of school hours. Keep the turnings for the school and other buildings on the street.
Perhaps there’s some yawning cultural gap between us but option 3 seems pretty terrible to me as it looks like it does little to help the traffic or the school. But maybe it would be the only feasible thing to do and building a new road wouldn’t be an option. When I look at my schools I described above, the first went for option 2 (well the bypass was more for the slow narrow bendy streets than the school) long before I attended the school, and the second went for option 1.
But we're talking about self-driving cars. There's won't be any drivers. When I'm on the train I don't complain about the speed limit because I know fuck all about what speed is safe on that line. Same thing.
Also there shouldn't be a fast road right next to a school. It's insane that such a thing exists.
I've seen many signs that say 25 mph when children are present. That becomes an interesting judgment call: if children are walking down the street or on the sidewalk, it's clearly triggered. But what if children are playing in a fenced in part of the school? How would a computer know that those kids are not able to easily climb over the fence, and are therefore not 'present' quite in the same way as if they were on the sidewalk?
I've talked about this with my lawyer friends (I am a lawyer also), and we've never been able to figure out exactly what this phrase means. Surely it isn't meant to be interpreted literally, as this would imply that the lower speed only applies when multiple children (not a single child) are present.
It's not meant to be interpreted like a computer program, it's meant to be interpreted like law. I'm pretty confident that a judge could reasonably conclude there's an implied "any" quantifier for children, with 1 child satisfying the criteria. Regardless, here's some caselaw that said it only means when children are arriving and leaving:
Indeed, here in Michigan most of these signs have hours that they apply, sometimes it's like 7:00 am-4:00pm, other times they split it into a morning and afternoon session...
In my pre-law days, I tried to argue to an Illinois county judge that it meant outside not just kids were inside the school that day. Got the ticket anyway.
Wow, the judge considered kids to be present when there were no kids around? That seems pretty outrageous, if the sign said "present".
That doesn't seem to comport with how people actually drive when students are in school but not visible. It also seems questionable to require people to know whether kids are in a school at any given time. You can be fairly sure they're there Mon-Fri during most non-summer weeks of the year, but there are also kids at the school after school for many hours, practicing sports/drama/science bowl.
I'm pretty sure I'd get honked at if I drove 25 MPH past a school in a 35 MPH zone at 5p on a weekday, if there were no kids around.
How does a self driving car make that determination? Query the school district website for the school, identifying their bell schedule and tacking on a buffer ahead and behind? Assume a school schedule that’s M-F? What if it’s a religious school that operates Sun-Thursday? Now the car has to determine which religious sects obey which calendar? Is it different in each country?
Just another example of a massive hurdle self driving cars have……
This isn't just a problem for self-driving cars, school zone laws are ambiguous, even for human drivers... and the rules can differ by locality. This is a problem that states and localities need to figure out.
The rule was created to lower the risk of kids getting hit by a car.
So let's focus on achieving the end goal and updating whatever law that is necessary along the way rather than taking the existing laws as gospel.
Here it could be signaling to the car where the school or construction zone is so in addition to it lowering its speed it should also bias its perception of kids or construction workers more towards false positive than false negative.
Tesla have two very distinct cruise/steering product grades, one is called Autopilot and another called Full Self-Driving. It seems unlikely to me that Tesla would claim that their first product was their second product.
This has nothing to do with the article really, but stupid signs will always be a problem as long as they exist. In Austria, there are signs on the highway that allow electric cars to obey a higher speed limit (130 Km/h instead of 100 Km/h; https://autorevue.at/ratgeber/ig-l-immissionsschutzgesetz-lu... ) provided they have (optional, but now standard) green number plates. My car doesn't know that it doesn't have the green plates, so it cannot know what limit to use. Also, it can't look up the paragraph quoted on the sign to read the current version of the law, presumably.
Are you not running FSD beta? That's the subject at hand here. And FWIW: my car sees and honors lit school zone signs in my neighborhood.
> How does a self driving car make that determination?
How does a human driver make that determination? They don't. Those signs don't work, they never have and never will, and they certainly won't in the age of autonomy. Frankly if you *did* want to come up with a solution to this problem, I'd bet on it working better as a data management problem (i.e. do exactly what you said and then geocode the results for the car to interpret) than as a signage problem (i.e. trust people to honor rules that they'll never understand).
They don't. This is why stocks like TSLA should NOT be subject to quarterly earnings reports, they should be subject to quarterly safety reports instead and earnings should not be public for a long period of time.
Tesla leads the world in terms of self-driving fatalities. It's not even close; the total deaths involving self-driving vehicles from every other automaker in the world combined is less than 1/10th the number of Tesla self-driving fatalities.
Note that I include autopilot in the meaning of "self-driving" here because Tesla does in its marketing.
In terms of total deaths per miles driven, all factors considered, Tesla is very close to the top of the best in terms of safety.
Yeah, there are idiots misusing autopilot but there are about a couple more orders of magnitude more people who die due to drowsy driving a car without driver assistance features.
When has Tesla ever described Autopilot as being Full Self Driving? Regardless what you think of these names, they are nonetheless distinct product names under the Tesla umbrella. Conflating them seems akin to Ford mistakenly describing an Ecosport as being an F150.
We have an economy run by a corpus of idiot shareholders who neither use the product nor are affected by the product, and are free to crash its value at will, and will crash it when earnings doesn't "beat" expectations, so the company is forced to prioritize the entirely wrong things.
If a company wants to "accelerate the transition to clean energy", quarterly earnings is NOT the thing to be prioritizing. Earnings are important for business sustainability, but on much longer cycles than quarterly.
School zone signs around my area actually include a bit of genius design: they are hinged in the middle [1], so during school holidays, the signs are covered up by closing the hinge.
Can they handle construction zones where lanes split all over the place and speeds change? I'm guessing no. Based on the constant amount of scuff marks along the guards of the 5 freeway and pretty much every other freeway in California I'm assuming thats a hard challenge even for human drivers.
All they need to do is recognize a construction zone a minute ahead of time and get the human to take over. This would allow level 4 through construction zone. Level 4 is self driving "in the easy parts", and give humans enough time to take over in the "hard parts" (level 5 is everywhere, and as you note a much harder problem). Note though that you can't just stop driving, you need to allow time for the human to figure out what it going on, how to properly have humans take over is itself a hard problem.
For what it’s worth, the one thing my Tesla has been consistently good at is picking the correct lines out of a jumble of nonsense on the road.
I’ve used AP heavily in highway construction zones, and at night with bad visibility, and in inclement weather, and in combinations of the above. AP does better than I can manually do at picking the correct set of road marks to follow, even in cases where the construction has left partial incorrect marks underneath / conflicting_with the right ones.
AP has a ton of other issues of varying levels of severity, but if you’re asking “can I trust it in a construction zone”, I’d say yes based on my usage.
No, they fail spectacularly at ad-hoc road work lane setups or setups where the "official" lanes are temporarily blacked out. I think that was the cause of the Tesla smashing into a freeway median a couple years ago.
To be fair i wouldn't know that some religious schools are S-Thurs unless it was spelled out on a sign too. Guess a thorough understanding of all types of religious schools and their operating days of the week should be compulsory for getting a drivers license.
>Just another example of a massive hurdle self driving cars have……
This is my thoughts as well, that the promises of AV may have grossly underestimated the problem space. Solving the problem of perception does not equate to solving the problems of driving.
That, among other things, is pretty much why fully autonomous cars == artificial general intelligence; and that's also why it won't happen this year, neither the next one, neither on the next 10. It will happen someday, for sure, just not soon.
Having the ability to reason about what you see can be helpful but it's not necessary.
Whatever the most developed version of the Tesla FSD software is, it already recognizes that people (children) can hide behind objects.
If you look at the way the FSD development is made you'd notice that they changed from coding to designing AI around a particular task. This is something they started doing quite recently.
Inferring speed limits from sign data is one problem but you could extend this to classify driving conditions based on image data from all cameras. Trying to detect a reasonable speed limit based on that, forgoing sign detection entirely. Sure, the car can get it wrong but if you err on the side of caution (which you do) the car will pick the slowest option based on conditions and you won't care because you aren't driving anyway.
As horrible as this may sound, ultimate, the system will prove itself when it is involved in fewer accidents and at that point it is a benefit to all. If this doesn't happen then Tesla is wrong but it sure looks to me like they are moving along this path nicely.
Driving a car is a simple problem. At least compared to something like general intelligence. You only need to know if it is safe to drive in a particular direction and sure this is a complex problem as you go into the details but it isn't general intelligence hard. We don't even know what intelligence is but we do have something that is looking a lot like cars that can drive themselves.
One way to address this problem is to designate which sections of road are school zoned and which are not. Then, include information about the school zone schedule. I can't imagine this information hasn't been digitized in some way yet.
It's an easily solved problem: the state legislature writes a law mandating a statewide database, and schools are required to enter their information.
Like some other commenters pointed out, ad-hoc situations like police directing traffic and one-direction-at-a-time utility work are a much bigger concern.
I know some municipalities publish this information on their city websites. Granted, I do not know much about municipalities and GIS, but I imagine it is possible this is in some format that can be made available to map data services.
The mapping data that Tesla uses does know about school zones, as well as other time or season based speed zones, and even weather-dependent speed zones. They all get rendered on screen, with the currently-active zone shown.
Couldn’t there just be some google maps / waze API that jurisdictions can enter speed limit information into for different days and times of the week, and just have the car query that?
As far as I know, these signs always have either flashing lights when active or a printed schedule. I don't think it would be enforceable against human drivers otherwise.
Very very far from "always". Many say "when kids are present" or "on school days". Some rely on external signs to indicate whether the zone is active or not, and some localities within the US have completely unique signs.
And this is just the US. Imagine how much variety there's gonna be outside of it.
Okay, so after checking the MUTCD, the "when children are present" sign is allowed, along with either a list of times or when flashing.
One state defines it as any of the below:
> (1) School children are occupying or walking within the marked crosswalk.
> (2) School children are waiting at the curb or on the shoulder of the roadway and are about to cross the roadway by way of the marked crosswalk.
> (3) School children are present or walking along the roadway, either on the adjacent sidewalk or, in the absence of sidewalks, on the shoulder within the posted school speed limit zone extending 300 feet, or other distance established by regulation, in either direction from the marked crosswalk.
So this would be a bit difficult to implement. Probably best handled by defaulting to slow down and let the driver override if they determine no children are present.
That's not the biggest problem: All drivers have to obey hand signals from policemen and other people who are authorized to direct traffic. Which means your car has to know what police officers and state troopers and highway patrol and mounties look like where you happen to be located right now, and not be confused by some dude dressed up as one. Oh, and it has to understand the hand signals as well.
I don't think it's reasonable to expect either car or human to determine the distinction between "person dressed as cop" and "sworn police officer acting in accordance with his duties".
Probably shouldn't anyway. It's not unusual for civilians at the site of some unexpected hazard to warn traffic.
"How can Tesla claim self driving if the car can’t read a sign that says - speed limit 25 mph during school hours, and properly adjust?"
Self driving will always be dangerous unless overall traffic infrastructure is updated.
Can you imagine a train where the 100% of the onus of auto-baking falls on the train itself, without zero input from sensors and towers outside the train?
> Self driving will always be dangerous unless overall traffic infrastructure is updated.
I don't see how people can propose this kind of thing with a straight face, when we live in a world where we can't even afford to replace the paint on the road when it gets worn away.
Yeah, sure, governments everywhere will be lining up to pay billions of dollars for putting up and maintaining new infrastructure to provide us with some low ROI shiny toys. And I have a bridge on a blockchain to sell you.
We would need to re-think resource allocation. Fewer deaths means smaller healthcare expenditure. Disability claims. Improved road infrastructure doesn't need to be that expensive.
Yes? That is pretty easy to imagine actually. It would be much less efficient than the current system with a central train controller, but it is definitely imaginable.
In the case of Tesla massively failing to drive safely in a school area: If you cannot operate safely, don't allow autopilot to engage at all.
Yes! I would rather spend resources to standardize “smart” traffic control infrastructure, where vehicles and road/street constantly communicate with each other even if it would just be for augmenting drivers’ awareness. For example in-car warnings about abrupt stop ahead, train approaching level crossing, positions of nearby vehicles, traffic light cycles, actual speed limits etc… Training “AI” to make sense of (sometimes barely) human readable signs and clues is waste of time in my opinion. Maybe just for helping with low speed obstacle avoidance…
I think the central thesis of OP is that the current infrastructure is built for humans. We seem to do OK. So, if anything, these kinds of issues are an indictment of the failure of self-driving tech that was boosted to insane hype around 2017-2018. Now we're getting to a phase called "Trough of Disillusionment" in the Gartner terminology. If we require the rest of the infrastructure to be rebuilt for self-driving tech, then it is a irrefutable admission of failure.
Reading speed limits signs is the easy part compared to figuring out how to handle all the kids walking to school on the sidewalk next to the road (any kid might get pushed into traffic, or decide to start crossing right there...). Even figuring out school hours is easy. Of course it isn't just kids, I've had to handle a bear on the road in front of me before.
> Can you imagine a train where the 100% of the onus of auto-baking falls on the train itself, without zero input from sensors and towers outside the train?
I'm not hugely familiar with trains, but as I understand it, trains in the general case have a much worse braking distance to visibility ratios than cars do.
Roads are generally designed to be safely navigable in good conditions when their occupants are obeying the speed limit, without external sensors. Rail lines are designed to be safely navigable only with the aid of external sensors. That's why trains can take blind corners at speed.
Many years ago I was surprised to find out how traffic-detecting intersection lights worked.
I assumed computer vision had been solved and they had reliable enough detection of cars, so you could just plop a few cameras in the intersection to detect waiting cars and start the transition when safe.
Nope! Instead, large numbers of intersections are dug up and electromagnets are installed to detect cars. These are very sensitive (but can miss cyclists) and super-simple; a microcontroller can reliably detect a car as a pulse on a line.
Not specific to tesla but I guess the answer is in your question. Most likely all the current self driving efforts across board will fail till we have smart roads. A road that can help a vehicle self drive will ensure the complexity of self driving is reduced. And of course that might mean these roads will have to be human driver, pedestrian free.
I was rear-ended in Colorado when I didn't do a rolling stop. Had a road-rage incident when the other driving was livid that I would stop under thoose circumstances.
My dad (when gowing up) was pulled over for a cop for doing a california stop in Colorado. The police officer let us off, because we had california plates.
When I got my Tesla, it was immediately evident that _no one_ travels the speed limit. They travel 5-10mph over it on anything other then a local road. or they drive way under it.
This is the problem with our mental model - FSD/Autopilot has to co-exist with human drivers - and human drivers don't follow the law. FSD/Autopilot has to exist in a world where roads do not follow standards, lines are not marked, deer jump in front of cars, people do California stops (it's called that for a reason), cars have accidents.
> When I got my Tesla, it was immediately evident that _no one_ travels the speed limit. They travel 5-10mph over it on anything other then a local road. or they drive way under it.
It is amazing to me how angry and insane people driving on the roads can be.
I don't want to have a constant battle with every other driver on the road, I want to set cruise control, stay in my lane, and turn off on my exit when I need to. Doing this almost always triggers a tail-gaiting session and a driver becoming enraged at me ( I am not in the passing lane, for the record ).
99% of the pain in driving is other drivers deciding to just be assholes and not let me merge onto a highway for no apparent reason. Perception of "me in front me go fast" maybe?
Driving today, it is pathetic. We are all locked into our little cages and make each other miserable for no benefit.
I'm completely fed-up. There should be a system where I can send dash-cam footage for review and get a ticket mailed to their house. Don't defund the police: direct them at the asshole-fucking-drivers making everyone's life miserable while they play speed racer on the freeway.
>Doing this almost always triggers a tail-gaiting session and a driver becoming enraged at me ( I am not in the turning lane, for the record ).
Or the fast/left most lane I hope. I cruise control in middle lanes and still have people tailgate like there isn't one or two open lanes to the left of me
One of the most annoying behaviors on the section of I-5 nearest where I live are people cruise controlling in the middle lane. I try so hard to drive courteously and safely, but middle-lane cruisers force me to make exceptions to that philosophy, just so I can make it from the left side of the freeway to the right.
It also causes other traffic to resort to high speed passing on the right and left simultaneously because they are aggravated with the moving blockade.
So I have no sympathy for people who cruise control unnecessarily in the middle lane. I wish the cops would write them tickets mercilessly until they stop. Traffic on the freeway flows smoothest (and safest!) when there is a comfortable difference in speed between the lanes, predictably from slowest to fastest as you go farther left.
I’m confused. I usually like to cruise control in the middle lane (at around 3-5 over the speed limit); how does this cause problems for you?
If I were to stay in the right lane I’d be stuck going 5-10 under because of all the semis & constantly having to dodge merging traffic. If I stayed in the left lane I’d piss off everyone who wants to go 20 over. This seems to be how most people drive, and it achieves your graduated-speed-difference standard: the right lane is for semis & getting on and off, the middle lane is for most traffic traveling at a comfortable speed, and the left lane is for passing.
> I’m confused. I usually like to cruise control in the middle lane (at around 3-5 over the speed limit); how does this cause problems for you?
If it were one car, it would be a minor annoyance, at worst. But this style of driving is really popular, so it's a steady line of cars cruising the middle lane in a line. It divides the freeway into two one lane roads, bottlenecking traffic and making it difficult for people who are in the left lane to get to the right in order to take their exit.
It may be that the dynamics are different in your area. Between Portland and Salem, Oregon, the exits are several miles apart so there's not much merging traffic.
> What would you propose I do differently?
In my experience the best freeway experience is when people pass only on the left, and there is enough of a speed differential between lanes that when you need to change lanes you can always find a spot you fit in. It gets aggravating when traffic is pacing, especially at high speeds, because it makes it a lot tougher to make comfortable lane movements. It's worse when the left lane is filled with cruisers, because they always bunch up and follow each other with a 1-second separation, making lane switching hard (and when you turn your signal on, at least half of them will just speed up to make sure the gap is even tighter).
I'm not going to tell you how I think you should drive. There are a lot of factors that are not necessarily the same between where I drive and where you drive. But I can tell you that the dynamics on this section of I-5 up here are just awful. It's a well known local phenomenon, everyone I've ever ridden with through that section of I-5 comments on how terrible the traffic gets with passing on the right, left, and locked up down the center.
Thanks for clarifying! I frequently drive that same stretch between Portland and Salem. It can definitely a really annoying drive, especially when there's congestion.
The problem with staying in the right lane for any period of time is all the semi traffic. It is in my experience very rare for the right lane to be going at or above the speed limit; realistically, if I stay in the middle lane I am always passing people in the right lane. I will of course get over if the right lane is clear and traffic behind me wants to pass, but this doesn't happen often. I do get passed on the right occasionally, but only by aggressive drivers who use the 5-second gaps between semis to weave across the road -- not by the normal flow of right lane traffic.
I share your frustration with the difficulty of changing lanes, especially in congestion. However, I would argue this is more of an issue with people following too close than using the wrong lanes -- if the right lane was a solid wall of cars, you wouldn't be able to get in there either. In any case, I usually don't have too much trouble if I give myself a few miles before my exit to get over -- though sometimes that can require forcing my way into an annoyingly small gap.
That all being said, I do agree that there are a few spots where the lane arrangements can become really dangerous, especially during congestion. For instance: the first few miles of I-205 northbound near Tualatin/West Linn, the weird bridge in Oregon City and just about anywhere on or near OR-217.
This is exactly my point, rootusrootus. You string words together, but you do not have a coherent point on how roads should work.
So how the fuck am I supposed to drive to please you? Does everyone need to drive at a reasonable pace, in their proper lane? Or do they need to drive 10 miles over the speed limit trying to go as fast as possible and pass the person in front of you?Oh, should we also have a fucking neural uplink into your brain to make sure you don't have to waste prescious time signaling to get into the lane everyone is SUPPOSED to drive in for long distances?
There is going to be unpleasantness driving. Sometimes car go slow. Try not to shit your pants when you can't swerve into whatever lane you need to be in.
You don’t need to try that hard. It isn’t that much effort. Just accept the fact that going +5-10mph will hardly save 5-10 mins on a 100 mile drive but going near speed limit (+ 2 to 5 mph) will result in everyone having a safe and easy drive.
People trying “hard” to drive safe are the most dangerous on roads as they think they are constantly changing lanes and going around others for safety reasons but actually result in more danger and annoyance for everyone else.
At what percentage of the speed limit are these cruisers you’re talking about? Assuming it isn’t >10% under the speed limit, what’s your issue actually?
In the situation you’re describing, if it’s a 3-lane highway and you’re in the fastest lane you need to plan and prepare properly to start transitioning well before you want to exit. If one or two cars at constant speed are preventing you from transitioning in time, the failure is on you.
Having a constant safe speed in the middle lane sounds ideal.
> Assuming it isn’t >10% under the speed limit, what’s your issue actually?
Slower traffic keep right is a thing. You are not entitled to stay in the passing lanes just because you are going at least the speed limit. My issue is that it really mucks up the capacity of the freeway and increases the suffering for everyone using it.
> you need to plan and prepare properly to start transitioning well before you want to exit
I think I didn't make myself clear. It doesn't matter how early you plan, whether 10 seconds or 10 minutes. You will just have to put your signal on and take a spot, even if it puts you much too close to the guy in front with someone much too close to your bumper. Because the cruise controllers drive like a train, and don't care to leave any reasonable gap between them.
Really, the reason traffic sucks in so many places is because people think laws like 'slower traffic keep right' are there to infringe on their rights, when really it's because traffic engineers have studied how traffic flows and made regulations to encourage it to be safe and as little congested as possible.
i see this thread has really gotten active with you and i mostly agree with it, but I didn't see anyone pointing out your description of cruise controlled cars being a blockade in the middle. I definitely don't see it that way with my adaptive cruise control (i never ever use regular cruise control anymore). Even at the closest follow distance setting, I am always about 2 cars distance from the one i am following. I have only ever used my one particular adaptive cruise control (subaru impreza) so I am not sure if other car models tend to follow closer and block the ability to change lanes.
>You will just have to put your signal on and take a spot, even if it puts you much too close to the guy in front with someone much too close to your bumper
This description makes me think you're talking about stop and go traffic, or at least where the entire highway is mostly saturated. At that point I hardly think any one particular lane is going faster/slower than the rest but that could be wrong. In this case though, I don't use cruise control because I mentally switch over to the mode of "gas and brake the least possible" to even the flow of traffic and not contribute to brake snakes.
edit:
oh i see in some other comments you're talking about 3 lane i-5, I have only been on mostly 2 lane sections of i-5 where it's just trucks on the right and people waiting in a line to pass said trucks on the left... but three lanes I haven't had any issues
This'll be my last reply on this topic, because I figure I'm talking to a brick wall. No offense intended :). I'm sorry that people don't really understand why the whole 'please stay to the right except when passing' rule makes the freeway flow much better. A line of cars bumper-to-bumper in the middle lane on cruise control causes so much congestion. But people do it because they don't want to be inconvenienced by the occasional traffic getting on the freeway.
If it were a four or five lane freeway it would be less egregious. But what it does to I-5 when it's only three lanes is really terrible.
Ah okay. Where I've been the right lane is not occasionally impeded with traffic merging and exiting, it's almost constant, so lots of people hogging the left lane actually causes blockages from people trying to enter and exit.
I also haven't noticed the middle lane blockage problem that much. It's very easy to merge when people are traveling a constant speed in a predictable lane.
I can see where your problem is coming from though. Don't feel like you're talking to a brick wall though it's the nature of the forum it looks like a bunch of people all asked the same question around the same time (I didn't see the others when I hit reply) so you're actually responding to 3 different people, not one person who keeps being unable to follow.
> My dad (when gowing up) was pulled over for a cop for doing a california stop in Colorado. The police officer let us off, because we had california plates.
I'm confused. A "California Stop" isn't even legal in California.
I don't disagree. It was the comment about the California plate - almost like it was supposed to validate the practice - that I don't understand.
I commented as I have a concern that an uneducated reader may walk away with the idea that it's legal in that state. It most certainly is not.
Apparently, though, Idaho does have some provision for bicyclists and rolling through stop signs. So, there you go - an "Idaho Stop" is a thing for any potential bicyclists in Idaho.
I am also more familiar with the term "california stop" than "rolling stop", and I was taught it by my parents who have never lived west of Texas. Looks like this admonition did a California stop all the way to the east coast.
It's not, but approximately everybody does it. I've even seen cops do it. Once, I sat at a stop sign for a while, just to see if I could observe anybody actually stop. Nobody did.
> A "California Stop" isn't even legal in California.
I'm informed that these are actually legal as long as you don't disrespect the police by doing it in front of them. It's the same way that going slightly over the speed limit on freeways is legal as long as you don't insult the police by passing them.
Going over the speed limit is not legal, but no officer will pull you over at +1, unless you piss them off for some other reason, like by being black in a white community, for example. Racial biases aside, this is actually a really good thing. If every law was enforced perfectly today, everyone would be arrested, fined, or jailed tomorrow.
realistically everywhere on I-5 between Seattle and the border, people do +5 over, and often 10 over. I'd wager it's near impossible to get pulled over for doing 65 in the 60 zone north of Seattle, and when the speed limit changes to 70, for doing 75 in the 70 zone on cruise control.
Usually when I have cruise control set to exactly 75 and I'm in the right lane, I'm very often passed by people who are probably doing 80 to 82 mph.
In New York, the highway norm is 13-15 over for the "I want to go fast but not get a ticket" crowd. (i.e. 80 in a 65).
AFAICT, the reason is that there's a big jump in the median fine if you're going >15mph over the speed limit vs <= 15mph over. Economically-speaking, it's not worth the police officer's time to pull somebody over who's "merely" 10 over, because somebody going 16+ over will appear in a few more minutes of waiting.
Interestingly, the state website says the official "maximum" fine bump occurs at 11+ mph over. I suspect they wait for 15mph to rule out the plausible cover stories "I was only going that fast to overtake the other car" or "new here, I didn't realize the incline was so sharply downhill with nobody in front of me".
You probably wouldn't get pulled over for going 80-82 in Washington (i.e. they people passing you probably have their cruise controls set for 80), but it might be less enjoyable of a drive. You have to pay more frequent, closer attention, because you're overtaking people more frequently.
Ironically, California dives deep into "low fines for speeds under 15 over". the under-15-over fine base is a meager $35! For 16-25, it jumps to $70, which is still paltry.
> I'm informed that these are actually legal as long as you don't disrespect the police by doing it in front of them. It's the same way that going slightly over the speed limit on freeways is legal as long as you don't insult the police by passing them.
You and I have a very different view of what makes something legal.
Many years ago had to do traffic school in california. Instructor took us out of class to go watch people at a four-way stop sign. Nobody stopped. We even got to watch a cop not stop.
One thing I'd like modern cars to implement, given that they're supposedly close to self-driving, is to cap your speed at some percentage above the speed limit.
There is no reason why a driver should be capable of driving the car 130 km/h in an 80 km/h zone.
I've always wanted a slightly less draconian version of this- a speed limit setting that makes the throttle pedal harder to push down once you've hit the a given speed. Sort of like a natural cruise control. This would keep you from accidentally creeping up over the limit, which is certainly something that I do, but still allow you to go faster if you make the conscious choice.
Mercedes had a version of this called a "haptic pedal" in their previous gen C-Class PHEV. It would provide a small bump of resistance before you engaged the gas engine. It also would gently push back when it saw the car ahead was slowing down, to encourage efficient coasting. A quick search shows they spread it to a few other cars.
Both emergency situations exist & also cars aren't that infallible, so there would need to be a way to override such a system or disable it. Otherwise imagine the chaos of a track day or autocross when someone's car suddenly thinks they are on the road next to the track and slams on the brakes...
If you gathered up every single one of these edge cases and multiplied them by 10 you wouldn't get anywhere near the actually-existing, non-hypothetical, measurable problem of speeding.
So, sure, we should talk about edge cases and people who implement this technology should work on them, but even if they didn't, the tech would still be a good idea given fairly pedestrian calculations of the relevant tradeoffs.
And even if we accept your premise that the edge cases are a big deal, the solution presents itself quite readily: let there be an override with serious consequences for frivolously engaging it. Simple.
For every suggestion to curb the car problems of society, there will always be someone bringing up edge cases to defend status quo.
How many meters are done on a track each year, compared to public roads? I'd wager most people have never even set foot on a track. Cases like that should have absolutely no bearing on policy making.
And emergency is always used as am excuse as well to not limit car use or rebuild streets. If anything, getting rid of traffic means that it's now easier for them to get to where they are going. Same for disabled people. It's always people shouting "you can't ban cars here, think of the disabled", when the truth is that getting rid of most of the cars and building better infrastructure would make their day easier.
I think the issue of "always bringing up edge cases" is meant to illustrate that a LOT can go wrong with some of these hard and fast rules. It is why humans are simultaneously great and terrible at driving. One slightly bad driver can be accounted for by several slightly better drivers navigating around the bad one's behavior. This smooth "self-correction" on the roadway leads to lots of rule breaking but also probably saves a lot of time/lives by allowing individual humans to navigate a situation (especially since lots of people forget the exact rules of the road...ex. do you yield in a turning situation or not, what exceptions would you not yield?)
However, I really do like your last point, the easiest way to solve is just remove the cars from as much of the equation and introduce transit that follows rules (trains, busses -higher compliance with road rules perhaps) and let humans navigate the last meter themselves
> I think the issue of "always bringing up edge cases" is meant to illustrate that a LOT can go wrong with some of these hard and fast rules.
This makes more sense as an objection in a vacuum than it does in the reality where the alternative is that speeding is a big problem within cities. We don't have to solve every edge case. We just need the edge cases to be less of an issue than speeding already is. And that's a much easier bar.
Asking, "what if we do this?" is fine, but one should also always consider, "what if we don't?"
But that is the point... if we don't do something here, humans can continue to take their own actions to resolve the situation (speed up with traffic or move to the right or choose an alternate route). That level of flexibility allows for each person to make the choice they want and feel comfortable with (as opposed to the car making a decision for you)
Let's be clear here that "letting the car make the decision for you" in context means "you can't go as fast as you want in an urban area because the car will prevent it." As in, you can't drive many multiples of the speed limit in urbanized areas packed with buildings and pedestrians and other cars. The car should prevent that in practically all cases. The car does know that that's a good idea in practically every conceivable case. A human driver who thinks they should drive many multiples of the speed limit in an urban area is wrong.
It's so manifestly wrong that it shouldn't even be technically possible. There's no good reason to manufacture a car that responds so stupidly to such a dangerous and irresponsible input.
And when the day finally comes that the conditions are just right so that a driver dies attempting to flee an avalanche because this tech prevented them from escaping at speed, we will put a lone checkmark in the "lives lost to this technology" column that's adjacent to a "lives saved" file 10 miles long.
More often is people trying to implement solutions without thinking the problem through to the end.
Like trying to ban the sale of combustion engines without having a robust charging infrastructure. Or making parking in a downtown area a nightmare without improving the public transit infrastructure.
Or in this case, trying to implement maximum speeds in cars when the technology (GPS? Accurate speed limit maps? Extra sensors on cars and roads?) isn't good enough or isn't there.
> How many meters are done on a track each year, compared to public roads? I'd wager most people have never even set foot on a track. Cases like that should have absolutely no bearing on policy making.
It absolutely should have a bearing as you are essentially arguing that those cases you should be illegal & banned. You can't ban something and then argue that "well, not many people were doing it anyway, so we can ignore it even though we're banning it."
Your policy impacts it, therefore your policy must account for it. Even if that account is to decide that the impact is justified, you are making significant changes to things outside of your stated & claimed goal.
> And emergency is always used as am excuse as well to not limit car use or rebuild streets.
I did no such thing?
All I said is there needs to be an 'off' switch, just like there is for traction control in the majority of vehicles today. The overwhelming majority of people leave it enabled (as that's the default), and the world is better. Rules don't need to be black & white to have broad societal improvements.
> It's always people shouting "you can't ban cars here, think of the disabled", when the truth is that getting rid of most of the cars and building better infrastructure would make their day easier.
Except you were arguing that cars should be banned for uses outside of being on the road as part of normal traffic. As in, your "rule" wasn't improving your hypothetical excuse situation here. It was instead making it the only legal usage of a vehicle, to compete with mass transit.
Good luck passing that law when we can't even gather political will to put in average speed cameras in most cities, which would also solve speeding without needing self driving.
British Columbia once had radar cameras on the highways. Instead of getting pulled over for speeding, you’d get a ticket in the mail. People hated it and consequently we elected a thoroughly corrupt government on the promise to rescind the practice.
But for a while there we all drove at around the same speed, within about 10% of the posted limit, and it was fucking glorious. It was so much less stressful: very little passing of one another, fewer idjits weaving in and out, left turns off the highway less of a guessing game, fewer accidents … it was just fantastic when most everyone did the same speed.
Of course now we’re worse off than ever, because without cameras the cops essentially gave up enforcing speed limits and now there’s an over 50% difference in speed between the slower and faster vehicles.
Tesla supports setting a speed over/under current area's speed limit (not sure if it's disabled in certain jurisdictions - worked for me both in California and Florida)
I believe there is a push in Europe to include this in cars based on computer vision. It was pitched as "not above speed limit". I would at least hope it is "not above speed limit for long" or have some form of slack.
So, you're entirely removing the option of "escape" from people caught in bad situations then. I'm not sure that's wise. Further, you'd have to ask, how hard would it be to disable that? Then, given that some users will have this disabled, how does that impact the roadway as a whole?
I believe that this is one of the worst ideas about modern cars I have ever heard.
It's almost equivalent to saying that we should replace all metal knives in kitchens with plastic knives.
"There is no reason why a person should be able to use a metal knife to hurt someone else."
Some of the problems with your idea :
1) A lot of places don't have speed signs - then should we allow people to drive 130 in 80 zone?
2) Outdated / incorrect signs - not every place on the planet has properly maintained roads.
3) AI mistake - if an AI mistake leads a car to read 50 kmph as 30 kmph, what then? Should the driver suffer?
4) Emergencies - have you ever been in one? Imagine if your car led to someone's death because you didn't trust people with their own car
5) Empty roads - if you have an empty road in front of you, you can go 110-120 in 80 zone. This practice is followed at every single place on the planet.
etc. etc.
You know the worst part about your idea? It's so insane that it actually will be implemented in the next 5-10 years. That's the part I hate the most. Bad ideas are not a problem, but actually implementing them is.
All it takes is 1 exec in Tesla who thinks like you or who has had a loved one die by overspeeding.
Any opposition to your idea can be crushed by saying "do you want people to die by overspeeding"?
We won't ban alcohol, but we want to ban overspeeding. The irony!
1) in my country the default is 80 km/h when no sign. Use that, then. Like seriously, do you think people today think "wow, no sign, I can drive 200"??
2) a sign isn't outdated, it's by definition the limit
3) so what? No one gets hurt by someone driving a bit slower for a short stretch.
4) if you care so much about emergencies, shouldn't you be for this policy? Tens of thousands die each year in traffic accidents, isn't that worth solving vs some obscure edge case that might perhaps sometime cost someone their life??
5) no, follow the speed limit. You're the kind of driver that makes this kind of law necessary. It doesn't matter that the road ahead is empty, if you misjudge the next turn and kill someone. What a crazy, selfish and entitled attitude you have.
> All it takes is 1 exec in Tesla who thinks like you or who has had a loved one die by overspeeding.
Your lack if empathy is astonishing. This happens all the time, and is why it should be implemented.
You can buy metal kitchen knives in most parts of the world but you may not be allowed to walk around town pointing them at people, and you may not be able to buy a claymore even if it is just for cutting cheese or opening letters. And speed limits can be >60mph even when 30-40 may be dangerous. So I don’t really understand what your analogy is, because in the real world there are reasonable restrictions on things, and where the proposed restriction still allows things that are unsafe.
You missed the point of my analogy. You CAN point your knife at anyone and even attack with it. You will be punished for it. That's the point.
The knife is not going to stop you from pointing it at someone. If we want car to force people to not overspeed, perhaps we should replace metal knives to plastic ones to prevent people attacking other people with it.
My entire point is that the responsibility to drive safely should be on the driver. The car should not force anything.
Also, you should read the rest of my answer to see why I think it's a really bad idea.
In 2020, the US had 1,700 deaths caused by knives and other cutting instruments[1] and 38,000 deaths from cars[2]. Cars clearly need more safety regulation than knives.
You need to design the road to the speed you want, not the other way around like it mostly happens in some countries. That solves quite a lot of problems.
I think it’s reasonable to say “nobody” follows all the rules of the rules of the road. The ones you deem important enough to rigorously follow come down to local social dynamics/norms more than written law.
I’m from Colorado and drivers there are much tamer than the Bay Area - and rightly so! Traffic conditions in SF/peninsula (where I live now) are way more hectic & dense than Colorado. It’s almost weird to expect uniformity across all regions
I don't think it's that weird to expect people who are licensed and tested on the rules of the road to actually follow them. Granted, if you do have that expectation, you'll be disappointed approximately 100% of the time, but I don't think the expectation itself is that weird.
The more insidious problem is that much of this is by design. Legislators know that people don't obey the speed limits. Police officers too. It just gives them license to stop and / or charge whomever they please. It is an obvious and vulgar workaround to undermine the principle of equal protection.
Colorado highways seem to have a weird thing I haven't seen elsewhere. The right lane is consistently going ~10mph below the speed limit while the left lane goes ~10mph over. Any and all middle lanes are also a bit of a crapshoot.
In my experience in Colorado, the right lane is usually empty and it's often easier to go faster than traffic in the right lane (at least on sections of freeway without exits close together). The left lane is usually traveling at least 20mph faster than the limit (on I-25 anyway). 85mph in a 65mph zone is the norm.
The actual implementation is a little more nuanced than always roll or never roll, with seven conditions required to avoid coming to a complete stop:
1. The functionality must be enabled within the FSD Beta Profile settings; and
2. The vehicle must be approaching an all-way stop intersection; and
3. The vehicle must be traveling below 5.6mph; and
4. No relevant moving cars are detected near the intersection; and
5. No relevant pedestrians or bicyclists are detected near the intersection; and
6. There is sufficient visibility for the vehicle while approaching the intersection; and
7. All roads entering the intersection have a speed limit of 30 mph or less.
If all the above conditions are met, only then will the vehicle travel through the all-way-stop intersection at a speed from 0.1 mph up to 5.6 mph without first coming to a complete stop. If any of the above conditions are not met, the functionality will not activate and the vehicle will come to a complete stop
Speaking as a city runner, you are 100000% wrong, and I have had numerous close calls demonstrating this.
If you roll through at 5.6mph while I am jogging into the intersection from your side, I will often be hidden by your A pillar the entire time right up until you hit me. And a 5.6mph point-blank hit to a pedestrian can be serious.
(edit: I have avoided these hits so far by watching people's eyes. If I can't see them see me, I stop regardless of right of way. This usually gives them a nice scare as they're rolling into the intersection and glance out their side window to see a person standing right there 'out of nowhere'. Not sure how I'll tell with FSD.)
As someone who did actually get run over when running through a crossroad as a kid, I _never_ run through crossroads since then, regardless of eye contact or not.
Runners (and the much loved sidewalk cyclists) are much farther from the road than a walking pedestrian. The drivers are NOT seeing you coming.
> Would you rather be hit by a semi-truck going 1mph or a bike going 28mph?
I'm actually not sure. The initial impact would hurt more with the bike, suggesting that perhaps momentum or something else proportional to velocity (not v^2) is a good heuristic there.
But if I tripped after impact, the bike would do no further damage to me and the semi would be ~1s away from killing me.
5.6 miles per hour is a fast jogging speed if you're on foot, twice as fast as typical walking speed. It is not a stop, doubly so if you're in a big metal box. Please don't do this.
Never having been stopped for doing this and confirming with police that they do not stop people for this are separate things, and I'm willing to bet that in your case it's the former, not the latter.
So presumably the Tesla is aware of all local laws and, more importantly, how strictly they are enforced from town to town, in every market the Tesla is sold or imported, in order to use this functionality safely and without resulting in ticketed traffic violations?
> more importantly, how strictly they are enforced from town to town
Does this dataset actually exist anywhere? I would bet that at best it might be able to be inaccurately inferred from other data.
Regardless, if we as a society decide to build self-driving cars, should we really be optimizing for the financial well-being of the driver (via tickets) over the physical wellbeing of the humans in the society (who are hit by FSD cars that roll stop signs in areas with less traffic enforcement).
Presumably because it matches the behavior of human drivers.
If self-driving cars were widely rolled out already, any large manufacturer could probably completely break traffic nationwide by rolling out an update that makes the cars actually follow every law and speed limit. (Just like work-to-rule can be as effective as a strike.)
Which, to me, just means the laws are wrong. The police often don't obey basic traffic laws like speed limits and signaling (in my experience of following police on the freeway).
I recently read an interview with a highway patrol officer whose opinion was that it's not really feasible for them to drive at the speed limit. Since a marked police car is already effectively a pace car -- no one wants to pass it for fear of being pulled over -- they would just distort the natural flow of traffic and probably create tailbacks everywhere they went, which would end up being more dangerous.
I'm not sure I agree completely with this reasoning, but it was an interesting perspective.
That's the same as saying that the speed limit (on freeways) is wrong because if everyone obeyed the speed limit, it would cause significant traffic congestion. In fact, you can see this when there IS a police officer driving the speed limit on the freeway and no one passes them. Traffic backs up for miles.
I recall seeing a forums post by a traffic engineer once and they said part of the process of deciding speed limits is, once the actual optimal speed for the road segment has been determined, subtract ~10mph to pre-emptively avoid complaints by the elderly that the speed limit is too high.
That was regarding suburban road planning though, not sure if the same applies to freeways and other major arteries.
It's not the same. The police officer is a moving speed limit restriction so cars driving at the normal speed will catch up to it and slow down and they will continue banking up for as long as the car is driving slower than normal.
You wouldn't get that banking up with everybody following the limit.
It becomes harder to change though because if the culture is to go x over the speed limit, people will probably still go x over the new higher speed limit.
This is ironic because next time there is a bike article, you get all the professional drivers in this thread commenting "but they never stop at stop signs!".
Can't have it both ways. The statistical evidence is clear: the average US driver is tremendously unsafe, untrained, unobservant and unskilled despite their country being built around driving everywhere. Their remonstrations on all the rule breaking they can perform safely stems purely from ignorance of their own inabilities.
in my country (the netherlands) i have seen people stop at stop signs at times when there is very, very little traffic (04:00 at night).
The point is also much more about creating a habit in which this kind of behaviour is just done, regardless of the state of the traffic on the road.
The law says you must stop for a stop sign, stop signs are placed in places in which sudden traffic participants could enter your field of vision at a time in which it is too late to react properly.
Also, people get fined for ignoring stop signs, even if no one is present.
Driving education in the netherlands is quite strict and so are punishments for drivers.
For instance, the driver of a car is always at fault for an accident with a "weak" traffic participant (foot/bike traffic), even if technically they werent at fault. (there is process to fight this in court if you assume ill intent/fraud is at play, although it is rarely used).
the reasoning being that the driver of a car has had a drivers education and can thus act responsible while driving a hunk of metal down the road at lethal speeds.
> in my country (the netherlands) i have seen people stop at stop signs at times when there is very, very little traffic (04:00 at night).
This is also normal in the US, despite what a few commenters on HN may have you believe. Hell, if anything, regular people out and about at 4am are better about following the basic laws, because you stand out a lot more when you flout them, and in the wee hours of the night the proportion of drunk drivers is far higher and so cops are looking for it.
i wonder if it’s a city vs suburbs thing. when i lived in the ‘burbs, doing highway driving, i got super accustomed to spotting the next red light and then coasting to a lower speed, hoping it’d turn green before i got there so that my average speed through the light was actually higher, city driving, you accelerate fast and break fast. once you’re accustomed to breaking with force and breaking often, a hard stop at a four way stop becomes a whole lot more natural than the alternative.
i don’t mean to overgeneralize. i expect there is at least some small correlation between density and how much you break at a four-way stop, and i’m curious what the other correlates are and how dominant full-stops are.
We humans often roll a stop sign (hopefully at a lower speed than 5.6mph) because we are impatient and because we have quite a lot of faith in our general intelligence and situational awareness to be able to make a reasonable call in these situations.
Still we often get this wrong and somebody gets hurt.
A computer is not cursed with our human impatience so why program it in?
But more importantly the computer has none of the general intelligence and situational awareness that a dim human has, and they won’t for a long time.
Have you ever been a pedestrian? The fact that most drivers don't give a shit about stopping for crosswalks isn't a reason for Tesla to say "fuck it let's go".
I wouldnt say outrageous but its pretty audacious to actually program in law breaking behavior. I would imagine this would instantly expose the company to liability?
Maybe my part of LA/SoCal is weird, but almost all of the people at the nearest stop signs stop at the signs, even when no other cars are there and no pedestrians are near the intersection.
In fact, the most annoying thing is that they will stop for too long despite the lack of cross-traffic or pedestrians traversing the intersection (in any direction).
Rolling stops are how a fair number of folks drive, especially when there is no traffic on a cross street. I was in a small down growing up, and you'd slow to maybe 1-3 MPH, then turn.
I know everyone on HN get's outraged by these decisions with FSD (and regulators do too). One thing I think this exposes is actually human driving. This is in fact common - the system is reflecting things back to us.
I was on road up higher and the number of people on their phones is mind boggling. THey will literally sit at a green because they are so checked out, they will be texting while merging. It's madness.
Rolling stops were only enabled in the "Aggressive" FSD Beta driving profile, one of three available profiles. Several FSD Beta testers used the Aggressive profile in areas where rolling stops are common. They did this because when using the other profiles, they would be embarrassed or honked at because their car insisted on always coming to a full stop.
Anyone familiar with driving in Los Angeles knows that when it's safe to do so (e.g., when there is no cross traffic on a turn), a rolling stop is extremely common. So common that it is plausible that people behind you might honk if you insist on doing a full stop and then cautiously proceed. Especially during rush-hour.
By the same token, most people in Los Angeles drive 5 to 10 MPH over the posted speed limits. Driving at the posted speed limit will cause other drivers to hate you.
I truly don't understand this line of reasoning. "Oh no, I've been honked at! Someone in a car, who I'll never meet, doesn't like me! I guess I'd better start driving a bit more unsafely so that people don't honk at me!". I'll ignore someone honking at me if it means I'm driving more safely.
And I've driven in LA plenty of times. It might be the norm for LA drivers, sure, but I would also argue that that whole region could use at least a bit more patience when on the road.
The speed limit differential gets into a bit more of a grey area, though, in regards to laws that require you to maintain a speed that's in line "with the flow of traffic". It's similar in Atlanta; posted speed limit might be 55, but you're likely going to be going at least 70 in the slow lane in order to maintain the flow.
Last time I was there I had someone get out of their vehicle and get angry as fuck at me - screaming, calling me a piece of shit, mother fucker, etc. - because they turned into oncoming traffic (my lane) to get around a garbage truck, found me there driving in my right of way, and demanded I back up and make room for them. Everyone else on the road in LA seem generally grumpy and impatient.
Granted, I usually drive pretty fast and am aware of my tendency to be impatient behind the wheel, but it seems, anecdotally at least, like LA is worse than many other areas in that regard. The first 20 minutes driving around after I leave LAX are usually spent thinking, "I'm crazy, but these people put me to goddamn shame and I need to adapt quickly," lol.
There are some truly idiotic drivers in California. On the one hand you have people in their beamers who go as fast as possible whenever they have an open lane, that might mean 50mph on residential streets and over 100 on the highway. Then you have the other end of the spectrum, people who think the speed limit is a limit, not a minimum, and drive like 40mph on the freeway with no one in front of them in one of the middle lanes. People weave aggressively left and right to get around them and it causes accidents.
All of this is the direct result of little enforcement for traffic rules. I've never seen someone pulled over ever in California unless there's been an accident. I've never seen a cop with a radar gun. I've actually seen the LAPD speed past me and hang out in the left lane with no lights on, because that's just how the standard of driving is in LA apparently. You try that in the midwest and you will be pulled over for going to fast, for going to slow, for having a 10 foot tall stack of scrap metal in your truck, for spending to much time in the left lane, and for failing to signal. The highways feel truly lawless in California.
One time, a CHP car turned its lights off and deliberately hung out in my blind spot for a few miles on the 405. I can only assume they were waiting for me to do something they could use as an excuse to pull me over.
> I truly don't understand this line of reasoning. "Oh no, I've been honked at! Someone in a car, who I'll never meet, doesn't like me! I guess I'd better start driving a bit more unsafely so that people don't honk at me!". I'll ignore someone honking at me if it means I'm driving more safely.
Sometimes a "bit" of safety isn't worth the tradeoff in throughput. But that also assumes it's safer to stop completely. It's very possible that smoother traffic flow is safer. Or maybe there's more risk that you get rear-ended than is caused by a rolling stop.
Then let's get the road designed that way. I'm onboard with the argument you're making about flow, but - and perhaps this is just me - traffic is safer when it's predictable.
Let's say you have a stop sign in your neighborhood that "everyone" rolling-stops through. Sure, great, let's look into changing that stop into maybe a yield, or a roundabout, or whathaveyou that safely allows for the flow that that street necessitates. But in the meantime? Stop at the stop sign! Yeah, sure, we can make the argument that the rolling stop is now "predictable" to everyone in the neighborhood, but what happens when you get an out-of-towner in front of you on your morning commute and you slack off on watching for them to stop, resulting in you rear-ending them?
Improving the flow of traffic is great, but ignoring rules for the sake of flow adds additional layers of unpredictability that ultimately result in roads that are less safe for drivers/pedestrians, at least IMO.
From my experience honking is exclusively used for frustration over driving style, with vehicular faults more often reported by less aggressive means like flashing brights or shouting the info from an opened window.
My suggestion would be for people to stop honking out of impatience. If there's an actual issue, honk. Someone not moving as quickly or dangerously as you'd like isn't an excuse to honk.
I really only honk at people for screwing up. If they cut me off I will wail on the horn for a long time, just holding it, letting them know. I love embarrasing asshole drivers like this especially if they have some passenger. Whenever cars honk at me for using the crosswalk I actually enjoy it, because then I just stand in front of their car for the rest of the light cycle absolutely cussing them out and embarrasing them while they can do nothing because the walk sign is on. I love pointing to the walk sign and saying "What does that sign say? Drive? does that stick figure walking with a countdown say Drive to you?" as loud as possible. Other pedestrians have even fist bumped me. It is so cathartic!
> If they cut me off I will wail on the horn for a long time, just holding it, letting them know. I love embarrasing asshole drivers like this...
I can say confidently that I've never looked at any other driver doing this and thought anything but, "Jesus Christ, calm the fuck down." I've been that guy doing the honking as well, and I'm fully aware that nobody arounds me gives a shit and they just find me annoying. Hell, half the time most of them don't even know what happened; they just look over and see some overly-angry dude screaming at someone.
>... especially if they have some passenger.
Did you really embarrass them in front of the passenger, or yourself? What if the transgression you're honking at was a genuine mistake with a reason that actually makes sense? I've been the passenger in situations like this before and have found myself thinking, "Yeah, oops, but I don't totally fault my driver or the other driver for this and the chud doing the honking needs to chill out because clearly they don't understand the driver's perspective."
I don't think cutting me off without a turn signal has any logic other than "me me me to the red light first!" If I can get them to have a reaction like "jesus Christ, calm the fuck down" then I am a happy camper. Again there are honest mistakes, and clearly asshole illegal behavior that drivers do like speeding like crazy, cutting people off, weaving through the freeway like a snake, running reds, cutting into lanes and blocking traffic because you must get to the turn lane at the last possible moment before there is a barrier, and willfully ignoring signage like no turns on reds. I'm also not embarrassed about calling out drivers as a pedestrian. Why would I be? I do it when have right of way and they don't and they had the audacity to honk at me for walking when the walk sign is lit. It's fun to call out assholes. I love shouting at them, so satisfying.
If they hit me at speed there's nothing I could do whether I argued or not, I'd be laid out before I knew what happened. If they stopped, then hit me, that wouldn't be a bad collision to take, they'd only get up to a few mph in the few feet in front of me even if they hammered the throttle. I'd slide off the hood and get their plates and then they'd be in jail and I would walk into the courtroom with a neck brace and get my payout.
I'd be getting that anyway when I get old, or I could be arthritic and also riding off the dividends of my massive settlement I had in my youth which I promptly would throw in the market.
Go to some town with less than 100k population at least 50 miles away from a metro area and you'll see this sort of ideal world: everyone goes the speed limit or 1-5 MPH below it, honking is only if you haven't moved for at least 5 seconds after the light turns green, nobody's in a hurry at all.
Anyone familiar with driving in Los Angeles knows that when it's safe to do so (e.g., when there is no cross traffic on a turn), a rolling stop is extremely common. So common that it is plausible that people behind you might honk if you insist on doing a full stop and then cautiously proceed. Especially during rush-hour
Anyone who drives in LA during rush hour knows that it is never safe to do a rolling stop, because during rush hour there is going to be cross traffic and you need to check that there is cross-traffic at the intersection before turning.
I've witnessed a lot of accidents where some moron thought that saving 3 seconds was more important than safety, and rolled right in front a car that had the right-of-way, or into a pedestrian in the crosswalk that the driver hadn't seen.
Right, but the promise of self-driving vehicles is about them being safer than humans. Humans are irrational and impulsive. Those actions disrupt the flow of traffic. The idea is that a self-driving vehicle can be safer than a human because it will do things like follow speed limits, always come to a complete stop, etc.
If self-driving vehicles are simply going to operate like facsimiles of human driving behavior their benefits are greatly diminished.
I don’t see anything wrong with copying human driving behavior except without any risk of drunk driving, driving and texting, falling asleep at the wheel, having a heart attack or stroke at the wheel etc. The average truck driver appears to be morbidly obese and gets little to no physical exercise, yet they drive enormous trucks that can do insane amounts of damage if they suffer a heart attack or stroke, personally I can’t wait for self driving vehicles to come online.
EDIT: Not really sure why I am getting downvoted for this, you can search google and get maybe hundreds if not thousands of results about truck/bus drivers and heart attacks/strokes causing deadly crashes, but they get maybe 1% of the media attention of an FSD video rolling through a stop sign
https://fox4kc.com/news/family-says-driver-of-tractor-traile...
Seriously! I was going to make the same observation about truck drivers. It takes so little to exercise effectively. Why not pull over your truck for 30 minutes twice a day and do some basic calisthenics/bodyweight exercises? Who's going to notice?
I think as long as autonomous drive systems are intermingling on regular roads with human drivers, they need to behave like human drivers. Human drivers will get angry with autonomous systems, especially during rush hour, if they adhere strictly to all laws such as full stops and speed limits.
As others have pointed out, this may force us to contend with the laws first since it's clear local enforcement and laws are not aligned.
When a majority of driving is autonomous, laws will likely evolve slightly. If/when all driving is autonomous, then things get really interesting: short following distance, peer-to-peer intersections without stops, etc.
Humans get outraged all the time. If you drive exactly at the speed limit, they get angry, if you drive 5km/h over they get angry that you don't drive 10km/h over it, others get angry that you are tailgating them when they are driving exactly as fast as allowed and will brake check you --- I really think autonomous cars should just mechanically stick to the rules, then they are at least predictable.
To be fair most self-driving cars don't have the the blind spots that make rolling stops unsafe to pedestrians. So they can be much safer than human drivers while still doing rolling stops in this manor.
How and why does stopping completely at a stop sign, with good visibility and no present pedestrians or cross traffic, increase safety? Will wait for an answer.
The human behavior here (slow rolling stop signs) at least in theory can prevent being rear-ended by someone who is used to everyone else in the area doing that (of course it's the rear-ender's fault in a crash... but still an annoyance for everyone involved)
There's no benefit to the computer swerving in and out of the lane like a drunk driver, or taking an intentionally inefficient route like a crooked taxi driver, or other pure-downside criminality.
The human behavior can also result in an accident when a car or pedestrian that expects the car to stop walks out in front of the car.
In theory, the car should be paying attention in all directions at the same time and won't get into an accident, but car sensors are not infallible, especially when detecting pedestrians, moreso if they are partially obscured by foliage, newspaper stands, etc.
Automated cars should stop 100% of the time at intersections where they are required to do so. When the stop signs are changed to "Self-driving yield" signs, then the cars can do a rolling stop.
So my point wasn't that cars should act like drunk drivers, but that modeling human behaviors is not the right mindset for self-driving cars.
today i approached a 4 way stop. the car to the left of me arrived later, as I was coming to a complete stop. well the driver to the left used a rolling stop and continued through the stop line. then he had the audacity to beep at me as he cut me off.
The issue there is not so much the rolling stop but not respect right of way.
Personally I think almost all stop signs should be replaced with roundabouts as it better represents how people want to drive and allows for rolling stops but that’s just me
We have a heavily trafficked roundabout in our neighborhood and it's a big hazard to pedestrians because drivers take it way too fast and fail to yield to pedestrians in the crosswalks around it. Aside from that, the roundabout is great but takes up a lot more space than an intersection would. Maybe some bumps to calm the traffic approaching the roundabout would help.
The problem that I see is just insufficient respect for traffic laws. I'm not sure what to do about it, aside from more police presence.
I do like the Australian perspective, where I learned to drive before I moved to the US.
There's no concept of "right of way" on Australian roads. There's only "duty to yield". It might seem like a subtle concept/difference, but when it comes time to stand up in court/ be pulled over/ avoid an accident, it frames things better.
Too many in the US are "it's my road, asshole, I got the right of way".
A stop sign at the end of a neighborhood road at the intersection of a main thorough street makes sense.
But four-way stop signs absolutely should not be a thing. Anywhere a 4-way stop exists should certainly be a roundabout.
The only problem is that roundabouts in the USA are exceedingly rare in most of the country. There would need to be a massive education campaign for them to work for most people. There are an alarming number of Americans that think that the people already on the roundabout have to yield for the people coming on.
It depends on the location. There are plenty of city intersections where 4-way stops are completely appropriate, and a roundabout would be impossible because it would have a much bigger footprint (and would block trucks and emergency vehicles).
Some places use virtual roundabouts, where the roundabout part is simply a small painted circle on the road. Emergency vehicles can drive straight over it
There's also formed roundabouts where most of it is raised but only by a small amount, so that emergency vehicles with large tires can drive straight over it but still discourages light vehicles from doing the same.
If you lack the situational awareness to recognise a police car in the vicinity, it's probably better to actually come to a stop. I suspect that's part of why the law works well enough.
I was going to argue with you that rolling stops are dangerous, but I ran into this interesting article which recommends replacing stop signs with more informative indicators:
If a police officer sees you do a donut in the middle of an intersection where i live (also southern california) you will just flee home and do it again tomorrow night. Traffic enforcement varies strikingly in California it seems.
Is it, or do we simply have more miles driven per capita? We are a big country. Not Monaco, where you could walk to anywhere else in Monaco to join a friend for breakfast
Looks to me like the US is still on the highish side for deaths per vehicle-mile compared to the EU and Canada, although not that high by worldwide standards.
The Tesla crept into the parked cars to see around the corner, and the driver applied brakes.. I don't see any exceptional behavior here other than the driver using the touchscreen while driving.
That's exactly the problem -- the driver had to hit the brakes because his car had started to pull into the path of a fast-moving UPS truck that his Tesla doesn't pick up until the last second. The "360º view" scanned "200 times per second" and acted upon with "superhuman latency" still randomly decides to pull out in front of traffic. Either the cameras are buggy or the software is, but we can do better.
it is a right turn onto a 2 lane road, where the right lane is protected by parked cars.
the tesla pulls into the PROTECTED lane to see around the parked cars.
you are assuming that it would have then decided to pull into traffic, but that belief is pretty unfounded. When FSD cannot see well enough to decide that a turn is safe, it will literally sit there for minutes. Yes, i am quite certain it would have done that in the protected parking lane.
I'm not assuming anything - what I told you happened is literally how the driver -- who likes Tesla and has has owned 3 others before this one -- reported it.
"If I didn’t slam the brakes, we would’ve been taken out by the UPS truck."
But sure, go off about how certain you are when there's video evidence proving you wrong...
Setting aside legality, It's hard to believe that Tesla's FSD has the ability to successfully evaluate its current context (weather, time of day, visibility, vehicle "body language") and decide when a rolling stop is safe and when it's not. Not that humans always do this well either, but resorting to rolling stops always while ignoring the circumstances seems like a bad move.
In the Assertive profile, it can do rolling stops, but only under circumstances where it is very safe, such as four-way stops with no other driving vehicles present.
I have FSD Beta as well. Even in Assertive mode, I've never actually seen it do a rolling stop, including rural areas with nothing around. I'm not saying it doesn't happen. In fact, I'd prefer it didn't--or at least provide an option to toggle it on and off.
That Tesla headline is misleading. Should be “Tesla Removes Rolling Stops from Assertive Driving Profile”. Tesla added rolling stops to make the car behave more human like. NHTSA said no, so Tesla removed the feature.
It’s pretty accurate even if you’re a fan of the company. The more accurate description is that they shipped a feature which breaks traffic laws, which is a serious error in a company asking us to trust their judgement in a safety-critical system. Given how many other problems they’ve had and the consistent overselling of their capabilities and safety, that’s an important conversation to have.
> they shipped a feature which breaks traffic laws, which is a serious error in a company asking us to trust their judgement in a safety-critical system
You're not wrong per se, but there is significantly more nuance with self-driving technologies than you're suggesting.
A more famous example in the self-driving car world is the Pittsburgh left [0], where in Pittsburgh a driver turning left will often get conventional precedence over vehicles going through the intersection despite there being no explicit left turn light. This move is technically illegal, but when the self-driving cars didn't do this it drove traffic to a halt regularly and held up every intersection where the self-driving car was turning left. Eventually this had to be added to the software.
Examples like these are why self-driving technologies are so hard to get 100% right, driving is a mix of intuition and rules. A significant amount of driving is doing what other drivers expect you to do. If the cars don't behave like human drivers expect, it often causes more problems than doing what the car is "supposed" to do.
That said, as a pedestrian who has frequently almost been hit by people rolling stop signs, I'm with the NHTSA here...
I remember back in Driver's Ed I was alarmed to discover that through sloppy definitions Colorado legislators had managed to make it illegal to take a right turn within 150ft of a stop sign (or similar, I forget the details). Of course, in reality nobody follows the sloppily defined portion of the rule, nobody enforces the sloppily defined rule, and it isn't a problem -- but the gap between rules and realities is substantial, self-driving cars are going to tease apart this gap, and spiders will come crawling out.
> Examples like these are why self-driving technologies are so hard to get 100% right, modern driving is a mix of intuition and rules.
Indeed! The hardest part of switching to driverless cars overall is that driverless vehicles have to exist on the road for some number of years surrounded by ones driven by primates, with primate reflexes and a variety of ad-hoc behaviors.
If every car on the road were to go driverless at midnight tonight, would the number of accidents and death and disability plummet compared to yesterday's stats?
There are better solutions to this problem than coding cars to behave like bad drivers. Round-a-bouts. Banning left turns during peak hours. Adding a dedicated left turn signal at the beginning of the cycle. The Pittsburg left is a terrible convention... It puts pedestrians at risk (assuming the cross-walk signal follows the traffic signal). And nobody from outside Pittsburg knows it's a thing - if I were visiting Pittsburg, I'd run into the car turning in front of me (well, hopefully not, but it's a risk).
Rolling stops, while all to common, are not at all like a Pittsburgh left. Police everywhere will pull you over for rolling through a stop sign. Police in Pittsburgh probably won’t, unless they have something against you personally.
It’s not serious at all imho. People roll stop signs all the time, it’s part of driving culture. If you feel obligated to obey the letter of every driving law you could have turned it off in settings.
Tesla also has an option to override the speed limit, you want to remove that too?
Unfortunately, self driving cars drive in the real world. There are a great many roadways in the US where driving the speed limit is legitimately dangerous in one direction or another.
I live in New York and there is one such roadway I drive often. Palisades Interstate Parkway. The speed limit is 55 and there are no trucks allowed, but if you are driving 55 on this road you are in danger. You will get run off the road by everyone else traveling at a minimum of 65-70 with many of them 80+.
There may come a day where humans are completely out of the equation, but until that time I believe self driving cars are safer if they drive more like human drivers. That means keeping up with traffic and other human quirks.
Agreed, and these roadways are all over the country. It's a pretty widely known fact that the safest speed is the natural flow of traffic, but municipalities all over the country are much more inclined to listen to the vocal minority over traffic engineers. Not a lot of "trust the science" going on there.
The solution to that is not that everybody gets to ignore speed limits and point to a QZ article as justification. The solution is not that Tesla unilaterally decides that 10% more is fine, everywhere, all of the time. The solution is what the advocate in your article proposes - make a conscious decision to raise the speed limit on certain roads, backed with suitable data.
That’s a lot of snark for such an obtuse comment. Obviously we should fix our laws, but in the meantime we shouldn’t imperil ourselves and everyone around us just to pacify the people who lobby their municipal governments for unsafe speed limits.
Wait, so you're saying that people would crash into you if you drove the speed limit?
I'm a very calm driver and regularly drive at or sometimes below the speed limit if visibility or other factors don't allow higher speeds. People do slow down and I never felt in danger - granted, this is usually at around 40 km/h instead of the limit of 50 km/h, but I can't imagine people are so careless they'd "run you off the road" if you weren't speeding.
>Wait, so you're saying that people would crash into you if you drove the speed limit?
Maybe not literally crash into you, but that is certainly possible. People will swerve around you, ride your bumper, flash their high beams at you, honk, pull in front of you and hit the brakes, and other dangerous road rage type behavior. It is absolutely unsafe to drive the speed limit.
In general the safest thing to do is just to keep up with traffic.
People will swerve around you, ride your bumper, flash their high beams at you, honk, pull in front of you and hit the brakes, and other dangerous road rage type behavior.
I don't think the one being unsafe in this scenario is the one driving the speed limit!
That’s exactly what the driver in front of you and behind you use as justification for being above the limit: I had to go with the flow of traffic. A self-perpetuating force that forces everyone to be too fast.
Disagree. Traffic engineers study roadways and recommend speed limits based on safety and human behavior. And then cities and states ignore them and use speed limits to generate revenue.
>A self-perpetuating force that forces everyone to be too fast.
If it were actually too fast, most people wouldn't travel that speed. Have you ever noticed that traffic speed ebbs and flows with the road conditions or time of day? People will drive the speed they feel safe driving, and for most drivers it's (typically) much faster than the posted speed limit. And if the conditions are poor, it's much slower.
> Traffic engineers study roadways and recommend speed limits based on safety and human behavior.
This isn't really true: what usually happens is that they either go with a default or they do a study and set the limits at the 85th percentile. For a separated highway, that works fairly well but it often has bad results for mixed spaces: the people commuting through a neighborhood, for example, are trying to go as fast as possible but the people who live there are more concerned about safety, the impacts of those decisions on how they use their space (I grew up hearing that “nobody walks in California” which really meant “nobody wants to walk 3 miles further to use the few signaled crosswalks”), etc. A big problem here are the outliers: most of the risk comes from the top of the speed distribution — even if half of the drivers scrupulously follow the speed limit, the speeders are the ones who will influence people's safety perception of the road.
> And then cities and states ignore them and use speed limits to generate revenue.
I think for highways and other limited access roads they're often concerned with minimizing complaints much like the commentary we're seeing up and down these comments so they slap a small number on the sign knowing full well that traffic will ignore it and call it job well done.
When you're driving significantly below the speed limit (or more accurately, below the average speed of the other drivers), it becomes impossible to maintain proper dostance behind you. Add to that the fact that people randomly drive in the left/middle lanes regardless of actually passing anyone on their right and you're forcing people to brake, make dangerous merges, etc.
I know nothing about your driving, but I find (anecdotally) that drivers who drive significantly slower than traffic are also ones who never look in their rear view mirrors.
The human driver is the one that told the car to disobey the stop sign[0]. Should every cruise control on every car with speed limit detection forbid the cruise/adaptive cruise to go above the speed limit?
I have a car with adaptive cruise control and speed limit detection, and this would absolutely drive me nuts. Not for any philosophical reasons, but because speed limit detection is fallible. On the expressway my apartment is a block away from, which has a speed limit of 50 mph, my car fairly frequently tells me the speed limit is actually 30. It's fairly common for it to read speed limit signs that have conditions on them -- only in effect certain hours, or during school hours, or when light is flashing, or if you're driving a truck -- and incorrectly assume that's the speed limit.
Maybe you'd be perfectly happy with "if you don't want the cruise control to make mistakes, just never use it, because gosh darn it, that's better than allowing people to set the adaptive cruise control five miles an hour over the speed limit like they've been able to do with non-adaptive cruise control since it was a thing." I would not, and I would argue I am not the one taking an unreasonable stance.
Speed limits change, signs are obstructed, some speed limits are only valid for certain times of day. There’s plenty of legal reasons to override the posted speed limit.
"Whether you're willing to hold computer drivers to higher standards than human drivers" is moot? Or you just don't feel obligated to reconcile the inconsistency?
I am willing to hold a computer to a higher standard, that’s the promise that gets made left and right for self driving cars. That they achieve a higher standard.
What if it meets the higher standard of “being more safe” but not the higher standard of “obey current traffic laws that aren’t enforced against humans”?
In the end, it is my choice whether I stick to the law. I'm fine with instituted consequences. I'm not fine with having the option to break a law removed.
Some laws are stupid. Some are unjust. Some are racist.
Discretion is valuable, probably even essential, for society.
Spread out over millions of drivers, it can cause many lifetimes worth of time lost from wasteful and inefficient practices, more damage to transmissions and driveshaft from stop-and-go, and increased environment destroying pollution.
People are also killed or injured by drivers all of the time, too. Remember when the selling point of AVs was that they’d reducr the ~40k / 300k Americans so impacted annually? Telling manufacturers that it’s okay to ignore laws if it gets you there faster will have the opposite effect.
Safety is measured in accidents per mile driven, lower being better. While there maybe a loose correlation strict obedience to the law is not in itself a measure of safety.
That metric is misleading since it’s skewed by highway driving where people cover lots of mileage with far fewer points of contention. A better metric would be something like operating time but even there you’d probably want more nuance since different environments have different safety characteristics (someone stuck in traffic on the freeway just doesn’t have as many things to get right as a driver on a city street even if they’re both averaging 5mph).
The law actually does have a fairly strong correlation with safety — this is the basis of the claims that something over 90% of collisions are due to driver error – but there is a growing recognition that it’s not enough, and design changes to streets and cars are needed to reduce the number of times where obedience to the law is the only thing protecting someone else.
Yes, and not only that, Elon Musk should be in prison for vehicular homicide. If a human chooses to speed and kills someone as a result, they are going to jail.
Tesla has willfully chosen to violate traffic laws resulting in numerous deaths. Where is the accountability?
I'd normally disagree since most of these systems ask you to keep paying attention or keep your hands on the wheel and they make it clear that this is assistive technology and not a replacement for an alert, licensed, driver being ultimately in control. But Elon's statements, and calling it "fully self driving", is really asking for a lawsuit or legal trouble. It's incredibly irresponsible and reckless, and also is the one big negative on an otherwise pretty awesome car.
I really wish we could divorce all the self-driving/big brother electronic stuff from electric cars, and make available a dumb electric car.
The idea that a driver can anticipate when a self-driving car is going to fuck up and do anything to prevent it is bullshit and the charlatans pushing these cars know it.
These cars need to be taken off the roads until they can safely drive unattended.
> The idea that a driver can anticipate when a self-driving car is going to fuck up and do anything to prevent it is bullshit and the charlatans pushing these cars know it.
There is no factual basis for this claim. Cruise control and adaptive cruise control have been in use for a long time, require this type of attention, and are safe to use.
> It’s not serious at all imho. People roll stop signs all the time, it’s part of driving culture
This is a terrible line of reasoning.
If Tesla ever releases their vaporware humanoid bot would you expect them to program it to rape women in cultures where women rape is part of the culture?
Programming a several ton machine to disregard laws and potentially kill many pedestrians without warning isn't really comparable to programming a machine to rape -- it's unfathomable worse.
I've worked in factories and the idea that a car company can release a machine onto the streets that wouldn't be allowed anywhere near a factory floor is flabbergasting.
If a company tried to release a product that had this safety profile in a factory setting the governments and unions would be all over them.
The fact that some jackass yokel or senile old lady routinely roll through stop signs daily doesn't justify Tesla releasing a product that does the same.
Equating rolling stops to rape has to be the worst argument against FSD I’ve ever heard. I’m honestly at a loss for words. Safely proceeding through an intersection without coming to a complete stop hurts nothing, except maybe the feelings of traffic law puritans.
Sometimes it is more dangerous to follow traffic laws exactly rather than normal human driver when you are surrounded by normal human drivers. It's not as simple as you are making it seem.
Multiply the extra few seconds times how many ever millions of people end up using automated driving, times how many stop signs they end up at. The number of lifetimes lost due to stopping at stop signs has got to be the equivalent of the circa-100's area. The real question is why anyone bothers stopping at all if all directions are clear.
This is a fallacy popular with bad drivers since it lets them rationalize their decision to put other people at risk. It's designed to shift the focus away from agency — note how the people most at risk aren't given a decision? — and it relies on an assumption which becomes monstrous as soon as you think about it. Trying to phrase it as a math problem with the implication that time is fungible makes it sound like a minor optimization but you really want to consider just how uneven the impact is: you're saving an amount of time so small you'll barely even remember it but the person you hit may be losing the rest of their life or spending the remainder of it significantly degraded. Around here, a lot of the people getting hit are kids so it's an especially unfavorable cost in years of quality life.
The other unquestioned assumption is that this is even a serious time savings. In my experience, the average driver _massively_ overestimates how much time aggressive driving actually saves them. In most cases, all it means is that they're spending more time at the next backup — I've had people weaving around me for literally hours on the interstate without appearing to notice that they were doing a LOT more work failing to improve their relative position at all, and in the city I routinely see the same driver “passing” me on every block.
Then, of course, if we're talking about lost time, how many years of quality life do people lose because they're deterred from healthier transportation options? It would take rather an awful lot of stop signs to cancel out the savings from even 5% of people switching from driving to bicycling, walking, or transit.
> The real question is why anyone bothers stopping at all if all directions are clear.
Because the most common excuse for hitting someone is “I didn't see them” or “They jumped in front of me!” (which in the vast majority of cases really means that the driver's attention was directed somewhere else). It's not like traffic engineers don't know yield signs exist: they use them because they're intentionally trying to prevent drivers from staying at a high speed due to the risk to pedestrians and cross traffic. Most people who live in neighborhoods which are popular with commuters have to spend years begging to get a stop sign, often requiring a serious collision to force the local DOT to act.
It's a popular fallacy to consider the lives lost you see but not the lifetimes lost from wasteful practices. Think of the children is a nice trope to strut around to scared soccer moms during the elections, though.
You can't make the presupposition that a rolling 'stop' is always more dangerous than a full stop, either.
> You can't make the presupposition that a rolling 'stop' is always more dangerous than a full stop, either.
Actually, you can. Drivers who roll through a stop are far more likely not to notice pedestrians, bicyclists, or often even other drivers. They’re also unpredictable since everyone else has to figure out what they’re doing and react accordingly. I see the chaos on a daily basis where everyone else has to adjust their timing – and the selfish driver almost inevitably ends up no more than one car length ahead at the next back-up.
In contrast, a legal stop is easy to understand - it works just like you learned in driver’s Ed and since you can’t read another driver’s mind, you have to plan on stopping anyway.
Thankfully I design radar detectors for a living so that people like you who inevitably support insane and irrational laws, on the entirely convincing proof of just 'actually you can' and 'think of the children', have minimized impact on people like me. My whole life is making sure people such as yourself and the thieves from government cause me minimal legal hassle.
So enjoy your opinion, I'll keep working to make sure your opinion has the least influence possible.
No prob. I haven't. Also I make more money the more traffic laws, enforcement, and cameras that come out (it drives consumers to seek ways to become educated about police and camera presence), so really it's in my interest to support you ironically.
I’ve known a few people with the same attitude. All but one of them was right up until the time they weren’t, and that changed a stranger’s life forever.
The 'attitude' I'm intending to express is that safety and law are not interchangeable. Sometimes they are in harmony, sometimes they are at odds, and sometimes they're simply completely separate concepts. This can vary from minute to minute, place to place, from one environmental condition to another and even from one driver to another. If how often someone gets in an accident is a guide as you're implying, well then I'm below the national average considering I'm halfway dead, I've been driving more years than not, and I'm definitely below the average person's accident rate of one per 18 years (sitting at zero).
Anecdotally as a bicyclist it was infinitely easier to predict rolling stop drivers than those who stop. People stopped are rarely intelligent enough to follow the order of who's next, and constantly would randomly choose when to start up (sometimes just in time to almost hit me). the one rolling through I already know is going and I know the time is now that they're going, so little risk of accident. Biking isn't a game of who wants to kill you (all drivers) but rather _when_ they're going to attempt it.
even this reply is framing it like it sounds ridiculous. I'm not sure if this stop sign incident is the best example, but look into all the reports of the waymo vehicle causing road rage due to its "safety-first" driving.
See also regular driving. The solution for road rage is taking away bad drivers’ licenses, not saying everyone should drive like them. If the police refuse to do their jobs, that problem isn’t one Waymo can solve.
again, you are removing all nuance. It is not an issue of "bad drivers", it is an issue of local driving norms. If every single driver is doing something, it is not an issue of removing bad drivers.
if you are driving under 70 mph on the 101 at 6:00am , you are endangering yourself and those around you.
This thread is about stop signs. That’s a common way for pedestrians and bicyclists to get hit when drivers ignore a safety mechanism because it’s mixing transportation modes, unlike someone going modestly over the speed limit on a controlled highway where everyone else is also in a protective steel cage with various safety mechanisms.
Ahhhhh. I've struggled with this all my life, where the rules say one thing, but you're "just supposed to know" those cases where they aren't really serious about that.
Now the contradiction is so painful, they'll have to address it. Either a) update the rules to reflect actual practice, or b) admit to "yes, your self-driving car has to obey traffic laws we don't enforce on humans".
The idea that you’d add a feature that’s clearly illegal blows my mind. “Product poses legal risk to user” is a specific risk severity rating that mechanical engineers are trained to flag in school. Any other engineering company would have caught this in their DFMEA (Design Failure Modes and Effects Analysis). These sort of sloppy failures show that Tesla is far from a path to competing with major car companies on reliability.
This is why i've been a fully self-driving skeptic in the last few years (initially was not), it's that our driving system is inherently broken.
Fully rule-following will create unsafe situations merely because humans expect a certain amount of rule breaking. The person "at-fault" will be he human, but politically that will be tough when humans are mad at machine for "acting weird by following the rules." I fear that our inherent contradictions for rule-following in the road will make it impossible for ML, both to understand how to behave, and to behave predictably to humans.
This is why I'm much more bullish on self-driving-similar vehicles, which look and behave differently, like how buses and trolleys behave differently in the road and we have different expectations from them.
I think the US has a particular problem here -- lots of bad traffic laws, and lots of bad drivers. IIRC road accident rates are relatively high in the US (some other countries are as high or higher, but not many) (edit to add: I guess the US is only high compared to most of Europe, Canada, Australia, NZ and parts of Asia. Which is a big fraction of the world, but not the whole world by any means.)
I wonder whether other countries might fare better? Although the regulations will be really fragmented, and few countries can come close to the US as a market for new cars.
I really, really care about traffic and transit alternatives.
The short answer is yes, the US has significantly higher death rates than europe (at least double last i checked), however, i think it's difficult to place that on their "driving behavior." I would speculate that much of it has to do with post-automobile city development. Having been on the road in many countries in europe, i can assure you that i've seen wildly unsafe driving in places like the UK and Italy, in contrast to germany/sweden/netherlands where i saw very safe driving. I presume the drastically lower fatality rates have much more to do with the road-widths and awkward intersections that prevent speed.
I would need a ton of data to accept that "culture" is to blame for road deaths in a country, given the fact that most people treat driving as a background activity, while they are listening to music/podcasts/news/etc. I presume the vast majority of safety is done automatically by removing unsafe road designs that favor throughput, speed, and traffic reduction over forced safe usage of vehicles.
This is just anecdata, but I’ve spent time in the UK and in several US states, and it seems to me that very bad driving is a good bit more common in the US.
There are aggressive drivers all over, at probably similar rates, but I feel like weaving across lanes, talking on phones while driving, etc is much more of a US thing. Also drunk driving, at least in certain states.
I guess I attribute this to a) low standards in driving exams, b) poorly designed laws (eg turning right on red), c) lax law enforcement and compliance (eg accepting drunk driving), d) misdirected law enforcement (eg aimed at maximising fines through entrapment, rather than actually improving safety). I don’t know to what extent I’m right about any of those, though.
So, I do think there’s a qualitative difference between the US and UK and “culture” is part of it (vs purely road layouts).
But I definitely agree that other countries do it even better -- we should look to them for lessons, and look for ways to diversify transport beyond cars.
Edit to add: to put a bit of colo(u)r on this: often in the UK, I find driving stressful because the lanes are so narrow, both on highways and in country roads. Then in eg Texas, you usually get lovely wide lanes, so driving should feel a lot safer; and I do feel like my own driving is easier, but it feels more dangerous overall because there are so many awful drivers around me! I don’t believe it’s a speed thing (plenty of people barrel down narrow UK roads at high speeds), it’s a driver attentiveness thing.
Here, you're conflating "bad driving" with "dangerous driving." The idea that "bad drivers" are the ones causing collisions, and that they can be corrected is not correct. I agree that bad driving is bad. I agree that bad driving will lead to collisions, but it doesn't follow necessarily that "bad drivers" are the reason that there are more injury and deaths. My contention is that, in general, we have a vehicle collision casino, and American roads just have better odds at creating collisions.
The most dangerous roads are the ones where people aren't predictable. Every time you pass hidden driveway, every time someone is crossing on-coming traffic, every time someone changes lanes, every time someone crosses a crosswalk, society rolls the dice, and there is some level of randomness of uncertainty could cause a collision. The more you roll those dice, and the higher speeds involved, the more injuries and deaths you'll cause. It will happen even if all the drivers are driving reasonably. This is why controlled-access highways are surprisingly safe despite their high speeds.
Road design is everything in this model. "Stroad" type roads, that feature people regularly entering and crossing the road, and random, and generally unpredictable intervals are notorious for their dangers. Semi-rural roads, with regular blind driveways, and little-to-no room for error narrowness, and the other type of extremely dangerous road, because the speed and narrowness lead to the same high levels of injury and death.
Both of these types of road are not dangerous because of bad driving, they are inherently dangerous, but, i agree, bad driving makes them worse. I contend that the design, however, is integral to the danger of the road. This is the only reason you can have roads, like interstate highways, with the most deadly conditions (extreme speeds, large speed deltas during traffic, near-constant rule breaking) and yet the roads are some of the least dangerous that we have.
I mean, sure when you frame it like that it sounds insane, but most cars these days have GPS that can look up the speed limit, so it would be relatively easy for modern cars to limit cruise control to refuse to allow speeding. Do you think every major car manufacturer is ignoring engineering safety standards and behaving recklessly and irresponsibly by implementing a driver assist feature (cruise control) that behaves illegally when the user asks it to?
The paragraph remains accurate but more funny if "rolling stop"* is replaced with "oxymoron",
so "“Tesla Removes oxymoron from Assertive Driving Profile”. Tesla added oxymoron to make the car behave more human like. NHTSA said no, so Tesla removed the feature."
Did anyone in Tesla's Legal Department review the rolling stop feature? This should have screamed illegal. Heck, I'm no legal expert but it screams illegal to me.
It's about as illegal as breaking the speed limit. If self driving cars aren't given the same leniency as humans to break the traffic laws that 99% of humans break then they will be a nuisance on the road.
Ideally the traffic laws should be updated to be more realistic.
So, self-driving cars are supposed to be, at the same time...
...much safer than human-driven cars, because they never tire and they never break traffic rules and thus never cause accidents
and
...supposed to bend and break traffic rules like humans quite often do in order to be efficient drivers
How should that be possible? Ah yeah, by having self-driving cars which are so extremely intelligent that they don't just follow the legal rules, but some idealistic set of rules deviating from the written laws that no human has ever been able to write down concisely, let alone bring into algorithmic form, but that is at the same time 100% safe and very efficient at allowing a maximum number of vehicles to reach their individual destinations in the shortest time possible.
Anyone not convinced that L5 self-driving cars are impossible without first developing a strong AGI way above human intelligence levels should have a really good solution for this obvious contradiction.
It's not complicated. Traffic laws are not perfectly aligned with safety. Plenty of legal things are unsafe and plenty of things that are illegal are not always dangerous.
This isn't surprising either. The process of making laws is political and politics are clearly not an ideal way to decide on a perfect and exhaustive and precisely detailed set of rules for anything. The laws do not constitute an algorithm for driving. They are necessarily simplified. And yes, a self driving car is going to need a set of driving rules that are much more detailed than anything humans have written down before.
Pragmatic rulemaking for large numbers of people also needs to take into account the fact that the rule will naturally be viewed as a loose guideline by some people, and even as a thing to be defied for its own sake by others. Setting the limit stricter than the actual desired behavior can push average behavior closer to the target.
> Setting the limit stricter than the actual desired behavior can push average behavior closer to the target.
Absolutely not, at least in the context of speed limits. Where I live, most of the speed limits are at least 15mph lower than they should be (why is a 6 lane road 30mph???), so you get:
* people who follow the speed limit exactly: 30mph
* people who follow the speed limit the road was designed for: 45mph
* people who see the limit as a "limit:" 25mph
It's a mess. Roads need to be designed to the desired speed limit. Don't make a 6-lane straight road 30mph; people unfamiliar with the area will assume the speed limit's 50% higher than it is, and people from the area will feel like they're going incredibly slow.
In short, it should be uncomfortable to go >= 20mph over by design if you want people to follow your speed limits.
It actually is incredibly complicated. You can tell, because every individual person you speak to will have a different interpretation of which rules are there for safety and which ones can be ignored.
The fact that you start your post about how an AI can make such a determination with the words "it's not complicated" discredits you.
Obviously rules for self driving cars are very complicated. What's not complicated are the reasons why the road laws made for humans are not directly suitable as unbreakable principles for self driving cars to always follow.
That seems like an appropriate generalized take on "laws" but in the specific case of ignoring a stop sign, I don't think it applies. Stop signs are a traffic control device that other drivers rely on to predict the intent of a vehicle approaching the stop sign.
Stop signs are anything but "ignored". The actual behavior only applies when the car recognizes a stop sign and also assesses good visibility to ensure that no other cars are around, and the car still slows regardless. It's not just blowing through stop signs at top speed whenever it feels like it.
1) they have to be simple enough to fit in an AVERAGE DRIVER's head (it is a miracle to me that highways in cities are as functional and low-error as they are yes I know the death statistics on driving)
2) the law makers know the laws will be bent, so they set a baseline of VERY VERY SAFE and hope that most people only bend the law to VERY SAFE or at worst MOSTLY SAFE.
It seems weird given how "bad" self-driving algs are right now, but with a detailed database of locations and context, I absolutely believe the "rolling stop" logic in the car would be safer and more effective than a human observing complete stops.
I definitely do not agree with Tesla's vision-only no-location-context algorithm approach. Such a thing is a baseline, but if you know the location to where you are going, then route-specific data should be downloaded that has been optimized with multiple specific AI computation passes.
That's how humans work: we are slow and dumb when we are driving in unfamiliar locations and routes, but for our commute routes we know where potholes are, cracks in the road, which lane to be in, dangerous sidestreets and intersections and congestion points, etc.
So I disagree that L5 is impossible. Well, it depends on "way above human intelligence". I think what you are trying to say is you need a supersapient intellect with IQ 195.
But AIs can be "dumb" but have access to a far larger memory space and database and respond to a far wider bandwidth of sensor information. The AI programs may not be solving Fermat's Last Theorem in the margin of a book, but they can be smarter in the sense that it's like having a million gnomes watching at once.
> Well, it depends on "way above human intelligence". I think what you are trying to say is you need a supersapient intellect with IQ 195.
I'd say even "normal human intelligence" would suffice for at least getting L5 to work (maybe not with perfect safety and efficiency, but working at all). But you cannot remove the need for actual "intelligence" from the equation, and that's the problem: our AIs are not "intelligent" at all. They are idiot savants, and driving a car isn't exactly a problem where you need savant-like capabilities. It is a very general, broad-range problem, which is much more applicable for the broad mental capabilities of even the not-so-smart part of humanity.
Case in point: there are human savants in areas like playing chess or Go or recognizing text in many different spoken languages, but there are no human savants in the area of driving cars on public roads.
Or possibly they just know that real world driving involves so many context-dependent decisions that trying to cover it all with laws is about as reasonable as making a self driving car.
A human breaks the rolling stop rule on a case by case basis and is responsible for the consequences, while a rolling-stop-avoidance feature acts on your behalf and Tesla is responsible for it. An analogy might be that it's the difference between getting food poisoning from home cooking vs. from a restaurant. Even though you're consuming food in both instances, the liability is assigned differently.
If you break rules when you think you don't need to follow them you'll find that you're not 99.9% perfect and you will inevitably cause a crash.
If you follow the rules all the time you won't make that error.
The reason why computers should ultimately replace human drivers is precisely because we're dangerously incompetent and overconfident and shouldn't be breaking any rules.
Easy. There is no contradiction. If I can think faster, see better, and react faster than you, I can break exactly the same traffic laws as you and still be a much safer driver. Computers can do all of those things currently, and it is a matter of time before that is applied to driving.
> because they never tire and they never break traffic rules and thus never cause accidents
No, because they never tire and almost never break traffic rules in dangerous ways, and thus cause accidents much less frequently than humans. Realistically, this means that instead of following the speed limit they'd follow speed limit + X, instead of keeping the prescribed distance they'd keep whatever distance human drivers typically maintain (while still keeping enough distance to react and stop), and it also means rolling through 4-way stops when they have sufficient confidence that it is safe to do so.
Which, last I checked for the state of WA, is also illegal. Just because there isn’t a cop ticketing every rolling stop at the intersection next to the elementary school by my house doesn’t make it “kind of okay because a lot of people do it”. It just means the municipality sucks at enforcement.
FWIW, if we're trying to maximize safety, then violating speed limits is often a must.
If the speed limit is 60 mph, but everybody is going 75 mph, then the safest speed for you as an individual is going to be 75 mph. Otherwise, you're a rolling road block, likely to get rear-ended, forcing tons of cars to change lanes to get around you.
Chances are the turns on the highway are designed to be taken at the speed limit. Sure go ahead and go 75mph with the crowd, don't be surprised when one day its you who is leaving a bumper behind on the freeway. Also on residential streets speeding is simply unaceptible. At 35mph an impact with a pedestrian will kill them 70% of the time. At 50 mph that goes up to nearly all of the time. We shouldn't favor serving impatient people over the safety of others. If the issue is everyone is going 75mph and its dangerous for self driving cars to go 65mph, then we should install speed cameras that send tickets to people who go 75mph rather than letting self driving cars also break the law and drive at dangerous speeds that our road infrastructure is not designed to handle.
Technically, highway (construction) code is that roads are engineered such that turns and such can be navigated safely at 15mph above the speed limit (or of 'cautionary speeds').
I state the above entirely on its own merits, not as a judgment of "should I or should I not travel at speed X on this road".
Maybe thats true with modern code, but thousands of interchanges across the country that aren't being reconfigured anytime soon have those yellow recommended speed signs that imo should be headed in many cases
Well, that’s the purely driver oriented point of view. Speed limits exist for other reasons, e.g. when going through pedestrian heavy areas. Going above the speed limit there endangers lives (regardless of if everyone else is doing it or not).
On surface streets, speed limits are usually quite reasonable. In all the streets I drive regularly, I think the limits are good, except for one stretch where it has two lanes in each direction, and so I often want to go 45 mph, but the limit is 35, likely because there are some houses that directly face the street.
It's highways that often have unreasonably slow speed limits, at least here in Oregon.
All you've done is to make an argument for speed governors. If the speed limit is 60mph, but everyone is going 75mph, then the point of "automated self-driving can't come fast enough, humans can't be trusted to pilot automobiles!" is self-evident. Argue all day if you like about whether the number on the sign is wrong, but that discussion belongs elsewhere.
I see from another comment that you’re in OR. That explains why you think freeway limits are too low. :-)
(The joke here is that OR’s speed limit on highways is typically 10mph below that of other PNW states, even in the “least populated county in the U. S.)
Having driven a lot in central and eastern OR, the roads are a bit winding, and even on the interstate going through the blue mountains can be a bit seat of the pants. They probably just never saw the roads safe enough to raise the speed limit.
But it's not just that there isn't a cop watching, it's that if there was, he still won't pull you over unless you're going 10+ mph over (more where I'm from!). So the official speed limit isn't really the law. Most cops also won't get you for a rolling stop unless it's egregious.
Are you going to build in a random number generator then for self driving to add speed sometimes above the posted speed limit? Or a button to boost the speed above the state laws?
It makes no sense when all you have to do is not use self driving.
It is neither obvious, nor possible. There are zero real world drivers that actually follow all the laws. Do you think that people are just all bad? Or is there maybe a higher level principle at play here?
Self-driving cars should be mimicking only the "best" drivers. If you qualify "best" as something like "lowest lateral g-forces on passengers" then you end up with a very comfortable drive that absolutely breaks laws all over the place and is also safe. So... which would you rather have? A driver that is a delight to experience and does not cause you anxiety, or a driver that strictly follows all laws to the letter but drives like a robot and kind of throws you around?
In the longer term - I suspect we will develop two different sets of laws as we understand what things robots do well and what things they do not. For example: I suspect that robots might be allowed to drive slightly faster on highways because they are better at maintaining the proper following distance, but always drive at the lowest speed limit for roads with variable conditions because they struggle to evaluate the environment.
Human drivers respond to incentives. Lax policing, lax drivers. Strong policing, speed traps, and bigger fines result in people driving better. It's not that people can't drive, it's just that they don't care to drive well, but they will drive well if the enforcement is frequent and heavy enough.
Erring towards 100% cautious at all times isn't always better, it needs to be a balance between controlling the risk of accidents with the costs: time taken for individual drivers, general congestion of roads, even things like fuel consumption.
Risks vary according to weather conditions, traffic levels, times of day, etc etc. A prescriptive approach (e.g. always drive at X speed) isn't usually optimal.
are you joking? The risk of driving is injuries and death, primarily born by those _not_ in a car. How somebody weighs this with speeding or rolling stops, or any other form of acting like a careless driver is beyond me.
Most importantly, safe driving is good form. Speeding is trashy.
It's always safest to not drive at all, or never leave the house. On the other hand, exceeding the speed limit when driving through the desert doesn't endanger pedestrians to any significant degree (how many zeroes is sufficient?)
How many miles are driven through the literal desert compared to where people are around? If anything, the speed limit in the desert could be adapted to local conditions, but not base the default rules on that exception.
Why are they called "speed traps"? If you're driving over the speed limit and get a citation, it isn't a trap. This isn't like a sting operation for vehicle theft with a car left parked and running with keys in the ignition.
There are some states where it was found illegal to have someone slow down too fast, i.e. you can't have a 70 mph zone followed by a 30 mph zone. That would in fact constitute a speed trap of kinds, since no one can safely slow down 50 mph in an instant. The practical effect of this is 3 or 4 series of signs spaced exactly 100 ft or so, each one lowering the speed limit no more than 15 mph. It's the equivalent of a 'braking zone' on a race track, but on the street.
Is a trap because its below the natural speed of the road. My city for example lowered the speed to 25mph by roads are designed for 35-45 mph so naturally traffic flows at that spees
I'm a little incorrect in my use of "speed trap," my apologies. I meant where the police are hanging out out-of-view and ready to spring on anyone speeding or doing other obvious violations, not sudden speed transitions which are mainly used as a revenue stream.
How are you defining better? Because personally I find taking rolling stops when it won't cause any harm is better than always stopping, even if it's not legal.
I think that if your country have signs for rolling stops (a yield sign basically), and chose not to put one at this very intersection (even if its a new road that had not seen much usage yet), you should do a full stop, even with a car doing a full stop in front of you.
In some countries, it's not clear if you don't read the local language(Morocco, i'm looking at you!), but if the inverted red triangle exist, just do full stop.
When driving I follow the spirit of the law, not necessarily the letter. If I'm approaching a stop sign and I can see clearly there are no other vehicles, bikes, people, etc, I slow down to very near a stop, but not a full stop. Why bother? The spirit of the law is to make sure I'm correctly yielding to other vehicles.
Perhaps a better example is stop lines at intersections; sometimes stop lines are so poorly painted that you cannot see any oncoming traffic unless you pull forward another 5-6 feet. The letter of the law says you still must completely stop at the stop line, pull forward and completely stop again, then continue. Does anybody actually do that?
The law is just a tool. It routinely does not permit the optimal choice for neither yourself nor society as a whole. Like other tools, sometimes the best choice is to ignore it.
----------
>The comment you are replying to said nothing about the law.
>> Why bother?
>Because you're operating heavy machinery that regularly kills people.
The context of the quote by globular-toast is from this paragraph:
>When driving I follow the spirit of the law, not necessarily the letter. If I'm approaching a stop sign and I can see clearly there are no other vehicles, bikes, people, etc, I slow down to very near a stop, but not a full stop. Why bother? The spirit of the law is to make sure I'm correctly yielding to other vehicles.
> That's exactly wrong. Human drivers need to get better, or get off the road. Robots shouldn't mimic careless driving habits.
One of those things will get better as technology advances and more R&D is done on it, the other thing will not and hasn't for a very long time. Unless I misunderstood your comment and you are actually referring to some form of transhumanity?
How many times do human drivers need to actually stop at stop signs? Probably hundreds of millions of stop signs are rolled per day. How much more carbon is released, if all of those vehicles actually came to a complete stop?
Change 4-way stop signs for roundaubouts like most of the world. Not only does that increase throughput by allowing to yield they also increase safety by forcing a curve and avoiding frontal crashes. (Also, pedestrian crossings should and are normally set back a bit from the intersection, given some traffic calming like level crossings and a median island and have better visibility overall)
You can't just replace every 4-way stop to a roundabout, and the "rest of the world" doesn't do so either. The vast majority of the roundabout would have to cut across residential property.
Now consider how many people have died or been permanently injured from rolling stops. An extremely common mistake is a failed right-turn on red where the driver doesn't stop and rolls over a pedestrian in the crosswalk. And how many drivers will live with the guilt for the rest of their lives.
Who cares how much carbon it saves when people get horrific injuries from it? If you want to save the planet, build intersections designed for rolling like roundabouts, crosswalks with signaling, and yield signs.
How many people have died or been permanently injured from rolling stops?
There are loads of stop signs at intersections where pedestrians would rarely if ever cross, maybe it would be possible to differentiate those intersections from crosswalks.
If you want to go over the speed limit then don't use self driving.
If you increase the speed limit by 10mph, humans will just adjust up too.
I do agree that interstate speed laws need to be updated. The most dangerous spot I drive on an interstate is crossing a state line that drops from 70mph to 55mph. The slower zone is so dangerous because some people are then going 55mph and some still want to go 80mph.
I don't see how self driving can be anything other than the most perfect letter of the law driver though.
They're not equivalent illegalities. Speeding involves a much higher energy state than rolling stop. If there's an accident, very clearly one is much worse.
I feel like a Unicorn, but I obey the speed limit as closely as I can. My speedometer says a perfect 60 on the highway in my area, and I am always being passed like crazy. I've never received a speeding ticket.
I'm not a "nuisance" for not going faster and breaking the speed limit. Everyone else is the "nuisance" and I vote for stronger enforcement where possible.
One of the most frustrating things I deal with is people driving too fast on the main street that is (basically) just outside my cul-de-sac. It's a 35 mph limit but people drive 40-45. That makes it harder to see them coming, making the road less safe.
Too many people seem to believe that the only thing that matters is the road in front of them, they don't consider the side roads at all. The city is talking about narrowing the lanes on the main street to reduce speed and add bike lanes, and it can't happen soon enough.
Also your chance of surviving an impact at like 50mph is next to zero as a pedestrian. You have a higher chance surviving an impact at 35mph even though its still pretty small.
It's not just the highway, I am frequently passed on roads of all kinds for following the speed limit and going no faster.
They are absolutely, for breaking the law, the nuisance in these situations. The number of people breaking the law does not change what the law is or, even necessarily, what the law should be.
The number of people breaking the law definitely changes what the law should be for traffic laws that primarily are about coordination. If 80% of the population starts driving on the opposite side of the road, or stopping in green and going on red we should absolutely change the law. Speed limits are primarily a similar law. It hardly matters what they are as long as people go the same speed.
People generally drive the safe speed for the road they are travelling on. The fact that you're being passed (probably on the right a lot of the times) means you are driving slower than traffic and are actually a danger to other drivers. You might think you are being morally high and mighty but you are actually putting innocent lives in danger. Please have some introspection here.
> People generally drive the safe speed for the road they are travelling on.
Can you substantiate that assertion somehow? People drive the speed that they feel safe with, but that is not necessarily true. It may be safe most of the time, but change due to external circumstances. It’s safe most of the time to drive 50km/h in a residential zone, except that one time when a child jumps out from behind a car.
The taxi study proved that drivers tend to take greater risks in cars equipped with ABS (although the difference in collision rates was not significant). In short, ABS may do more harm than good.
No, you are just more conscious about speed-related issues than your local council. They should engage in road work to reduce the road width, making everyone slow down.
You are behaving dangerously. Accidents are caused by speed differences (one reason). People changing lanes to pass you introduces all sorts of additional complexity, increases speed deltas, reduces reaction time, limits visibility, requires quicker maneuvers, forces other drivers to double and triple check that others aren't trying to pass simultaneously, increases the risk to pedestrians and cyclists, etc.
Deliberately going the speed limit when it results in other drivers constantly passing you increases the danger to to everyone else around you significantly.
Please give this some thought.
Ideally, the German model would be enforced. Slower traffic keeps right, left lanes are for overtaking. And many places are unrestricted, meaning there's no speed limit. The system works quite well, and the flow of traffic is _vastly_ more predictable, and feels a lot safer than the US interstate system.
It's a bad law that endangers people and should be changed. Going an unsafe speed is another law, and its entirely possible that there is no speed that does not violate one of those laws. The law is a relic of when cars were less safe, and the same safety can be achieved with much higher speeds.
You drive the speed you feel you can drive without undue risk to others, those around you may do the same. I take the exceptional viewpoint that neither of you are necessarily 'nuisance.' Reasonable rate of speed depends on vehicle type/design, driver skill, sobriety, wakefulness, attentiveness, safety features of those on the road, density of traffic and environmental conditions. Reasonable rate of speed does not depend on whatever arbitrary number is on a sign.
All roads are not built to perfect standards. Chances are your freeway interchanges are not built to be taken at full highway speeds. If you tried to go 75mph to make the 101s to 110s interchange in LA, you would crash into the wall like many have done because the recommended speed is 35mph (and speaking with experience it is VERY sketchy going above that because the turn is entirely unbanked and the lanes are narrow and there is zero shoulder to speak of).
There is also the issue of meatspace. Maybe your modern car can exceed the speed limit of the road surface and safely handle itself. Meanwhile, we still have our biology to deal with, which has not hardened itself to survive an impact at a high speed with a car. Only after millions of years of pedestrian deaths will that evolution be possible. When you find yourself driving 45-50mph on a road signed for 35mph, just know that if a pedestrian were to step out and appear in front of you, you are virtually guaranteed to kill them at this speed, and if you respected the signage, they have a better chance of surviving this impact.
Most accidents happen at intersections, and you don't know how fast the other car is going. I would consider rolling stop far more "dangerous" in that regard. If I were programming it and laws were not a concern, I would probably look at speed limits on both streets and disallow on higher speed intersections, or those with obstructed vision (data allowing of course).
There’s nothing magical about a full and complete stop. The reason we do it is to force humans to slow down and give themselves some objective standard for how much time they need to evaluate the conditions of the intersection. And the reason rolling stops are so common is that in many situations, it’s very obvious that the full stop is an unnecessary bit of ceremony. I don’t think it’s at all unreasonable to say that a competent self driving car should be equally capable of making the necessary safety decisions whether it’s paused for a full stop or not. Given the premise that we are approving computers to make high frequency decisions about multi-ton hunks of metal driving around on our streets, there is simply no way in my mind that the difference between a full stop and a rolling stop for them is actually a safety issue.
It's more a question of whether it's a problem for the other driver. But yes, I agree the best thing would be to change the laws. Instead they have gotten even more ceremonious in my area over the last few years. Tesla is taking on a ton of liability even by going 5 mph over the limit. This recall will be used in a court case over another FSD feature for sure.
This makes the big assumption that the other driver is playing by the rules and respecting the speed limit and also not going to ignore the stop sign (which I see happen more and more lately in socal).
If someone rolls a stop in a non-4-way stop and there was a car approaching at 55mph at the same time from a transversal road then a close to 55mph t-bone crash is prone to happen. (And the vehicle involved might be a truck with a heavy load vs a tesla rolling a stop)
Presumably the cars driving on the intersecting lane of travel could very well be in the aforementioned higher energy state when an accident during a rolling stop occurs.
Humans aren't given leniency, there's just an issue enforcing laws on the books. The answer isn't to give self driving cars the same pass to endanger others especially when its lines of code we could implement trivially.
They ticket everything in ohio. Drive too fast? Ticket. Drive too slow? Ticket. Lights not on after the legal definition of dusk? Ticket. Hanging out in the left lane? Ticket. Not using the blinker? Ticket. Leaving the blinker on? Also ticket. The sucky part is when you are a teenager they sometimes couple this with throwing all your belongings on the side of the road in search of the weed that you must have.
At least when I grew up in Australia, part of the road laws was a statement, very close to this effect:
"You are required to drive at the maximum speed that is 1) within the speed limit, 2) is appropriate for the conditions, and 3) within your ability to control the vehicle".
And that was the rationale for ticketing 'too slow' drivers. Within those constraints, there was no reason for you to go more slowly, and posed unnecessary risk to those who were within those criteria.
This. Elon is all "move fast and break things"; "things" in this case being Teslas and the people inside them. And why wouldn't he? The federal government barely pays attention to wanting to audit self-driving technologies as it is.
This should initiate a conversation into following the spirit of laws vs. authoritarian rigidity of law without nuance.
In high school there was only a handful of things I learned that were actually useful, outside of social experiences. One of them was a teacher who taught business and law classes. In a business class he shared with us, first saying he could probably get fired for telling us this, but that if we had good ideas, new we had a good idea for a business, then instead of going to university and getting $40,000+ into debt over 4 years - get a job and/or apprenticeship and work on your idea. At the end of that 4 years he suggested you'd be in a much better position than those who went into higher education; of course it depends on what someone's goals are. In law class he gave an example: in some states in the US there are very long stretches of road where you won't see anyone for awhile, and sometimes there are traffic lights on a straight road - with no intersection. He put forward to the question of what do you do if that traffic light is red when you come up to it? There's zero vehicles near you in either direction, there's no intersection to worry about cross-traffic, and so do you stop and wait for the light to go green, or do you go again even though it's red? I think every reasonable person would answer that they would go. E.g. A rolling stop in some circumstances isn't dangerous for anyone; and I also see police doing it all the time.
Of course AI deciding when to do it when it may not yet be accounting for the whole or an adequate enough of environment does add questions, and because it's not critical to self-driving, I believe it's a good idea to not allow it until that conversation can be thoroughly hashed out, as well as the technology much more thoroughly tested and evolved.
Edit: Lazy people downvote - find something better to do with your time, or some other form of entertainment.
>This should initiate a conversation into following the spirit of laws vs. authoritarian rigidity of law without nuance.
That's why rolling stop laws don't always end up with you getting a ticket. I was pulled over a year or two after first getting my license for executing a rolling stop. The cop reminded me what I'd done, recognized that I was a dumb kid and a relatively new driver, told me to cross my heart and promise I'd never do it again, and let me go. Enforcement is circumstantial and nobody's going to follow that rule if they're not aware that they could get in trouble for it.
>In law class he gave an email: in some states in the US there are very long stretches of road where you won't see anyone for awhile, and sometimes there are traffic lights on a straight road - with no intersection. He put forward to the question of what do you do if that traffic light is red when you come up to it? There's zero vehicles near you in either direction, there's no intersection to worry about cross-traffic, and so do you stop and wait for the light to go green, or do you go again even though it's red? I think every reasonable person would answer that they would go. E.g. A rolling stop in some circumstances isn't dangerous for anyone; and I also see police doing it all the time.
Many such stop lights exist because they provide a safe space to cross the road for pedestrians. In fact there are a few such lights near me in long stretches of road, among woods, that allow people walking through trails in the woods to cross the road. I stop at those lights every single time. Why? Because I don't know if someone has crossed yet. Could be that it's a family trying to cross the street, and they had to go back to the trail to corral a kid that wandered the other way and they'll be crossing the street in a second. Or, it could be a group of people - some of them have crossed, others are shortly behind and will be coming out in a second.
The point is that, often, I don't know what is or isn't there. And unless I know, I'm going to stop.
>Edit: Lazy people downvote - find something better to do with your time, or some other form of entertainment.
You're right, and it's the same with jaywalking - it's usually only enforced or a fine or charge laid if the action causes a collision or harm.
In the example I gave there wasn't pedestrian crossing as part of the example, and in it you also stop at the light first. In your scenario it sounds like there are blind spots too, whereas I guess I left out some language, like the road was in a desert with full visibility everywhere. Of course, you need to always fully stop or be rolling slowly enough, say if you're making a right-hand turn at a red light [where legal], so that you can stop quickly enough if you see past the blind spot that traffic is coming.
>... whereas I guess I left out some language, like the road was in a desert with full visibility everywhere.
I'd guess that's a speed deterrent. Long stretches of flat, open road with nothing around to crash into are very inviting for people looking to race. Get going fast enough and you risk losing control of your vehicle and crashing into other people, say an oncoming car. Those red lights, assuming I'm understanding your scenario correctly, encourage people to maintain a safer pace of travel.
>Of course, you need to always fully stop or be rolling slowly enough, say if you're making a right-hand turn at a red light...
... no. There is no "or be rolling slowly enough". Stop.
> I'd guess that's a speed deterrent. Long stretches of flat, open road with nothing around to crash into are very inviting for people looking to race. Get going fast enough and you risk losing control of your vehicle and crashing into other people, say an oncoming car. Those red lights, assuming I'm understanding your scenario correctly, encourage people to maintain a safer pace of travel.
Sure, or perhaps it makes people who maybe zoned out give them an opportunity to see how fast they're going - and slow down, and they get an opportunity to see if they slowed down fast enough to stop at the red light - before reaching the next red light which maybe is in a little town up ahead a bit, so if they didn't stop in time for the first red light then they've kind of been notified to be more careful next time. But the exercise was to ask: if it's 100% safe to go at a red light [once you've stopped at it], with zero potential for anyone getting hurt, do you go through the red light, or do you wait until it turns green?
It makes for quite the interesting psychological test seeing how different people answer, similar I suppose to the whole train track scenario - where your train is going to crash - and you have a few options; Will Smith in iRobot has related trauma as well, the robot's AI determined to save him - an adult - over the young girl, even though he was trying to command the robot to save the little girl.
> ... no. There is no "or be rolling slowly enough". Stop.
If you're at a stop sign or red traffic light, and if it's legal in your jurisdiction to turn right on, you have to start rolling forward - and are allowed to even if there's a blindspot and can't see any traffic coming yet [the road could be clear or not] - so if you're going slowly enough and there's no traffic then you continue, if you all of a sudden can see past the blindspot and there's enough time to safely go then you continue, if there isn't enough time to pull out then you stop.
>But the exercise was to ask: if it's 100% safe to go at a red light [once you've stopped at it], with zero potential for anyone getting hurt, do you go through the red light, or do you wait until it turns green?
Stop and wait. Keep those good habits a part of your second nature.
>If you're at a stop sign or red traffic light, and if it's legal in your jurisdiction to turn right on, you have to start rolling forward - and are allowed to even if there's a blindspot and can't see any traffic coming yet [the road could be clear or not...
This is true, but you are missing one very important caveat; you pull up to the stop sign, THEN YOU STOP, and then you slowly inch forward to see past the blind spot and continue on per the rest of your comment.
I agree this is worth a conversation so here is mine.
I think the law as-is is already ambiguous enough (due to its complex nature) that our society waste huge amount of energy and resources arguing about its meanings. Anything that can be "rigidly" defined (and therefore enforced) with little downside* is a win in my book, purely from a practical perspective.
* Just like in this particular case (rolling stop), the "downside" of enforcing full stop compared to a subjective "safe rolling stop" is close to none.
And about the spirit of the law, to me it's pretty simple: if an illegal practice works better in this regard than the "legal" alternative, sure, we should seriously consider if we need to punish people doing so.
But in most of cases people bringing this in, both practices are perfectly aligning with the spirit of the law. Paired with the practicality argument above, there isn't much point to allow rolling stop as it doesn't make the road "safer" than full-stop.
Also, as someone coming from a country that doesn't have stop sign, people are spoiled and don't know how genius this idea is.
In an ideal world where everyone follows traffic law and pay full attention to the road all the time, stop sign isn't really needed. However, people are not machine. The whole points of stop sign is to force these distracted drivers to pay attention at intersections, even if only out of fear of getting a ticket. This kind of "foolproof" safety technique is not as excessive as it may look like at the first glance. Therefore, I'd argue having people full stopping is exactly the spirit of stop sign.
Okay, so safety wise let's say that's figured out. People and planners also like to take into account flow of traffic, and throughput wants to be maximized, you may then conclude that there are safe circumstances where rolling stops have practically no safety concerns - and if AI can become amazing enough [let's say there are 100,000 baseline safety experiments that need to be conducted/passed] to account for and catalogue those scenarios, then we could arguably loosen restrictions in at least some contexts.
> Simple, binding, easy to follow rules are important when the cost of mistakes is death or significant damage
There is nuance. Pilots are the final authority when it comes to the safety of the craft, and you're in the clear if your actions were justified. With self driving cars, we're discussing where the boundaries are and when the vehicle's decision can take precedence over coarse legal code (as occurs with human drivers every day).
Of course! And you should definitely run a red light if staying were you are would put you or other in dangers. I'm commenting about running a red light just because the road seems otherwise clear.
And you're being naive or arrogant enough at this moment to think this is "cry[ing] about internet points."
Maybe brainstorm as to what the actual implications of the dopamine hit/easy reward to downvote/suppress content is vs. simply having an upvote mechanism, and then share your thoughts and I'd be happy to get into a conversation with you. It doesn't sound like you've spent the time to actually extrapolate to the full consequences of the downvote mechanism.
Your response here though is one prime example as to why downvotes for most content types shouldn't exist. That you spent the tiny effort to click downvote to react to what you perceived as my "whining" - that that was a strong enough trigger or annoyance for you emotionally says more about your emotional regulation than the content of what I said, likewise by actually commenting you outed yourself or rather shared your actual qualitative reaction/response - so now there's an opportunity for a conversation, to broaden or enhance your understand or perhaps get educated by seeing things more from my perspective.
Don't you think having a qualitative response vs. a single quantitative digit changing to suppress content in an algorithm is more valuable to you, to society?
P.S. Upvoted you for commenting. Now maybe your comment won't be at the bottom, interesting how the "worst" or less valuable or lowest quality [qualitative] comments naturally make their way to the bottom - without requiring the downvote mechanism, isn't it?
P.P.S It'd be neat if HN/dang would offer a parallel view of posts, and then in the actual thread view, have 2 columns of comments - one not influenced by downvotes, and the other as the status quo - so people can start to experience and contrast; because AFAIK downvotes/upvotes aren't available in the API, so a third-party can't develop this? I'd certainly develop this system if HN's API could facilitate it.
You can get away with it a hundred times, or a thousand times, but eventually you'll be tired and do it and clobber a pedestrian you didn't see because "you were tired" (so its not your fault, even though it 100% is).
Stop thinking the way you do and stop being one of the 88% of American drivers who think they're above average. Follow the goddamn rules because you're not 99.99% perfect and its the .01% that is going to hurt someone else.
And "I see the police doing it all the time" clearly isn't the right moral barometer, if you haven't been paying attention.
No, but you're attempting to put words into my mouth.
You're making assumptions too, it seems, of what scenarios I believe it's safe for rolling stops to occur - or what state a person will be in when they're doing it. For example, I don't drive when I'm very tired, and whether I am tired at all or not, if weather conditions or if traffic conditions
Maybe we shouldn't allow airplanes to been flown anymore because "you can get away with it a hundred times, or a thousand times, but eventually you'll" crash?
I wonder if you're convoluting different rules, like your assumptions, and not differentiating that different rules are more serious than others - giving the same weight to less serious rules than those that are more serious. E.g. Speeding through a red light during rush hour is different than a pedestrian jaywalking - yet both are illegal.
And before someone comes in to say a jaywalker can't do the same damage as a vehicle, here's my personal story: I was riding my bicycle, going a normal speed, vehicles parked along the side as they were allowed to - and the perfect scenario for a collision occurred: a tall, strong man walked out into the street - looking the opposite direction first - from behind a box van with no windows, and stepped right into my path with no time for me to put my breaks on. I crashed into him - he didn't actually move - and I had whiplash, my jaw slammed shut, I bite the right side tip of my tongue 80% off in a deep cut, and multiple teeth were split and chipped; him and his girlfriend didn't stay around, they actually laughed about it as they walked away, and I was in shock - and in pain - and so I didn't realize I should have called the police.
If you bend the rules around rolling stops, I don't trust you not to bend your rules you claim to have about driving tired or in bad weather.
I'm almost positive you'll violate those rules as well.
I just believe that you're a flawed human being like all the rest of us. And so its safer if you follow rules consistently and stop trying to optimize things away that don't need to be optimized. Same thing with using your turn signal every single time because even if you think you 'need' to you should always do it for the pedestrian, vehicle or bicycle that you don't see -- or even the person violating the law and driving down the shoulder of the highway.
And I don't even understand how your example is relevant. But you should take the lane and stay away from the door zone if you're bicycling (and when you're driving you should be careful about the door zone as well). They were in the wrong, but it sounds like you're not very aware either, which kind of only reinforces my point.
You're making assumptions again. From your perfectionistic-like attitude, you're probably a worse driver than I am - how's that for an assumption? Though I'm generally good at orienting myself compared to others, it'd be neat to actually learn our different driving styles and see who's a better, more responsible, more aware driver.
And arguably people who rigidly follow rules could be far more dangerous on the roads than not. For example, people who only go 100 kilometres per hour highways in Ontario - the speed limit on most highways here - even in the slow lane, cause traffic to have to move into passing/fast lane to go around them.
The back of a box van didn't have an outward moving door zone - and it was impossible to see that there was a pedestrian about to step out because the box van had no windows; nice try cherrypicking and not understanding/visualizing my argument fully/accurately in order to try to make an argument point.
So do you go exactly the speed limit where you live - or perhaps you even below the speed limit on highways?
I do always put my turn signal on because that's respectful as a warning to let people know your intentions ahead of time, and the blinking alerts the brain to a change before the movement actually starts (at least that's how they're meant to be used).
You didn't respond however to my example of rolling forward at an intersection, say where you're turning right, but it's a blind spot to the left - so you have to roll out as normal behaviour - to check if it's safe to continue with the full turn. So are you saying people should never do this, even though it's accepted-common practice (and what they actually teach in driving schools), because it's the same behaviour as a rolling stop - except the person comes to a full stop first and then slowly rolls out?
With all the assumptions you're making, I'm curious how quickly you're imagining the rolling stops I do are - in the limited circumstances/scenarios/contexts that I actually very occasionally do them?
I agree strongly with your overall sentiment, but you're quite wrong on this:
> I think every reasonable person would answer that they would [run the red light in the middle of nowhere].
I tend not to do it. Not because I have some overinflated worship of the letter of the law, but simply because it doesn't matter to me. If I'm in a hurry, I'll probably blow through it, but by default, probably not.
Edit: I also agree that you're being downvoted simply because people disagree with you or your style. It's something I'm getting somewhat tired of on HN. I think they need some concrete guidance on what's acceptable to downvote and try to keep it as much to "rules violations" as possible, because it's increasingly being used as a disagreement flag lately, in my observation, which is just unhealthy.
In a few months: Tesla baffled as regulators crack down on their "I'm in a rush" feature, which makes Tesla cars with FSD ignore any and all speed limits as long as the car deems that's safe to do and won't bother anybody.
It can be a real pleasure to enjoy a buttery-smooth ride, and robots will excel at it. And of course manufacturers will optimize for it since it demonstrates confidence and safety. OTOH, you know what can also be fun? Having a Tesla kick your butt with almost 1 G.
There should be a reasonably-smooth default setting, a "show me your skills" super-smooth setting, a "sporty" setting, and "madman". The latter would probably cost extra due to the extra wear and tear, and I'm sure I'd pay for it, at least once, for the experience.
I'm not sure I understand the joke here. That's the way it works. The cars already allow you to speed (up to a limit, obviously). Just roll the right scroll wheel. And you can control its default choices via a menu (e.g. "limit + 5mph", etc...). Being able to match traffic speeds is a good and useful feature.
But yes, it's technically "illegal", just like a rolling stop. Which prods the question of why one is subject to regulatory oversight and the other is not.
> But yes, it's technically "illegal", just like a rolling stop. Which prods the question of why one is subject to regulatory oversight and the other is not.
Because one is a safety issue when done, and one is a safety issue when not done. Not obeying the speed limit is (mostly) safe when matching the traffic around you. Not stopping fully at a stop sign is unsafe since it can lead to accidents regardless of what everyone else around you does.
Or to flip it around... obeying the speed limit but not matching the traffic around you is generally unsafe. While obeying the stop sign and stopping fully is always safe.
Limit + 5mph is an explicit control by the driver, not a decision by the car. "I'm in a rush" described here is not explicit, the car is making decisions on how much faster to go. Self driving regulations naturally only regulate decisions by the car, not decisions by the driver. "break all speeding laws however you want" isn't a decision as much as a directive on how to make driving decisions.
Same with rolling stop in fsd, it's even more clearly a decision by the car which would naturally fall under self driving regulation while decisions by the driver naturally wouldn't. I.e. this isn't a "perform rolling stops" button the driver pressed to force the decision.
Virtually all humans roll stop signs from time to time. It's safe when done in appropriate circumstances, yet it's illegal.
The law is at complete odds with normal human behavior, and we've been ignoring the contradiction for, what, decades?
The most interesting possible outcome of widespread self-driving vehicles is reconciling the legal fictions on the books with reality as code is unambiguously breaking the laws that aren't based in any semblance of reality.
> Virtually all humans roll stop signs from time to time.
Are you in California by chance?
Because I rarely see that behavior in Washington state. Dead of night empty roadway people will come to a full stop at a stop sign.
> It's safe when done in appropriate circumstances, yet it's illegal.
You mean like when a driver thinks they don't see anyone else around? While driving a car with giant A pillars that block a large % of the street and sidewalk from view?
Just, like, always stop at stop signs. It isn't that hard.
Not justifying rolling stops, but I can understand why people do it. I've lived in infuriating neighborhoods with a 4 way stop sign at every... single... intersection... You could drive a mile and hit fifteen of them. Cities get a complaint about an intersection and the cheapest way to "calm traffic" is to plop down a 4 way stop sign. You've basically created stop-and-go traffic without the traffic.
Most stop signs I encounter in my commute and trips into town could be safely converted into roundabouts while still keeping neighborhood driving speeds low, but since this is the USA, nobody understands how to use roundabouts, so we sit there like idiots stopping every 300 feet of travel.
There is one intersection in Kirkland WA that needs to be a round about. Every day during rush hour one stop sign is responsible for a 3-5 minute line of traffic.
But you know what? Everyone stops at it Even though it is stupid, and there is almost no chance of cross traffic, people stop at it.
(It is on 116th ave in Bridal Trails, anyone local reading this knows the stop sign I'm talking about!)
So yes, some very stupid stop signs exist.
The neighborhood I live in now has a compromise stop signs every block going east/west, but no stop signs going north/south. It works out well enough!
Indeed, roundabouts would solve a lot of these problems.
One private gated community around here just uses giant speed bumps. Another, somewhat jarring, solution to the problem!
It certainly is illegal, but everywhere else I've ever lived, 99% of stop signs would simply be a 'give way' or 'yield'. That's typically what people instinctively do at those signs. Give way/yield junctions are absolutely safe if used properly and in the right circumstances.
Your anecdotal evidence also contradicts mine, so we have a stalemate here.
I don't really understand the A-pillar comment either. Not going to apply this broadly because you're just a single person, but navigating your blind spots is something you learn to do in other countries at least.
I know we're playing the anecdote game here, but I've lived and driven all over the US and can't recall a place where stop signs were respected by anyone (cross walks are a different story)
You are in an exceptional area then. The entire city of Philadelphia operates on the idea stop means slightly tap on breaks. I've even seen cops regularly roll them.
"Even" cops? Here in the Bay Area, cops blow through every intersection without turn signals and barely any hint of slowdown. And everyone's happy, because only our blue domestic warfighters stand between us and the unwashed mobs robbing Lululemon stores.
If a law leaves "appropriate circumstances" to the discretion of the public then it is effectively useless. If you get hit by a driver who refused to stop at a stop sign, you're most certainly not going to think that rolling stops are fine.
It’s attitudes like this that me bull-ish about the future of self-driving cars.
Cars can be programmed to never speed, double-park, or roll through stops (I'm aghast that Tesla even tried this). Loads of drivers seem to think this kind of shit is just fine and normal, and desperately need to be taken off the road.
Self-driving car skeptics seem to think that the barrier is "being good at driving." I think the barrier is "being better at driving than most people." That's a low bar.
>Loads of drivers seem to think this kind of shit is just fine and normal, and desperately need to be taken off the road.
For every few hundred cars going 65-80 on a 55mph stretch of interstate highway there's maybe a couple going the speed limit. Who is contributing the most danger to the roadway on a per-capita or per vehicle basis?
> That's a low bar.
You only think that because you mentally bucket people who casually violate the letter of the law when they deem it safe to do so and who's judgement on such matters is roughly in line with everyone else's into the "bad driver" category.
It's also more environmentally friendly as it takes more power (exhaust) to get moving from a complete stop, more brake dust created coming to a complete stop, etc.
> The law is at complete odds with normal human behavior
> that aren't based in any semblance of reality.
Dunno how true it is, but way back when, when I was in driving school, the instructor said the reason for coming to a complete stop was in order to accurately gauge the speed of cross/oncoming traffic, which is not nearly as accurate if you are in motion, in order to know if you can pull out/turn safely.
Indeed. Similarly, we may eventually need to contend with the fact that posted speed limits are, in most jurisdictions that I know, at least 5 to 10 MPH lower than the enforced limit. Virtually all human drivers drive some small amount over posted speed limits.
If we were to force all autonomous drive systems to always perform full stops and always obey the speed limit, the human drivers mixed among them will lose their minds.
At least so far, NHTSA has not said that Tesla and others need to ensure that in autonomous drive modes, the speed limit may not be exceeded.
The problem is that judging whether the circumstances are appropriate is nearly impossible from the limited perspective of the driver. You can't see things hidden by A/B/C pillars, you can't see short objects (like children) obstructed by your doors, and too often people using cars forget that things like kids on bikes, motorcycles, pedestrians, etc. exist.
This is indeed normal human behaviour, but we've also seen a dramatic increase in drivers killing people with their cars.
A lot of traffic laws are guidelines until you do something worse, or fit a cop's profile of "suspicious." In the same vein as the old "broken taillight" ruse (the trope is that you get pulled over for no reason and the cop breaks your taillight).
Roll through a stop sign during rush hour? in most places you're fine.
Roll through a stop sign late at night when bars close? Maybe you'll get pulled over for it and a sobriety check.
The laws are rooted in safe ideals, but are applied on a whim based on police bias.
Personally I think self-driving cars should err on the side safe ideals (even if they're slightly less practical), and the laws should be applied to human drivers in a more uniform manner.
I've had some bad run-ins with police myself (pulled over and searched for having a mirror ornament, pulled over for doing 5 over the limit, etc)... and follow every asinine road rule because of it.
And there are circumstances where it's both reasonable and expected to pass a vehicle over a double yellow line (a slow-moving agricultural vehicle or a horse and buggy, for instance).
And there are definitely cases when it's preferable to closely cut a yellow light than risk a collision if you're being tailgated and don't have much faith in the person behind you stopping if you do.
At best, the law codifies best practices for the typical case.
At worst it's a tool used to selectively target some people over others. See, for example, the concept of "driving while Black".
> The feature, which appeared to violate state laws that require vehicles to come to a complete stop and required drivers to opt-in for what it dubbed "Assertive" mode, drew attention on social media and prompted NHTSA to raise questions with Tesla.
Emphasis mine; this apparently wasn't the default behavior. Though, it may not have been clear to users opting-in exactly what "assertive mode" changes about the system's behavior.
Tesla would vastly prefer a situation where the NHTSA allows it to do anything it wants, so it does just that, hoping to normalize this deviance from how the law intends highway safety to be regulated. By this action, NHTSA is trying out a more assertive mode itself.
It's not the default, you have to turn it on in a menu. It's arguably not even "shipped", as it's part of the must-qualify-in FSD beta program. And most beta users probably didn't know it existed.
It also engages only when approaching an empty intersection. Any obstacles/pedestrians or moving vehicles cause a full stop. To be perfectly honest: I turned it on when I saw it, but haven't seen it actually do it yet. And at this point I guess I never will.
I mean, speeding is equally illegal and inarguably more dangerous. Yet no one is upset that the car lets you speed.
"FSD beta" refers to the specific autonomy-in-all-circumstances product in testing. You have to request it, then prove you can win a game vs. the car's Safety Score feature for a few weeks or months, then wait to be upgraded. It's available to the public, but only in limited release.
"Full Self Driving" is the name of the vehicle option that you can purchase or license, which includes a bunch of different features (light/sign recognition, autonomous navigation on highways, lane changes, stuff like that).
> You have to request it, then prove you can win a game vs. the car's Safety Score feature for a few weeks or months, then wait to be upgraded. It's available to the public, but only in limited release.
Or have enough social media clout (in the appropriately Tesla-positive direction).
Are you referencing anything in particular? It's true that the first few hundred non-Tesla-employee installs were to a bunch of known fans and inflencer types. But since September it's been a completely public thing with objective rules. They have 60k of these cars on the roads now per the linked article, it's absolutely not just a marketing thing.
I agree with you, for what it's worth. I'm not sure why this is at all controversial. Even if you argue that it's sometimes morally permissible to do a rolling stop, it's still baffling that Tesla would explicitly program its AI to perform illegal acts. Why open themselves up to criticism and scrutiny? Should they run red lights at empty intersections, too? Ignore speed limits in quiet residential areas? Maybe tailgate other drivers going under the speed limit?
You make it sound like as of something just by being legal adheres to the platonic form of truth. There are so many dumb laws. For example in a certain state I can't remember it's illegal to have an ice cream in your back pocket on Sundays. There are many, many idiotic laws that one would do well to at least question, instead of shaming others or being "baffled" why others don't put up with them.
Illegal isn't binary - just look at speed limits. Everyone speeds at least a little, and FSD/AP had to be allowed to speed to be safe.
Illegal is binary for most driving laws (especially 'do what this sign says' rules), but some things fall into an ambiguous category of illegal-but-rarely-enforced. The problem is that computers don't do well with ambiguous rules. I strongly suspect that when most cars are using FSD the rule of being allow to drive a little over the speed limit will be removed, and cars will have to stick to the limits. Hopefully the limits will be raised.
It is not legal to speed to pass in most states. I've only found Wyoming, Idaho, Minnesota and Washington have such a law on the books, up to 10mph over the posted speed limit.
For sure in Colorado it's not legal to exceed the speed limit to pass. But it is illegal to drive below the speed limit while in the passing lane on a highway with a speed limit 65 mph or higher. There's exceptions for safety and congestion, but otherwise the left lane is considered a passing lane. If you're not passing, you're not supposed to be in that lane.
Only a couple of states have laws allow speeding to pass and the vast majority of states have an absolute speed limit rule disallowing speeding in any situation. A handful of states won't give you points due to a speeding ticket <6 mph though.
Of course these are the laws, not the practice, which is what I think GP was trying to say.
For what it's worth, in Ohio I've never been stopped by a cop for going 5 miles above the speed limit on or off the interstate, despite doing so in the presence of cops many times. I got my only ticket for going 15 miles over, though.
I've gotten a ticket on US 24 (going Fort Wayne to Toledo) for doing 69 in a 65. I've went through there probably nearing a hundred times by then (both family and work out that way) but only been pulled over for it that once. Ever since One of two tickets I've ever gotten, the other was also in Ohio but that one was much more obvious - I missed the speed change on a normal road when I was younger and was doing 45 in a 35. Cop knocked that one down quite a bit though, can't remember what actually got put on the ticket.
Of all of the places I've been Chicago was probably the worst at speeding, especially in the dead of night when the roads are "too" open. Recently they got a bit stricter with the speed cameras though https://www.illinoispolicy.org/chicagos-speed-cameras-ticket...
It is? That is a surprise. In germany and most countries I drove, it was definitely illegal. Of course, it is common and it makes sense, but the law is that the speed limit is absolute.
Why does the autopilot need to speed to be safe? When does “speeding to be safe” become “unsafe speeding”? 10 miles faster than the the one that you’re trying to overtake? What if they’re speeding by 10 already? Why is the speed limit not 10 higher than it is, if that’s the actual safe speed? How can Tesla unilaterally decide that exceeding the speed is perfectly good and safe?
Because people passing you is very slightly more dangerous than people following you. It’s not a big deal most of the time, but when everyone passes you across thousands of hours it adds up to a significant risk.
Why would anyone want to pass you when you’re moving at the speed limit? You are at the speed limit and everyone passing you would be beyond the speed limit.
I honestly can't tell if you're trolling or not. They would want to pass you because human drivers aren't rigidly law abiding machines. Is there a large portion of people that go exactly the speed limit, or even lower? Sure. Is there also a large portion of people that speed virtually every moment they're behind the wheel? Yes, absolutely. And I wouldn't be surprised if that were the larger population in most areas. No one who actually drives with any regularity would ever be surprised that people are speeding to pass them.
Nobody is saying autopilot "needs" to speed to be safe at all times. But it needs the ability to be able to go faster than the posted speed limit. I.e if the speed limit is 45mph, going 47mph shouldn't be a problem that the car freaks out over. For a while, on city streets if you were using Autopilot you'd be able to go up to 5mph over the posted limit without issue. I think in the FSD Beta, you can go more.
The car already has a ruleset for when to not obey speed limits. If you're on the highway and your lane is going 65mph but the other lanes are moving very slowly, the car will slow down accordingly. Similarly, this could be implemented for the inverse to an extent.
And again, I don't think the car should speed by default, but a car going 2-3mph over the posted limit should be acceptable rather than an error state because the world is not black and white.
You're like this close to the the heart of the issue.
If you can determine in real-time whether the maneuver your about to perform or the speed you're going is safe then why even have speed limits? The speed limit for highways is still 65 whether it's a a bone dry, pitch dark, pouring rain, or completely iced over. "Any speed under 65" can't possibly be a safe speed for all these conditions while allowing for the highest safe speeds possible in ideal, or even average, conditions. And this doesn't even being to take into account the huge vehicle variance and tire ware. The safe operating speeds for a top-heavy Honda Fit with narrow tires vs a low-to-the-ground wide-tired Corvette are going to be wildly different.
And then you have to deal with other drivers. If traffic is going 75 you're gonna have a hell of a time merging capped at 65. And in an ideal world nobody would pass on the right making it possible to get off the highway without increasing speed but real life hits hard.
If you can create an all-knowing AI that can predict your the road conditions around a corner or beyond the crest of a hill, maybe. Remember, we are not discussing individual decisions made by people based on a current situation, but the defaults encoded into the software of thousands of cars. And if that default is lax, it will end up in lax behavior.
> Why does the autopilot need to speed to be safe?
For the same reason driving below average speed is dangerous. If you drive 10km/h slower than everyone else you are a problem, even if everyone else is driving at or slightly above the speed limit (very common in Germany).
You are arguing that everyone should be moving faster. But we want everyone moving at the speed limit - building cars that intentionally break the speed limit will make the effective speed creep up. It needs to creep down.
But we want the flow of traffic to be at the speed limit, that’s why there is a speed limit. So more cars need to go slower, not more cars need to go faster.
With respect to speed limits (and not stop signs, so this is explicitly a bit of a digression), it's also worth noting that sometimes you have to choose between "safe" and "legal" since most municipalities set speed limits which are not safe (the safest speed being the speed at which traffic naturally flows). So should a self-driving car (or human, for that matter) drive safely or legally?
That's...preposterous. There are a great many illegal acts that are so rarely enforced that the act is normalized, with enforcement surprising the culprits - speeding and rolling stops being the primary examples. Such are normalized because individual enforcement is practically impossible; it's when the behavior gets codified by a business as/in a product that gov't has a chance to crack down on it.
I would strongly discourage you from trying that line of argument with the cops when you're pulled over for speeding. I'm certain that a court of law with decent standards does not interpret speeding "at least a little" as anything except a binary.
The "argument" is tried against cops when you pass them while driving faster than the speed limit. In the vast majority of cases, if you're going less than 10 over, they will ignore you.
A thing is not legal just because law enforcement lets you get away with it, but I don't think this is a good example. It's silly to have this conversation without at least acknowledging the way driver behavior expands into "grey areas" and gaps in enforcement, and I don't think it's trivially obvious that "self driving" cars should rigidly follow the letter of the law even if that means they'll be the only cars on the road doing so.
It isn't a "recall" in common English. So, the title and story are wrong. Very disappointing on the part of Reuters. This is not quality journalism.
Since they wrote it the way they did, I would guess there is some regulatory document that causes this to be classified a a "recall" for legal purposes. But it's incorrect to substitute legal language for common English language when there is a conflict between the two, except in legal contexts. That rule goes for technical language in general.
Is this practice more common in British English? I feel that it seems to appear more often in writing by British people, but that's anecdotal and could be incorrect.
It is literally a recall, though. A recall is a defined process with the NHTSA (and similar processes with other government safety agencies) to track vehicles that are somehow unsafe, and need some kind of service to be made safe again. There's a website that lists them: https://www.nhtsa.gov/recalls
That can be anything from "these airbags might not open when they're supposed to", like the Takeda problem, or "if water splashed just right on the bottom of the open door, you would have water next to electrical wires which could maybe start a fire" (an issue my car had a few years ago).
Tesla met with the NHTSA and agreed to issue a recall. How is it possibly bad journalism to call it what it is?
What you are describing is a recall in a regulatory and legal sense, but not in a common English sense. In common English, a recall is when the product has to be returned to the manufacturer or taken to a dealer for a fix. In other words, "recall" implies physical movement of a product. I guess you won't agree with me on what "recall" means in common English, and that's fine. And we probably have a deeper disagreement on where words get their meanings from. But I would maintain that a software update is not a "recall." If it is, Microsoft performs a "recall" every time it issues a security patch for Windows.
The update is not the "recall". The update is the remediation. The "recall" is the notification that the product may contain a safety defect in its current configuration, that those products require diagnosis and, potentially, remediation before they are safe again, and that Tesla is legally required to make reasonable efforts to diagnose and remediate safety defects in those products for free [1]. To fix your analogy, a "recall" is more like Microsoft releasing a security advisory or notifying users of a security vulnerability.
As for semantic arguments, obviously the current colloquial usage of the term "recall" means to call for products in use to be removed from use and potentially for remediation. However, the usage of the term in the article is the precise, legal, technical usage of the term meaning what I stated above. This usage and definition by NHTSA predates the colloquial definition and was thus not confusing at the time it was defined, and is both precise and has been precisely used by NHTSA for duration of its usage of the term so their usage of the term has not materially changed in the interim. Therefore, it is both technically, semantically, and culturally correct to use the term "recall" in this specific instance even though the colloquial usage of the term has changed underneath them. This is in contrast with Tesla's usage of the term Autopilot which does correspond with your concerns as it was coined after the colloquial usage had already shifted, and limited effort was made to precisely define and inform potential stakeholders of any differences in terminology with respect to the colloquial usage.
> This usage and definition by NHTSA predates the colloquial definition
I don't think that's right, because countries could "recall" ambassadors, or parliament could be "recalled." Essentially, "recall" means "to bring back." If you aren't bringing something back, it isn't a recall. I don't have time to do the etymological research, so this could be wrong, but it isn't likely. (IIRC I used to be able to access the Oxford English Dictionary for free online, but I'm finding that now I can't.)
Sorry to hear that. This practice degrades the signal to noise ratio of the language and in my view is absolutely indefensible. It's possible for languages to evolve in ways that make them objectively better or worse; we should strongly resist the latter. If we are sloppy and lazy, we make the language sloppy.
For reference, there is specifically a flag/toggle called "California stop" (rolling stop) in the feature flag control panel for FSD / autopilot features. Screenshot[0] and full scroll-through on Greentheonly's twitter[1]. This panel is only available to Tesla employees, as no FSD beta tester had seen it before it was posted to Twitter.
That was back in December 2020 So I the 'chill/average/assertive' setting changes how it works, but it is true that Tesla intentionally allowed the cars to roll through a stop sign[2].
I wish all stop signs would be replaced with roundabouts, or mini roundabouts like those in the UK. Stop signs are wildly time inefficient, less gas efficient, and more dangerous. Not necessarily best practice, but I'm all for rolling stops in the meantime.
Roundabouts are awesome (for cars), when the intersection was designed and has space for it. Retrofitting them into intersections often doen't work. They do it a lot in my city in smaller residential intersections. They can be so tight that the turning radius of a mid-large size car or SUV won't allow circumnavigation. People drive over the center, and still make left turns because it is so hard to get around.
Where I am, they fix the "drive over the center" problem by making it a giant concrete barrier so you can't drive over the center unless you have a monster truck.
Paramedic here. Have been called to more than one MVA where your regular sedan has successfully "climbed" a 20" vertical concrete raised center of a roundabout. Energy and inertia is a powerful thing!
There is one in my city and it is in popular shopping center, so it get a heavy usage out of it. In the center, there is trees and rocks as decorations and designed to prevent anyone from driving over the center. There is other city that have a small (1-lane) roundabout, the center is protected with chain "rope-like" barrier with the sign to drive right.
My friend crashed through the center of recently retrofired roundabout and took flying for a few meters. The policeman was very happy since he won a bet that it'll take less than 3 days for the first car flying.
In that use case, the goal is simply a rule for determining right-of-way. Doesn't really matter if they drive over the center as long as they wait their turn to do so.
The issue is space to put roundabouts. Majority of American cities already have roads set up and it would be difficult to put in the roundabout to replace existing infrastructures. Roundabout requires more space than a simple 2-lane wide 4-way intersection.
And there is a driver problem, majority of American almost never have any experience driving on roundabouts. I seen some would simply cut off the roundabout traffic and proceed on their way without yielding.
I think most crossroads could accomodate a mini roundabout, maybe lopping off each corner a bit. Where I live, converting an crossroad to a roundabout seems to take about a year, they close and completely excavate the existing roadway, relocate storm drains, build a huge center island, and take an additional 10' of property all around. It seems excessive.
Many stop signs could be replaced by yield signs. The energy savings would be enormous. A full stop should only be needed when making a left hand turn or needed for sighting, and there are many stop signs in places where left turns are impossible.
There are issues with mini roundabouts, two I regularly see in my home town:
- If you have one 'entrance' to the roundabout which is partially busy, say at rush hour, it becomes near impossible to enter the roundabout from a different direction. With a normal roundabout many more cars can be on the roundabout at once and a natural flow, even from less used entrances, just happened. This does not work on a mini roundabout.
- Where they have replaced a T junction, people often ignore it causing (near) accidents when going "straight on" when they don't have right of way.
They are definitely not a one size fits all solution.
I'd wish new construction aimed for that, but converting existing four-way stops to roundabouts would require widening of the road at all four corners which is impractical (i.e. you may have to knock down buildings, but certainly have to re-place sidewalks).
The data shows that roundabouts are safer and even improve traffic flow. But they're also a nightmare to install after the fact, which is why it is so uncommon. All we can do is stop making the same mistakes going forward.
Same thing with burying power lines: Far cheaper to do with new construction, substantially more reliable and better able to withstand natural disasters, but we aren't. Society is just bad at planning for tomorrow if it costs us a little today at every level.
I live in an area with roundabouts. The frequency of collisions actually seems higher because people do not know the rules and do not yield before entering the roundabout.
I live in an area with roundabouts. There are very few accidents in roundabouts and it's impossible to get a license if you don't know how the rules for a roundabout.
What the studies say is; "they can substantially reduce crashes that result in serious injury or death." That is not the same as a reduction in the total number of accidents. The number of accidents may actually be higher in a roundabout, but the severity is lower and they are more survivable.
In Iceland they have many roundabouts all over, and at most low residential intersections there's simply a yield for one of the intersecting roads. Having a yield instead of having to come to a stop is so refreshing.
You are also required to have your lights on when you are driving. It notably increases visibility even during the day.
I came back to the US really wishing those policies would go into effect nationwide.
Yes I would love to see the elimination of the four-way stop intersection. One road should be designated as the "arterial" road and should not stop. The crossing road should yield. There are very few cases where a four-way stop is justified.
They're almost the default now in my area of the US if at all plausible.
Having said that a lot of roads in my area are already built around 4 way stops and it's not easy to undo that, but new construction in my area of suburbia US has adopted the roundabout.
Then ask our legislatures to change the law. Don't let private companies make cars that disobey the laws developed by elected officials and those they appoint.
We have one roundabout in my small town, and there is strong hatred towards it. There's a perception that it's far more dangerous than any alternatives.
recalls apply to ota patches that have safety implications because you can't always guarantee that the user is in a place where the ota can be applied.
The feature, which appeared to violate state laws that require vehicles to come to a complete stop and required drivers to opt-in for what it dubbed "Assertive"
I have the FSD beta on my Model X, and opted in to assertive not knowing that it enabled rolling stops. I enabled "assertive" in hopes of making the car less timid about unprotected turns.
BTW, the FSD beta is terrible. Its what I imagine a senior citizen taking their first drive after getting a learners permit would be like. The worst part is horribly timid behavior pulling into traffic or making unprotected left turns. It also is quite annoying on rural 2 lane roads, with frequent enough phantom braking to make passengers queasy and hence make me turn off FSD. It has no knowledge of potholes, so I frequently have to take over before it costs me a wheel and a tire by hitting a monster pothole.
Its frustrating that I paid $3000 for this 4+ years ago, and waited 4 years for the feature, and was made to drive overly gently (no hard cornering or braking) in order to get a good "safety score" for quite a while before I could even try it. The $3000 would have been much better off had I invested it in Tesla stock.
If this is where they are after 4 years, I don't hold out much hope.
When one of your main complaints about your self driving car is that it is too timid and doesn’t avoid potholes, I think things are progressing just fine. Just step back a little and think about where the technology was 5, 10 years ago. Heck even 20 years ago. Imagine telling someone on a forum “yeah my self driving car is fine but a little timid for me and ugh, potholes!”
Well, I have lots of others, but I didn't want to look too much like I'm ranting.
They include:
- Driving too close to parked cars for comfort on un-laned side streets when there is no oncoming traffic. I personally like to leave enough room to avoid somebody opening their door into me, if I have the room.
- Leaving way too much space in front of the car at stoplights. This is a problem when it leads to blocking the entrance to left turn lanes, etc.
- Totally blowing through stop signs in parking lots.
- Freaking out and making me take over on a mildly tricky interstate interchange where 2 lanes narrow to one.
And I'm sure there are a million others that I'm forgetting about.
I'm confused by this response. "Horribly timid behavior pulling into traffic" sounds like they're saying that when a car pulls into traffic it pulls in too slow, which puts other cars at risk of hitting the Tesla that's not properly getting up to speed. And when I read, "has no knowledge of potholes," I picture a Tesla slamming down into a giant pothole and damaging the car; needing to pull over to get it towed, fixed, etc.
Are things really "progressing just fine" if the car is repeatedly putting itself into dangerous situations?
Edit: I suppose that one could make the argument that we didn't even have this kind of tech on the roads ten years ago, and sure, that's progress in one sense. But I wouldn't call it "just fine" by a long shot, nor would I dismiss these concerns as whininess.
The point is that you can't rely on FSD to fully self-drive. Somewhat decent doesn't cut it. In particular, if it can't avoid potholes, it's not safe to let it drive on the road.
> The $3000 would have been much better off had I invested it in Tesla stock.
It'd be ~$40,000 >.<
Could be worse. If the people who put a $50,000 down payment on a Roadster 2.0 the day they announced them had bought $TSLA instead, they would have enough money now to pay cash for TWO Roadsters and have enough money left over to add a Model S Plaid.
What version of FSD beta are you on? After 10.6 most of my complaints were addressed and I am very happy using it to get to work every day, although I live in a suburban / semi-urban area and have never used it on rural roads. I have an original model 3. Pretty much any left-handed turn is protected with a stoplight around me, although it negotiates unprotected ones well for the ones I’ve faced under 35mph
We have solid infrastructure so can’t speak to the potholes complaint. A few months back FSD beta started acknowledging speed bumps and taking them slowly so I’d imagine potholes are in the works.
Note that rolling stops can seem safe but in some cases there are surprising circumstances where intersections can leave you blind to other road users. https://www.wired.com/story/the-physics-of-the-69-degree-int... for instance discusses an intersection where the angle caused a person-size blind spot behind a car's pillar.
Good thing cameras mounted under the windshield glass are not obstructed by pillars.
Direct sun glare can be an issue though. I have yet to see how Tesla plans to solve this, although having five different forward facing cameras surely helps.
Bartleby the Tesla at a stop sign: I'd prefer not.
My most Luddite view is that AV are not worth it right now. Maybe in 10 years, but I see too many idiots on the freeway sleeping or playing video games in Teslas while in the driver seat. If you're too busy to command a vehicle that can kill you and those around you, Uber to work or take the train. I don't trust the roadways to be a massive experiment in how close a corporation can cut it on self-driving tech with some acceptable margin of death. Sure, human drivers are probably worse in normal circumstances. But I'd happily outlaw AV in any peculiar circumstance till 99.9999% of the kinks are worked out away from other drivers who haven't opted into the grand experiment of robots on the road.
That's an interesting challenge for autonomous vehicles though... On one hand, the idea of programming your self-driving system to violate the law seems baffling. On the other hand, there are unwritten local rules everywhere. I suspect that even though rolling stop are illegal in California, the traffic density requires drivers to adopt a more aggressive stance if they want to move effectively and rolling stop helps that, especially when multiple cars are stopped in line waiting at a stop sign.
If most people adopted that unsafe unlawful behaviour but which help reduce congestion, what are you suppose to do as a car maker if respecting the law will make other users road-rage against your client using your self-driving system?
I guess we're headed towards yet another slice of future dystopia - where once we saw the opportunity for clean and efficient automated driving by the time it's here it'll have been so heavily influenced by current bad driving habits that they'll have to keep the bad traditions up, if not to be defensive against people but instead to be defensive around other older AI drivers that are still assuming bad habit.
I don't think we should program-in unsafe driving behaviours to accommodate road-ragers. As those self-driving features get more main-stream, this will just become more normal. And if road-ragers are still around, they should get fined.
If the company accepts the liability, I can't be charged for a driving crime while the car is out of my control, and it is statistical safer to let the car drive itself I am on board, but the courts haven't cleared the liability problem, and there are safety concerns with large white trucks being detected, so driving your own car seems like the right choice for 2022.
At the moment, Tesla's liability position is "heads I win, tails you lose," since you're supposed to be hands-on-wheel and alert at all times while using FSD. ¯\_(ツ)_/¯
Just throwing in my experience here. Come down to Southern California. The rolling stop is everywhere. Not so much in SF Bay Area where I grew up. People get irritated if you drive the speed limit and make full stops. Especially at 4 way stop signs, I often feel like people are confused by a full stop, lol.
I am also an FSD beta user, and I have my car set to standard assertiveness and it does not roll through stops.
And, one last comment on the vision of the car - it’s a whole different beast with the new fsd beta code. The highway autopilot code is not safe on the streets and until you see the fsd beta code at work, it’s easy to think that it will perform poorly on local roads. Yes, it can do stupid things and cause anaccident, but not in the way the autopilot code will. It’s different.
Personally I am curious how this is done in the AI. Do they simply flip a switch that says “if you see stop sign make a full stop,” or do they now have to update and retrain the neural net to acquire this behavior. Does anyone know?
There is a mix of a "neural net planner" and a "explicit planning and control" in traditional code[1]. The explicit planner has the last word, and uses input from both the "vector space" and the neural net planner.
Karpathy has commented about the neural nets gradually replacing the traditional code. See his presentation[2] around 18:45.
So they're recalling a feature that was explicitly and intentionally built to do something that runs afoul of the law, at the user's behest. Again this is by design, meeting their own requirements. For their self driving technology that is already labeled "beta".
Why anyone in their right mind would willingly use this stuff in a risk-intensive life/death scenario like driving on public roads remains beyond me.
(Yes I lose points every time I criticize Tesla online but I will keep doing it until someone makes it make sense or the company finally goes out of business for continuing this type of irresponsible behavior.)
Rolling stops mimic human behavior. Going 5-10 MPH over the speed limit also mimics human behavior, and both of these settings are controllable by the human driver behind the wheel in FSD Beta[0]. At what point does the Beta (or Ford's BlueCruise or GM SuperCruise) force you to go exactly the speed limit?
Sounds almost as if "fully self-driving cars" coexisting with human driven cars is not feasible to do safely because it's not purely a technical problem. My contention is simply that Tesla is acting irresponsible so long as they continue to design and market otherwise.
Self-driving cars need to bend the rules enough to be attractive to a critical mass of purchasers. Then regulations will require strict adherence to laws. Then owners of these cars will be numerous enough to push for regulatory reform.
If all self-driving or driving assisted cars were limited to exactly the speed limit and following the letter of laws that no human follows, they will never achieve mass adoption.
FWIW, cruise control has always let you set any reasonable speed. Doesn't quite seem right to hold a car to a different standard just because it's advanced enough to sometimes know the actual speed limit.
The problem with the assertive setting is it didn't really work. The car would still do a rolling stop even in chill. I think in those cases it might have been creeping for visibility but often the creep is a full blown drive into the intersection.
They really need to add some explicit controls over this like navigate on autopilot has. I've had to disengage so many times due to it making really dumb moves that could be fixed with just asking me first. I know thats against their end goal but their models just are not there yet.
It seems like the mission statement at Tesla is "let's see what we can get away with." I've never seen a company with so much to lose as fearless of regulators or customers.
This is one of the really hard issues of automation: humans normalize disobedience to the point that strict obedience ranges from irritating to dangerous; the gap is not an engineering problem.
It's not a matter of software reliability. The rolling stops was an intensionally added feature to make the car behave more naturally/human-like at stop signs. Tesla knows how to make the car stop at stop signs, they simply chose not to under certain circumstances. You can certainly argue that it was a bad decision on Tesla's part, but using it as a signal for software quality is ridiculous.
Personally I think the removal of the rolling stop behavior is a minor tragedy.
If software engineers at Tesla chose to break the law on purpose, which is what seems to have happened, then I believe they should be criminally charged.
I write this as a massive proponent of self-driving vehicles. Musk has himself said that the self-driving has to be proven to be orders of magnitude safer than humans.
Given that, the last thing you want is a bunch of accidents delaying the mass-adoption of this tech. So, the fact that they shipped this is mind-boggling. I wonder who made the call on this.
So it was software feature in FSD that when encountering a stop sign, it did not stop as it was unable to read those signs? That's much worse than the time the FSD system was confusing the moon with a traffic light and was slowing down in the highway.
No wonder it is eligible for the nickname of 'Fools Self Driving'.
When I moved to San Francisco from Europe I had to take a drive test to get a US license. In every single intersection I did a rolling stop, and got like 15 minor "errors" on my score sheet. However, since I had no other errors I got the license. Perhaps the FSD beta would also pass this test.
Does the Tesla know not to turn on red in Europe or is FSD not yet available? Also for example in Switzerland a driver is required to stop at a cross walk if a person is standing to cross or about to cross. This is not the case in Italy.
'Recall' is not the correct word, even if it is officially what NHTSA uses. This is an 'option' in a 'beta' version that applies below 5mph. The recall is just an OTA update in a few days.
Will be hilarious if FSD gets halted by 80% of humans committing rolling stops and never allowing FSD vehicles to cross the street (which are programmed to follow the letter of the law).
The tension between "Follow the Law" and "Drive like a human" is interesting.
On the one hand, humans violate the law all the time: Rolling Stops, Speed Limits, Driving without Registration / Insurance, Passing on the Right, Left Lane Camping, Carpool lane violations, Stopping after the white line at a stop sign / light, not using turn signals properly, following too closely.
Any FSD System that drives on the freeway and follows the speed limit is doomed to failure. Likewise, as evidenced by the crazy amount of "right on red" fines in cities, humans don't come to a full 3-second stop.
"Drive like a human" seems a reasonable goal for now.
One thing I don’t recall seeing discussed before: why have 4-way stops at all?
Far, far better to have clearly delineated major and minor roads at intersections. You yield when going from minor to major - no confusion, less temptation to “jump the queue” (although some drivers will still try).
The main downside is that it can take a long time to get onto the main road safely when it’s busy. But traffic lights fix that.
I don't much about the tech behind Tesla, but as an engineer I'd never trust a "self-driving" car without lidar. This has nothing to do with the stop sign issue , but the absence of lidar gives me the sense that tesla is trying cut corners to get to something that's merely good enough. Waymo and cruise seem to be way ahead despite Musk's promises.
This is BS. Most human drivers do rolling stops. If you didn't want your Tesla to do a rolling stop, you had the option to set your FSD beta profile to a more conservative setting.
Full stops, when no cross traffic is present, waste time and energy.
Teslas have fixes all the time, unlike other cars which just stay broken. Try putting any car on cruise control on a street with stop signs and see what happens. It will be way worse than what any Tesla does.
“But those cars don’t claim self driving.”
Neither does Tesla. The capability is still under development.
Sure, but you're talking to someone who doesn't believe in "true" FSD inside the next 10 years. I suspect Tesla will be forced to stick with an ever improving FSD-lite that still needs to be monitored. Which (in my own use case) is actually less than pointless ... you lose the joy of actual driving, to become a supervisor of your car.
There is still value in lane assist on motorways and cruise control, but we have that anyway.
The difference is all other car manufacturers haven't been selling FSD and robotaxis as "right around the corner" for the last few years. I understand the development problems and overambitious deadlines ... but if you took away the ebullient narrative and put just the facts on paper, I wouldn't be able to distinguish what's gone one from fraud.
FSD might be a complete joke at this point. Tesla should open source all of their data, code, and documents at this point. Clearly Tesla or any of this vulture capital funded companies cannot get the job done.
What is it with this headline? Tesla is deploying new firmware OTA. No vehicles are being recalled. If deploying new software is a "recall" then I do recalls dozens of times a week.
Are they recalling the software version for an update to fix a safety issue? If so, then it is a recall. You can't play semantics with words like that for these issues.
>A recall is issued when a manufacturer or NHTSA determines that a vehicle, equipment, car seat, or tire creates an unreasonable safety risk or fails to meet minimum safety standards. Manufacturers are required to fix the problem by repairing it, replacing it, offering a refund, or in rare cases repurchasing the vehicle.
>A recall is issued when a manufacturer or NHTSA determines that a vehicle, equipment, car seat, or tire creates an unreasonable safety risk or fails to meet minimum safety standards. Manufacturers are required to fix the problem by repairing it, replacing it, offering a refund, or in rare cases repurchasing the vehicle.
So basically, regardless of whether the fix is a software or hardware update, any issue a car has that "creates an unreasonable safety risk or fails to meet minimum safety standards" falls underneath the "recall" banner. I can see the benefit here; a recall generally gets a certain level of publicity that a "software update" might otherwise not. It might not be a bad idea for people crossing at stop signs to think, "Hmm, Tesla approaching, let's exercise just a bit more caution", until this is resolved.
> Tesla will perform an over-the-air software update that disables the "rolling stop" functionality, NHTSA said.
I was perplexed by the wording as well. Apparently software updates can be labeled recalls now? Did someone inform Microsoft, maybe this would help Windows 11 adoption.
"Recall" is the only term available to express a software update via the NHTSA's public notification process. It's just the word we have. And yes, it implies things that are salaciously untrue, which is one of the reason it finds its way into headlines like this.
That incident is a tragedy - but seems like it is completely unrelated to this matter. It was a speeding driver in a van, who sped through a stop sign & crossing walk and killed a child.
"Police said the van came to a complete stop and proceeded through a stop sign when Hart was unable to stop her bicycle and entered the intersection into the path of the moving vehicle.”
The accident you cite has pretty much nothing to do with the debate here since the stop wasn't actually rolled.
I want to government to change the rules and make Tesla responsible for all accidents causes by FSD. Then let's see how long until that feature no longer exists and Tesla has to admit that what they have been peddling is vaporware.
Laws can be dumb, if there are no other cars and you roll through a stop sign why is that unsafe? This is also called California stop. People in California have places to be and it's a big place so the effects compound.
As soon as you make the rolling stop legal people will push the limits of that law and drive straight through if they think the way is clear.
The numbers of drivers driving through reds (at least here in coastal San Diego) has noticeably increased over the past two years, previously people would hit the gas if the lights turned yellow, now they are hitting the gas if they see yellow (and crossing through the intersection on red). Accident statistics seem to back that up https://www.cnn.com/2019/08/29/us/red-light-deaths-trnd/inde...
Making traffic rules less strict seems like a recipe for more accidents and deaths.
Traffic lights are a completely separate issue. The cars are traveling much faster. At a stop sign you will have much more time to react since you're going 5-7mph max.
I don't think there are any federal laws about traffic signs. California's laws require you to come to a complete stop at stop signs. Programming a computer to intentionally break the law "because I have places to go" seems like poor judgement to me.
It would be fantastic if one day, with a high degree of certainty, self driving cars were legally allowed to run stop signs if they deemed it safe.
There must be some fascinating cost/benefit analysis here looking at the waste that comes from breaking and starting again vs. a small chance of causing an accident.
Think about all the cool ways you could alter urban design or traffic law with self driving vehicle data.
This sort of conflict highlights the difference between the traffic laws (which are often rooted in revenue generation instead of safety) and the way people really drive. Speed limits have the same issue. Some governments (such as Germany) have roads with no speed limit. Why don't we have those everywhere?
In the end, it comes down to personal responsibility. Apparently the people in Germany drive more responsibly than everywhere else?
Click-bait title. Tesla was forced to remove the rolling-stop option when the driver is using the beta version of FSD in the "Assertive" mode. No-one has reported any injuries or problems with this feature. Seems to me having this option is good to have. This behavior is expected in many locations, saves time and energy, and is often safer than coming to a full stop. Cops should feel free to ticket these cars when they do this if they think its not safe, but NHTSA shouldn't be involved in this very important emerging technology at this low a level.
"Tesla said as of Jan. 27 it was not aware of any warranty claims, crashes, injuries or fatalities related to the recall." It is good for automatic cars to behave similar to humans for safety and social acceptance.
For some context: California allows “rolling stops” in which the driver is not required to come completely to 0mph (fully stopped) at a stop sign. Most US states require a complete stop at stop signs.
Strangely, Tesla exported this rolling stop behavior everywhere, despite it being illegal in many (maybe most) states.
It’s strange that this was allowed to get so far. This violation should have been obvious.
Unless something changed in the 30 years since I learned to drive--in California-- the "California Rolling Stop" is a pejorative and is not at all legal. In California, and every other US state, a stop sign means come to a full and complete stop.
>For some context: California allows “rolling stops”...
False. I present to you California Vehicle Code #22450:
>The driver of any vehicle approaching a stop sign at the entrance to, or within, an intersection shall stop at a limit line, if marked, otherwise before entering the crosswalk on the near side of the intersection.
California isn’t OK with it. Police routinely ticket vehicles which perform a rolling stop. The police aren’t everywhere, so you could get away with it for a while, but eventually you’ll probably do it in front of a police car and be cited.
Because based on my experience most people don’t yield at yield signs in ca, and prefer to try and outgun whoever is not letting them in than back off. Sounds like a good way to me to increase the number of accidents at 4 way stops.
It is NOT legal to do a rolling stop in California. Like everywhere else in the nation, you must (legally speaking) come to a complete stop at sign/light. Failure to do so will result in a ticket (if you're caught).
I'm not sure why it's rational or desirable to let companies violate the law until they get a letter that says "the law applies to you too". Let's fine Tesla for rolling a stop sign times a large multiple based on their data of how often it happened.
We can use the national average fine or use the GPS to match it to the proper jurisdiction. I don't care which.
Rolling stops are illegal in California and you will get ticketed if the cops catch you doing it, even if it was perfectly safe.
This seems to be an example of Tesla “pushing the boundaries” in a way that puts their customers at risk. I’m all for small-l libertarianism, but they could spend effort where it actually mattered instead of this childish petulance.
How can Tesla claim self driving if the car can’t read a sign that says - speed limit 25 mph during school hours, and properly adjust? Humans just look around to determine if school is likely in session by the number of cars in the parking lot during normal school hours, or they know the school calendar.
How does a self driving car make that determination? Query the school district website for the school, identifying their bell schedule and tacking on a buffer ahead and behind? Assume a school schedule that’s M-F? What if it’s a religious school that operates Sun-Thursday? Now the car has to determine which religious sects obey which calendar? Is it different in each country?
Just another example of a massive hurdle self driving cars have……
And another recall that should be issued.