Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Tesla owner who drove with 'arms folded' cleared of dangerous driving (irishtimes.com)
25 points by dmulholl on Oct 27, 2023 | hide | past | favorite | 65 comments


> Asked if the unexpected happened, would he be able to react as quickly as a person with their hands on the steering wheel, he said “I would be unsure of that.”

> Judge Hughes asked if he accepted it would be better to have two hands on the wheel to take control of the vehicle, and he answered: “Probably yes”.

If these were candid responses, rather than coached with lawyer, sounds like a real engineer.

And the interaction with the judge sounds civilized and good faith, in both directions.

(Not that I approve of Tesla's autonomous driving tech when other people's lives are at risk, nor with having hands off wheel even if somehow monitoring diligently; just commenting on this judicial interaction.)


They sound like honest answers. The question is more if it is dangerous enough to constitute actual negligence. I'd argue that self driving cars are statically safer that having your arms crossed doesn't really constitute a risk.


I saw a Tesla driving down I-5 yesterday and the person in the driver's seat was wearing headphones and holding their phone in a way that made it seem like they were playing video games (landscape mode, both thumbs hovering). In the passing lane. Barely doing the speed limit. Simply unacceptable on this highway where the right lane typically travels 0-5mph below the speed limit and the left lane travels 5-15mph over.

We locked eyes and you could tell they were embarrassed but I highly doubt they stopped. It looked like actual tech/screen addiction to me. I think they were just worried I was a cop, but when they saw just a normal car they went right back to it.

In some ways we can't blame Tesla for this kind of thing. But it's hard to believe they're doing anything to discourage or mitigate antisocial behaviors.

At the very least Tesla should have a "merge right when appropriate" priority in their Autopilot offering. It seems drivers all over the Bay area have forgotten this basic traffic-mitigating rule too. Many transplants seem oblivious why people passing on the right are so much more aggressive than those passing on the left.


> At the very least Tesla should have a "merge right when appropriate" priority in their Autopilot offering. It seems drivers all over the Bay area have forgotten this basic traffic-mitigating rule too.

You expect this from a company that writes software that will intentionally roll through stop signs?!? Or that lets you set parameters on how much to exceed the speed limit by?


I love the european style of driving on the right, but blaming Tesla for not following it in California where doing the opposite is standard seems like going too far. I think Tesla reasonably is trying to match local conditions, and why shouldn't they? Making California drivers not pace each other on 2 lane roads & block all traffic, or making them cruise in the right lane rather than the left when they have the option, is a tough mission to add on top of creating FSD


> is a tough mission to add on top of creating FSD

To be blunt, "sucks to be them". They want to make billions from the technology, there's going to be some tough problems along the way. That doesn't make it incumbent on anyone else to allow, excuse, or otherwise give them a pass.


So you're saying that if anyone wants to interact with FSD & american drivers, you're requiring them to also solve this problem?

How far would you go here, balancing this added feature to the benefit of making FSD release sooner and lowering traffic accidents? What's your exchange rate in time - there are ~100 fatalities nationally per day - would delaying FSD by 10 days to fix this annoying behavior be worth the delay? What if the delay was longer? How long would you wait? Or what if the delay were unknown - would you assert you are sure it's short, and ram it through? What if you were wrong, that including this behavior caused a long delay in FSD - would you feel bad then?

Or are you certain that they're just being obtuse by not including it? Or do you have another theory for why Tesla isn't including bundling this specific software behavior along with their efforts to create superhuman safety FSD?


    if (isSafeToMergeRight and not rightLaneTrafficPaceAddsTimeToEstimate):
        mergeRight()
So complicated... the hard part is the first test but that's expected to be part of FSD already. And the second test is a simple state estimation procedure on top of the relatively harder detect-cars function, which also should be part of FSD already. So I'm not sure what you think this suggestion adds in complexity.


I would rather let people who are experienced in low-cost manufacturing of cars, batteries, rocket engines, etc make decisions about what features they think would add complexity to themselves, rather than me deciding from the outside what I would like to force them to do, as a prerequisite to improving the number one cause of death for large age ranges of Americans.

Would you delay solving this problem in favor of improving the annoying way people drive on the left in California? I hate it too but if we can get FSD soon, it'd save a lot of people's lives.


> in California where doing the opposite is standard

This wasn't true ten years ago.


In California Tesla is not allowed to roll-through stops at the present time.

Source: I have a model Y and regularly use FSD on roads in CA with stop signs and observe it coming to a full stop.


FSD does have a merge right rule, but it can be turned off. The sort of person who'd do what you describe sounds like someone that would turn it off too. :facepalm

Of course, it could also have been one of the cases where the car just navigated poorly and thinks it needs to be in the left lane for some reason. The vehicle's lane selection is not great, IMO.


Autopilot doesn't, but FSD will change lanes to go better with the flow of traffic. (And does a semi-decent job at it, in my experience.)


A recipe for road rage incident where I live - he must be in laid back California.


Waymo claims to already have a better record than human drivers. How long before we read a similar case where a driver is accused of creating unsafe conditions by taking control of the car?


A while, there's only one car maker selling one car to the general public right now who's taking legal liability for crashes while their system is active.


That Mercedes-Benz system is a glorified adaptive cruise control and has several limitations that Tesla autopilot does not have.

> There must be a vehicle in front of your car, reasonable road conditions with readable markings and lines, and clear weather and light conditions. Drive Pilot can’t be used at night or in the rain, and the headlights and wipers must be set to auto for it to work.

> It’s also only available on freeways that have been mapped by Mercedes, with GPS positioning that is precise to the centimeter and even accounts for continental drift. Drive Pilot can’t be used in construction zones.


It is just very fancy cruise control but it's still very significant to the whole "will a person ever get in trouble for taking over from the automated system" discussion that they're the only ones taking legal responsibility while the system is active. Everyone else including Tesla puts all the legal weight on you to stay constantly alert and take over for the system.


It'd be interesting to read the legalese, one article states that it's on a case by case system. Since there are so many limitations, how will it deal with, say, there no longer being a car in front of you?


> they're the only ones taking legal responsibility while the system is active

They are the only ones using the consequences of general principles of product liability as a selling point, might be more accurate.


Don’t worry, Mercedes will also put the legal weight on you by disabling the system well in advance of any potentially risky segments of road. So their promise is essentially PR fakery. Don’t fall for it.


As long as they're not cutting out moments before a crash to shift liability at the last second I don't see the problem. Disabling the system early when it's not sure it can operate safely is way better than not doing that and plowing into the side of a truck because the driver enabled it on a road with un-signaled cross traffic. (When the car could know from map data it's on a road it's not well equipped to handle mind you)


AP will disengage or work poorly in poor weather too.

Mercedes is actually being cautious about slowly rolling things out, not just yeeting it over the fence like Tesla.

Little things like:

> Mercedes shows us an EQS with an important visual indicator that is not yet legal: in the hopefully very near future, when Drive Pilot is active, the headlights, taillights, and side mirrors will have turquoise marking lights so other drivers know the Level 3 system is in use.

and

> Drive Pilot also has redundant braking and steering actuators and a separate onboard electrical system just in case one of them fails.

are good things.

And it drives better than Autopilot, despite being "glorified cruise control":

> It does feel more precise and accurate in how the car stays in its lane and reacts to surrounding traffic, with fewer jerky movements and constant tiny adjustments. Drive Pilot even reacts to larger trucks or motorcycles that are lane splitting by moving slightly over in the lane without crossing the lines, and not only does it work in carpool lanes but also it can tell the difference between the carpool and FasTrak lanes and will let the driver decide which to use.

This doesn't sound like Tesla at all:

> and there’s not a single moment where Drive Pilot makes a mistake or reacts badly to a situation, even when surrounded by nightmarish LA drivers.

Some of us applaud these kind of things.

Remember the whole phantom braking thing a while back? And then it disappeared with an update, but another phantom braking problem arose, so then they reverted that update, and then released another update that fixed those two problems...

A safety culture where a component manufacturer had firmware that took around 36 hours to work through the full test suite... released a new firmware to Tesla in response to issues. Got an email from Tesla 3 or 4 hours later saying "Thanks, this is working so much better", and when they responded with confusion about the test suite, etc., "Oh, we just flashed it onto one of our cars here and took it out on the road and tried".


Except that system turns itself off on the slightest hair trigger if it gets any inkling that another car might be present within ten miles.


I haven't seen any video of it actually in use or reviews. Have a link showing that?


No.


I guess until all autonomous vehicles are fully equipped with real sensors (likely both lidar and radar) so that cars can almost-mechanically stop before foreseeable collisions without worrying about AI hiccups. This is technically possible even now, but it's way too expensive for now.


Why should lidar be a requirement when that's not a requirement for humans to be able to drive? The goal is to reduce traffic fatalities, humans are not reliable, if you say we can't have automation until its 100% perfect, then you are actively choosing to have more traffic fatalities.


This is a bad argument because humans use two eyes that move constantly in unison, have massively higher dynamic range than any digital camera, and can accurately measure distance in a fraction of a second by changing their focus distance. A web cam really can't replace human sight, if it was two very high speed camersa that moved like human eyes than your argument would stand, but it is not!


Because of worrying about AI hiccups.

99.99% of daytime highway dashcam frames don't include a firetruck stopped in the lane, dealing with an accident.

But let's say, hypothetically, that a self-driving car accelerated into a clearly visible, stationary firetruck in clear daylight. Despite being 99.99% perfect.

Legislators might feel in order to be an above-average driver the vehicle shouldn't hit any clearly visible stationary objects in clear daylight, because the median human driver doesn't do that.


>when that's not a requirement for humans to be able to drive?

If it get's cheap enough it will be.

But the AI isn't human and things a human would recognize, like a a large grey truck up ahead doing a turn on an overcast day an AI can just miss.

> if you say we can't have automation until its 100% perfect,

That's why we need the lidar.


Depends, is Tesla currently as good as Waymo? I don't think the OP was arguing for perfection, i think they were recognizing the current state of everything. Our ability to make AI as good as human with as limited sensors as humans is not great, currently. Throw in more sensors though, and you can offset deficiencies in human-visible-only AI detection.

To me, more sensors means faster implementation. Quicker to get on roads. Less total deaths. At least in theory, and in what successes i'm seeing.


What else are you planning to use to scan the environment? Why should we not require multiple instruments?


Unless a lot of money is paid out in campaign contributions, I suspect you're right.

I'm worried about a time when the _statistical_ safety of AI driving is better than humans across the board, but there are horrifying correlations between AI induced accidents, like "Children on tricycles are not well recognized" or "slight fog in Seattle induces 150 car pileup". Or, AI causes worse accidents less frequently than humans. Whereas humans have lots more fender benders and minor injuries to older adults.

Not saying this is true, but it's a scenario I think about a lot.


A lot more minor injuries? You are aware that over 100 people are killed in cars every single day in the US alone right?


I think you missed the part where I'm not saying this is current reality, just saying it's something I worry about.

Continuing your example just for exposition / hypothetical, if those are 100 individual deaths with varying causes, vs 99 children run over by AI b/c they were walking in a line holding hands and were interpreted as road markers, well that's not _better_ is it?

This is purely hypothetical, but statistics gloss over those human details.


TBH, that stat seems low.


Given how autonomous vehicles drive today (cautiously, within the speed limit), I feel pretty good about autonomous vehicles easily beating out humans on highway driving. I'd love to see a highway-oriented true full self driving mode developed, and I'd probably pay an additional ~10-20k to get it


Highways/motorways should be the perfect ground for "self-driving trains" of cars: self-regulating convoys of vehicles going at similar speeds between exits, "attached" at safe distances. It would dramatically reduce stress and fatigue (and hence accidents) among professional drivers and other heavy users. Particularly in the US, it's an absolute no-brainer. The problem is finding who will pay for the necessary infrastructure.


Highway self driving is already here. Any new car has lane-keeping and adaptive cruise control. That's all you need. For fair weather anyway.


The same cars that comes to a halt because people put a traffic cone on the bonnet.


>Waymo claims to already have a better record than human drivers

Elon Musk claims the same but there is a "terms and conditions apply" and they don't agree to assume responsibility.


What about all these porn stars who have sex while driving Tesla. They should definitely be guilty of dangerous driving.

edit: link to the article about it https://www.dailymail.co.uk/news/article-7015229/Elon-Musk-r...


What's the point if you can so easily fake this without any of the safety issues? Hook the car and toe it, remove the toe bar in post, done.


At least in Poland, you cannot be in a car that is being towed (unless it's necessary to "drive" it).


I perhaps naïvely, assum the car they are in is actually being towed while they fuck in the drivers seat otherwise it seems insane what if they were to bump the wheel and drive into a jersey barrier.


I haven't seen any of the videos. I'd assume that there's some creative use of a camera to give the illusion that no one is behind the wheel.

After all, in the the "two girls one cup" video the girls eat chocolate ice cream. There was a camera cut between filling the cone (with something that is not ice cream) and the girls eating from the cone. It's clear that in the cut when the girls are eating from the cone, the brown runny soft soft substance is chocolate ice cream. Your imagination fills in that it was the "not ice cream" from the first shot.

Edit: You must have added the link while I wrote the above comment.


> There was a camera cut between filling the cone

Somewhere on the internet, someone is now fact checking this claim.


Thanks for the reminder of that videos existence!


whatever autopilot is, its definitely some wild energy. the number of times ive seen a tesla try to zipper merge onto the interstate whilst some do-nothing behind the wheel scrolls instagram is just terrifying. wobbling between lanes like a drunk, pumping the brakes like a madman until finally some poor landscaping truck takes pity on them and just lets them slowly drift into traffic like a plastic bag.


For those unfamiliar. The road mostly has four lanes on each side. And speed limit is 100-120km/ph depending on what part you are on.


Don't you have to defeat a safety interlock to allow hands off the wheel driving in a Tesla?


Have you never used your legs to kind of hold the wheel or even steer in a car when you've taken your hands off for a moment? That's what I'm going to guess was going on here.


Genuinely, I have never ever done that. I am driving for over 20 years.


How else can you eat while you commute?


I rarely eat while driving and when I do, I eat stuff that can be easily eaten by one hand and quickly (chocolate etc).

I really hope the question was meant to be funny.


Only half-joking. I can eat two handed e.g. a sandwich or something similar while briefly holding the wheel with my knees. It's not a huge hardship especially in slow and predictable rush hour traffic.


It is not that I consider it hardship. I consider it distracted driving.


I think it "depends on the market".

My Tesla Model Y 2023 in the Netherlands complains a lot if I take my hands off the steering wheel. Heck, it even complains sometimes when my hands are in the steering wheel!

However, I rented a Model 3 in California through Hertz past month and it was much more lenient. I imagine I had to take my hands off for a period of time at least 5 times longer for it to start bothering, and didn't bother at all when I was driving with my hands steadily placed on the steering wheel.

In my opinion the one in California made long drives a bit safer as you don't have the pressure of putting the right pressure (pun intended) as much.


Exactly, there are different regulations that cause them to have less capability and stricter nags in Europe.


You have to nudge the wheel every 30s or so, and respond to audible prompts to "wiggle" the wheel or steering control will be disabled for the rest of the drive.


I think its by distance. On the freeway its more "whiny" for you to wiggle wheel.


It also nags much more in certain "dangerous" situations. For example, too many traffic cones nearby or detected emergency vehicles. It has gotten pretty good at that last one, though it often alerts to emergency vehicles on the opposite side of a divided highway.


What's interesting is that blue cruise in my car specifically says "hands off" when it's active. Though it does also say "be ready to take over at any time" and yells at you if you look away.

I didn't realize that Ford was ahead of Tesla in that department!


According to the subreddit some tesla owners can fool it with some fishing weights




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: