Hacker Newsnew | past | comments | ask | show | jobs | submit | erikerikson's commentslogin

About fivish years ago I interviewed with a Wi-Fi device maker and the engineer I interviewed with was bragging that they could watch users walk around their home.

> how the Middle East should be divided up

Given how every group claims it is a holy place, I'd expect each group would want it held in a state of peace, prosperity, and reverence for the benefits of creation. Instead they all seem bent on holding their holy lands in states of violence, discord, and waste.

You're not wrong that there is deep external interference but wouldn't holy peoples rise above any of that to do better from every side?


Sensory overload is one of the terms of practice

Not to ignore that the complex manifold of interlocking conflicting social rules can add further friction


For most of human history, there were no formal schools.

> relying on in-car cameras rather than the radar and sensors ... It is betting that its approach will be cheaper

Is this due to reduced processing requirements, reduced sensor costs, or what?


This decision was made at the height of the pandemic supply chain shortages, but then was never reversed when they could get sensors again. FSD will never work with pure vision and it's folly that Tesla / Musk insists that it will.

I’m willing to believe that machine vision would eventually become good enough to match or exceed human visual perception given the same inputs.

But humans in a car have a massive advantage over little cameras that no one seems to discuss much: we have two sensors (eyeballs) mounted on a servo (our head) that can move around and is looking through a truly monstrous aperture (the windshield), and that aperture is equipped with fancy cleaning devices (wipers and cleaning fluid spray), and the car’s operator is motivated to clean the windshield and maintain the windshield, wipers, and spray system to be able to see.

A Tesla car has little tiny camera lenses that are every bit as exposed as the windshield but don’t have all the compensating machinery.

Go stick a pair of nice cameras on a three-axis servo mount with a range of motion of a whole foot (or a camera array and no servo), stick that two feet behind the windshield, train it well (use that massive parallax!) and I’d believe the result would be competitive in performance but definitely not cost. Also the car would lose an entire seat.

Or use radar and lidar and achieve super-human performance.

Fir what it’s worth, the military was and is fully aware that lidar and similar tech can outperform human eyeballs in “battlefield conditions”, and I’m aware of old DARPA projects to do things like pulsed laser range-gated imaging to see through fog and such. (You still get attenuation and scattering, but you can mostly disambiguate the additive signal from fog from the stuff behind it.) Lidar can do something similar. Humans can move their head to acquire more data. Little cameras are at the mercy of the fog and can only use fancy image processing to try to compensate.


The craziest part is that many Teslas already have these sensor installed (physically), yet their input is coded-out/ignored!

The ultrasonic parking sensors in older Teslas only have a range of a few feet. That'd be useful for automated parking, but not for self driving above 5mph.

>only have a range of a few feet

At the very least they couldn't they detect imminent collisions with pedestrians? walls?

----

My non-Tesla has a radar for cruise control and imminant crash detection — while annoying at first, it is unfailable. As I've aged (¡perfect driving record!) my thought process has gone from more horsepower! to more safety!. Now when my car beeps at me I'm more-inclined to listen cautiously.


Ultrasonic isn’t particularly useful for fast reactions. If it’s navigating around a parking lot slowly, sure, but if you’re going any reasonable speed it’s basically useless.

>if you’re going any reasonable speed it’s basically useless

My active cruise control works even when set to 99mph (e.g. it will slow me down upon approaching another vehicle). I'm sure it'd work at higher speeds, but won't "set" above 99.


Not really. Stopping distance at 20mph is 20ft on dry pavement, and it goes up quadratically the faster you're going.

Andrej Karpathy should be on that list.

And there should be some criminal liability since people have died.


Can you cite some practical failure scenarios besides a wile e coyote billboard where camera inherently won't be able to accomplish what lidar/radar do?

Cameras can obviously work at least as well as a human if they're attached to a human brain. The question is whether you can put enough compute and data together into a car to be able to do it live.

Why even bother when we can make artificial eyes that see depth? The price of LIDAR has plummeted and will continue to plummet. We already know that it works really, really well for self-driving with today's available compute+data.


It's not a given that a camera will even work as well as a human if it's attached to a human brain. The human eye is stereo, it has a focusable lens, an iris, it's incredibly sensitive and the foveated retina has a very high resolution. Can't say the same for the cheap-ass cameras on the Tesla. I'm not sure there is a camera on Earth that is the equal to the eye (yet).

Great points! The eye (really everything attached to a biological system) is fucking amazing.

> Cameras can obviously work at least as well as a human if they're attached to a human brain.

Eh, I mean I think that that’s necessary, but maybe not sufficient. Eyes are _really good_.


I can't see in the dark, can't see in the fog or the rain, can't see UV, my eyes only see in the rough direction my head is facing, there's a limited ability to track objects. Bicycles coming from behind are particularly easy to miss. Speaking of easy to miss, there's a hole in your vision m that you doing notice, where the optic nerve is. Hell, there's a whole body of work for times when human vision falls short and gets fooled, called optical illusions. There's another whole field of study about failures of the lenses and other parts in the eyes themselves. Some of those failures an electronic camera system is also going to have/have a reliance on components not being broken.

Given the number of shortcomings of human vision why shouldn't our self driving cars be designed to have better than human vision, especially if the goal is to not get into crashes. Humans, with human vision and human object tracking skills, and human reaction times get into crashes all the time. Shouldn't we want better and more sensors, which would lead to few fewer crashes, simply because better sensors have better data available?


As other commenters have noted, the kind of cameras that see better than human eyes cost more than a car.

Cheaper than that and you're making significant trade-offs.


Fair point! Is there some advantage eyes have that wouldn't be surmountable with simply more cameras (i.e. to capture different exposure ranges etc)? I actually haven't thought about this side of it super closely, but I think you're right.

Resolution; extreme dynamic range in a single exposure; sensitivity with short exposures in low light; focal range and speed of refocusing; white point adjustments (happening to a large degree in the retina as receptors become temporarily desensitized to a given wavelength with uninterrupted exposure). I’m sure there are more.

It's a question of cost and technology. Cinema cameras don't cost hundreds of thousands of dollars just because studios like spending money. Humans can see differences between even the best cameras on the market.

It's also a bit of a false analogy. Cameras don't really work like human vision. We do things like mesopic vision that simply aren't possible with current sensors. We have massively high resolution. We have async "pixels" that can respond immediately without waiting for a framing signal. Our brains process color in truly weird ways.

It's not like there's some physical law preventing computer vision from being better than human vision, but it's an incredibly difficult engineering problem that we've spent the better part of a century attacking without clearly winning.


Shadows. Everyone I know who's tried FSD learns very quickly that it's a random chance that the car will see an oncoming shadow (big truck, bridge, tree, etc) as a wall and either slam on the brakes or swerve.

If you have reliable depth information, ala LIDAR, you'll be able to know that there's nothing actually there.


Shiny truck trailers have been a failure case. A reflection of the sky in a truck back looks a lot like the sky from the right angle.

There are advertisements on the sides of trucks. Better question is why you are willing to dismiss the wile e coyote failure demonstration.

The wile e cayote failure was using tesla's cruise control rather than FSD. Binocular vision is sufficient.

Lidar will likely be outlawed anyway as it burns your retina. Dare you to put your eye (or cellphone camera) next to a waymo sensor for 10 sec and see what happens.


It is funny you should say that binocular vision is sufficient when Tesla Vision hardware does not have binocular vision.

HW3 has three front-facing cameras that are not only too close together to provide binocular vision with adequate disparity, but also have different fixed focal lengths making them unable to establish binocular focus even if they were far enough apart [1]. HW4 has two front-facing cameras with the same limitations.

This is of course ignoring the fact that humans with visual acuity comparable to the HW3 cameras would almost be legally blind and not meet minimum vision requirements to operate a motor vehicle in, I believe, every state. HW4 cameras are better and you would only be unable to meet minimum vision requirements to operate a motor vehicle in most states, including California and Texas.

[1] https://en.wikipedia.org/wiki/Tesla_Autopilot_hardware#Tesla...


Virtually all LIDARs are class 1 devices, safe for human eyes to look at. Do you have any reason to believe Waymo LIDARs aren't class 1?

The rapid rotation and your blink reflex is apparently what makes it class 1 (supposedly). I don't see how a blink occurs if it isn't in visible spectrum. The science around the safety is very shaky, many tests look at only retina and not cornea damage.

https://www.laserfocusworld.com/blogs/article/14040682/safet...

Many of the class 1 lidars do damage camera sensors, is your eye really that much more resilient?


The damage at those wavelengths is just overheating. Your eyes are mainly water with its absurdly high heat capacity compared to pretty much anything in an image sensor. I'd expect eyes to be much more resilient in short duration, high intensity exposures and significantly better at long duration, low intensity exposures. LIDARs have been shifting towards short duration exposures to get higher intensities (for range) in recent years, simply to stay under the class 1 limits.

Heat damage doesn't make it safe, eyes are very sensitive to heat and while the cornea provides some protection, not everyone has a thick cornea due to surgery, and there is also evidence that heating the cornea causes cataracts.

The short exposure is putting a lot of faith in these device manufacturers. How do they guarantee short duration, are you confident that each manufacturer has sufficient electronics to stop the laser if the spinning slows slightly due to dirt, low power or a crash? Should the exposure limits be the same during day vs. night when your pupils are dilated?


FSD catapults families into highway dividers, I'm pretty sure LIDAR will be fine

LIDAR uses wavelengths that are transparent to the receptors in the human eye (but not to the sensors in many brand of cameras).

So, safe for human eyes but deadly for a camera.


Sure, anything that involves fog.


I was driving in convoy over closed (to non-convoy driving) mountain pass in Norway, in winter. Everything was white, you could barely see the road. Snow was so heavy that visibility stretched as far as the car before me, you couldn't see further than that. At some point a snow plug passed on the other side of the road and completely covered my windshield with snow, to the point that for a few second I had no visibility whatsoever. Good luck to cameras then.

In case it's helpful to know, Lidar also struggles to perform well in heavy snow due to the scattering and reflection of laser beams by snowflakes, which reduces the detectable range and can create false readings.

But you are also only a camera without LIDAR or RADAR, and you apparently navigated that situation successfully...


Watching this from the outside, cameras only seems to be a religious decision based on how strongly people react to the question.

5 years ago I agreed that you'd need the other sensors. ML vision has improved so quickly now I'm really not sure you do. From what I've seen the system available to consumers also performs well IRL.

I’ve been using Tesla FSD since the original beta rollout. Collision avoidance has never been a real problem. Phantom braking has gotten a lot better, though still not 100.0% fixed.

Most of the problems I have now are things like lane selection or turning into the wrong parking lot, which seem solvable in software given enough time, and the Robotaxi project should ramp up the urgency on that front


I think the correct answer here is that when you are producing vaporware to run your hype machine, you avoid hardware costs like the plague.

Lidar units suitable for this sort of thing used to be extremely expensive, but they’ve come down a good bit and will likely continue to. At this point hard to read it as anything other than obstinacy.

Costs. Lidar is extra sensors (and also extra signal integration but not having a lidar requires a lot of extra video processing to get information the lidar would straight up give you).

Allegedly, they believe a system with inputs similar to human vision is best suited for interpreting signals on roads designed for human eyes, and conflicting signals from LIDAR makes disambiguation challenging when combining sensor types. Per a recent Musk interview.

I always thought it was because of patent licensing. Basically, extra unit costs.

both? they did try and get to market faster than cars with full lidar rigs etc

The didn’t actually get to market before Waymo in the robotaxi market

Welcome to Hacker News!

It is generally recommended to upvote a comment you appreciate rather than making a comment that isn't adding substance. It helps keep the signal rate higher.


Sides are a distraction.

Violence and conflict creators and propagators are bad.


The stability of society and the law based facilitation of peace are absolutely within the mission of police forces and highly facilitative to the prosperity of a society.

I was once involved with a project that returned determination of land ownership from people's physical custody to the courts and the resulting drops in assault and homicide rates (for the entire country) was in the double digits over a period of months.


Wow, super interesting! Where was this if I might ask?

Sorry for the delay but this is such a public setting. The governments involved never approved the release of that information.

"There can be only one!" /s

* Please be advised that the writer of this post in no way advocates murder and any such actions taken will be considered the sole, whole, and complete responsibility of the actor.


As an adopted person "biological <insert title>" has worked well for the parents that had sex to make me. For a donor parent, I probably just use "donor <insert title>". I'd advise not worrying too much about the language though. Being kind and thoughtful is far more important than selecting the correct words. A snap judgment selection of proximal words is sloppy but it's impractical to pause to select exactly the right language in all cases for all statements. So too with something this sensitive it might be good to slow a little.


Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: