Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>Imagine what kind of perspectives are possible when thousands or millions of input sources are your senses.

Doesn't this exactly describe human sensory input? Though our brain is efficient by throwing out most of the data early on in the signal chain (as research has revealed in vision and auditory input). Will future AI also need to be as efficient?



Doesn't this exactly describe human sensory input?

Well, consider having a 360º array of 30 cameras all integrated into a perfect spherical sensory experience. It's something we can't really imagine experiencing natively, but it would be trivial for eBrains to coalesce visual systems that way from eBirth.

Our bodies have lots of low bitrate sensors like billions of individual sensory nerves distributed throughout our bodes (and they are each individually addressable in the brain), but we don't think of "touch" as a sense to "computationalize" like vision or sound or language.

One amusing thing about AI sensors: nobody ever talks about superhuman smell. Where are the quantum AI noses?


Biochemical sensors, like what that discredited blood test company theranos tried to do, is a sort of superhuman smell.


But at a completely different scale though. You have to imagine a whole planets sensors as your sources combined with the whole planets knowledge and so on.

I don't think it would make much sense to compare with the limited POV we are experiencing the world from.


Deep neural nets do precisely this. Though the specifics change from implementation to implementation, in general there is a large drop in the number of hidden units at each layer of the architecture.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: