Hacker News new | past | comments | ask | show | jobs | submit login

Humans have physically limited sensory bandwidth. Your ears, eyes, nose, etc. can't process more than a limited amount.

So higher bandwidth can help move some processing about and make different architectures possible, but they can't make the experience much richer.




Interesting point, but the thing about eyes is that they can look in unpredicted directions, so you need much more data available than just what the eye can process.

This wouldn't be true if we didn't have lag (then we could just stream the right images for where the eye was looking). But our eyes are extremely sensitive to lag - even done 100% locally, it's difficult to be fast enough (or used to be).

To compensate, we can download a whole environment, and track the eyes locally.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: