Hacker News new | past | comments | ask | show | jobs | submit login

This is basically one of my problems with self driving cars and putting your life in the hands of neural networks. It could theoretically be possible to find vulnerabilities in these systems. Someone could possible create some sort of genetic algorithm to evolve a sign that could cause a car to veer off the road and crash.

Maybe i'm just scare mongering or worrying about nothing. I just don't really feel comfortable putting my life in the hands of something that no one knows exactly how it works.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: