Hacker News new | past | comments | ask | show | jobs | submit login

>driving a car is a walk in the park

In the average situation, not the edge-case. The way you should think about it is, "If I had a black-box oracle that can drive a car exactly like a human, could I use that to simulate an artificial general intelligence?"

The answer is probably yes. For example, in order to "ask" the AGI a yes-or-no question X, you could contrive that the car find itself at a fork in the road with a roadsign that says "If the answer to X is 'yes', then the road to the right is closed. Otherwise, the road to the left is closed."




what does drive a car exactly like a human mean? i am going to assert that you don't need AGI for self-driving cars. in your example: that's not how driving works. the driver - even a human one - is not expected to answer random questions as they are driving.


>the driver - even a human one - is not expected to answer random questions as they are driving.

In the same way, the C++ compiler was not expected to be able to emulate arbitrary programs (i.e. to be Turing complete), but some ingenious people found a way to use it to do just that. The way they did it was by writing some extremely unusual and edge-casey code, but you can't just wave that aside and say "C++ isn't really Turing complete, because the proof that it's Turing complete involves C++ code that nobody would really write!"


Ahhh, a Turing car; if you can make a Turing car, you generalize it to make a AGI. I like it!




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: