Hacker News new | past | comments | ask | show | jobs | submit login

"keep in mind that Eliezer Yudowsky is but a human (though a very smart one), yet he did got out of the box"

Not quite sure what this means. Any references?




Edit: you're right, I'm not clear. An experiment was made, in which Eliezer and some other person talked during 2 hours on IRC. Eliezer played the AI, and the other played the Guardian. The AI is supposed to convince the Guardian to "let it out" by the end of those 2 hours. No word play or such, the Gardian has to make a concious decision for the AI to win. The result is to be acknowledged publicly by PGP signed e-mail by the losing party. Eliezer won twice, over people who publicly stated that there was no way an AI would convinced them. Even though they could just say no, they didn't, and later sent the e-mail acknowledging they let the AI out.

Relevant links:

http://yudkowsky.net/singularity/aibox

http://en.wikipedia.org/wiki/AI_box http://rationalwiki.org/wiki/AI-box_experiment




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: