Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is obviously nonsensical?

A sense of self is meaningful (that is, has consequences for whatever operation) only if those consequences aren't random.

The concept of "self" serves a (or rather many) purposes, the most obvious being to protect yourself.



An AGI may not have a sense of self. A sense of self is not really necessary to pursue a goal in the most efficient manner possible. Plenty of algorithms already pursue goals as efficiently as possible in a limited context without any hint of a sense of self.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: