Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sorry, I don’t get it. Why is it a requirement for a superintelligence (whatever it may be) to be able to create another superintelligence (I assume, of comparable “super-ness”)?


sufficient: if entity A can build B who solves task C, then A can solve C by building B

necessary: if superintelligences are much smarter than humans and humans can build superintelligences, then superintelligences can build superintelligences

If humans categorically can’t build superintelligences, then its not that consequential for our definition of superintelligence to be wrong




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: