Sorry, I don’t get it. Why is it a requirement for a superintelligence (whatever it may be) to be able to create another superintelligence (I assume, of comparable “super-ness”)?
sufficient: if entity A can build B who solves task C, then A can solve C by building B
necessary: if superintelligences are much smarter than humans and humans can build superintelligences, then superintelligences can build superintelligences
If humans categorically can’t build superintelligences, then its not that consequential for our definition of superintelligence to be wrong