Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is a false dichotomy. Clearly getting money and control are the main objectives here, and we're all operating over a distribution of possible outcomes.


I don't think so. If Altman is prepping for an exit (which I think he is), I'm having a very hard time imagining a world in which he also sincerely believes his company is about to achieve AGI. An exit only makes sense if OpenAI is currently at approximately its peak valuation, not if it is truly likely to be the first to AGI (which, if achieved, would give it a nearly infinite value).


What's the effective difference between exiting now and if it does achieve in your words "nearly infinite value" to him personally?

Either way he is set for life, truly being one of the most wealthy humans to have ever exist... literally.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: