Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There's a reason OpenAI had this as part of its charter:

“We are concerned about late-stage AGI development becoming a competitive race without time for adequate safety precautions. Therefore, if a value-aligned, safety-conscious project comes close to building AGI before we do, we commit to stop competing with and start assisting this project. We will work out specifics in case-by-case agreements, but a typical triggering condition might be “a better-than-even chance of success in the next two years.””



Operative word there seems to be had.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: