Hacker News new | past | comments | ask | show | jobs | submit login

> 3) Its leadership - or some of it, anyway - is extremely egotistical and borderline cult-like to begin with

I'm always surprised at how common this is in rationalist and EA organizations. The revelations about the cult-like behavior at MIRI / CFAR / Leverage are eye-opening: https://www.lesswrong.com/posts/MnFqyPLqbiKL8nSR7/my-experie...

The issues with sexual misconduct and drug-assisted assault in these communities even made the mainstream news: https://www.bloomberg.com/news/features/2023-03-07/effective...

It's equally fascinating to see how effectively these issues are rapidly retconned out of the rationalist discourse. Many of these leaders and organizations who get outed were respected and frequently discussed prior to the revelations, but afterward they're discussed as an inconsequential sideshow.

> TLDR: isolation, very strong in-group defenses, logical "doctrine" that is formally valid and leaks in hard-to-notice ways, apocalyptic utility-scale, and being a very appealing environment for the kind of person who goes super nuts -> pretty much perfect conditions for a cult.

I still think cults are a rare outcome. More often, I've seen people become "rationalist" because it gives them tools to amplify their pre-existing beliefs (#4 in your list). They link up with other like-minded people in similar rationalist communities which further strengthens their belief that they are not only correct, but they are systematically more correct than anyone who disagrees with them.






Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: