Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

When I was 15-18, in around 2017, I got extremely into Elizer Yud, "the sequences", lesswrong, and the rationalist community. I don't think many people realize how it appeals to vulnerable people in the same way that Atlas Shrugged, Christianity, The furry community (/jk), the self-help world, andrew tate and "manosphere content", does.

It provides answers, a framework, AND the underpinnings of "logic", luckily, this phase only lasted around 6 months for me, during a very hard and dangerous time in my life.

I basically read "from AI to zombies", and then, moved into lesswrong and the "community". It was joining the community that immediately turned me off.

- I thought Roko's basilisk was mind numbingly stupid (does anyone else that had a brief stint in the rationalist space think it's fucking INSANE that grimes and elon musk "bonded" over Roko's basilisk? Fucking depressing world we live in) - Elizer Yud's fanboys once stalked and harassed someone all over the internet, and, when confronted about it, Elizer told him he'd only tell them to stop after he issued a very specific formal apology, including a LARGE DISCLAIMER on his personal website with the apology... - Eugenics, eugenics, eugenics, eugenics, eugenics - YOU MUST DONATE TO MIRI, OTHERWISE I, ELIZER (having published no useful research), WON'T SOLVE THE ALIGNMENT PROBLEM FIRST AND THEN WE WILL ALL DIE. GIVE ALL OF YOUR MONEY TO MIRI NOWWWWWWWWWWWWWWWWWWWWWWW

It's an absolutely wild place, and honestly, I think I would say, it is difficult to define "rational" when it comes to a human being and their actions, especially in an absolute sense, and, the rationalist community is basically very similar to any other religion, or perhaps light-cult. I do not think it would be fair to say "the average rationalist is a better decision maker than the average human", especially considering most important decisions that we have to make are emotional decisions.

Also yes I agree, you hit the nail on the head. What good is rational/logical reasoning if rational and logical reasoning typically requires first principles / a formal system / axioms / priors / whatever. That kind of thing doesn't exist in the real world. It's okay to apply ideas from rationality to your life, but it isn't okay to apply ideas from rationality to "what is human existence", "what is the most important thing to do next" / whatever.

Kinda rambling so I apologize. Seeing the rationalist community seemingly underpin some of the more disgusting developments of the last few years has left me feeling a bit disturbed, and I've always wanted to talk about it but nobody irl has any idea what any of this is.



Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: