> “You’re Scott Aaronson?! The quantum physicist who’s always getting into arguments on the Internet, and who’s essentially always right, but who sustains an unreasonable amount of psychic damage in the process?”
Give me strength. So much hubris with these guys (and they’re almost always guys).
I would have assumed that a rationalist would look for truth and not correctness.
Oh wait, it’s all just a smokescreen for know-it-alls to show you how smart they are.
The basic trope is showing off how smart you are and what I like to call "intellectual edgelording." The latter is basically a fetish for contrarianism. The big flex is to take a very contrarian position -- according to what one imagines is the prevailing view -- and then defend it in the most creative way possible.
Intellectual edgelording gives us shit like neoreaction ("monarchy is good actually" -- what a contrarian flex!), timeless decision theory, and wild-ass shit like the Zizians, effective altruists thinking running a crypto scam is the best path to maximizing their utility, etc.
Whether an idea is contrarian or not is unrelated to whether it's a good idea or not. I think the fetish for contrarianism might have started with VCs playing public intellectual, since as a VC you make the big bucks when you make a contrarian bet that pays off. But I think this is an out-of-context misapplication of a lesson from investing to the sphere of scientific and philosophical truth. Believing a lot of shitty ideas in the hopes of finding gems is a good way to drive yourself bonkers. "So I believe in the flat Earth, vaccines cause autism, and loop quantum gravity, so I figure one big win this portfolio makes me a genius!"
Then there's the cults. I think this stuff is to Silicon Valley and tech what Scientology is to Hollywood and the film and music industries.
Another thing that's endemic in Rationalism is a kind of specialized variety of the Gish gallop.
It goes like this:
(1) Assert a set of priors (with emphasis on the word assert).
(2) Reason from those priors to some conclusion.
(3) Seamlessly, without skipping a beat, take that solution as valid because the reasoning appears consistent and make that part of a new set of priors.
(4) Repeat, or rather recurse since the new set of priors is built on previous iterations.
The entire concept of science is founded on the idea that you can't do that. You have to stop and touch grass, which in science means making observations or doing experiments if possible. You have to see if the conclusion you reached actually matches reality in any meaningful way. That's because reason alone is fragile. As any programmer knows, a single error or a single mistaken prior propagates and renders the entire tree invalid. Do this recursively and one error anywhere in this crystalline structure means you've built a gigantic tower of bullshit.
I compare it to the Gish gallop because of how enthusiastically they do it, and how by doing it so fast it becomes hard to try to argue against. You end up having to try to counter a firehose of Oh So Very Smart complicated exquisitely reasoned nonsense.
Or you can just, you know, conclude that this entire method of determining truth is invalid and throw the entire thing in the trash.
A good "razor" for this kind of thing is to judge it by its fruit. So far the fruit is AI hysteria, cults like the Zizians, neoreactionary political ideology, Sam Bankman Fried, etc. Has anything good or useful come from any of this?
Give me strength. So much hubris with these guys (and they’re almost always guys).
I would have assumed that a rationalist would look for truth and not correctness.
Oh wait, it’s all just a smokescreen for know-it-alls to show you how smart they are.