Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Einstein's Arrogance (overcomingbias.com)
17 points by Neoryder on Nov 18, 2007 | hide | past | favorite | 17 comments


I don't really like this view of hypothesis. Or maybe I just don't understand it.

Take this hypothesis: 2+2 = 4. There are an infinite number of things 2+2 could equal, but I, like Einstein, without any empirical testing of my mathematical hypothesis, believe it to be 4. If you believed the sum to be 5, again without any testing, I don't see how you or I have any different number of bits dedicated to our answers. Your bits, however, are wrong.


There's plenty of empirical evidence for 2 + 2 = 4, it's just so pervasive that we fail to recognize it as such. Take two apples, place them on a table. Now take two more apples, and place them on the table. How many apples are on the table? 3? 4? 5? I have an astronomically high probability assigned to "2+2=4", but if I kept doing this experiment and suddenly ended up with 3 apples every time, I'd have to decrease the probability I assign to 2+2=4, and increase the probability assigned to 2+2=3.

The apples example is from "How to Convince me that 2 + 2 = 3": http://www.overcomingbias.com/2007/09/how-to-convince.html

Interesting essay on Bayesian reasoning: http://yudkowsky.net/bayes/technical.html


From the article:

"But from a Bayesian perspective, you need an amount of evidence roughly equivalent to the complexity of the hypothesis"

But what is the "complexity of the hypothesis"? Without a proper definition, there is not much left to the article, or is there?


Fixed with linky.


By my contribution, I am guilty by association. But do you realize how pointless this discussion is getting? Ha. Only on hacker news.

2+2 = 4. Thus it is written.


IMHO, I think Bayesianism is more interesting than a lot of the stuff discussed here. The idea that all reasoning can be done as Bayesian inference is pretty damn cool. If someone looks at the above and thinks "Whoa, Bayesianism is neat! I should go read more about it!" then I'm happy, but if everyone is just thinking "Duh, of course 2 + 2 = 4"... well, shit.


I don't disagree, but my hypothetical assumes that no one has done the experiment before making their belief known.


Sounds like an unreasonable assumption. Either way, it's not about "wrong" or "right", it's only about what can be derived from evidence. Without evidence you have no justification to assign a higher probability to "2 + 2 = 4" than "2 + 2 = 3".

Caveat: It's true that some knowledge does have to be put into the prior distribution. Apparently you can't just bootstrap from a zero-information ("maximum entropy") prior. So it's possible that 2 + 2 = 4 is burned into our brains at birth.


> So it's possible that 2 + 2 = 4 is burned into our brains at birth.

Yup, there is plentiful evidence... not specifically the addition, but 2 and 4 are innately perceptible quantities.


I don't even think it is a good idea to think about theoretical physics as the blogger does. Math doesn't require any probabilistic validation whatsoever. Biology needs lots of it because there are too many unknowns. Physics is much closer to math than to biology, and evaluating it like one does a biological experiment, to me, is perplexing. For 2 + 2 = 4, there doesn't seem to be any need for "bits of information." And even though the author is probably talking about hunches and intuition, the probability that a theoretical physics can go about his field thinking in hunches seems small enough... ok, maybe this statement required 27 bits of information?

Maybe I am missing something.


"I myself often slip into this phrasing, whenever I say something like, 'To justify believing in this proposition, at more than 99% probability, requires 34 bits of evidence.'"

Error: Social robot could not compute. Please input valid parameters into auditory interface again.


i just want to point out that einstein wasn't 'arrogant' here. he said what he did because he 'knew' GR had to be right -- it was just the generalization of special relativity, which was already known to be right.

of course, it turns out that GR is wrong, at least at small-scales. while special relativity is compatible with quantum field theory (indeed, SR was the motivation for moving from quantum mechanics to QFT, as quantum mechanics does not respect SR), general relativity is not.


I'm going to go out and swim in the deep water with this comment, but I didn't care for the article that much.

All science is provisional, this much is true. Math, however, is a formal symbolic system for representing things in reality. 2 + 2 = 4 not because of some inner truth in math but because when we observe nature and combine 2 things and 2 things we have what we call 4 things. We could change the symbols around all day and they would still work. So math is just a generic way of talking about that which we can observe.

The interesting thing happens when our symbolic system escapes that which can be observed, or when it is incomplete, say in the case of negative numbers (then rational, the imaginary, then irrational, etc.) At this point the exercise becomes one of either bringing the system of symbols to some application that has observable impact (applied physics) or changing the symbolic tools. There's nothing Bayesain about 2+2=4 -- that's the way the symbols are supposed to work.

Now whenever we get "stuck" we have to go back and check out symbolic systems. Just like geeks build O/S as a hobby or college experiment, I imagine physics and mathematicians build calculi, or systems of symbols and rules for working with them. Wolfram came up with a great question in NKS -- what if the universe is really discrete and not continuous? In other words, when Newton created the integral he might have taken math down a path that ends up breaking when you try to put a GUT together. I think that's a helluva question, but it's above my pay grade.

There was a book George Gamov wrote: 1-2-3-Infinity about the way various counting systems and numbers play together. Go read it -- it's better than this blog article.


Actually it took Whitehead and Russell a lot of work just to prove 1+1=2. not kidding. see http://en.wikipedia.org/wiki/Image:Principia_Mathematica_the...

It all depends on the axioms you use, iirc. I can think of cases where 2+2 does not equal 4. For example (very roughly -- this example isn't really 100% accurate, but you get the idea), 2 protons plus 2 protons != 4 protons. in fact, you get the helium nucleus, I believe, 2 protons plus 2 neutrons. before you say this is an esoteric example, i might point out that this is the reaction that powers the sun and you wouldn't be alive right now without it. (for the actual reaction see here: http://en.wikipedia.org/wiki/Proton-proton_chain_reaction)

Someone with more maths training might correct me here, but 2+2=4 is true because you're using the usual axioms you learned in school. you might also say it's obvious that 3 times 3 = 9, or that ab = ba, but there is a lot of interesting math around anti-commutative algebras where ab=-ba, for example. it's all about the axioms you use.


"There's nothing Bayesain about 2+2=4 -- that's the way the symbols are supposed to work."

How did you come to this decision that "that's the way the symbols are supposed to work"? I bet it's some sort of process of taking in information and updating your beliefs. And the idealized optimal version of that is Bayesian inference.

"Wolfram came up with a great question in NKS -- what if the universe is really discrete and not continuous?"

Sorry for the anal-retentive nitpicking, but this question isn't due to Wolfram. I'm not qualified either, but it's definitely a fascinating thing to think about. Maybe our universe is just a small computer program. http://en.wikipedia.org/wiki/Digital_physics


I did not come to any decision about that's the way the symbols are supposed to work -- that's my whole point. The symbols represent the way it works. They are a self-referencing representation. There's no decision involved here at all. You can call it an axiom, but that misses the point. An "axiom" only has meaning inside certain formal symbolic systems. It. Just. Is. It exists. Different people with different numbers and operators? They'd be asking why %^ $% $#^ #$$%, for instance. But it's the same thing. The answer is in the question. It's like asking "Why is red a color?"

I didn't mean to imply Wolfram came up with computational reality, I simply mentioned that he brought it up in his book. It may turn out the integral was just a nifty little shortcut that took a lot of impossible-to-calculate math off the table for physicists. Thanks for letting me clarify that.


"I did not come to any decision about that's the way the symbols are supposed to work -- that's my whole point. The symbols represent the way it works. They are a self-referencing representation. There's no decision involved here at all."

There's a difference here between what you believe and what is true in external reality. (Yes, I know you can spend lots of time debating whether the latter is even a coherent idea, blah blah blah, but all that stuff isn't relevant.) You had to come to a belief about what "+", "=", "2", and "4" mean. You were not born with this information embedded inside your head! :) This is true regardless of whether the symbols by themselves unambiguously and uniquely pin down the meaning of the expression. Even if a certain symbol sequence has a "unique unambiguous" interpretation, you still had to read those symbols and interpret them, and you learned something in the process -- which means you updated your beliefs about the symbols. I agree that the "real" interpretation in external reality of these symbols didn't change (if such a thing even exists), but your understanding of the symbols changed.

In general, probabilities are subjective and a property of the observer. They describe degrees of belief that the observer has in various propositions, and don't directly have anything to do with external reality. The only way they're connected to external reality is that you make observations in reality and use the gathered information to update the probabilities you assign. This means the claim that the symbols "2", "4", "=", and "+" are unambiguous is irrelevant even if it's true.

BTW - Thanks for the link on your blog to the NYT graph of the price of gas in constant dollars.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: