Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The Trouble with Theories of Everything (2015) (nautil.us)
63 points by dnetesn on June 2, 2018 | hide | past | favorite | 32 comments


In my view one of the puzzles of physics is actually its success over the past couple of centuries. That we have such precise and comprehensive models of anything could not have been anticipated.

But this success doesn't guarantee future success or a predictable rate of future progress. Whenever we agree to work on a fundamentally new question, we are never sure whether the answer will arrive in 5 minutes or 5 centuries. Reconciling quantum mechanics and gravity might be a 5 century problem for all we know.

Likewise, we can never claim with assurance that any non-trivial question is un-answerable. When problems get hard, books get written about the boundaries of science.

As I understand things, it took humanity some 1000 years to solve quadratic equations.


Depending on how you count (no pun intended) it took 4,000 years to invent the number zero.


What are these 1000/4000 year starting points? From when mankind evolved?


Actually it turns out I was off by an order of magnitude. The earliest recoded number systems date back 40,000 years [1]. Zero was invented about 5000 years ago [2].

[1] https://en.wikipedia.org/wiki/History_of_ancient_numeral_sys...

[2] https://www.scientificamerican.com/article/what-is-the-origi...


I enjoyed reading the article. This reminded me of my own work, in which I apply various theories in every interaction with my clients. Some of the models with which I'm familiar are very detailed and provide some amazing scaling features such that they work on multiple levels. However, no sooner do they do that then they begin to require the construction of abstract models just to understand and leverage them, to say nothing of properly testing them in depth. Even so, each one of these models gives me a slightly different lens on a problem, and those lead to new and varied leverage points. For that they are extremely useful.

As an aside, I was also reminded of role-playing simulation systems like Fate, GURPs, and Risus--all of which can model entire worlds, but each of which does so with varying experiential 'leverage points'. GURPS can offer the experience of quantifying a solar system, if you want that. You can then look at Risus and maybe it's simple and not substantive in that way, but that doesn't give due credit to its capability within specific boundaries like time, experience level, etc. IMO the boundary concept is important to embrace and appreciate in any theoretical or even any human endeavor, otherwise theories / models can start to leak leverage, so to speak.


> nature, as Feynman once speculated, could be like an onion, with a huge number of layers

Much of this was like woosh to me admittedly, but in between the Greek; happy to see that even distinguished scientists with degrees seem to think that we are nowhere near understanding 'the mind of God' as the late Stephen Hawking once or twice proposed we might be, may he rest in peace.

So, a TOE seems more and more improbable. The most sane and reasonable stance IMO would then be to suspect but not know an infinite number of layers, scales and dimensions, be it in terms of time, space or speeds.

Such a position would seem useful in theory-creation as it would free our assumptions about the universe from our natural human-centered prejudices in a radical way.

And philosophically, place us in the center of the universe again in a way (since the center of an infinite universe is everywhere).

And tentatively allow us to get a glimpse of, albeit not understand and certainly not acquire, 'the mind of God'.

But this is obviously where science ends and belief begins.


This has been roughly my position for the past few years and it has, without a doubt, multiplied my ability to understand new concepts and perceive causal relationships, among many other positive radical changes.


A similar idea was worked out in more detail by the philosopher of science Nancy Cartwright under the heading of "the dappled world":

https://www.amazon.co.uk/How-Laws-Physics-Nancy-Cartwright/d...

https://www.amazon.co.uk/Dappled-World-Study-Boundaries-Scie...

It's interesting to see a physicist suggest something similar. Review/summary of the second (more recent) book here:

https://philosophynow.org/issues/28/The_Dappled_World_A_Stud...


> There is no known physics theory that is true at every scale—there may never be

The article mentions how GR and QM have never been successfully combined together into a testable theory, how SR and QM produce infinities that need to be "renormalized" when combined together, and how there's no "direct" evidence that QED and QCD can be combined together.

The article didn't mention Thermodynamics that operates at scales between GR and QM, and how its reliance on a directed time dimension means it also doesn't combine with time-reversible GR, e.g. where has the information gone after a black hole evaporates via Hawking radiation, or what happens at the Cauchy horizon in a rotating black hole. QM is also time-reversible so I guess TD can't be combined with QM either.


Not a physicist, but I do not think thermodynamics is a physical theory in the same sense that QED or GR are: it isn't about any specific physical system or interaction, but rather a general pattern when one looks at certain properties of bulk matter on (relatively) long timescales. Feynman had a good explanation of this, but for the life of me I can't remember the reference right now -- it's either in one of the lectures in the Character of Physical Law (try the one on the distinction between past and future) or one of the Feynman Lectures chapters on thermo.

IIRC thermodynamics per se does not have to "change" to adapt to systems where quantum effects play a role. How to link abstract thermodynamic concepts to the microscopic dynamics of a specific system in which quantum effects play a role would require a quantum theory, but for that we have quantum statistical mechanics (https://en.wikipedia.org/wiki/Quantum_statistical_mechanics) which is a fairly well-developed subject.

(This leaves out connections to relativity, special or general, about which I don't know much.)


My understanding is that the black hole's entropy gets encoded into a hologram on it's event horizon, and then is distributed with the Hawking radiation. Last I read, it sounded like it was still a somewhat open question, though.


Thermodynamics is time reversible in general. It's just not reversible once you postulate a small entropy starting position.

I am not aware of any incompatibility between thermodynamics and either QM or GR, except for stuff involving black holes.


Every domain of study actively tries to create a unifying Generalized theory. It's basically implied that research is attempting to completely model a system into the "Grand Unified theory of [insert domain]."

I think it's a quirk of Humanity to think that systems trend toward stability and immutability, but I'm not sure if it's possible to say for sure. In other words, there is a slight possibility that the fundamental "laws" governing systems could actually change over time. That is not to say that our perception of them changes with better tools, but that fundamental interactions may not be immutable.


To summarize the article: maybe we'll never discover a unified theory and will have to live with approximations for each context. But maybe we will. But nah, we should be skeptical about that. Probably not.

Just like pretty much any philosophy ever.


Godel's incompleteness is the closest we have come to the theory of everything


If there is no theory of everything, then reality simply cannot be captured by a formal system of any kind.

I'm not sure that this possibility agrees with what we do know about reality. Like the unreasonable effectiveness of mathematics.


The integers can't be captured by a formal system of any kind either.


Well that's not true. You can't prove every true statement about the integers, but that's not the same thing.


For every unprovable true statement about the integers, you can find a set of "non-integers" that has all the properties of integers that are provable, and in addition the unprovable statement is false.

I don't think a formal system can be said to capture the integers if it also allows such "non-integers".


I honestly don't follow your first claim or how it has any bearing on this question. Perhaps you could provide a link to the actual theorem you're describing to clarify.


GP is alluding to the fact that any unprovable true statement about the integers has a counter-model for which the statement is false which the formal system cannot distinguish from the integers. Therefore no formal system characterizes the integers, there are always these false integers lingering.

These objects are highly pathological artifacts of the formal system in question, but that just goes to show how slippery the integers are.


> GP is alluding to the fact that any unprovable true statement about the integers has a counter-model for which the statement is false which the formal system cannot distinguish from the integers.

Again, do you have a link or a name for the theorem you're describing?


This is a consequence of Goedel's completeness theorem: a statement in a first order theory is provable if and only if it is true in every model.

The counter models obtained for arithmetic true unprovable statements are called non-standard integers.


Ok, now we're getting somewhere [1]. So various theorems can be used to prove the existence of non-standard models of arithmetic. I'm not sure why I should find this compelling.

There are infinitely many possible models of arithmetic, each of which has different properties. Some of these are isomorphic to standard models in various ways, some of which are not.

But, if the world is governed by mathematics, then at least one such model would best match reality. How would we even speak of a world which was not internally consistent such that it could be formalized in such a fashion?

[1] https://en.m.wikipedia.org/wiki/Non-standard_model_of_arithm...


I'm not sure what you're asking. Any formalization of the integers won't be "the" integers, the totality of which perhaps isn't even a reasonable idea.

Mathematics is a part of the world and does not stand apart from it. There is no reason why such a "best" model should exist, and by any reasonable definition, it doesn't. Mathematics is irrefragably incomplete. It is perhaps the only human discipline that demonstrates its own limits.


> Any formalization of the integers won't be "the" integers, the totality of which perhaps isn't even a reasonable idea.

"The integers" is whatever formal model we all agree has the properties we want of the integers. This is a language game, it's merely a label for a set of properties.

> Mathematics is a part of the world and does not stand apart from it.

If you mean that mathematics follows from naturalism, this is at best a conjecture. We could very well live in a mathematical universe (ala Platonism), in which case our world is itself a particular mathematical structure. This is actually a far less problematic view of mathematics, philosophically.

> There is no reason why such a "best" model should exist

There are many reasons why a "best" model of reality should exist. For one, as I mentioned above, mathematical monism is a less problematic philosophy of mathematics. It then immediately follows that positing a separate physical world is multiplying entities unnecessarily, ergo, considering the universe to be mathematical is the most parsimonious theory.

But even going a different route, as I said many posts up, denying an accurate and precise mathematical of reality entails some irreconcilable inconsistency in reality, which so far has completely failed to manifest itself. We very well could be a brief island of stability in a random output generator, but ordering my hypotheses by likelihood, this would necessarily be dead last (because its incompressible ala Solomonoff induction).


What?

Every model of arithmetic has every standard integer though?

Are there nonstandard models of second order ...

Ok Wikipedia says that "the axioms of Peano arithmetic with the second-order induction axiom have only one model under second-order semantics.", so...


Second order induction could hardly be called a formal system: There is no complete, effective proof system for it. This is the basically the same limitation on first order PA. The standard model for second order arithmetic involves the power set of the naturals which is immediately a far more complex object than any computable system; it essentially presupposes the object that you're trying to formalize in the first place.


I don't see why the lack of a complete effective proof system for it should be a problem.

I do in fact consider the power set of the naturals to exist, so I'm not sure I see the problem? Is it just that uncountable sets have elements that we cannot pick out? I don't have a problem with that.


It matters because it means second order arithmetic isn't a finite object in the same way first order effective theories are where you might have "infinitely" many axioms but they are packaged up in a finite object, an algorithm.

The whole point of a formal system, its formality, is that it doesn't require any semantic notions to describe it. Second order arithmetic with its full model is no such thing. It is inherently infinitary and therefore not formal at all.

And it's a treacherous object. C.f. Richard's paradox which demonstrates that "subset of the integers" is by no means a naive idea.


Isn't it true that the only countable model of the natural numbers (including ordering) which is computable is the standard model of the naturals?


I thought that was just for first order systems?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: