Hacker News new | past | comments | ask | show | jobs | submit login
All of Physics in 9 Lines (motionmountain.net)
199 points by harperlee on Aug 20, 2023 | hide | past | favorite | 113 comments



Unfortunately this website (and the cheat sheet here) seems to be motivated and evidenced at least in part by the speculative theories of the author so it is difficult for the layperson to determine what here is of value.

For example, the derivation of General Relativity (quite a significant part of physics!) apparently follows from the author's own "Maximum Force" theory in Item (3).

ps. previously discussed:

https://news.ycombinator.com/item?id=30733666


Yes, and one should be very wary of someone who speaks in absolutes.

That said, there IS a valid point to be made to highlight the inordinate success of modern physical theories. It's annoying when the only physics news is sensationalist psuedo-science "heresy" from those who don't even know what they're rebelling against. Sean Carrol does a great job of making this argument in an hour-long video "Quantum Field Theory and the Limits of Knowledge - Sean Carroll - 12/7/22": https://www.youtube.com/watch?v=REITVohWOO0

He makes a much more precise, and I think powerful, argument that we actually know quite a lot.


Yeah I'm not so convinced that Maximum Force implies GR (and by extension gravity - even if Newtonian), seems to be a very wishy-washy assertion


To anyone interested in learning physics, I suggest reading https://www.susanrigetti.com/physics

Do not try to learn anything from the submitted article, it promotes a, shall we say, "non-traditional" view of physics that is unlikely to be helpful.


I wouldn't call all of it "non-traditional" (though some of it is). Either way, the author is being very sloppy with his "proofs", so it's hard to learn anything from the article unless you already know the details.


(5) (the minimum value of entropy) seems to come from an extremely hand-wavy argument on page 5 here:

https://arxiv.org/abs/2307.09914

It appears to be an example of argument by italics. I’m not convinced that it has any real content — it seems to be trying to say that the author thinks all systems are either being observed or they aren’t, and that there are therefore at least two microstates.

I think this is entirely missing the point and is mostly wrong. Even in the highly classical case of a gas, you have a box full of gas, with n particles and some volume V, etc. Those parameters form the macro state. Now you count microstates, take the log, and get the entropy. But nowhere in the count is a set of states where the gas is observed and a set where it isn’t. If you have a box of gas with n particle, etc, it’s implied that the gas exists!

So I don’t buy it. I’ll stick with S >= 0.


You're right, I was only looking at line (5) briefly and thought he was talking about the minimum change of entropy.

S ≥ k ln2 is incorrect, as you say. A system whose state is fully known has no entropy. One can either see this through counting the number of microstates (which is 1) or by applying Shannon's definition of entropy to the trivial probability distribution which is 1 for precisely one state.


> (the minimum value of entropy) seems to come from an extremely hand-wavy argument

So does the "maximum force" argument, which is given in this paper:

https://arxiv.org/abs/physics/0309118

It is, to say the least, not generally accepted as a valid argument.


Wow, there’s not a whole lot argument at all in there.

Maximum power seems especially odd — power is extensive. If you have two maximum power systems, don’t you end up with double the power?


> power is extensive

So is force. Two maximum force systems should also end up with double the force.


True, but “force” as used in that paper is so vague I gave up. Force is applied somewhere, and it makes very little sense to add up forces in different places. Although if you ignore GR and add up forces everywhere, you get F=0 (Newton’s third law), which is well below the supposed maximum. And trying to decide whether the net internal force of a system (0) is an extensive property is a bit silly :)

Admittedly, power has much the the same problem. Saying that P <= P_max is some sort of physical law requires a lot more explanation of what it would mean than the paper even tries to give it.


Your link points to texts for 9 years of full-time learning.


You can skip the areas you've already learned, but other than that, that's physics. It's not something you can learn in a weekend.


Humans, arguably some of the smartest and most capable who've ever lived, have worked tirelessly over 400 years to produce the corpus. I don't think you can really learn it all in 9 years - I think a normal bright person will take 20 and even then only have real intuition about parts of it. Most of the modern young wunderkinds of physics are ridiculously good symbol manipulators, and pay little to no attention to intuition.


Well do you want to learn it or not? There's no royal road.


just start with the Feynman undergraduate lectures (listed). The easy-mode of that is "Six Easy Pieces" (read it on kindle) which are the easiest 6 lectures.

At some point you'll need math, I recommend https://www.amazon.com/No-bullshit-guide-linear-algebra/dp/0... (I actually started here), and for calculus, "No BS Guide to Math/Physics" by the same author. These books both include a review of high school math (i.e. trig) which i needed. For DiffEq I currently recommend Logan's "A First Course in Differential Equations", this is where I am now and I found this the most gentle after trying several textbooks recommended from r/math. Context: I am an adult with an engineering degree from 20 yrs ago.


Reminds me of a machine learning article that recommended starting studies with the following https://webspace.science.uu.nl/~gadda001/goodtheorist/


Given how heavily mathematical physics is this isn't surprising. Some mathematical concepts themselves can take a couple of months to learn to a sufficient level, especially if you include the prerequisites.


Might actually teach physics then


There is a full textbook behind the article, for what is worth (haven’t read it myself, so no opinion on it): https://www.motionmountain.net


Just skimmed the EM section: I didn't catch anything egregiously wrong, per se (but again, I only skimmed it), but it's mostly fluff, and the actual physics content that does it exist is targeted at wildly different levels. For instance the author goes from "an electric field is like a tiny arrow attached to every point in space" (day 1 of intro physics for non-physics majors) in chapter 1 to assuming the reader knows what covariant derivatives are (advanced undergraduate) in chapter 2. I would strongly recommend against using these books.


I think a really important point that Susan doesn't go over is her course needs to be paired with something like Anki or you'll forget so much by the time you're all the way through.


Relevant Feynman: https://www.feynmanlectures.caltech.edu/II_25.html

(At the bottom, cf. “unworldliness”)

Physicist here, I don’t want to be too dismissive because reducing physics down to its basic principles is part of the game. However, I’m not sure this covers everything. We need a description or at least characterization of 4D Minkowski space. Continuity and even differentiability of functions is assumed. I also am not sure that canonical quantization is covered by this list. Quantizing the EM field, for example, is a big pain and not simply implied by the EM Lagrangian or even any quantization rules. I do not think you can go from this list to non-commuting Hermitian operators acting on a Hilbert space.


Thanks for your input. Since you're knowledgeable about this, can I ask about the thermodynamics part? My understanding is that thermodynamics involve macroscopic laws that are based on microscopic laws. This is similar to how chemistry depends on physics.

So why must thermodynamics be included in this list? Isn't all of macroscopic (thermodynamics) physics derived from the microscopic just like chemistry?


I will be a bit sloppy with my description here: There are two ways to "prove" thermodinamics. You can start from the axioms of probability and statistics *together* with an assumption that matter is made out of atoms or similar constituents, and you can derive, through statistical physics, most of thermodynamics. That is a perfectly reasonable and practical way to do it.

However, it is extremely valuable to also understand that the vast majority of thermodynamics (including our understanding of entropy) can be derived from completely independent axioms that have no relationship to information theory or the knowledge of discreteness of matter. You need to define temperature as an transitive observable (zeroth law of thermodynamics), you need to define heat as a type of energy (first law of thermodynamics), and you need some version of the second law (there are a few equivalent ones, but the easiest is that "heat flows only from high temperature to low temperature"). Lastly, you need equations of state as a starting point (instead of assumptions about the microscoping constituents of matter). Tadaaa, you have all that is necessary to derive all our knowledge of thermodynamics without knowing statistics or atoms.

Aesthetically this is incredibly pleasing and intellectually it is quite surprising that these two sets of axioms that look nothing alike produce the same self-consistent physics. To be productive in cutting edge research you usually need to be comfortable with both.

Edit: corrected associative -> transitive as pointed out below.


Sure enough, the principles of Carnot's thermodynamics and the premises of statistical mechanisc look quite differently. The thing is: since both form the same thermodynamics there must be a connection.

I submit: the qualification 'completely independent axioms' is incorrect. There is the observation: temperature is transitive. This transitive property is a statement of conservation (In terms of Carnot's thermodynamic it used to be thought of as conservation of Caloric.) The concept of Conservation of a quantity correlates with information.

We have that statistical mechanics subsumed Carnot's thermodynamics.

The laws of Carnot's thermodynamics are theorems of statistical mechanics. (Those theorems weren't necessarily stated explicitly. I'm saying the principles of statistical mechanics are sufficient to imply the laws of Carnot's thermodynamics.)


Statistical mechanics needs to assume the existence of atoms to give you the results of Carnot's. Carnot's thermodynamics does not. That makes it a bit less straightforward to say one strictly follows from the other. Otherwise the gist of what you say seems reasonable.


> it is quite surprising that these two sets of axioms that look nothing alike produce the same self-consistent physics

Why is that surprising? This sort of thing seems to be ubiquitous in our universe. F=ma is equivalent to the principle of least action. Turing machines are equivalent to the lambda calculus. Electrodynamics is equivalent to the existence of a finite reference velocity.

Also:

> You need to define temperature as an associative observable

Did you mean transitive? My understanding is that the required axiom is that if system A is in thermal equilibrium with system B, and B is in equilibrium with C, then A will be in equilibrium with C. That's transitivity, not associativity.


Indeed, this type of "surprises" is how you know you are probably on the right track with your choice of abstraction. But I still find the word "surprise" reasonable for the emotion I feel when I observe something like that. Very pleasing reassuring intellectually satisfying surprise. A "sufficiently smart armchair theorist" might discover these "surprises" in advance simply from consistency and beauty and simplicity considerations, but these armchair theorists are usually just a made up thing we imagine when teaching.

I corrected the wrong use of the word associative, thanks!


Here is one way of looking at it: statistical mechanics introduced the concept of entropy.

Years ago, in school, the physics teacher gave the following vivid demonstration:

The demonstration involved two beakers, stacked, the openings facing each other, initially a sheet of thin cardboard separated the two.

In the bottom beaker a quantity of Nitrogen dioxide gas had been had been added. The brown color of the gas was clearly visible. The top beaker was filled with plain air, so it was colorless.

Nitrogen dioxide is denser than air. If the gases would not mix then all of the Nitrogen dioxide would stay in the bottom beaker. But of course the two do mix.

When the separator was removed we saw the brown color of the Nitrogen dioxide rise to the top. In less than half a minute the combined space was an even brown color.

And then the teacher explained the significance: in the process of filling the entire space the heavier Nitrogen dioxide molecules had displaced lighter molecules. That is: a significant part of the population of Nitrogen dioxide had moved against the pull of gravity. This move against gravity is probability driven.

Statistical mechanics provides the means to treat this process quantitatively. You quantify by counting numbers of states. Mixed states outnumber separated states - by far.

The climbing of the Nitrogen dioxide molecules goes at the expense of the temperature of the combined gases. That is, if you make sure that in the initial state the temperature in the two compartments is the same then you can compare the final temperature with that. The temperature of the final mixture will be a bit lower than the starting temperature. That is, some kinetic energy has been converted to gravitational potential energy.

So in this particular demonstration probability was acting in a direction opposite to gravity, and overall probability had the upper hand.

Probability effects fall in the category of emergent phenomena. An emergent phenomenon is somewhat of an in-between category. Not quite as fundamental as the law of gravity, but there is no denying that it has an existence of its own.


The article is highly opinionated. This basically opens the floodgates for critical comments. But like you, I sense some value here, because distilling to fundamental principles is part of the game in physics. I am not knowledgeable enough to comment on how accurate it all is, but I am reminded of amazing things like Noether's theorem, where conservation principles come from symmetries. I would love to understand all of this better, and my assumption is that producing results from symmetries is what much of particle physics is focused on.

I think I'm okay with how opinionated this article is because, while I am not qualified to judge, it gives me an incredible sense of the wonder of physics. And I think opinionated intuitions, albeit somewhat wrong or imprecise, are useful for guiding ones understanding.


Similar stuff by this guy already posted here: https://news.ycombinator.com/item?id=32367085 It is a crackpot pseudoscience.


Line 1:

  > Action W = ∫ L dt is minimized in local motion. The lines below fix the two fundamental Lagrangians L.
Well, you lost me. As a non-physicist maybe I'm not ready for the distillation of the field into 9 lines.


The author is also a non-physicist, though he'd disagree with that characterization. He is a famous crackpot. Ask him to actually derive all of physics from these principles and he'd fail completely.


L is a path, and W its sum along time, furthermore dW = 0 means W is always minimized (things take the cheapest, lowest .. whatever metaphor for minimum makes sense in a context)

that's my understanding as a non physicist .. but dW means local minima or maxima.. so i'm confused :)


To be additionally pedantic as a physicist, dW = 0 means we have extramized the action. Technically, we are looking for a path of stable action (i.e. a small change to the path does not change the action), so both maxima, minima, and even saddle curves are allowed solutions.


thanks, appreciated the pedantism :)


L is the Lagrangian, not a Path. The action W is the Lagrangian integrated (or summed) along a path.


If you want to learn, and potentially really understand, some of this, I would ignore the physics entirely and start with the calculus of variations [0]. It's fairly straightforward (at least insofar as anyone with some basic differential calculus background ought to be able to understand the formulation and follow the derivation of the Euler-Lagrange equation), and it's useful. You can use it to derive Snell's Law or to design gears or to make pretty curves or the fastest possible ramps. Or you can figure out what L=T-U means, and use the Euler-Lagrange equation to recover F=ma from L=T-U, and you have classical mechanics!

And then you can go off the deep end and figure out how this all applies to quantum field theory :)

(L isn't a path. It's a scalar-valued function mapping the state of a system to a scaler. In C, with one classical particle, it might be double L(double t, double x, double v) where t is time, x is position, and v is velocity. IMO the simplest example from physics that actually makes sense as a minimization problem is the principle of least time: light going from point A to point B takes the fastest path. This seems a bit silly at first glance -- light travels along without having any idea where it's going. But it really is self-consistent: light propagates through space and through materials such that, wherever it ends up, it gets there by a locally fastest possible route. From this, you can derive specular reflection as well as Snell's Law (refraction) and you can start to understand mirages, e.g. why hot roads look shiny in the distance.)

[0] https://en.wikipedia.org/wiki/Calculus_of_variations is a so-so start -- IMO it's a bit rambly and doesn't do a great job of getting to the point.


Kind of cheating, if the "9 rules" have numerous bullet points. Rule eight has 18 non-specified bullet points. Rule nine, 27.


The title of this HN post could be "all of physics in 1 line"!


The beginning of the International Obfuscated Physics Contest.


It's like those "3 step recipes" where each step is 2 paragraphs of steps.


This is kind of like "LLaMa implemented in 9 lines of Python" and 8 of those 9 lines are import statements followed by a huge one-liner composing library functions

If civilization were to collapse tomorrow and the survivors were left in the stone age with "All of Physics in 9 lines", it would arguably be useless. There's so much missing/buried context.


I'm not a physicist, so I'm not really qualified to comment on whether a "real physicist" would say the fundamental assertion is true or not, but from a simple linguistics point of view, this article is pure click bait.

At school, I studied physics for 7 years. Nothing I was taught is hinted at by any of these 9 points, or at least in any way I'm able to recognise after receiving all that teaching, except perhaps some of the constants (they're not listed, but I assume there will be an overlap). Did I, in fact, learn absolutely nothing about physics in all those 7 years? It's much more likely to be the case that from a simple linguistics point of view, these 9 lines do not contain "all of physics", but rather than just the parts of physics that the author considers to be the most fundamental. Maybe if you have decades of highly specialised study you have a chance of remembering most of the rest of what's needed to fill in all the (known) gaps and assumptions those 9 lines are based on and hint at. And the fact that each "line" links to an entire volume that tries to fill in those gaps (I have no idea if they succeed in doing that or not), shows that this isn't anywhere close to what it claims to be.

It's about as vacuous a statement as trying to summarise all of maths as "the set of all possible sets" or all of philosophy as "cogito ergo sum".


>but rather than just the parts of physics that the author considers to be the most fundamental.

Fundamental is what "contains all of physics" means in this context (key word: contains). The rest would be emergent properties.


Can the emergent properties be derived from just the nine lines?


No. Some of them are simply wrong. For instance:

- action is extremized or at a saddle point, not necessarily minimized

- Action is not quantized in QM (and quantum-mechanical action is not quite the same as classical action). There's another very indirectly related notion of "action" which shows up in historical semiclassical approximations and which is quantized. The author has likely confused the two.

- SU(2) is not the gauge group of the weak interaction, because there is no such thing. A purely weak gauge theory with no EM is not possible, on account of having massive force carriers. The SU(2) and SU(1) in SU(3) x SU(2) x U(1) corresponds to weak isospin and weak hypercharge respectively, not weak charge and (EM) charge.

But even if all the errors were corrected there's a huge amount of missing context. (For instance, the actual content of the standard model Lagrangian - the fact that there are 19 free parameters is in no way sufficient to determine it) And even if you had all of that, a full reduction of macrophysics to microphysics is likely computationally intractable, even in principle.


I'd say that if a capable scientist like Einstein was given them (assuming they didn't know the rest of the theories), they could build upon them to derive or get to close guesses of large parts of physics. The lines are quite loaded with broad equations that covers very fundamental stuff - including constants and elementary particles.

A few equations being fundamental for big theories built upon them wouldn't be unheard of. Maxwell's equations for example cover several important fields of Physics. Newton's 3 laws cover classical mechanics.


Except the article has a categorical statement: "The 9 lines contain all present knowledge about nature, including all textbook physics and all observations ever made."

Take something simple like "F=ma" which is in lots of textbooks. Nothing in those 9 lines seem to suggest anything that could be used to derive that.

I'm sure a domain expert, who already knows all this stuff, I'm sure they'd be useful as a memory aid to help them trot this stuff out, but do they "contain all present knowledge about nature, ..."? Not so much, if you need a ton of other knowledge to make use of them.


Physicist here. I have no idea what they mean by the entropy having to be larger than klog2 “implies” thermodynamics. Implies it how?


Based on the book he points to, it goes somewhat like this:

(...) people asked what entropy was microscopically. The answer can be formulated in various ways. The two most extreme answers are:

Entropy measures the (logarithm of the) number 𝑊 of possible microscopic states. A given macroscopic state can have many microscopic realizations. The logarithm of this number, multiplied by the Boltzmann constant 𝑘, gives the entropy.*

Entropy is the expected number of yes-or-no questions, multiplied by 𝑘 ln 2, the answers of which would tell us everything about the system, i.e., about its microscopic state.

In short, the higher the entropy, the more microstates are possible. Through either of these definitions, entropy measures the quantity of randomness in a system. In other words, entropy measures the transformability of energy: higher entropy means lower transformability. Alternatively, entropy measures the freedom in the choice of microstate that a system has.

(...)

Before we complete our discussion of thermal physics we must point out in another way the importance of the Boltzmann constant 𝑘. We have seen that this constant appears whenever the granularity of matter plays a role; it expresses the fact that matter is made of small basic entities. The most striking way to put this statement is the following:

There is a characteristic entropy change in nature for single particles: Δ𝑆 ≈ 𝑘. This result is almost 100 years old; it was stated most clearly (with a different numerical factor) by Leo Szilard. The same point was made by Léon Brillouin (again with a different numerical factor. The statement can also be taken as the definition of the Boltzmann constant 𝑘.

The existence of a characteristic entropy change in nature is a powerful statement. It eliminates the possibility of the continuity of matter and also that of its fractality. A characteristic entropy implies that matter is made of a finite number of small components.

The existence of a characteristic entropy has numerous consequences. First of all, it sheds light on the third principle of thermodynamics. A characteristic entropy implies that absolute zero temperature is not achievable. Secondly, a characteristic entropy explains why entropy values are finite instead of infinite. Thirdly, it fixes the absolute value of entropy for every system; in continuum physics, entropy, like energy, is only defined up to an additive constant. The quantum of entropy settles all these issues.


Funny. I feel like this is a highly compressed description of physics.

Only that the codec needed to actually decompress (understand) it is huge.

Edit: from an information theory perspective.


Whatever equation you have, bring everything to the left side, define a new variable Ω as whatever you have on the left side. Behold the ultimate theory of the universe.

  Ω = 0


There are some inequalities too, so we need some slack variables. /s


> No known observation and no known measurement contradicts these 9 lines, not even in the last significant digit.

Surely this is technically incorrect; there are countless observations that disagree with these 9 lines. We just suspect they are caused by measurement or operator error. There is evidence of all sorts of impossible things if you stick to just what is recorded. There can't be that many machines capable of measuring the extreme least-significant digit of these constants.


Many of the "27 numbers" have fairly wide (relative) error bars as well. We literally do not have exact values for several of these.


>there are countless observations that disagree with these 9 lines. We just suspect they are caused by measurement or operator error

That's not his point though. Obviously those are not included in the observations that matter.

Same way "there can't be any evidence of a perpetual motion machine" is not refuted by people coming up with their "evidence".


His point is technically incorrect. He can't exclude all the contradicting observations then conclude that all the observations agree. That is a tautology by construction. And it is also unscientific, the physical constants don't have hard boundaries. Each significant digit gets fuzzier the until we just can't guess what the next one is. There are going to be measurements that disagree about the least significant digit.

We can't even be that confident they are constants. We've only had good measurements of them for a few centuries. He's right that proving that would be a big deal, but it isn't true to say that there are no inconsistent measurements.


Which of these talks about the speed of light being constant for all frames of reference? This to me is one of the most fundamental and bizarre aspects of the universe.


That was point #2.


Non-physicist here. Was hoping this would do something for me, but unfortunately... Very first line, I look up "Lagrangian" to see what L means, and I get ten different definitions: https://en.wikipedia.org/wiki/Lagrangian. I'm guessing it's the one that says kinetic minus potential energy, but that seems arbitrary (unlike the sum which would be total energy). Couldn't this be more specific?

Most of the other lines have terms/constants/variables that I've never heard of or don't remember from school, and usually it's "Lagrangian" again. I don't know what it means to fix, restrict, yield, or complete a Lagrangian. I could keep reading about that, but at this point I don't think I'm learning physics the right way. If there's one thing I remember from class, it's that there aren't many shortcuts here; usually you have to start at fundamentals and prove your way up.

Also, idk how entropy ≥ (some constant) implies thermodynamics, and seems the physicists here don't see it either.


> I'm guessing it's the one that says kinetic minus potential energy, but that seems arbitrary

It is arbitrary, in some sense: Lagrangians are specifications of a physical system in the same way that laws of motion are. Instead of directly describing a trajectory as the solution to some set of differential equations, the Lagrangian approach describes it as a stationary point (an extremum or a saddle point) of the action, which is the integral of the Lagrangian over time. The resulting equations of motion are then given by the Euler-Lagrange equation. L = K - V is (one formulation of) the Lagrangian for a system that obeys Newton's laws. Other systems have other Lagrangians, from which we get other laws of motion.

The other important formulation of classical mechanics is the Hamiltonian one, which very roughly speaking reparameterizes the Euler-Lagrange equation in terms of (generalized) positions and momenta, instead of just position. This turns your n-dimensional second order differential equation into a 2n-dimensional first order one, which is often more convenient to work with.

For simple "ideal billiard balls in a vacuum" situations it makes very little difference what approach you use, but Newtonian mechanics generalizes poorly. All of modern physics is based on the Lagrangian and Hamiltonian approaches.


"The theoretical minimum" series by Leonard Susskind et al. (along with the video series in YT) is a supergentle introduction & gloss of this whole table. If you stick to the books you'll never have to read popsci crap, again, except to join in with the eye rolling.


I was trained as a physicist and I love physics, but I believe it is a genuinely open question whether all these laws, which probably do encompass physics per se as we know it, actually therefor encompass the whole world.

If you genuinely believe in emergent phenomena, then these laws genuinely do not describe all the things that happen in the universe. I do not believe in emergence in this form, but some philosophers of science do.


My best friend growing up was always into physics, and at one point in high school I remember him committing to a deterministic view of the world, because everything could be described by physics, and if you knew the position and velocity of every particle in the universe, you could fast forward or rewind everything, and so on. After he got his Ph.D. we were talking about similar topics, and I asked him if he still believed that. He blinked a couple times and said, "absolutely not, it's much more complicated," which I have to admit I was glad to hear him say.

But what I ought to have asked is whether he thinks it's because the rules we know are incorrect or missing a few key features, or whether at some fundamental level they just can't do what he used to think they did. To my worldview, it borders on magical thinking to believe they couldn't describe everything in the universe. On the other hand, intuitively, it seems deeply unsatisfying (not to mention empirically unproven) to claim they do. It's a big question for sure.


In our current understanding of the world (quantum mechanics), the world is inherently probabilistic ("random", not deterministic) in the microscopic realm. There are differing opinions among physicists whether or not there could be a more fundamental layer of reality beyond this, and if so, whether it would be deterministic. Notably it could be said the mainstream view is that reality is not deterministic.


If you were shown a screen, told it consists of individual pixels, but no matter what microscope you grab you can't discern those pixels, does that screen consist of pixels or is it a continuous canvas?

That's kind of where physics is at, no? Until you succeed in building an apparatus that lets you see individual pixels, it's a continuous canvas for all intents & purposes.

Some discrete & deterministic layer underneath it all is a more elegant possibility imho. Might suit people who prefer "nature at its deepest level is math" worldview. But why would reality 'bother' to fit into that shoe? It just is. Whatever that is. Discrete or continuous, deterministic or probabilistic.


I think there are good epistemological reasons to at least consider the fact that this is not what quantum mechanics is about. There seem to be ways you could kind of try to make what you are talking about work, but they are incomplete and pretty incondite and have, at any rate and to the extent that I understand them, some pretty unappealing philosophical characteristics.

The real meaning of the commutation relation is not that there is a fundamental _relation_ between sets of observables but, I would argue, that at a deep level pairs of non-commuting observables like x and p, share a single ontological substance which we can view partially as either position or velocity depending on how we arrange our measuring apparatus.


The second most popular view is that the universe branches deterministically, which means there will be some observers in a very small percentage of branches who observe seemingly miraculous events (very low probability). Their notion of probability would be different from ours.


> if you knew the position and velocity of every particle in the universe

That's a big if, which quickly shot down my then early and naive absolute determinism view when Heisenberg uncertainty principle reared its head:

    σx•σp >= h/2π
IOW try to increase knowledge of speed and you lose as much knowledge of position.


How do you mean emergent phenomena? Like with gliders and stuff in Conway's game of life? Where the rules of the game are trivial, but the higher level behavior is not.

That said, if that's what you mean, then it's been my belief for a long time that if we were given a book with all the true laws of physics at the lowest level we would still need to spend centuries deriving useful approximations for our practical concerns at the larger scales.


Well, consider the art installation called Descension by Anish Kapoor[1]. The vortex which makes up the piece emerges from the constraints imposed upon the fluid by the artist. If we replaced the water in the installation with any fluid with similar viscosity, the vortex would remain. Thus it appears to be the case that the vortex does _not_ supervene upon the actual substances which makes it up per se.

In the case of Descension the constraints are somewhat artificial and imposed by the artist. Arguably, the phenomena supervenes upon the physical structure of those constraints. But consider some living organism. It's various physical parts might be replaced by other parts with similar properties (with respect to their function) and the organism would keep existing as itself. But there isn't any particular substance which maintains the constraints which define the organism. Some philosophers of science (eg Terrance Deacon) uphold that this means emergent phenomenon do not supervene upon their material constituents per se, and are thus not determined by the laws of physics which dictate their properties. I don't buy it, but it is a position.

[1] https://www.thisiscolossal.com/2015/02/anish-kapoor-decensio...


Interesting. I definitely don't agree with that myself.

It sort of reminds me of the Philosophy of Mathematics[1], where there are some schools of thought that I consider vaguely ridiculous too. Like some schools of Platonism suggesting mathematical objects actually exist somehow instead of it just being a convenient linguistic shorthand.

[1] https://en.wikipedia.org/wiki/Philosophy_of_mathematics


It's a pretty standard belief. We've spent centuries as a species investigating the consequences of simple approcimate rules like F=ma.


> The 9 lines contain all natural sciences. They contain physics, chemistry, material science, biology, medicine, geology, astronomy and all engineering disciplines.

so, can you go from these to predict the structure of DNA/RNA and predicting how they work?

also, I don't like the use of "nature" as some sort of concious entity that prefers some behaviors to others.


"All science is either physics or stamp collecting" - Sir Ernest Rutherford


Evolution, geology, meteorology are just stamp collecting? They have their own explanatory models. Maybe they entirely reduce to extremely complicated physics, but we have no means of explaining them as such.


if he had studied molecular biology he might have had a different opinion - though probably not.


nature is left as an exercise to the reader


The physical world cannot be sufficiently described or predicted without superfluids, n-body gravity problems, and emergence.


Line 10: Nature dislikes anthropomorphization


This is not “all of physics” because we are still pretty far from deriving all physics from the microscopic scale: there's still a good chunk of physics that relies on macro-scale empirically-derived formulas. For instance: fluid mechanics and granular material mechanics.


Not a physicist.

How would someone go from these to maxells equations?

Wondering just how big a leap is required?


This is line 6. It basically implies[0] a specific form for the electromagnetic Lagrangian from which Maxwell's equations follow immediately.

For the Lagrangian and the derivation of Maxwell's equations see https://en.m.wikipedia.org/wiki/Covariant_formulation_of_cla... .

[0]: The author is being very sloppy – tons of additional structure and additional assumptions are silently implied by his "9 lines".


Too bad that quantum and relativity don't go together, and most probably all the listed equations are wrong at the conceptual level: the concepts that are denoted by the letters do not exist.


Ok, now I just need to know what a Lagrangian is.


Lagrangian mechanics[0] is a compact way to describe the dynamics of a physical system using a single object/equation. You can think of it a a generic description of a system, from which you can extract the equations of motions of the particles forming that system: which is generally an important part of what physicists wants to achieve, i.e. predicting the motion of objects.

I'm not qualified to provide you with a more precise, still concise and (hopefully) still accurate description. However, if you want to dig further, the most beginner friendly introduction that I've stumbled upon is Susskind's The Theoretical Minimum - Classical Mechanics [1](there's a book & a series of videos).

[0]: https://en.wikipedia.org/wiki/Lagrangian_mechanics

[1]: https://en.wikipedia.org/wiki/The_Theoretical_Minimum


I assume you are joking, but for others reference, this is a great way to answer that question:

http://ppc.inr.ac.ru/uploads/476_Hamill.pdf


Amazing that almost everything after 1 (apart from 5) is circa 100 years old. And almost nothing is newer than circa 50 years ago.


Doesn't matter much since this seems to be not really true anyway...


That's not what's wrong with it.

There has been almost no definitive progress in fundamental physics in the past 50-100 years.


40-50 years, maybe. 100 years is greatly overstating it - for example, the unification of all forces except gravity dates to the 70s.


That's exaggerated. The Schrödinger equation was published in 1926 and tons of particles have been discovered in the decades after. But most would agree that progress has been slow in the past 50 years, with the discovery of dark energy 30 years ago being one of the most significant ones.


That's just the way science and technology works. You get the largest gains at the beginning as you're picking the low hanging fruit.

You could say the same of chemistry too. Or even about industrialization in general.


Why does this read like a cult manifesto?


Obey the 9 Lines.


As a programmer this looks unusable because of all the overloads implied.


> What determines the principles in lines 1 to 5? They imply that there is a smallest length and time interval.

I would love if someone could elaborate this point.


Planck length and Planck time.

Not a physicist, but from what I recall, the Planck length is presumed to be the smallest measurable distance before stuff turns into a black hole, which isn't very useful

Planck time is the time required for light in a vacuum to travel the Planck length.

Physics goes to shit if you try to go shorter on either measurement.


Important addition: Most statements regarding Planck units are hypotheses or educated guesses at best. (Other people call them numerology – because people simply combined some fundamental constants and out came the Planck units.) We don't really know whether there is a minimum length or minimum time, what happens at the Planck scales etc.


These are the scales at which neither General relativity nor the standard model (QM, QFT, etc) dominate, so you cannot just ignore one and do the calculations with the other. To solve systems at these scales, you need a working system of quantum gravity. We do not yet have one of those. These quantities are essentially where our current models of physics break down and we can no longer make anything resembling a prediction.


Not a physicist either but currently self learning quantum physics. Note that Planck length and time are just units - they do not mean that space and time are quantized. Quantum theory treats space and time as continuous variables. While it has been speculated that space and time are quantized I do not believe this has been experimentally verified because the scales involved are beyond current experimental reach.


> the Planck length is presumed to be the smallest measurable distance before stuff turns into a black hole

This is not correct. The Planck length happens to be very very roughly the scale at which we expect effective theories that ignore quantum-gravitational effects (i.e. the Standard Model) to break down. It has no special physical significance.


Unreadable on iOS 16.6 and browser Firefox 116.2 (33536). Text overflows right border and can’t be scrolled

Toggle “ask for desktop version” doesn’t make any difference


Use Reader mode


This is confusing the representation of something for the thing itself. It’s like thinking that the word John is enough to know the person John.


Is this New Physics like the New Math?

Make 10 from 8 + 5... or ... reduce Maxwell's equations into U(1)?


Ha! First thing I was looking for is Maxwell’s equations and more shocked that they got summarized in U(1) only :)


Wightman axioms better


There goes my next month reading these books.


This post is wrong. Get better sources to study.

There are a few comments here that explains the errors, and there are a few additional comments about other problems in the previous submission https://news.ycombinator.com/item?id=30733666


All of Physics in 1 Line:

1. All of physics




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: