* These are not elementary particles.
* It was predicted for a long time that these composite particles should exist. So it is not "new" in the sense that it was unexpected, but we finally have the resolution (energy, luminosity etc.) to detect these with statistical significance.
* These belong to the same class as protons and neutrons - these are hadrons made with multiple quarks.
Here's a super simple overview of Quantum Field Theory & Particle Physics:
Everything is made of force fields and matter fields.
* We discovered that force fields (i.e. electromagnetism) is quantized, giving rise to quantum mechanics etc.
* Matter fields are also quantized, hence their excitations behave like discrete particles - the ones we observe at the atomic scale, for instance.
* In hindsight, we should've been able to predict this after E = mc^2 telling us energy <> matter, thus if energy is quantized so should matter.
Energy isn't necessarily quantized though. For instance, the spectrum of an unbound particle is continuous. So why does it follow that matter should be quantized from E = mc^2?
I assume you've heard of particle/wave duality, or how really tiny things are never exactly in only one place.
There are a set of wave equations that describe how this works.
There are also two kinds of things. Some things can go thru eachother, like photons. Other things bounce off eachother, like neutrons (and other things considered matter).
The equations for the kind of things that bounce off eachother would be a matter field.
There are also two kinds of things. Some things can go thru eachother, like photons. Other things bounce off eachother, like neutrons (and other things considered matter).
This is not true. The difference between bosons and fermions lies in the way swapping two of them works. The carriers of the weak force, W and Z bosons, are for example electrically charged and can therefore scatter off each other. Gluons, the carriers of the strong force, also interact with each other. Even photon photon scattering is thing. On the other hand you can try to collide two neutrinos, which are fermions, for quite some time and not much will happen.
The difference between bosons and fermions lies in the way swapping two of them works.
That sounds like what Wikipedia says is the rigorous version of the Pauli exclusion principle[1].
I was trying to get close to the not-rigorous version (1st paragraph of the link) in terms that are easily understandable without having taken a university QM course. I guess can/can't be in the same place at the same time would be a better approximation of it?
Loosely peaking the wave function of a quantum mechanical system specifies for each possible state of the system the probability of finding the system in that state. Actually it is not the probability but the probability amplitude, a complex number from which you can derive the probability by squaring it.
Assume you have two identical fermions, say two electrons, the first one in state x and the second one in state y. State means everything required to fully describe the particle, for example position and spin. Therefore x stands for the first electron being in a specific position and having a specific spin, similarly for y and the second electron.
Let A(x, y) be the probability amplitude for finding the first electron in state x and the second electron in state y, i.e. the first argument of A is the state of the first electron, the second argument is the state of the second electron. Now swap the two electrons, take the first and put it where the second one is, take the second one and put it where the first one was. Also change the spins as necessary. The probability amplitude is now A(y, x), the first electron is now in state y, the second electron is now in state x.
The important thing is now that the two electrons are identical, you can not tell the difference between the situations before and after swapping the two electrons. Had you painted one electron blue and one red, then you could easily tell the difference, but without that you can not. That was the entire point of swapping the electrons, bringing them into exactly the state of the other one.
But if you can not distinguish the two situations, then it better be the case, that they have the same probability, i.e. A(x, y) = A(y, x). But that is not quite right, A is the probability amplitude, not the probability. It turns out that there are actually two valid possibilities, A(x, y) = A(y, x) and A(x, y) = -A(y, x). As mentioned at the beginning, you get the probability by squaring the probability amplitude, so that the minus sign in the second case vanishes.
The first possibility is how bosons (particles with integer spin, for example photons and gluons) behave, the second one is how fermions (particles with half integer spin, for example quarks and electrons but also helium-3) behave. Now we finally arrive at the important point, what happens if both electrons are in the same state, i.e. if the first electron is in state x and the second electron is also in state x. Then the probability amplitude is A(x, x) and we have to satisfy A(x, x) = -A(x, x) because electrons are fermions.
But there is only one complex number identical to its negative and that is of course zero. Therefore the probability amplitude and in consequence the probability obtained by squaring the probability amplitude are both zero, which means that the probability of finding the system in the state where the first electron is in state x and the second electron is also in state x, is zero. The two electrons or more generally two fermions can never be in the exact same state.
I have actually! And I also know about bosons and fermions. I didn't realize the phrase "matter wave" was just the name for the wave in wave-particle duality, that's all :) thanks!
They are properly called fermionic fields [1] as opposed to bosonic fields. I also want to note that the »super simple overview of Quantum Field Theory & Particle Physics« really is super simple. As far as I can tell - I am not a physicist - none of the statements is actually correct.
Matter fields are indeed a real thing. a Google Books search will reveal hundreds of books etc. They're also called fermionic field.
Here's the full range of elementary particles, all 19 of them, in the Standard Model:
Fermions:
- Leptons (6) (electrons, neutrinos etc.)
- Quarks (6) (protons and neutrons are made of this)
Bosons:
- Gauge Bosons (aka Force Carriers) (4)
- Higgs Boson (gives mass to stuff) (1)
What classical physics called force fields (i.e. Maxwell's equations), now we call them gauge bosons.
Electromagnetism : Photon
Strong Force: Gluon
Weak Force: W&Z boson
Gravity: No one knows!
One extra boson that gives everything mass - the higgs.
Force fields are quantized (see re: Photoelectric effect & Einstein's paper in 1905, winning a nobel).
Matter fields aka fermionic fields are also quantized, which makes these fields behave as if they are composed of discrete states -- giving raise to the particles & discrete states.
Indeed, in the Wikipedia article you link to, the first sentence reads: "fermionic field is a quantum field whose quanta are fermions".
ie: matter/fermionic fields are quantum fields, whose quanta are fermions, i.e. electrons, protons.
So, actually, all my statements are correct. Do you care to give a concrete example as to which one is not correct?
This is the best summary I've read (not that I've read much) but it really puts things in perspective.
Correct me if I'm wrong, but I'm going to try to summarize to make sure I understand:
The properties of protons, neutrons, and electrons that make them unique/distinct from each other (mass, electric charge) arise from the composition of each out of smaller particles, which each are/carry/act as the respective mass + charge + the other properties.
Is there any theory as to what this "looks" like? Or is the best we can do "it's a bunch of these things mashed together and the only way to see them individually is to bash them together until they break"?
If the strong and weak forces are particles, does that mean they're 1: literally everywhere, not necessarily stuck to any larger particle and 2: like glue?
I'm also confused about the relationship between gravity and mass, given that the higgs is stated as corresponding to mass, mass is traditionally thought of as what gravity acts upon, but the wikipedia chart states that gravity acts upon all particles.
There are a few errors in your comment. I hope my comment can clarify some of them.
Electrons, Positrons, Neutrinos, Muons, ... are elementary particles. You can't break them.
Protons, Neutrons, ... are composed by three quarks. Quarks are elementary particles that you can't break.
The quarks inside the proton and neutron are bounded by the strong force. The strong force is really strong so no one have seen an isolated quarks.
We only know they are formed by three quarks because if we make them collide at high speed the quarks from one of them can be recombined with the quark of the other and form a few new particles.
It's more complicated, because during the collisions it is possible to create a pair of quark-antiquark. So if you collide a proton and an antiproton at a high speed, after the collision you have to rearrange the 3 quarks from the proton, the 3 antiquarks from the antiproton, and all the quarks and antiquarks that appeared in the collision.
The exact number of pairs of quarks and antiquarks and their favors are determined by probabilities derived from difficult calculations.
And actually, all this mess is not instantaneous, the quarks can rearrange themselves in some particles that later decay in other particles with other quarks.
In particular in this experiment they didn't see the "new" particles directly, because they live for a very short time. They only saw the particles that were formed after the "new" particle decayed.
---
About
> If the strong and weak forces are particles
No. It's important to distinguish the difference between a force and the particles that are the carriers of the force. The differences and relations are subtle, so it's better do delay the discussion for another day
> gravity ... mass ... higgs
They are also different things, that are interrelated but different.
You've got it backwards. Fields are fundamental. Nature is just one big complicated multi-component field. The field is quantized in various ways that make state 'clump' in discrete ways: these are particles. The strong and weak force are mediated by particles because strong and weak interactions are comprised of quantized state transitions.
Gravity is a field, and mass is a property of how it is quantized. The Higgs particle is a description / side-effect of this quantization.
The properties of protons, neutrons, and electrons that make them unique/distinct from each other (mass, electric charge) arise from the composition of each out of smaller particles, which each are/carry/act as the respective mass + charge + the other properties.
Protons and neutrons are composite, made out of two up and one down respectively two down and one up quark. Plus gluons holding them together. There are also four other quarks, bottom, top, charm, and strange. And of course one antiquark for each of the six quarks. There is a huge number of particles made out of quarks, called hadrons. Hadrons are either baryons like the proton and neutron made out of three quarks, or mesons made out of one quark and one anti quark. There are also exotic things like tetraquarks.
Electrons are, as far as we know, fundamental and not made out of other particles. The same goes for the muon, the tau, and the three accompanying neutrinos. There is again an antiparticle for each particle. This group is called leptons.
The properties of composite particles are determined by their constituents, but not in a trivial way. The mass for example is usually bigger than the mass of the constituents because the binding energy contributes to the mass.
Is there any theory as to what this "looks" like? Or is the best we can do "it's a bunch of these things mashed together and the only way to see them individually is to bash them together until they break"?
Quantum chromo dynamics is the theory of quarks and gluons.
If the strong and weak forces are particles, does that mean they're 1: literally everywhere, not necessarily stuck to any larger particle and 2: like glue?
The electromagnetic, the strong, and the weak interaction are mediated by their respective bosons, we also suspect it for gravity. You can observe the bosons on their own, they are not like springs and rubber bands connecting particles between wich they mediated forces. Actually there are not really any photons bouncing back and forth between two electrons pushing them away from each other due to their like charges. But I can not offer any good model, that is something I never managed to really understand.
I'm also confused about the relationship between gravity and mass, given that the higgs is stated as corresponding to mass, mass is traditionally thought of as what gravity acts upon, but the wikipedia chart states that gravity acts upon all particles.
Most mass comes from [binding] energy, the Higgs mechanism contributes only a small bit. The Higgs boson has nothing to to with that at all, it is just an excitation in the Higgs field. Gravity acts on energy. As far as I can tell mass is just an abstraction. If you put massless photons into a mirror box to bounce around, they add energy to the box which makes the box harder to move, i.e. you have to push against the photons hitting the wall you are pushing on. As a convenient abstraction we say the box got heavier, it has more mass, but there is actually nothing fundamentally heavy in the box, the photons have no mass, only their energy and momentum with which they hit the wall making it harder for you to push it.
I am not a physicist, take all this with grains of salt.
We discovered that force fields (i.e. electromagnetism) is quantized, giving rise to quantum mechanics etc.
You are confusing quantum mechanics with relativistic quantum field theory. Quantum mechanics was developed as consequence of many different experiments showing quantization phenomena or requiring quantization to explain them, among them the spectrum of black-body radiation and the photo electric effect, but also the quantization of charge and spin. Quantum mechanics and non-relativistic quantum field theory are unable to describe electromagnetic fields, the former because it can not handle the creation and annihilation of particles, the later because photons are always relativistic particles and therefore require a relativistic description. Only with the development of quantum electrodynamics, a relativistic quantum field theory, was a quantum mechanical treatment of the electromagnetic field possible. But this was 20 or 40 years after the inception of quantum mechanics, depending on from where you count.
Matter fields are also quantized, hence their excitations behave like discrete particles - the ones we observe at the atomic scale, for instance.
It is important to understand, that quantum field theory is very different from quantum mechanics. In quantum mechanics a wave function describes the state of a system in Hilbert space, it describes properties of particles. But this runs into problems if the system contains many particles. Two electrons, for example, are indistinguishable and swapping them does not change the state, at least up to a sign change of the wave function which is the difference between fermions and bosons. This complicates the mathematical treatment. And as mentioned before, this approach is unable to handle the creation and annihilation of particles. Quantum field theory therefore takes a very different approach and describes with occupation numbers in Fock space how many particles are in each state.
In consequence the fields in quantum field theory are just a mathematical tool to handle many particle states. We started with particles and introduced fields to describe them mathematically, we did not discover that a field is quantized and therefore looks like a collection of particles. Admittedly this is a contentious issue, there are people claiming that those fields are real and more than a mathematical tool.
In hindsight, we should've been able to predict this after E = mc^2 telling us energy <> matter, thus if energy is quantized so should matter.
We knew or at least suspected that matter is quantized long before we discovered the photon, atoms and particles in general are a very old idea. So at best we could have inferred that energy is quantized from the quantization of matter, not the other way round. But the idea of photons actually predates E = mc², too. You are also probably misinterpreting what E = mc² actually says, you can not use it to link the quantization of energy to the existence of particles, at the very least not in any obvious way. The relationship between mass, energy, and particles is complicated.
And again, I am not a physicist, do not take what I say as the final truth, use it as a starting point. Corrections from actual physicist welcome.
Yes, but these experiments serve at least three purposes.
One, to verify the Model (after all, if its predictions fail in some part it will have to be revised).
Another, to verify the characteristics of the predicted particles (there might be some differences from the prediction that are important, but not hypothesis-breaking).
Finally, however unlikely and yet most excitingly (to me at least), to open new avenues of questioning and hypothesis up by really throwing into question some things.
It's worth remembering that the Higgs particle was also predicted by the standard model, and no one underestimates the importance of that confirmation.
The Higgs is an elementary particle, so its discovery was much more exciting. There are a lot of particles made up of quarks, like the five mentioned in the article:
What is at least one single scientific (replicated) evidence that these tables does not contain merely a socially constructed crap, modern day alchemy deduced from flawed models and/or instruments?
These tables are editorialized by the Particle Data Group, they summarize and distill the empirical evidence of thousands of high-energy physics experiments. Not every experiment is replicated, but some are, hence the different ratings of how sure the editors are the particles exist.
From their website:
"In the 2014 Review, the listings include 3,283 new measurements from 899 papers, in addition to 32,000 measurements from 9,000 papers that appeared in earlier editions. Evaluations of these properties are abstracted in summary tables."
> probability, statistics, accelerators and detectors
Surely, there could not be any error here. Nothing socially constructed.
Isn't it already borders nonsense that in the universe which is sustained by a couple of fundamental conservation laws there supposed to be hundreds of elementary (presumably fundamental) particles which emerge and disintegrate in time, which is not even an intrinsic property of "more real" things like photons.
> Surely, there could not be any error here. Nothing socially constructed.
There could be. Physicists have been trying very hard for the last 40 years to find something the standard model cannot explain, without success. Some experimentalists are actually disappointed about this.
> Isn't it already borders nonsense that in the universe which is sustained by a couple of fundamental conservation laws there supposed to be hundreds of elementary (presumably fundamental) particles which emerge and disintegrate in time, which is not even an intrinsic property of "more real" things like photons.
You are confused. The standard model has 17 elementary particles (+ anti particles). There are a lot of composite particles, like the ones the article talks about. Not sure what you mean about decays, but photons are a special case, because they move at the speed of light. Other elementary particles, like the heavier quarks, can and do decay.
Given that science is done with models and instruments I am confused what answer you expect. Those models and instruments are validated in thousands of different experiments, at different energy and length scales, from astronomical observations to man-made particle accelerators. Each of the experiments and observations testing slightly different part of the theory, all of them mostly in agreement. It is the rare disagreements that are actually most exciting, because they pave the way forward to pieces of the theory we do not understand yet.
Here are two independent examples that agree in their results: experimental measurements at the LHC; theoretical predictions from QFT.
Why, there are way too many well-documented instances of socially constructed and socially accepted bullshit in the history of mankind. Actually, it is much difficult task to find instances of accurate approximations to the truth.
There has been times when Hegelian "logic" has been accepted, published by Oxford University press, peer-reviewed, highly praised and successfully taught to
students. I have read parts of Encyclopedia of the Philosophical Sciences. I have read it after The Principles of Mathematics and things like Haskell Prelude.
I have read peer-reviewed commentaries to The Highest Yoga Tantra and such crap like commentaries to Hatha Yoga Pradipika by some Australian lady with some funny Hindu nickname.
I have read even beautiful sufi texts in which they freely mix and match anthropomorphic qualities to produce a beautiful carpet of linguistic patterns which describe nothing that exist. Peer-reviewed and highly praised, of course.
Socially constructed nonsense is not something rare and uncommon. To the contrary, most of publicly available information is bullshit, or at least highly inaccurate, full of meaningless generalizations, flawed logic and amounts to nothing but mere compilation of current memes.
So, I am more or less familiar with how such things could emerge. My question is - what is a single falsifiable experiment which proves that this is not socially constructed, highly sophisticated sectarian set of beliefs supported by complex (but meaningless) simulations (instead of mere a book of dogma).
There is plenty of stuff in science that is imperfect and deeply flawed. Look at all the poorly reproduced studies on various nutrition and health products, for example. The Standard model is not part of that weak set.
The way to oppose social construction is increased rigor and experimentation. The Standard Model has some of the highest rigor and largest amount of experiments backing it.
Skimming through your comment history it's obvious you have a bone to pick with modern physics. Is there some discrete point in history where you think we started diverging from falsifiable experiments so we can perhaps compare before and after?
Not in the same way. There was no experimental evidence for the Higgs field before the Higgs was observed. These new particles arise naturally out of parts of the Standard Model that are already experimentally tested.
You're not wrong, but you'd have been really hard-pressed to find someone of significance who didn't already believe the Higgs mechanism was there. In the sense that mass exists, there was a high degree of expectation there too. It's actually quite amazing to have so many of these recent discoveries, also be confirmations. It's been over a century of this, starting with Relativity.
Professor Tim Gershon, Professor of Physics at University of Warwick and UK spokesperson for the LHCb experiment:
“After the LHCb experiment is upgraded in the next long shutdown of the LHC (during 2019-20), it will be able to move to the next stage in the search for new particles: namely, doubly heavy baryons. These states – which contain two charm quarks or two beauty quarks or one of each – have long been predicted, but never yet observed. Their discovery will help to address important unsolved questions about how hadrons are bound together by the strong interaction.”
So I would assume that yes they have been predicted and is opening the doors for further confirmations?
Please take more care in quoting. What you've quoted does not describe the current work, but rather something they haven't demonstrated yet called "doubly heavy baryons". That sentence was immediately preceded by this one:
Professor Tim Gershon, Professor of Physics at University of Warwick and UK spokesperson for the LHCb experiment, explained what will come next for the LHCb experiment: “After the LHCb experiment is upgraded in the next long shutdown of the LHC (during 2019-20), it will be able to move to the next stage in the search for new particles: namely, doubly heavy baryons.
> Their discovery will help to address important unsolved questions about how hadrons are bound together by the strong interaction.
If the particles were already predicted by the standard model, what kind of unsolved questions are to address here, besides validating the predictions of the standard model even further? (serious question)
The standard model provides a set of postulates that could be used for prediction of possible composite particles, their masses, and decay times.
However it is computationally infeasible to calculate them directly, without using various approximations. Physicists try to solve these problems numerically (see for example about the field called lattice QCD), but it is not always possible and leads to introducing various approximations that produce errors and other artifacts in the numerical predictions.
So the particles were allowed by the standard model, but we didn't know for sure their properties. So this provides way to verify already done numerical predictions (I don't really know were they be done for this exact particles or not) and give us data about exact properties of these particles.
One could possibly draw an analogy with (quantum) chemistry here.
I'm not sure about this particular one, but a general idea is that the properties you are measuring in a particle depends on a lot of virtual particles.
It has 9 Feynman diagrams. If you look at the top left diagram, there is an electron that enters from the bottom right corner, then it emits a photon that go out thought the top left corner, then the electron goes out through the top left corner.
The following two diagrams show the case were the electron emits a second (and third) photon and reabsorbs it, so the second (and third) photons are not visible for the experimenter, they are virtual photons. These additional photons are only important because the change slight the properties of the electron.
In the next three diagrams the photon is so strong that it can spontaneously split in another electron and a positron. It looks like a loop/circle, because positrons are like electrons traveling backward in time. They are virtual electrons, and again they are not visible in the lab, they are only important to make a tiny correction to the result of the experiment.
The other three diagrams have two virtual electrons, than makes even smaller corrections.
And in addition of the virtual electrons, there can be virtual muons and tauons. They are like electrons but with more mass. So the probability of having one of them is smaller, so the correction is smaller. In this case, I think that the correction is so small that it's impossible to measure it.
And you can have another virtual particles, like virtual quarks and virtual W, anything that has a charge. Moreover you can have virtual unknown particles (with charge) because nature doesn't care if we know the particle yet or not. But they are heavier, so the correction is negligible.
If you change the experiment, and for example make a electron collide with a positron, then the calculations are very similar, but there is more energy laying around, and the corrections from heavy particles are more important, so this variation is more useful to discover new particles.
Back to your question ...
The new particles are composed by three quarks, but actually they are composed by a lot of gluons and virtual quarks and antiquarks. To do any calculations you have to include a lot of diagrams like in the figure linked above, and a lot more, many many more.
IIRC the calculation is so complex that it's not possible to compare the experimental results with theoretical calculations. Perhaps they have some heuristic to compare the results with the results of similar particles.
This was probably part of a bigger experiment that produces a lot of particles, and they are trying to classify them in families. And perhaps in the classifications they can spot some strange pattern that may provide a hit that there is a new elementary particle.
Because a prediction is just an assumption (theory), and it can become a house of cards when basing future science on that assumption. Observation is proof, so future science can use that proof without worry.
You start with a hypothesis with no assumption of truth.
Using that hypothesis you make a prediction and then use observation to test your prediction.
During your observation you may find proof that your prediction was correct, which in turn provides support for your hypothesis.
Once sufficient evidence is found for a hypothesis, it becomes a theory.
I'd say you have a theory from which you deduce a model that consists of various assumptions plus a hypothesis. If this hypothesis has not yet been compared to a set of observations, then it is also a prediction about that set of observations.
Also, the distinction between assumption and hypothesis is subjective, it depends what aspect of the phenomenon you care about at that time. Another term for assumption could be "auxiliary hypothesis".
Proof refers to the set of logical deductions (from theory + assumptions) that lead to the model, it has nothing to do with the observations.
It's important not to confuse theory with scientific theory. They have very different meanings. In everyday speech a theory is roughly equivalent to a guess. In science, a theory is a well tested explanation of some phenomenon.
It's always hypothesis then theory. Your hypothesis may be based on other theories, but it is itself not a theory.
From Wikipedia: "The scientific method involves the proposal and testing of hypotheses, by deriving predictions from the hypotheses about the results of future experiments, then performing those experiments to see whether the predictions are valid. This provides evidence either for or against the hypothesis. When enough experimental results have been gathered in a particular area of inquiry, scientists may propose an explanatory framework that accounts for as many of these as possible. This explanation is also tested, and if it fulfills the necessary criteria (see above), then the explanation becomes a theory. This can take many years, as it can be difficult or complicated to gather sufficient evidence."
Math tells me there must be 216, no? 3 quarks make a baryon, there are 6 types of quarks, so 6^3?
Idk if up up up baryons are allowed though, or any other baryon made of 3 equal quarks.
You can also have excited particles that have the same quark content. They are called "resonances", are unstable and can decay. For example, the five particles discovered here are excitations of the particle containing two strange and one charm quark.
AFAIK, the experimentally determined rest masses of these excited states/particles (same thing, different was of looking at it) agree with the calculated ones well. So yes, they were predicted. No surprises sadly; of course it's still a huge achievement!
yes, and note that these are not fundamental particles (like the Higgs was for example), but composite particles of yet another combination of the fundamental quarks. The SM predicts the existence of hundreds (thousands?) of these.
Actually it is even a one new particle (Omega_c baryon), but they observed five different excited energy states of it (like observing different excited states of a Hydrogen atom), so called resonances [1]. But the discovery is still exciting because it should have been really hard to find something that we don't really know how looks like in such large amount of noise.
The problem is that it is hard for us to predict masses/energies of new composite particles because although Standard Model provides hypothetical way to do it, it is infeasible computationally.
>"it should have been really hard to find something that we don't really know how looks like in such large amount of noise."
But if there are thousands of different such "surprising-to find-particles" to possibly detect, is it actually surprising to observe one of them?
Edit:
Also, from the top answer at your link: "The first generation of elementary particles are by observation not composite and therefore not seen to decay...The Standard Model of elementary particles, with the three generations of matter, gauge bosons in the fourth column and the Higgs boson in the fifth."
From wikipedia: "In the Standard Model, the Higgs particle is a boson with no spin, electric charge, or colour charge. It is also very unstable, decaying into other particles almost immediately."
https://en.wikipedia.org/wiki/Higgs_boson
> But if there are thousands of different such "surprising-to find-particles" to possibly detect, is it actually surprising to observe one of them?
It is hard to correctly identify what exactly particles from those possible thousands are observed in given data. The are two main problems: the properties of those not yet observed particles are not well known (because it is computationally hard to predict them from the standard model) and because the number of useful events is much much smaller than the number of events that correspond to already known events.
> So do elementary particles decay or not?
I don't see a contradiction here: the first generation of elementary particles does not decay (or has not been observed to decay yet), Higgs boson is not from then because the author of the answer are talking about the first generation of fermions and Higgs boson is not one of them.
Yes. They're baryons (three-quark states). This is analogous to the discovery/synthesis of new isotopes in the middle of the last century. Technically they were "predicted", but it's still important to take the measurements. Occasionally there are surprises.
Your comment is being downvoted for a number of reasons. I'd like to use it as soapbox for a moment.
One reason I left academia was the constant cycle of grant funding proposals followed by a flurry of publication followed by more proposals. These proposals often carried some requirement for language that, in my view, was a direct response to a certain kind of taxpayer, in order to justify on some tangible level the "need" for the research to be funded.
Basic science, and theoretical research alike deserve far more funding than they get. That a significant portion of our country views the funding model we have for scientific discovery as "beg[ging]" is one of the greatest failures we have as a culture.
Oh, some of it does come down to almost begging. It shouldn't, because funding discovery is important. People doing long hours of hard work for the common good shouldn't have to beg for funding. Then again, we shouldn't have to beg for school lunches or housing for military vets either but look at social media right now.
I think on a philosophical level everyone should be able to justify their work at any time. But on a practical level this can be distracting to the people doing the work and it gets us into the begging/funding/publishing cycle we're stuck in now. Maybe the reward system just needs to be changed.
How much of that attitude do you think is due to a cultural shared failure to value basic science and theoretical research, and how much is due to a failure to communicate the benefits of such activities?
I think it's quite possible that when people hear "it's complicated" or "we won't see practical applications for a decade", they wonder whether it's money well-spent. Especially if funding for things which have an immediate impact (e.g: healthcare, schools) is perceived as desperately lacking.
It seems to me that the joy of discovering and understanding the universe does not get communicated to the general public, and that's a failing of those who are asking that the general public fund their research (whether individual scientists or funding bodies such as EPSRC).
Also I think that popular media accounts of some ridiculous-sounding research that got funding tends to get extrapolated (by at least some folks) to a characterization all taxpayer-funded research. One example I remember hearing about a few years ago was a study of how cocaine affects the sexual behavior of Japanese Quail. A lot of people hear about that sort of thing and question why their money is being spent on such useless-sounding research.
And I imagine a fair number of people who are just working to get by don't see much point in the study of escoteric particle physics either. Never going to affect their lives.
That's the problem. They can't answer that question and neither can the market. The market is great at assigning value to things based on scarcity and demand, but there will never be scarcity of the Navier-Stokes equations because they can just be copied and shared so the market is useless for saying what they're worth.
And before a discovery is made literally nobody knows how it might change the world. I somehow doubt a thousand investors could have better predicted the future value of the transistor any better than the handful of electrical engineers working on that frontier.
That's great for solving known problems, like maybe Coke could estimate that reducing turbulence in their pipes in Warehouse ABC would save them a half million dollars a year. Then they can set aside money, maybe insure the project against failure, the insurance company has some idea what to charge them for such a policy, and so on.
But when Einstein was working on Relativity nobody could have foreseen what it would make possible. In that case the discovery was made and then eventually the business sector found a way to sell it, decades afterwards. Today we know what GPS makes possible so we know what it's worth, but in 1905 there was basically no market value for it.
Your comments would be more positively received if you acknowledged the minority position your views represent, and take that into consideration when trying to make points to others.
Many people support large amounts of free market activity and decentralized economic agency, but most people believe that only relying on this model of human organization will bring significant harm to society.
Very few people consider the idea of "taxation is theft" to be even remotely reasonable.
Thus there is a fairly large gap for you to bridge when trying to make points that rely on such principles.
I have a hard time seeing taxation as always theft. In a democratic society the public either is or was part of the decision on whether or not to fund things like the LHC. If society agrees to pay for it then by definition there's no coercion.
I think the LHC has over a dozen participating countries, each of which volunteered to be part of it. Ideally there should be checks and balances making sure a project of this scale is free from corruption.
Neither theft nor markets exist without property rights. And to have property rights you need a legal system, you need democratic institutions, law makers, an executive branch that enforces your legal rights, etc.
That costs money. It requires the work and agreement of others. You are not entitled to that for free and without entering into any sort of negotiation that involves give and take.
Your demand for protection against that thing you call "theft" is essentially a demand for the existence of a system of government, and that requires taxes. Therefore the claim that taxes are theft contradicts itself.
People who just want funding don't work on things like the LHC. If you want to solve a problem while someone is throwing money at you, then you work in defense. You shouldn't be so quick to harshly judge people who are making the choice to explore the edges of what we can achieve rather than make a quick buck developing the best new way to kill someone.
https://explorable.com/research-grant-funding - "Let's start by saying money does matter. It matters in every sector including science. Even if the researcher has a pure love of the scientific method, given the right circumstances, such as pride or the right amount of money, there may be some consideration given to skewing the data or holding back on publishing results for a little longer than they should."
I have to admit, I rather admire their putting a brave face on things, with their mildly-too-insistent claim that the LHC is vital for finding new physics despite the increasing likelihood it's not going to find any.
In that sense the LHC is a political failure, because it has failed at its primary political job, which is to make the argument for an even bigger collider.
You know, it's ok not understand much about high energy physics, it's a complicated and somewhat obscure field. It's ok not to understand about what's already been achieved at the LHC, and what might still be over the decades.
What's not ok is to pontificate from that seat of ignorance.
How'd it fail at it's primary task? Wasn't the Higgs a big part of that? Or is the argument that is hasn't found anything outside of the standard model?
It succeeded at its primary task, and the Higgs was a big part of it. Its still doing its job, so give it time. Finding nothing can often be just as important as finding something unexpected.
I think its a good analogy. Or just talk about the original periodic table. It was put together with many holes, but with the assumption that those elements would later be found and sure enough they were. I suppose scientists of the day could have openly accepted that model and decided there was no reason to prove silicon existed, but I think anyone would see that as silly. Its necessary to prove these things.
I think its a fundamental problem with teaching science history. We tend to look at achievements in history as being inevitable, like of course we discovered flight after the locomotive was invented, that's just how it happened! But those discoveries were all a circumstances of chance. If we're going to continue to make progress in science we need to accept that new, groundbreaking discoveries don't happen inevitably, they happen from people doing a lot of the dirty work and from imaginative people putting the puzzle pieces together in new ways. There's no inevitability to discovery.
The physics community does not consider finding the Higgs new physics. Not finding Higgs would've been more exciting, or a more massive / lighter Higgs, etc. The Higgs showed up exactly as predicted with the predicted mass, so we're stuck with the Standard Model - i.e. no "new physics".
And yet, that was precisely one of the stated goals of LHC. Observing things is not a bad way to find new physics: Radiation was observed before there was a theory for it. Electrons were also generated as a "ray" before people had any concept of electrons. We've observed the microwave background without knowing about the big bang, or cosmic expansion.
Most new physics starts out as experimental observations. Einstein's theory of relativity is one of the amazing (partial) exceptions.
Well, I don't know if the LHC is a vital instrument but it will certainly decide were the community of particle physics will go after the plug is pulled on it.
The little I know about is that the LHC is an hadron collider (of course, duh), which means the collisions are not as "clean" as they were with the LEP collider, because the energy is not "deposited" on elementary particles like electrons but on composite hadrons. Which apparently makes things more complicated when it comes to know exactly how the energy is distributed inside the hadrons when they collide...
I'm not sure, I'm not an expert :) I've read that the next collider should use electrons again for collisions, up to 1 TeV or more. Maybe using linear accelerators...
Even if the analogy is not very good it's a little bit like the James Web space telescope, it will open new horizons and open the way for more precise exploration with 50m-class optically stabilized terrestrial telescopes... Just like actual 10m-class telescopes have studied Hubble discoveries in more details.
I see the LHC the same way. The results they will gather from it will decide what to explore next, with more a "precise" collider using leptons and "clean" high energy collisions.
Maybe I'm wrong though. On the political side of thing, sure, it's a costly adventure but I'd rather see money spent on those big scientific project than throwing money at military spendings and engage in useless and illegal wars.
That doesn't mean we can't debate big scientific project. ITER is also a monstrous one when it comes to budget and is even more risky because of the disruptions problem when it comes to confined unstable plasmas in a Tokamak chamber. An apparently 70 years old unsolved problem. Some physicists have warned about powerful disruptions, because of the size of ITER and induced currents in it, potentially harmful for the installation and the people that will work on it if the Tritium breeding blankets are destroyed...
Still, it will be made. And an even bigger experimental reactor is planned, DEMO, that will cost even more. I'm more sceptic about ITER than the LHC...
In computer science it's used to express asymptotic behaviour. This suppresses lower-order terms and constant factors, so O(2n + log(n)) = O(n). The most common ones used are upper bounds O(•), asymptotically equal behaviour θ(•), and lower bounds Ω(•).
I'm guessing shmageggy thought the notation was Big-O notation. It's a bit of CS notation that defines an upper bound on the number of operations for an algorithm as a function of input size.
I don't know enough about particle physics to know whether you and him/her are even remotely close to saying the same thing.
It's math notation invented for analytic number theory that got popularized in CS by Donald Knuth.
By definition, constant (nonzero) arguments denote the same class, so using constants different from 1 isn't really useful. However, the O stands for 'order' or 'order of', and it sometimes gets (ab)used to denote orders of magnitude (eg powers of 10) of finite values instead of limiting behaviour of functions.
If you're a physicist, you've probably seen it used as the final term of Taylor expansions. That's the proper usage, whereas the usage above is an abuse of notation insofar as (nonzero) constant arguments are equivalent.
LHC is a science experiment. Its job is to, first, check older results [1], and second, put nature to the test in a well-motivated and important way. In particular, LHC is the Higgs hunter, a messy [2] pathfinder before a precision Higgs-studying tool. It succeeded at that job [3].
That LHC has found no new physics beyond the standard model is not a failure. Finding nothing where you thought you might find something is as useful as finding something. If anything, it is more interesting, as it means there is still far more to be learned.
Politics are about people. The science case is a driver for the political case, but it generally takes a backseat to other goals when funding is at hand. First, science keeps us sharp. It pushes harder on materials and technology than anything ever done. An investment in science is also an investment in the entire science supply chain. The recent explosion of quantum-computing hardware investment has only been possible because scientists have built and sustained the tooling and companies that manufacture the necessary subcomponents at a reasonable cost.
More important, it teaches us how to learn; our most important product as scientists is our students. Students keep the field alive, yes, but most students take their new skills, knowledge, and curiosity with them to share outside of academia. Furthermore, governments support scientists to retain the skills for when they are needed by the populace in general; when the Fukushima accident occurred, our laboratory dropped everything it was doing in order to focus on atmospheric monitoring [4].
Finally, the fundamental knowledge we glean from each halting step forward moves us forward as a species. The device on which you are reading this text is the aggregate product of millennia of fundamental research and refinement.
LHC, at least from my outside perspective (I would benefit personally if less money were directed to colliders), has been well-run, successful, and worth the price, both scientifically and politically.
Where we go next is an interesting question -- I'd place my money on exotic accelerator technology. The detectors are wonderful, and linear accelerators are the century-scale path forward. The trick is finding a revolutionary new accelerator idea.
[2] Protons are full of quarks and gluons, so when they collide, it's difficult to know which component hit which other component. Lepton-antilepton colliders, on the other hand, are harder to build, but extremely clean.
* These are not elementary particles. * It was predicted for a long time that these composite particles should exist. So it is not "new" in the sense that it was unexpected, but we finally have the resolution (energy, luminosity etc.) to detect these with statistical significance. * These belong to the same class as protons and neutrons - these are hadrons made with multiple quarks.
Here's a super simple overview of Quantum Field Theory & Particle Physics:
Everything is made of force fields and matter fields.
* We discovered that force fields (i.e. electromagnetism) is quantized, giving rise to quantum mechanics etc.
* Matter fields are also quantized, hence their excitations behave like discrete particles - the ones we observe at the atomic scale, for instance.
* In hindsight, we should've been able to predict this after E = mc^2 telling us energy <> matter, thus if energy is quantized so should matter.