This misses whole swathes of the double slit experiment.
It doesn't cover the Observer problem, which means it doesn't acknowledge duality, which (you would have to assume) means it doesn't even recognise delayed-choice quantum erasers.
At the risk of getting downvoted to oblivion I feel like the author needs to study up on quite a bit more of this research as there's a much larger body of evidence than that which is countered in the article.
The author of TFA has little understanding of what they're talking about and misses all of the nuance of the double slit experiment and its relationship to other QM phenomena like the Dirac three polarizer experiment[1] and delayed choice quantum eraser experiment[2]. The three polariser experiment in particular makes it almost impossible to argue that light is a particle even if it had hidden state, unless that hidden state was propagating faster than light.
TFA reads like /r/iamverysmart content or a medium effort troll post. I don't understand how this ended up on HN frontpage.
I didn't know about Duane's hypothesis even though I read a few books mentioning the double slit experiment so I'm glad I read the article. In the end, as much as the wave/particle duality seems mysterious at the moment, I'm convinced we'll find a quite boring/straightforward explanation for it.
Actually the mysteries are deeper. This one can't explain "Delayed Choice Quantum Erasure" experiments, which is also a double slit experiment. But the arrangement allows you to detect the path AFTER the photon has traveled through one of the slits and hit the surface. Now after the photon has hit the surface you can decide whether to detect its path or not. A delayed choice. But based on your later choice the interference pattern of the photon from a time from the past appears, or doesn't.
This is the main reason why some scientists come up with many world interpretation. This article doesn't even touch the surface of it.
tl;dr: the detector collapses the state, detection does not mean both paths were not "taken". This is the same as detecting a photon on a screen after a double slit.
Disclaimer: I only skim QM stuff, I am not a physicist
Question: what's the name for this experiment where light is interfering with a "slower" beam of itself (legend: ⫽ are beam splitters, \ and / are mirrors, arrows are photon):
Common example is dropping two balls of different mass connected by a chain off a tower.
The point is not that is demonstrates something empirically, which of course it doesn't. But it does say something about which hypotheses make sense within whatever assumptions you're making.
All mathematical proofs are thought experiments, for starters. Similarly, all predictive output of scientific models is technically thought experiments, although we nowadays use computers to achieve superhuman precision and accuracy.
Well, from the article on the dual-slit experiment ...
> What many don’t realize is that the double slit experiment (with particles), proposed by Feynman in 1963, was for decades only a thought experiment. Finally, in 2013, it was successfully performed with electrons.
The author implied it was a thought experiment for "particles" like electron, though actually demonstrated from the very beginning for non-particles such as photons.
The quantum Erasure doesn't have anything to do with wave particle duality imo. They are purely testing the effects of entanglement and choice.
I see entanglement differently than most people.
They tend to believe it's a state of identically positioned particles acting in unison. To quote the wiki
"that converts the photon (from either slit) into two identical, orthogonally polarized entangled photons with 1/2 the frequency of the original photon"
But entanglement and super position are all about balance. Too much interference and your state is lost.
Our galaxies and solar systems are macro representations of entanglement. I mean is got to be pretty obvious. We are essentially using the same technology to look at both.
So if entanglement is all about balance, no shit whatever you measure on one will be the opposite for the other.
There are two sides to a coin.
We are messing up where the axiom of choice is taking place. It always happens at the point of entanglement. Based on that choice every step from there is deterministic "in theory".
> What many don’t realize is that the double slit experiment (with particles), proposed by Feynman in 1963, was for decades only a thought experiment. Finally, in 2013, it was successfully performed with electrons.
Total crackpot, electron diffraction was observed in 1927 by several different groups of people[1][2] some of them who later in 1937 shared the Nobel Prize in Physics[3].
Came here to say this, but the 1927 experiment was only the first with electrons. The first experiment with light was in 1803 by Thomas Young. This was the foundation of the wave theory of light.
It is very common in physics that the same phenomenon can be explained using different concepts. Treating particles hitting a regular structure as diffraction of de Broglie waves gives exactly the same results as the momentum transfer approach the author prefers. In fact, both can be derived from standard quantum mechanics and both are widely used, depending on which is easier in the particular case.
However, scattering particles off objects is only one of many experiments they can be done. For instance, one will have a hard time explaining how atoms work without using some kind of wavefunction description of the electrons. Furthermore, as other commenters have mentioned, there are more sophisticated versions of the double-slit experiment, such as the delayed-choice quantum eraser, where the "quantum weirdness" simply cannot be argued away.
Frankly, it is pretty insulting by the author to assume that physicists intentionally choose the "more complicated" explanation over the "simpler" one.
The delayed choice quantum eraser is often explained as being more 'spooky' than it actually is.
The interference patterns only appear when you compare coincidences between the detectors. It's easy to see you could get an interference pattern after the fact by taking a bell curve (no interference) and subtracting out a sine curve (interference).
but how do you get the sine curve in the first place? because you're not subtracting a sine curve, you're subtracting non-coincidences, for which there is no reason they would be on the sine curve
The author is frustrated by their apparent misunderstanding of theory and history of theory that is very hard to understand, and is even not completely understood yet, and is right to be subject to new questions and outlooks.
It seems equally selfish to be offended personally or in a groups honor, by accusations of innate intellectual prejudices at large, than it is selfish for the intellectually isolated individual to entertain and express such accusations.
>I tend to agree, the notion of matter being in two places at once is nonsense.
Based on what? Empirical experience with macroscopic objects? There are tons of phenomenons that are at odds with everyday empirical experience for that to matter (relativity has all kinds of such cases).
Agreed. Remember he started by saying the double slit experiment has been performed on much larger molecules at this point. People who come across this will go "wait a second?!" and question what they've been told.
> According to Ballentine, Duane's proposal of quantum translational momentum transfer is no longer needed as a special hypothesis; rather, it is predicted as a theorem of quantum mechanics.[22] It is presented in terms of quantum mechanics by other present day writers also.[23][24][25][26][27][28]
Duane doesn't explain double slit experiment. It says momentum exchnge is quantized if you direct particle at lattice and this gives same pattern as interference.
If you consider single electron in this interpretation, point like particle without any associated wave or ability to be in more than one point of space at the same time ... when it is directed and passes through one of the slits, it produces the pattern but only if the it potentially could have been directed at the other slit.
How the hell what could happen with a particle (but didn't) could influence what actually happened to the particle (quantized, not normal momentum exchange)?
--
For me the way to deal with weirdness of double slit experiment is that electrons and such are just waves, all the time. Fuzzier, sharper but always waves. The only thing that makes us think that's not the case is that, when they exchange energy and momentum, it's governed with same equations as if they were two balls bouncing off of each other.
The truth in my opinion is that electrons don't bounce like tiny balls, but balls exchange energy and momentum as if they were huge, very sharp electrons. Macroscopic balls are waves too, but imperceptibly because they are build out of bazzilion particle waves all tangled together into a very constraining packet.
If you have wave-systems with high enough fidelity (Q-factor in a cavity for example), they behave a lot like particles in certain energy/time regimes. People yelling from a mountain top hear echoes.
People have always thought about "atoms" of matter as being primary units of existance and waves being some second order, compound behavior. That's why it was so hard to get rid of the idea of ether.
> Duane’s Law says that if a body is periodic in space in a given direction then momentum exchanges in that direction are quantized. That means that the emerging electron is more likely to go in certain directions than in others, depending on how many quanta of sideways momentum are exchanged. Guess what? When a stream of electrons is sent through, a ‘diffraction’ pattern results.
If this "direction preferentiality" were all that were needed to cause a diffraction pattern, then we should see it if we run the experiment with one slit closed for a while, and then the other slit closed for a while.
But that's not what happens! If you send particles through only one slit, you do not get a diffraction pattern! (Actually you do, but the valleys are spaced far wider than those with both slits open.) No diffraction pattern = no valleys. Running the experiment again with the other slit closed can't add those valleys in. This is logically no different than individual electrons simply "randomly" entering alternate slits, so that can't make a diffraction pattern either.
What am I missing? It seems like the author is trying to explain double-slit purely in terms of statistical ensembles of quantum-ish particles rather than exploiting the full generality of quantum fields, which makes no sense simply by looking at the results of a single- vs. double-slit setup. (e.g. https://en.wikipedia.org/wiki/Double-slit_experiment#/media/...)
QM is easy to describe. There is wave-particle duality, but not at the same 'time', only in a strict alternating sequence of propagations and interactions.
Propagation is completely wave-like. It is unitary and time reversible, without loss of information. There is no particle there, and no 'there' there, as the propagation is non-local. Waves have complex amplitudes. Interference happens because that's what waves do. Propagation is described by the Schrodinger/Dirac equation.
Interaction is completely particle-like. The interaction is non-unitary and time-irreversible. The interaction may also be known as detection or measurement.
The wave only contributes the probability of the interaction happening, which is described by the Born Rule. Amplitudes are complex, so squaring them with the Born Rule can give negative probabilities, which is why the double-slit experiment gives many fringes.
The interaction is local, because it creates the 'there'. Positions and distances are created by the interaction as a specific spacetime event. Hence the aphorism spooky distance at an action, meaning that (inter)action is fundamental, and distance is a strange thing created as a byproduct.
There are conservation laws that balance some incoming and outgoing properties, scattering laws that determine how the interaction happens for various forces, and some constraints between the knowable values of commuting pairs of observable state (particle properties) that are revealed (entangled) by the interaction. These constraints are also known as Heisenberg's Uncertainty Principle, but are really just direct and obvious mathematical properties of wave packets, as described by Fourier Analysis.
After the interaction, there will be propagation of new waves for the outgoing particles (meaning a bundles of state). The interaction results in an entanglement (correlation of future possible interactions) between these distinct outgoing waves.
Though the duality comes from the measurements. We are literally measuring kenenitc vs potential energy. The reference frames are completely different. One is when an object is at rest "particle" and the other is when the object is in motion "wave"
That's the reason for the entire uncertainty principal!
What if we took two properties that weren't intrinsically link?
Physicist here. This is just self-congratulatory nonsense. Though I shouldn't be surprised, because I originally signed up for this site to comment against some other self-congratulatory nonsense.
Quantum mechanics is built on extraordinarily simple rules, in fact simpler ones than classical mechanics: all you do is sum up the amplitudes for various paths, and square it to get a probability. The problem is that the objects obeying these rules thus act neither like particles nor waves. So obsessing over whether quantum objects are "really" particles or waves is not useful.
It certainly works sometimes. There are many limits where quantum objects behave just like particles, or just like waves (there must be, since we must be able to recover classical mechanics from quantum mechanics). And sometimes you can describe a quantum calculation in terms of classical wave or particle intuition, which is what's going on here. The double slit experiment is in fact so simple that I know of at least 5 ways to derive the result, using all sorts of different intuition. "Duane's law" is one of them, though I never knew it had a name, as it pops out rather straightforwardly from the mathematics.
The reason that we don't try to extend this to cover all of quantum mechanics is because it doesn't work. Treating the quantum mechanics as if it were classical underneath leads to all kinds of paradoxes. Bell inequality violations tell us that the classical variables would have to either influence each other faster than light, or be pre-set since the beginning of the universe in a malicious conspiracy. The delayed choice quantum eraser tells us that if you want to treat the photon as a classical particle, you need to accept that it can retroactively change the past. Yet another failed attempt is quantum logic: if you insist on treating quantum superpositions as representing some unknown classical position of a particle, you need propositions about those positions to violate the rules of logic themselves.
Hundreds of experiments probing quantum mechanics have been performed, all with the same results: committing to a classical ontology in terms of "waves" or "particles" costs you more than just accepting that quantum objects are different from either.
Physicists have spent a lot of energy thinking about this. All of the author's claims were under active debate in the 1920s. He is just completely ignoring the following century of investigation, and claiming that it never happened!
> the classical variables would have to either influence each other faster than light, or be pre-set since the beginning of the universe in a malicious conspiracy
Why is superdeterminism almost universally rejected by working physicists? From the outside, it feels like the most parsimonious explanation: there is just a single way things are and will be, set in motion from the very beginning. It doesn’t feel like a conspiracy of any kind to me.
Because if you take superdeterminism seriously, you can't do science.
Sure, it is logically possible that the world runs on classical mechanics, but that random factors in the light emitted from opposite ends of the observable universe billions of years ago were delicately set at that point so that whenever you run an experiment with it, naively assuming that no such correlations exist, quantum mechanics appears to be true instead.
Unfortunately, if you are consistent with applying this principle you can't conclude anything at all. If you see the Sun and planets move every day, would you eventually accept orbital mechanics, or would you prefer the explanation that "orbital mechanics is false, everything actually moves however it wants according to random unknown factors, but those factors were arranged just so that whenever we make observations, standard orbital mechanics appears to be true"?
Suppose we're betting on horce races, but I win 100 times in a row while you always lose. Would you accept my explanation that "this was merely superdetermined to be the case"? Or would you start to suspect that I'm stealing your money?
Superdeterminism is philosophically equivalent to Cartesian skepticism. It's just a minor twist on being a brain in a vat. Scientists reject this not because we can disprove it, but because you can't do anything if it's true. I mean, you might as well not bother getting out of bed in the morning.
de Broglie-Bohm has the same problems as usual for such theories. I wrote an article once [0] about such things and I'll just quote from it.
> To understand the working physicist’s position, it is useful to consider pilot wave theory. This alternative to the Copenhagen interpretation is often championed by realist philosophers, who would prefer that measurable properties, such as the position of a quantum particle, should always be well-defined. In this interpretation, the wavefunction is a classical field, like the electromagnetic field, called the pilot wave. Every particle always has a definite location, but it is “guided” by interaction with its pilot wave, allowing it to perform feats impossible for ordinary classical particles, such as quantum tunneling. Hence, the subtlety of quantum mechanics is dispelled by describing a quantum particle as both a classical particle and a classical field, in a literal interpretation of wave-particle duality.
> For those who have struggled with the subtleties of quantum mechanics, this simple story sets off alarm bells. For one thing, where is the probability? If the electron really has a definite position, then why is it measured to be seemingly random, even after the electron settles into its lowest energy state? The answer is that the pilot wave is postulated to unpredictably shuffle about the location of the electron until its position reaches “quantum equilibrium” and matches with the predictions of the Copenhagen interpretation. This shell game is assumed to occur too quickly to detect. Worse, once it is over, the electron is predicted to hover in midair, perfectly still. Velocity measurements indicate otherwise, so pilot wave theory simply assumes they are all mistaken. Apparently, there is a real velocity, but it cannot be measured, and any attempt to do so yields something else. These ad hoc fixes allow pilot wave theory to avoid contradiction with the empirically verified uncertainty principle.
> The pilot wave itself also has strange properties. We expect to be able to measure classical fields, but the pilot wave cannot be directly measured. Classical fields were introduced to avoid nonlocal “action at a distance”, but when the particle is measured, the pilot wave instantly collapses. The collapse is postulated to be faster than light but coincidentally completely undetectable, making pilot wave theory almost impossible to reconcile with relativity. Furthermore, “Bell test” experiments confirm that this problem cannot be removed in any refinement of the theory.
> The reason most physicists are hesitant to accept pilot wave theory is that it appears to have the ether’s flaws. In exchange for the classical intuition of definite particle trajectories, the theory drastically increases the complexity of our world. It suggests many natural questions about the nature of the pilot wave and particle, then gives them unnatural answers which are hidden from observation. That is why, when physicists working on quantum foundations were polled at the 2011 conference Quantum Physics and the Nature of Reality, precisely zero vouched for the pilot wave. The interpretation is like an art teacher who sings the praises of creative freedom, yet berates any who draw outside the lines.
I don't personally have anything against Bohmian mechanics except for one thing: its supporters consistently ignore all of these well-known issues and present it as if it's a completely perfect quasi-classical model for the world, only kept down by conspiracy. This leads to all kinds of crazy narratives that I think are actively harmful to the field. At the very least, people who go out and talk about the many worlds interpretation face the weirdness of quantum mechanics head on, and honestly.
This is as good a description of the ontological limitations of pilot wave theory as any, but many of the criticisms apply to almost literally any interpretation of quantum mechanics whatsoever.
As a physicist with a passing interesting the subject of foundations, its my feeling that what quantum mechanics tells us is that at least one aspect of our classical ontologies is not correct. My personal bet is that its locality which is the problem, but as far as I understand it the question is still wide open.
de Broglie-Bohm merely casts the ontological tensions with classical physics in one particular light, but I'm aware of no interpretation which is free of the tension.
I thought that locality had had to be abandoned long ago, at least for entangled particles.
Of all the oddities pressed on us by alternative interpretations, non-locality between particles that have once interacted seems least troubling. Or is that not enough?
There isn't any consensus on what aspects of our classical ontology are the right ones to get rid of. You can preserve local realism if you have superdeterminism, for instance.
You can check my website, which I linked above, for several papers and a dissertation.
I'm not talking about any of "my" "theories" at all, I'm just describing a bit of what has been done in the past century by thousands of smart and dedicated people.
Please don't take HN threads further into flamewar. Whatever the substantive points on the topic are, posting like this just makes the thread considerably worse.
When I watched a goofy layman’s explanation[1] about the double slit, it stressed the importance of the effects of observation on the behavior or nature of the electrons.
When observed to see “which” slit the light passed through, they no longer created the interference pattern described by Duane’s law. And when left unobserved (at the slits) the interference pattern emerged even when shooting “one electron at a time” at the slits.
Does this article or Duane’s Law cover or appreciate this? It seems like it only describes the interference pattern.
The focus on “observation” in quantum mechanics is dumb and damaging. In quantum mechanics observation means something very different from what people assume when you use the word observation.
When you say “if we observe the position of the electron” people think: “we put an observer who without interring in any active way and he looked at the electron to see where it is”. What QM actually means is more akin to “if we take a 200 Ton bulldozer and force the electron into one or two positions and then test which one it’s in”.
The very issue arising from
Observer effects is that you cannot have the system in a single state where for instance momentum and position are both observable, and observing either means forcing the system into a state where the other is “uncertain”, that is to say that it will have a distribution of outcomes if you use a different 200ton bulldozer to observe this parameter afterwards. The uncertainty principle basically tells you that the technical requirements for a “position bulldozer” and a “momentum bulldozer” makes it fundamentally and axiomatically impossible to make a single bulldozer that carries out both “observations”.
>The focus on “observation” in quantum mechanics is dumb and damaging. In quantum mechanics observation means something very different from what people assume when you use the word observation.
Which is still neither here nor there concerning this case.
Even with the different meaning of observation from "people assume when you use the word observation", Duane’s law still doesn't explain the outcome.
The point being made is that all observations change the thing being observed, so you have to be very careful what you measure and how you measure it. That is all. This is a basic point in science that is often missed. As here, apparently.
The hallmark for any reinterpretation of QM in terms of more classical ideas would be that it explains Planck's constant in terms of other constants. If an explanation of a quantum phenomena succeeds to eliminate one of the fundamental constants, all physicists will provide a listening ear, no conspiracy involved...
Single slit diffraction and double slit diffraction are very different. This difference is elegantly and simply explained by the wave model. Without it, you'd expect the double slit experiment to create a simple sum of the pattern from each of the slits, but what you get is a completely new pattern that depends on the spacing between the slits, and the wavelength of the thing passing through.
With respect I simply disagree with Feynman that the double slit experiment contains everything... Maybe in suggestive outline?
As an example of how QM is weird, my go-to is a game that I call Betrayal. It is a challenge for a 3 person team. The teammates go through rounds where they are relativistically separated and unable to talk with each other; each such room has a big computer screen and two buttons labeled 0 and 1, and during each round each teammate has to press one or the other of these buttons; then their selections are summed up to a number that we call The Sum.
The reason we call this Betrayal is that we are trying to force one of the teammates to betray the other two. So 25% of the time we run a “control round” where we ask everyone to make The Sum even, and the team passes the round if The Sum is even. Or the rest of the time we secretly select one of the three at random to betray their colleagues: we tell the other two the true goal, make The Sum odd, and the team passes the round only if it is odd: but we lie to the traitor and tell them to make The Sum even, so they will work at cross purposes to the other two.
The key result is that there are 4 cases and classical strategies can only work for at most 3 of them, so your pass rate is limited to 75%. But quantum players who share a so-called “GHZ state” can pass 100% of trials in principle—albeit quantum computers have finite probabilities of simply failing, but if you could improve that to 90% fidelity then over the course of 100 rounds you will still score visibly ahead of the 75s, occasional 80s, and rare 85s produced by random chance.
The key is that quantum systems come out with these strange global correlations that cannot be explained classically. You also cannot see them locally; nobody knows the round succeeded until everyone comes back together to compare what bits everyone committed to.
1. It’s very nice to be introduced to Landé and Duane.
2. After so much emphasis on reason and experiment and evidence, it is jarring to switch gears to a conspiracy theory about scientists concealing or ignoring theories in order to make themselves seem “superhuman.”
3. It’s very nice to be introduced to Landé and Duane.
You'd be surprised how much scientists hush and push dissenting voices which don't suite a particular theory which one cabal wants to be true. Sean Caroll has been advocating strongly in public to go back and take a look at measurement problem but nobody wants to do it. He speaks more about this on his podcast (mindscape) and also on Joe Rogans podcast ( I mean the fact that he had to bring it up in Joe Rogans podcast to let the public know what's up) is pretty bizzare and in indicative of the kind of schemes that go on in academia.
This kind of conspiratorial thinking is precisely the problem. There is a constant avalanche of papers being published on interpretations of quantum mechanics, in my mind, too many. To believe that Sean Carroll is the only person thinking about this is frankly insulting.
This is comparable to hearing one C++ developer speak, then getting the impression that they are the only programmer in the world who uses a strongly typed language, and that there's a conspiracy to cover up strong typing.
It's not hidden by a conspiracy, the problem is that a few (10?) year later the result of Duane was explained as an special case of the general theory that use quantum mechanics and/or the Fourier transform. So the result is now just an exercise, that is explained without naming the (or perhaps with the name of the author, but nobody remember him after the exercise has finished).
Scientists generally don't like to talk about things they can't say anything meaningful about. (String theorists are in another category.) "Of that which we cannot speak, we must remain silent." -- Wittgenstein; maybe the only useful thing he said.
So, for example, plasma fluid dynamics is way, way harder than regular fluid dynamics, which is itself very hard. So, anything that takes understanding plasma fluid dynamics to talk about, most try not to discuss. If they can get an interpretation together that describes a phenomenon without reference to plasma fluid dynamics, but only, e.g. heat and gravity, they will prefer that, even if it doesn't account for everything. Then, they will prefer not to talk about the parts not accounted for. (Thus, the universe is full of "hot gas", not plasma. It's there, but it doesn't do anything.)
If it needs dark matter and dark energy, or perfectly parabolic nozzles in the geysers of Enceladus, or volcanoes on Io that bob about like ducks in a bath, then so be it. They're only human, and used to being right.
Solar physicists are awesome because they take on plasma fluid dynamics head-on, and get meaningful results. They are hot on the trail of the apparently-million-degree solar corona. So to speak.
I'm not saying anything about whether it is true or not.
I'm saying there is a completely different class of reasoning between talking about how the two-slit experiment might not be mysterious, and describing the use of the scientific method to validate it, versus attributing the obscurity of a result to a systemic bias problem in science.
I completely accept that science has human bias problems... The community is made of people, after all. I personally subscribe to the idea that it is systemically sexist.
So I do not argue that there is a bias, I only state that the transition from one kind of discussion to the other was jarring in this post.
I waled away from the article not sure if it was primarily about a theory I hadn't heard of, with a footnote about why I hadn't heard of it, or if it was primarily about a systemic coverup of a theory, with some introductory material about the theory presented to understand why it would be covered up by the establishment.
One can say, "Why not both?" But truly, good writing has a focus, and an essay really doesn't have a lot of room to stray too far from a single, primary message.
It could be that one good essay is about how the grand mystery isn't that mysterious, and another good essay is about how Science with a capital S is biased towards the mysterious and inscrutable.
It wouldn't take much editing for this essay to be transformed into either of those. And perhaps this is a matter of taste, but I would prefer one of those essays to this one.
But that being said, whatever else I might think about its focus, I'm glad it introduced me to something intriguing about the two-slit experiment!
> Admittedly there’s still a bit of mystery left. How does periodicity in space quantize momentum (I think if it as a sort of resonance phenomenon). But it’s hardly an impenetrable, soul searing mystery that makes you question reality itself.
I don't find this to be true. What's left is the key issue. The entire mystery is left. The old QM worked exactly like that explanation: you take a classical explanation and postulate that some quantity in it is quantised. This is ad-hoc and not a general approach. A major breakthrough in the new QM was an explanation why things are quantised. And yes, it's a resonance phenomenon. Resonance of what? Probability amplitude waves. Once you have that explanation you can do away with the classical+quantised part of the explanation, because it becomes superfluous.
The problem/mystery is more that the double-slit experiment and other quantum phenomena are always presented in a hand-wavy way where it's not clear how it derives from the mathematical formalisms.
The author has gotten confused the bit about the double slit experiment that is “mysterious”.
The mysterious bit isn’t what happens when you transmit a stream of particles.
It’s what happens when you only send one at a time. If the movement was purely changes to particle momentum you would get a uniform spread across your detector.
But that’s not what happens. What happens is you get a diffraction, that only makes sense if your particle goes through both slits.
Reading things like this article and replies, it seems apparent that physics research, or at least teaching, often blurs the distinction between experiments in thought or otherwise explaining a theory, and those that motivated theory by empirically disproving previously-held beliefs.
Also I’ve noticed a strange form of entropy in physics that tends to show more evidence for models that have the largest number of equivalent formulations based on different assumptions. You could say that this is a result of how physicists work to prove their own theories, but perhaps the gods work the same way.
Waves of probability amplitudes with quantized action is a nice abstraction that works so far, but so was continuum mechanics. Wave-like observations could easily be a simple consequence of the central limit theorem in the same way.
Has the double slit experiment been done and verified in a vacuum? I'm able to produce the superficial results with a simple neon laser, but it always struck me that outside a vacuum, there is a lot available to skew those results.
> Has the double slit experiment been done and verified in a vacuum?
99.9999999% sure yes, but I can't find a source with the actual experiment. The versions with electrons are made in vacuum, but I'm not sure they use the double slit pattern.
You can reproduce the experiment with light in air. It's expected that you will get the same result in vacuum. An actual experiment is nice, but the setup in more complicated, so it's not done in a normal lab class.
It should be easy to setup for someone with access to a vacuum pump, and some free time (1 week top). It would be nice to see how the lines move when the refraction index of the air that is almost 1 changes to the refraction index of vacuum that is exactly 1.
It's hard to convince someone of doing the actual experiment, because it looks a lot of work without an insightful result. Perhaps you are lucky and someone already did this. I guess this is the case, but I can't find it. Perhaps you can convince someone like Cody of "Cody's lab" to make the experiment and upload a video.
I expect these results in the open air, since there is so much for the photons to bounce off of. They're probably even bouncing off the sides of the actual slits, unless the block is thin and precise enough, and anti-reflective enough to not allow light bouncing off of it - straight polarized lines only.
This is why I ask such a specific question. I don't deny the effect is happening at all - I question the influence of other parts not taken care of in the experiment.
We speak of "noise" (interference) when discussing related observations in regards to quantum computing. I'm hoping we're at least as stringent about these experiments. But so far, I've only ever seen them done in open-air environments, which to me leaves too many nagging questions. I doubt the quantum effects I can produce in my dusty office are as precise and pure as they would be in a proper lab, but all the evidence I've seen is that I'm doing it pretty much precisely the same (sloppy) way.
Electrons do indeed show the same effect and I recall reading once that the experiment was first proposed using electrons. There are other particle/wave effects as well, but those two are the easiest to demonstrate with, apparently.
Note that interference here has a very technical meaning that is NOT noise. (Sorry for the all-caps, but it's very important in these experiments.)
I think you don't understand the important point in the double slit experiment. Let's try to explain it, but it's a long explanation.
---
First you must understand the single slit experiment. If the slit is thin enough you get a nice diffraction patter.
If this spread was caused by dust or something like that, a good guess is that the image gets a gaussian blur like exp(-x^2).
In a diffraction experiment, the image you get is something like sin²(x)/x². (You should draw them in Wolfram Alpha or something.)
The main difference is that the first one is always positive, so you get light everywhere, but it get's dimmer very soon. The second one has a lot of zeros, so you get strips of light and darkness. The math details are complicated, so let's ignore them.
You can do this experimentally. In the lab we used two razorblades with some screws to move them arcuately. There should be plenty of recipes in Internet to reproduce it. You can try it in more dusty environments, and clean environment. Sharp razorblades, not sharp razorblades. Wind or not wind. Try it a few times until you get use to the images it produces. You will notice that the cleaner environment produce the better results, so I hope you will get convinced that it's not caused by the dust or something
You can probably buy some single slit from an educational provider. Buy some of them of different width to compare the results with your homemade version.
Now, the double slit version looks more difficult to make at home. I think you should buy one (or a few), and the single slit version with the same width.
The idea is that if it has only one slit you would see the old image. If you have only the other slit you would see the old image. If the screen is far away and the slits are very close, you can't notice the difference. So you would expect to see the same image, but with more bright.
The weird thing is that you don't get the same image. You get almost the same image but with a lot of additional strips. There are the old dark lines and many new dark lines. These new dark lines are the interesting one. In this lines you have something like
You can try with a dusty windy environment, and it should make the lines more fuzzy. You can try wit a clean environment and get better results. The lines are not caused by the dust, they are just weird.
. light + light = darkness
(If you are going to buy things, add a diffraction gratting. It's like using many slits instead of two. The math is more difficult but they create nice rainbows. https://en.wikipedia.org/wiki/Diffraction_grating )
The dark lines are not so surprising when you use light that is usually approximated a wave. But if you get more advanced equipment you can repeat the same experiment with electrons or neutrons and get the same weird lines. (When you understand this, you can scream again.)
I actually don't like the double slit experiment because it's so easy to confuse yourself. Good for learning the math, maybe, but bad for presenting the "weirdness." Tests of Bell's inequalities are much more interesting and IMO cut straight to the heart of the weirdness of QM. Likewise with tests of the quantum Zeno effect. For me, at least, these two experiments go straight to "okay fine, measurement is weird and let's just treat superposition as its own thing" while double slit experiments beg for hand waving mysticism.
If you like crackpot "science" check out Rex Research. (Fair warning, not everything there is BS, but you can find all sorts of weird stuff to gawk at.)
The wave/particle duality notion isn't really meant to hold an accurate deep insight into QM, it's just meant to inform intuitions, and sorta communicate it to laypeople.
This is not correct. The duality is the fact that an elementary particle is point-like but requires wave mechanics to describe its motion. Another related discovery was that energy waves, too, consist of quanta, which in many respects behave as particles. Those are valuable physical insights (similar in its power to the idea that “everything consists of atoms”).
I disagree with your disagreement, but perhaps we all three agree.
The electrons/photons/whatever must be described quantum field theory, but the math is too complex to use it unless it's a very simple experiment with a few particles or with a computer. So it's good to have some simplifications.
In some cases, you can approximate an electron/photon/whatever as a wave. The math is much simpler and with some training you can to with paper and pencil or in some simple cases with handwaving.
In other cases, you can approximate an electron/photon/whatever as a particle. The math is much simpler and with some training you can to with paper and pencil or in some simple cases with handwaving.
I don't like the name "wave–particle duality" but it is probably too late to change it. It's not true that an electron/photon/whatever is sometimes a particle and sometimes is a wave. It is always a weird quantum thing that sometimes can be approximated as particle and sometimes can be approximated as wave.
For historical reasons, people discovered first the approximations, named it as the "wave–particle duality", and later understood what is really happening.
> What many don’t realize is that the double slit experiment (with particles), proposed by Feynman in 1963, was for decades only a thought experiment. Finally, in 2013, it was successfully performed with electrons. It’s easy to see why it took so long: the slits were 62 billionths of a meter apart.
Who is this clown? I did the two slit experiment using a CRT (aka using electrons) in undergrad experimental physics class in the late 80s, and unless I had some amazing experience, I'm pretty sure every undergraduate physics major does the same thing. The '37 Nobel in physics was awarded more or less for doing this in the 1920s after de Broglie said it would be there.
Seriously! We literally did it in our modern physics lab. For me that was ~2004. It's not a terrible difficult experiment to do at home. Also, 62nm is not a particularly small distance in electro-optics. You can buy polarizers and the like off Thorlabs for ~$9.
“The general perception is that the electron double-slit experiment has already been performed. This is true in the sense that Jönsson demonstrated diffraction from single, double, and multiple (up to five) micro-slits [2], but he could not observe single particle diffraction, nor close individual slits. In two separate landmark experiments, individual electron detection was used to produce interference patterns; however, biprisms were used instead of double-slits [3, 4]. First, Pozzi recorded the interference patterns at varying electron beam densities. Then, Tonomura recorded the positions of individual electron detection events and used them to produce the well known build-up of an interference pattern. It is interesting to point out that the build up of a double-slit diffraction pattern has been called 'The most beautiful experiment in physics' [5, 6], while the build-up for a true double-slit has, up to now, never been reported.”
I did the experiment in 1989, closing individual slits and everything. There are entire fields of material science based on photo electron interference (ARPES, XPS and other photo emission spectroscopies). Quoting some clown who claims "well you never did it my (irrelevant) way" is not useful here.
If you think that citing a paper by a physics professor in a peer-reviewed physics journal is “quoting some clown” you’ll probably understand that I don’t give much weight to a comment from a random person on HN.
You're perfectly free to believe in ghosts, the flat earth or DMT elves as well. People who know what they're talking about are still going to laugh at you.
Do you think clowns who publish physics papers (and yes, those guys are grade-A quacks for writing that paper with such claims) are immune to the temptations of PR balognia? Do you really think "Ultramicroscopy" is a reputable physics journal? Had someone actually done something as innovative as his claims: it would be in Science, Nature and PRL. But I guess you don't actually know anything about physics and think google substitutes for knowledge and "muh links" substitute for argument.
For what it’s worth, one of the clowns who coauthored that Ultramicroscopy paper is credited with the first observation of the build-up of the interference pattern from individual electron detections in 1974 (using an electron biprism, not slits in a plate).
Let’s accept for the sake of the argument that it’s a clownish thing to do in the 21st century to report that you’re sending electrons one by one (at intensities low enough so the interaction is necessarily that of independent particles, not that of a beam) towards a metal plate with two slits (which can be opened and closed at will) and detecting the arrival of individual electrons at the other side as independent events. Because that’s a trivial thing to perform and thousands of students were doing it already in the eighties.
You say that if it was innovative it would be in the cover of Nature or something. When would you say it was done for the first time then? Was it published in a top-tier journal when it was innovative? If not, why not?
I'd say you either don't understand the english language or are a bored grad student with nothing better to do than attempt to defy objective reality on the internet.
Maybe there is a misunderstanding. What I originally said is that “Doing it with actual slits and standalone particles is not so easy”.
Are you claiming that it is absolutely trivial to do it with a double slit and single electron detections, and you were doing it in 1989, or not?
Probably not, but then I don’t know why did you choose to reply to my comments in the way you did.
(I don’t know either what do you imagine that I think that the paper I cited means, or what do you believe that the paper claims that is so innovative that it would belong in a much more reputable journal.)
The problem I have with this. Quantization must be a factor of wavelength and perhaps width of one slit. Yet the interference pattern produced by the double slit experiment depends on the distance between two slits. Diffraction pattern is not the same as an interference pattern, something the original researchers were well aware of.
I'd recommend reading the book The Physics of God.
The author goes into a whole different theory on how it's possible for a photon to be both a wave or a particle depending on whether there is an Intelligent Observer/sensor.
This explanation doesn't seem to leave any room for the experimental observation that simply knowing which slit the electron went through - not some physical interaction that disturbs it but simply having that knowledge - changes the pattern.
You can decide afterwards whether to figure out which way it went or not, and make your observation then. While the English language isn't great at describing this sort of thing, it seems fairly safe to suggest that I cannot choose now to physically interact with an event that happened in the past.
I absolutely agree. We tend to forget that nothing is still in our universe.
Everything to the smallest quanta oscillates. The only way to get something to stop moving is by freezing it. You need to hit absolute zero to stop movement completely, I don't believe we have even observed absolute zero.
If you think about it, absolute zero shouldn't be possible. There are an infinite amount of degrees between 1 and 0. Aren't we only limited by the sensitivity of our equipment?
They don't stop at absolute zero, either, because then position and momentum would both be fixed.
But there is a continuous range of temperature all the way down, and there is plenty of work at increasingly tiny fractions of the last degree. It is meaningful to talk about what happens at a billionth of a degree.
It doesn't cover the Observer problem, which means it doesn't acknowledge duality, which (you would have to assume) means it doesn't even recognise delayed-choice quantum erasers.
At the risk of getting downvoted to oblivion I feel like the author needs to study up on quite a bit more of this research as there's a much larger body of evidence than that which is countered in the article.