The Cutting Edge

As this is the first real entry of The Cutting Edge, I will once more explain it. The Cutting Edge is an effort to help facilitate science and technology communication to the public. I will attempt to explain the best I can in plain language what new discoveries mean, so that the reality of which is not accessible only to those with an advanced degree. There will be a thread in The Hub, the forums of Seize My Future, for those who wish to ask questions about the story to do so.

Wheeler’s Delayed Choice Experiment

The basis of the experiment just recently carried out is Wheeler’s Delayed Choice Experiment, a collection of thought experiments posed by John Wheeler. One of the basic foundations of quantum physics is that a quantum object can behave as either a particle or a wave. Wheeler’s Delayed Choice Experiment thought to reason out when “a choice was made” between particle behavior and wave behavior. Particle behavior is that of all objects in classical mechanics, from the Sun to the planets, all the way down to a baseball and even at the cellular level. Particle behavior is exemplified by a predictable, singular path from point A to point B. Wave behavior, on the other hand, is not so simple. Wave behavior allows a quantum object to actually assume a “superposition” of states, which means that it actually isn’t just in one state or another, but has a probability of being in both. This would mean that instead of taking a single path from point A to point B, it actually can be described as taking all possible paths between point A and point B. In quantum mechanics, it’s been found through the famous double slit experiment that when measured, a quantum object such as a photon (the basic particle of light) will change from wave behavior to particle behavior. This poses an interesting problem, how does the photon “know” it’s being measured? Wheeler’s Delayed Choice Experiment hoped to find this out by figuring out when it makes that choice.

Bose-Einstein Condensates

The basic particle in the recent experiment is a single helium atom. This atom isn’t in the phase of being a solid or a liquid. Instead, it’s in the phase of a “Bose-Einstein Condensate”. This is an extremely low energy state of a particle that allows it to exhibit quantum behavior (wave-particle duality). This is a rather interesting state of matter, and there’s actually a planned experiment in 2016 aboard the International Space Station to attempt to create a Bose-Einstein Condensate the size of a human hair. This would be almost impossible on Earth, but it’s hoped to be possible with the micro-gravity of space.

Experiment to Confirm Quantum Weirdness

Stages
Enter Associate Professor Andrew Truscott of Australian National University. While the original delayed choice experiment was envisioned as a photon of light between mirrors, Dr. Truscott got the idea to do the experiment with a Bose-Einstein Condensate. The experiment consists of five stages. Stage one (we’ll call this t=0) is when a helium atom that has been put into the Bose-Einstein Condensate phase is released from a magnetic field. Stage two (t=1) is when the atom hits a scattering laser light grating. This grating acts like a material grate, which would separate light beams. Two paths are possible for the atom after stage two. Stage three (t=2) is when a computer generates a random number. Remember, this happens after stage two when the atom is scattered. Stage four (t=3), another laser light grating is turned on depending on the outcome of stage two. This grating reverses the path scattering from stage two. Stage five (t=4), the outcome is detected on a screen. This is the base of the experiment.

The Data from the Experiment

So, what happened? Well, if a number was generated that turned did not turn on the second grating, one of two things happened. Either the atom followed path A, or the atom followed path B. While this was not measured at the time of the atom traveling the path, it was able to be determined which happened. This was done by detecting a slight change in momentum of the particle. There was about equal probability of the atom traveling down path A or path B. This outcome is consistent with particle behavior. If a number was generated that turned on the laser grating, the atom was found to have traveled down both path A and path B. When the laser light recombined paths A and B, an interference pattern was observed. This means that the atom generated a pattern of minimum and maximum strength on the screen. Minimums are places the atom essentially didn’t hit, and are characterized by dark spots. Maximums are where the atom essentially hit twice, causing a brighter spot than the spot generated by the particle behavior. While this is not entirely an accurate representation of the “why” of the interference pattern, it’s close enough to get the point.

Interpreting the Data

So, the data indicates that the events of stage three and stage four actually determine what happens immediately after stage two – or so it would seem. If stage three generated a random number that caused stage four to not turn on the lasers, the atom would follow a single path and exhibit particle behavior. If stage three generated a random number that caused stage four to turn on the lasers, the atom would follow both paths and exhibit wave behavior. In traditional thought, the atom exists in a consistent state throughout the experiment. Either it’s a wave, or it’s a particle. However, the outcome of this experiment means that either what happens immediately after stage two is determined by the events at stage three/four (meaning the future measurement causes the past to change), or the atom doesn’t actually exist until observation (meaning the wave or particle behavior gets determined at measurement).

While both possibilities seem maddening and counter-intuitive, the truth is that the idea that the atom doesn’t actually exist beforehand, and therefore whether it has wave or particle behavior isn’t determined at stage two – but at stage four, is the more accepted explanation. There’s a large body of information that points physicists to believe that it’s almost impossible to affect the past, but there’s nothing necessitating a particle to exist before it’s measured. Aside from common sense that is, which quantum physics has violated multiple times. So, the seemingly more consistent explanation is that the atom actually becomes “real” when it’s measured.

An Alternative Explanation

So, the popular explanation is that the atom isn’t “really there” until it’s measured. This would seem to answer the age old question “if a tree falls and there’s no one there to hear it, does it make a sound?” It would answer it with a big, fat no. However, what if it were the opposite that were true? What if the future really was influencing the past? Should we not at least examine the possibility? I have done just that, and have personally come up with a plausible explanation that is at least consistent with the majority of knowledge in physics. This idea is that this particle, between measurements, is actually in a state known as “entangled” with itself across time. Entanglement across time has no direct evidence as of yet, but neither is there a physical law directly forbidding it – at least in the manner I propose. Quantum entanglement is when two particles become “twins”, essentially. When a measurement is made on one, it effects the other, and if the measurement is then done on the other particle, it’s found to have the same result. This would treat the particle immediately after stage two and the particle at stage four as two separate particles that are entangled with each other. When the particle at stage four is either subject to a laser grating, the effect this has on the particle at t=3 influences the particle after t=1 in the identical manner – consistent with quantum entanglement. This then causes the particle as it goes from t=1 to t=3 to behave in the wave manner, creating a consistent history. If the particle is not recombined by a laser grating, it strikes the screen as a particle, ending the period between measurements, and it too influences itself directly after t=1 to create a consistent history. In a manner, this is almost a hybrid explanation. Perhaps it really is best viewed as a combination of both. However, it still would make those who would prefer not to have anything to do with the future influencing the past uneasy.

So, what do you think? Discuss in the comments

Advertisements