Home Artists Posts Import Register

Content

In particle physics we try to understand reality by looking for smaller and smaller building blocks. But what if that has been the wrong philosophy all along? 

—-

The year is 1925 and the young Werner Heisenberg is striving to understand the mechanics of the newly-discovered electron orbitals of hydrogen. His approach is strange and radical - rather than trying to map the detailed inner workings of the invisible atomic structure - the traditional reductionist approach - he sought a model that ignored the fundamentally unobservable internal mechanics. His mathematical description should depend only on observable quantities - in this case,  the mysterious frequencies of light produced as electrons jump between orbitals. This philosophy led to a series of seemingly miraculous mathematical insights, with the final result being the birth of modern quantum theory and first complete formulation of quantum mechanics - matrix mechanics. 

Other representations of quantum mechanics soon followed - for example, wave mechanics driven by the Schrodinger equation and the Paul Dirac’s notation representing a space of quantum states. These became better known than matrix mechanics, but the underlying philosophy of the Heisenberg representation was not forgotten. In fact the great Neils Bohr passionately advocated it, insisting that matters are the observables - the measurable start and end points of an experiment. According to this philosophy, the unobservable details that happen in between are not only irrelevant, it may be meaningless to even talk about those details as real, physical events.

Despite its importance in the foundation of quantum mechanics, and being championed by Bohr and Heisenberg, most physicists over the following decades did not subscribe to this philosophy - at least not in practice. They remained reductionists, and the quest continued for a detailed, mechanical description of the hidden inner workings of atoms and of the universe. This search for the underlying clockwork of reality led to quantum field theory, in which all particles are described by vibrations in elementary fields that fill the universe, and all interactions are calculated by adding up the exchanges of infinite number of virtual particles.

But one ignores the wisdom of Heisenberg and Bohr at great peril. Early quantum field theory was plagued by problems - for example, how do you compute infinite interactions? And how do you avoid the infinite interaction strengths produced by some of those infinite sums? Some clever hacks - perturbation theory and renormalization - worked in many cases to tame the infinities and yielded the incredibly accurate predictions of quantum electrodynamics, which describes the interactions of the electromagnetic field.

But problems returned when we started to peer into the atomic nucleus. At the beginning of the 1960s the atom was understood as fuzzy, quantum electron orbits surrounding a nucleus of protons and neutrons. Those nuclear particles were originally thought to be elementary - to have no internal structure, just like the electron. But new experiments were revealing that they seemed to have some real size - as though they were made of yet-smaller particles. These were scattering experiments - particles were shot into atomic nuclei, and the internal structure was probed by the way those or other particles emerged. Such experiments revealed that the forces binding these sub-nuclear particles together must be so strong that space and time must break down at those scales, and our even best field theory hacks seemed to fail.  

And so a number of physicists turned back to Heisenberg’s old idea. What if it was possible to understand a scattering experiment - like those used to probe an atomic nucleus - not by modeling all the cogs and wheels in the internal nucleus, but rather by understanding the observables only. In this case the observables were the particles that entered and left the nucleus in a scattering experiment.

In fact, Heisenberg was way ahead of the game. He’d already laid the groundwork in the early 40s with his work on something called the scattering matrix, or S-matrix. The S-matrix is a map of the probabilities of all possible outgoing particles, or out-states, for a given set colliding particles - in-states. The idea was invented by John Archibald Wheeler in the late 30s as a convenient way to express the possible results of a quantum interaction. In fact, it's still very important tool in quantum mechanics today. But Heisenberg took it in a very different direction. 

In standard use, the S-matrix can be calculated if you understand the forces in the interaction region - for example, in the nucleus of an atom. But what if you don’t know those internal interaction forces? Heisenberg sought a way to ignore that internal structure and, rather, treat the S-matrix as fundamental. The S-matrix was to become the physics of the interaction, rather than an emergent property of more fundamental, internal physics. Heisenberg’s made some progress in the 40s, but the approach came into its own 20 years later when the atomic nucleus refused to give up its mysteries. 

Through the 60s and 70s Geoffrey Chew and others took Heisenberg’s work on the S-matrix and his anti-reductionist philosophy and developed S-matrix theory. At the time, nuclear scattering experiments were producing a startling variety of different particles. For example, many different mesons were discovered, which we now know to be composed of two elementary quarks. But at the time, prior to the discovery of quarks, no point-like, elementary nuclear particles were known. Rather than searching for smaller and smaller particles, Chew and collaborators promoted a “nuclear democracy”, in which no nuclear particle is more elementary than any other. They attempted to build scattering matrices with no elementary particles at all, and with no details of nuclear structure.

But how is this even possible? Remember, quantum field theory fastidiously adds together a complete set of virtual interactions that contribute to the real interaction. S-matrix theory sought to avoid this, and rather model a scattering experiment - to build an S-matrix - by applying some general consistency conditions and then looking for the only scattering results consistent with those conditions. These conditions include things like conservation of energy and momentum, the behavior of quantum properties like spin, and the assumption of a family of particles that can be involved in the interaction. 

But in order to avoid those sums of Feynnam diagrams, S-matrix theory also relies on symmetries within those virtual interactions. In particular something called crossing symmetry. An example of this is the fact that antimatter can be treated as matter traveling backwards in time - that folds together large sets of Feynman diagram and helps us ignore the actual causal structure within an interaction region. 

Here’s another example of crossing symmetry. Imagine two particles scattering off each other. Two go in, and two go out - the out particles could be the different to the in particles, or they could be the same just with different momenta. Two broad ways this can happen are as follows: 1) the ingoing particles exchange a virtual particle which deflects or transforms them into the outgoing particles - this is called the S-channel; or 2) the particles annihilate each other, briefly form a virtual particle, which then creates the two outgoing particles - that’s the T-channel. In regular quantum field theory you’d need to add up all the different versions of both these two channels separately. Before quarks and their interactions were properly understood, doing that sum seemed impossible in the case of strong force interactions.  

But in 1968, italian physicist Gabriele Veneziano figured out a hack. It had been postulated that the S-channel and the T-channel should lead to identical scattering amplitudes. That fact enabled Veneziano to ignore the fiddly details of the separate channels and derive a scattering matrix, which in turn allowed him to explain the peculiar relationship between the mass and spin of mesons. 

The S-matric approach to solving problems in quantum mechanics based on these global consistency conditions and taking advantage of symmetries is also called a bootstrap model - from expression “pull yourself up by the bootstraps” - the idea of raising yourself up without concrete starting point to push off of. 

So S-matrix theory looked extremely promising … until it didn’t. It presented severe challenges on par with those plaguing quantum field theory - and, as it happened, physicists solved the QFT challenges first. Breakthroughs in our understanding of the behavior of quarks and gluons revealed that the strong nuclear force does not actually approach infinite strength as was once feared, and so full quantum field theoretic description of the strong nuclear force was possible after all. The result is quantum chromodynamics - our modern description of sub-nuclear physics. QCD deserves its own episode, so I’ll skip the details for now. But the results was that S-matrix theory was sidelined, and quantum field theory reigns supreme to this day as our reductionist description of the subatomic world.  

So do we really now have a perfect mechanical description of the smallest scales of reality? Well, not so fast. The most precise tests of quantum chromodynamics still involve the aforementioned hack of perturbation theory - sums over large numbers of intermediate virtual states. And as we discussed in our episode on virtual particles, the physical-ness of these states is questionable at best. There are non-perturbative formulations of quantum chromodynamics, like lattice QCD, but these are severely limited in application due to their computational demands.

Quantum field theory surely gives us insights into the nature of the fundamental workings of the universe. At first glance S-matrix theory now seems less fundamental - it seems like an emergent set of relationships - what we call an “effective” theory - but it turns out that it has led to deep insights that even quantum field theories couldn’t reach. 

So I said that S-matrix theory got sidelined - that’s not exactly true. Remember that clever little bit of work by Gabriele Veneziano? It turned out that the Veneziano amplitude for meson scattering represents something rather more profound that just predicting the results a scattering experiment. Other physicists quickly realised that it was telling us that mesons could be described by a very particular type physical system: a vibrating string. And so string theory was born - at first as a description of strong force interactions before quantum chromodynamics took over - but then as a theory of quantum gravity. So our leading, and perhaps only current contender for a theory of everything was first derived using as bootstrap model, an S-matrix theory. 

And physicists are bringing the S-matrix back. Here’s an especially awesome recent example. We think that the largest structures in the universe today - galaxies and galaxy clusters - collapsed from quantum fluctuations in the extremely early universe, represented in some cases by individual particles. Princeton’s Nima Arkani-Hamed and collaborators have performed what they call a cosmological bootstrap to understand the nature of those early subatomic scale interactions based only on current observables - the distribution of gigantic galaxies on the sky. 

That’s a cool result, but Arkani-Hamed’s work on something called the amplituhedron has hinted that the S-matrix approach can be taken much, much further. The amplituhedron takes Heisenberg’s old philosophy to the extreme  - “only consider the observables”- the amplituhedron doesn’t just eliminate the fiddly mechanics of quantum field theory, it removes the very concepts of space and time. These only emerge later as a consequence of spaceless, timeless particle scattering. But all of these new efforts deserve their own episode, when we’ll see how a simple insight by a young scientist back in 1925 may take us as close as we’ve ever come to understanding the fundamental nature of space time.

*********************

SHOOTING SCRIPT

In particle physics we try to understand reality by looking for smaller and smaller building blocks. But what if that has been the wrong philosophy all along? 

The year is 1925 and the young Werner Heisenberg is striving to understand the mechanics of the newly-discovered electron orbitals of hydrogen. His approach is strange and radical - rather than trying to map the detailed inner workings of the invisible atomic structure - the traditional reductionist approach - he sought a model that ignored the fundamentally unobservable internal mechanics. His mathematical description should depend only on observable quantities - in this case,  the mysterious frequencies of light produced as electrons jump between orbitals. This philosophy led to a series of seemingly miraculous mathematical insights, with the final result being the birth of modern quantum theory and first complete formulation of quantum mechanics - matrix mechanics. 

Other representations of quantum mechanics soon followed - for example, wave mechanics driven by the Schrodinger equation and Paul Dirac’s notation representing a space of quantum states. These became better known than matrix mechanics, but the underlying philosophy of the Heisenberg representation was not forgotten. In fact the great Neils Bohr passionately advocated it, insisting that matters are the observables - the measurable start and end points of an experiment. According to this philosophy, the unobservable details that happen in between are not only irrelevant, it may be meaningless to even talk about those details as real, physical events.

Despite its importance in the foundation of quantum mechanics, and being championed by Bohr and Heisenberg, most physicists over the following decades did not subscribe to this philosophy - at least not in practice. They remained reductionists, and the quest continued for a detailed, mechanical description of the hidden inner workings of atoms and of the universe. This search for the underlying clockwork of reality led to quantum field theory, in which all particles are described by vibrations in elementary fields that fill the universe, and all interactions are calculated by adding up the exchanges of infinite number of virtual particles.

But one ignores the wisdom of Heisenberg and Bohr at great peril. Early quantum field theory was plagued by problems - for example, how do you compute infinite interactions? And how do you avoid the infinite interaction strengths produced by some of those infinite sums? Some clever hacks - perturbation theory and renormalization - worked in many cases to tame the infinities and yielded the incredibly accurate predictions of quantum electrodynamics, which describes the interactions of the electromagnetic field.

But problems returned when we started to peer into the atomic nucleus. At the beginning of the 1960s the atom was understood as fuzzy, quantum electron orbits surrounding a nucleus of protons and neutrons. Those nuclear particles were originally thought to be elementary - to have no internal structure, just like the electron. But new experiments were revealing that they seemed to have some real size - as though they were made of yet-smaller particles. These were scattering experiments - particles were shot into atomic nuclei, and the internal structure was probed by the way those or other particles emerged. Such experiments revealed that the forces binding these sub-nuclear particles together must be so strong that space and time must break down at those scales, and our even best field theory hacks seemed to fail.  

And so a number of physicists turned back to Heisenberg’s old idea. What if it was possible to understand a scattering experiment - like those used to probe an atomic nucleus - not by modeling all the cogs and wheels of the filed theory of the internal nucleus, but rather by understanding the observables only. In this case the observables were the particles that entered and left the nucleus in a scattering experiment.

In fact, Heisenberg was way ahead of the game. He’d already laid the groundwork in the early 40s with his work on something called the scattering matrix, or S-matrix. The S-matrix is a map of the probabilities of all possible outgoing particles, or out-states, for a given set colliding particles - in-states. The idea was invented by John Archibald Wheeler in the late 30s as a convenient way to express the possible results of a quantum interaction. In fact, it's still very important tool in quantum mechanics today. But Heisenberg took it in a very different direction. 

In standard use, the S-matrix can be calculated if you understand the forces in the interaction region - for example, in the nucleus of an atom. But what if you don’t know those internal interaction forces? Heisenberg sought a way to ignore that internal structure and, rather, treat the S-matrix as fundamental. The S-matrix was to become the physics of the interaction, rather than an emergent property of more fundamental, internal physics. Heisenberg’s made some progress in the 40s, but the approach came into its own 20 years later when the atomic nucleus refused to give up its mysteries. 

Through the 60s and 70s Geoffrey Chew and others took Heisenberg’s work on the S-matrix and his anti-reductionist philosophy and developed S-matrix theory. At the time, nuclear scattering experiments were producing a startling variety of different particles. For example, many different mesons were discovered, which we now know to be composed of two elementary quarks. But at the time, prior to the discovery of quarks, no point-like, elementary nuclear particles were known. Rather than searching for smaller and smaller particles, Chew and collaborators promoted a “nuclear democracy”, in which no nuclear particle is more elementary than any other. They attempted to build scattering matrices with no elementary particles at all, and with no details of nuclear structure.

But how is this even possible? Remember, quantum field theory fastidiously adds together a complete set of virtual interactions that contribute to the real interaction. S-matrix theory sought to avoid this, and instead tries to model a scattering experiment - to build an S-matrix - by applying some general consistency conditions and then looking for the only scattering results consistent with those conditions. These conditions include things like conservation of energy and momentum, the behavior of quantum properties like spin, and the assumption of a family of particles that can be involved in the interaction. 

But in order to avoid those sums of Feynnam diagrams, S-matrix theory also relies on symmetries within those virtual interactions. In particular something called crossing symmetry. An example of this is the fact that antimatter can be treated as matter traveling backwards in time - that folds together large sets of Feynman diagram and helps us ignore the actual causal structure within an interaction region. 

Here’s another example of crossing symmetry. Imagine two particles scattering off each other. Two go in, and two go out - the out particles could be the different to the in particles, or they could be the same just with different momenta. Two broad ways this can happen are as follows: 1) the ingoing particles exchange a virtual particle which deflects or transforms them into the outgoing particles - this is called the S-channel; or 2) the particles annihilate each other, briefly form a virtual particle, which then creates the two outgoing particles - that’s the T-channel. In regular quantum field theory you’d need to add up all the different versions of both these two channels separately. Before quarks and their interactions were properly understood, doing that sum seemed impossible in the case of strong force interactions.  

But in 1968, italian physicist Gabriele Veneziano figured out a hack. It had been postulated that the S-channel and the T-channel should lead to identical scattering amplitudes. That fact enabled Veneziano to ignore the fiddly details of the separate channels and derive a scattering matrix, which in turn allowed him to explain the peculiar relationship between the mass and spin of mesons. 

The S-matrix approach to solving problems in quantum mechanics based on these global consistency conditions and taking advantage of symmetries is also called a bootstrap model - from expression “pull yourself up by the bootstraps” - the idea of raising yourself up without concrete starting point to push off of. 

So S-matrix theory looked extremely promising … until it didn’t. It presented severe challenges on par with those plaguing quantum field theory - and, as it happened, physicists solved the QFT challenges first. Breakthroughs in our understanding of the behavior of quarks and gluons revealed that the strong nuclear force does not actually approach infinite strength as was once feared, and so full quantum field theoretic description of the strong nuclear force was possible after all. The result is quantum chromodynamics - our modern description of sub-nuclear physics. QCD deserves its own episode, so I’ll skip the details for now. But the results was that S-matrix theory was sidelined, and quantum field theory reigns supreme to this day as our reductionist description of the subatomic world.  

So do we really now have a perfect mechanical description of the smallest scales of reality? Well, not so fast. Standard QCD employs sums over large numbers of intermediate virtual states. And as we discussed in our episode on virtual particles, the physical-ness of these states is questionable at best.  

Quantum field theories like QCD surely gives us insights into the nature of the fundamental workings of the universe. Given their astounding predictive success, S-matrix theory now seems less fundamental - it seems like an emergent set of relationships - what we call an “effective” theory - but it turns out that it has led to deep insights that even quantum field theories couldn’t reach. 

So I said that S-matrix theory got sidelined - that’s not exactly true. Remember that clever little bit of work by Gabriele Veneziano? It turned out that the Veneziano amplitude for meson scattering represents something rather more profound that just predicting the results a scattering experiment. Other physicists quickly realised that it was telling us that mesons could be described by a very particular type physical system: a vibrating string. And so string theory was born - at first as a description of strong force interactions before quantum chromodynamics took over - but then as a theory of quantum gravity. So our leading, and perhaps only current contender for a theory of everything was first derived using as bootstrap model, an S-matrix theory. Oh another example of bootstrapping a scattering experiment without understanding the internal physics, Steven Hawking's derivation of Hawking Radiation. 

And physicists are bringing the S-matrix back. Here’s an especially awesome recent example. We think that the largest structures in the universe today - galaxies and galaxy clusters - collapsed from quantum fluctuations in the extremely early universe, represented in some cases by individual particles. Princeton’s Nima Arkani-Hamed and collaborators have performed what they call a cosmological bootstrap to understand the nature of those early subatomic scale interactions based only on current observables - the distribution of gigantic galaxies on the sky. 

That’s a cool result, but Arkani-Hamed’s work on something called the amplituhedron has hinted that the S-matrix approach can be taken much, much further. The amplituhedron takes Heisenberg’s old philosophy to the extreme  - “only consider the observables”- the amplituhedron doesn’t just eliminate the fiddly mechanics of quantum field theory, it removes the very concepts of space and time. These only emerge later as a consequence of spaceless, timeless particle scattering. But all of these new efforts deserve their own episode, when we’ll see how a simple insight by a young scientist back in 1925 allowed us to pull ourselves up by our bootstraps towards a better understanding of the quantum weirdness of space time.

Comments

No comments found for this post.