Home Artists Posts Import Register

Content

Entropy is surely one of the most perplexing concepts in physics. It’s variously described as a measure of a system’s disorder - or as the amount of useful work that you can get from it - or as the information hidden by the system. Despite the seeming ambiguity in its definition, many physicists hold entropy to be behind one of the most fundamental laws of physics.

The great astrophysicist Arthur Eddington once said, “The law that entropy always increases holds, I think, the supreme position among the laws of Nature... if your theory is found to be against the Second Law of Thermodynamics I can give you no hope; there is nothing for it to collapse in deepest humiliation.”  Eddington wasn’t the only one to see the 2nd law as fundamental. As we’ve discussed in previous episodes, the 2nd law of thermodynamics  may be responsible for the arrow of time and is a key ingredient in solving the black hole information paradox—a solution that may one day unite quantum physics with gravity.

But despite these clues to the importance of entropy and the 2nd law, it’s not obvious what makes it so fundamental. A system’s entropy is what we call an emergent property, and the 2nd law seems to be an emergent law. Emergent properties and laws arise from the statistical behavior of large numbers of particles.  For example, a room full of air has a temperature, which is a measure of the average energy of motion of all the individual air molecules. But a single air molecule doesn’t have a temperature - at least not in the same way. Instead, that molecule has a velocity and a mass and so on, which define how it bounces off the walls or other particles, giving rise to what we perceive as temperature, and giving rise to the laws of thermodynamics.

So we tend to think of emergent properties and laws as less fundamental than the properties and laws governing individual particles. Entropy IS a thermodynamic property, and the 2nd law is the … well, second law of thermodynamics, so is also statistical in nature. Why, when are these things considered so fundamental? To answer that we need to understand what these things emerge from. What underlying property of nature do all these different definitions of entropy describe?

As is often the case, getting more fundamental means getting quantum - and there is indeed a type of entropy that applies to quantum systems like our air molecule - it’s von Neumann entropy, and understanding it may help us understand not just the 2nd law, but also the arrow of time and how the large scale world emerges from the quantum in the first place.

Before we dive into that, let’s review the more familiar definitions of entropy. To really understand these we have a full playlist on the mysteries of what I’ll call classical entropy and its relationship to information. But let’s run through .

Rudolf Clausius came up with the first definition of entropy as essentially the amount of useful work that could be extracted by moving heat energy around. If heat energy is perfectly mixed inside and out of an engine then no work can be extracted, while if heat energy is separated - hotter inside the piston chamber - then it will tend to become mixed, and we can harness that flow.  Ludwig Boltzmann recast entropy in terms of the number of configurations of particles that give the same set of crude thermodynamic properties. For example, there are more configurations of particles in which energy is perfectly mixed than if energy is concentrated in one spot - like in our piston chamber. Systems will tend towards the more common configurations.

Here we can start to see the connection between entropy and information. If all the air is in a corner of the room then you know more about the positions of the individual particles - they’re all in the corner - versus if they’re spread through the room.

But it took the invention of information theory to really understand this. It was Claude Shannon who founded the field of Information theory, and also invented the entropy of information - Shannon entropy. Shannon entropy can be thought of as the amount of hidden information in a system - or more precisely the amount of information we can hope to gain by making a measurement on the system. If all the particles are bunched up in the corner then measuring  their exact positions gets you some information. But if you measure their positions when they’re spread through the room you increase your information by a lot.

Another way to think about Shannon entropy is in terms of outcomes of events. The more possible outcomes, the more entropy the event has. For example, a flipped coin is a low-Shannon-entropy even with only two outcomes, while flipping a million coins is a high entropy event. Shannon entropy is actually more fundamental than thermodynamic entropy in that it is the generalization of the more familiar entropy. It applies to any system of information and any type of action that reveals that information.

When he first came up with this theory, though, Shannon didn’t fully realize its importance. As the (perhaps apocryphal) origin story goes, he only started calling his invention “entropy” after talking to the great Hungarian mathematician and physicist John von Neumann. Supposedly, Von Neumann said that he should call it entropy for 2 reasons: 1. it looks exactly like the equation for thermodynamic entropy, and 2. nobody knows what entropy is, so nobody would argue with him.

But von Neumann probably knew perfectly well that Shannon’s entropy was the real deal - A) because von Neumann was a savant level genius who had out-math’d and out-physic’ed many of the greatest minds of the last century; and B) because, as part of A), von Neumann had already invented his own brand of entropy. Von Neumann entropy is the entropy of quantum systems, and because everything else is made of quantum systems, it may be the most fundamental definition of entropy, Even Shannon entropy is a just a special case of von Neumann entropy - at least so says von Neumann himself.

The concept is at least incredibly powerful. It’s at the heart of quantum information theory, enabling us to calculate how much quantum information is contained in a system, and it can also be used to determine how many bits of classical information we can get out of the system when we make a measurement. But perhaps the most interesting - von Neumann entropy tells us the amount of entanglement in the system. In fact it’s driven by entanglement - this mysterious connectin between quantum particles that Einstein called “spooky action at a distance”. As a bit of a spoiler - von Neumann entropy seems to reveal the evolution of entanglement connections is what drives the 2nd law of thermodynamics.

Ok, we’re getting ahead of ourselves. To get a glimmer of understanding of what von Neumann entropy is about, let’s think about information in quantum mechanics. Quantum systems are described by what we call the wavefunction - that’s the distribution of probabilities of all the possible properties the system could have if you tried to measure it.

According to the best accepted interpretations of quantum mechanics, the wavefunction contains all the information needed to perfectly define a quantum system.

As an example, imagine we have a quantum coin. It has a wavefunction that just describes which side is up - heads or tails. So you flip the quantum coin it enters what we call a superposition of states - it is simultaneously heads AND tails until you look at the result, at which point it becomes either heads or tails.

By the way, the quantum coin is just like the both alive-and-dead Schrodinger’s cat, and while these are just illustrative examples, there are many real quantum systems that can exhibit these superposition states - like a particle simultaneously having spin up and spin down as revealed in the Stern-Gerlach experiment, or a simultaneously passing through two slits in the double-slit experiment. And we’ve talked before about how these experiments verify these overlapping split realities.

Here’s a counter-intuitive thing about superposition: after you flip the quantum coin, you actually DO know its current unrevealed state. That’s because its state is entirely defined by its superposition wavefunction - it is in a pure state of 50% heads 50% tails. This is subtly different to it being either heads OR tails, with a 50% chance of each after you look at it. That superposition really represents the coin’s current reality from your perspective. But if you have full knowledge of the unrevealed coin’s current state, then there is no hidden information - which means its entropy - its von Neumann entropy - is zero. Observing the coin doesn’t reveal new information about the unrevealed state. Rather it changes the quantum state in a random way - now 100% heads or 100% tails, but the information about which way it would go wasn’t hidden in the unrevealed wavefunction.

This is very different from the result of flipping a regular coin, which is definitely heads OR tails before you reveal it. That information of IS embedded in its wavefunction, it just isn’t known to you. So its entropy - in this case its Shannon entropy - is positive.

In case superposition wasn’t weird enough, let’s bring in quantum entanglement. That means we need a second quantum coin. It’s spookily connected to the first, in that when both coins are flipped they have to land opposite to each other. After the flip we say that the flip results are entangled. They’re correlated - even though we don’t know the individual results. If we reveal one we’ll immediately know the result of the other.

So you flip your pair of entangled quantum coins. Again there are two ways this can turn out - either the first coin lands tails and the second heads, or vice versa. Before they’re revealed they exist in a superposition of states - both possibilities exist simultaneously. The unrevealed wavefunction is like this, which means 50% heads tails and 50% tails heads. The von Neumann entropy of that entire wavefunction is still zero because the combined wavefunction of the two coins holds all the information about their current state. But what if we consider only a single coin? Because of the entanglement, the part of the wavefunction corresponding to a single coin does not contain all the information about that coin’s state.  Its von Neumann entropy is no longer zero - information IS hidden - it’s hidden in the part of the wavefunction corresponding to its entangled partner.

And this is where we can connect von Neumann entropy to all of the other forms of entropy, and glimpse the real origin of the second law.

When viewed WITH its entangled partner, the coin exhibits quantum weirdness like superposition. That could be revealed in experiments like a Bell test, which we covered previously. But treated individually, each separate entangled quantum coin behaves kind of like a regular classical coin - for example it doesn’t by itself exist in a pure superposition of states; that superposition only appears when you include its entangled partner. It’s in what we call a mixed state - heads or tails, not heads and tails. And, just like the regular, classical coin - it has non-zero entropy.

This similarity between the entangled but isolated quantum coin is no coincidence. Its entanglement is the first step in the transition between the quantum and classical world. Our capacity to observe quantum effects like superposition depends on being able to access the entire wavefunction. With one partner in entanglement, that becomes more difficult - but as a quantum object interacts with the countless particles of a macroscopic environment, and those particles interact with each other, the web of entanglement grows so quickly that it soon becomes impossible to access the entire wavefunction. We call this process decoherence - it’s how the ordinary macroscopic world emerges from its very weird quantum pieces - and for a deeper dive into decoherence we have you covered.

Looking at it this way, our classical coin is just like our isolated entangled coin. Except now we don’t have a simple entanglement between two coins - the entanglement is between the coin’s countless constituent quantum parts and every particle they’ve ever interacted with. That network of entanglement IS in a superposition of states - heads AND tails - but you can’t ever access it. In part because you’re part of the network - you’ve already become entangled with the coin and live in the slice of the wavefunction - the mixed state -where the coin result is heads OR tails.

The propagation of entanglement leads to our experience of the very un-quantum macroscopic world, but it also drives the growth of entropy.

Information about the detailed quantum states of all particles becomes increasingly inaccessible, leaving only crude observable properties - for example, thermodynamic properties like temperature. We call these properties that are preserved through entanglement pointer states in the language of quantum darwinism. Over time, systems move towards a state of maximum entanglement, at which point most information is hidden, and the systems are describable by the fewest properties - for example, when a system equalizes to a single temperature.

So there you have it - the growth of entanglement drives both the 2nd law of thermodynamics AND the emergence of the classical world from the quantum. And, as an extra trick, also defines the arrow of time which itself points in the direction of increasing entropy and multiplying entanglement, as, of course, we’ve discussed. But there’s a lot more to talk about - because von Neumann’s insights lead us to a picture of reality that is more informational than physical. But that information won't stay hidden for long - only until our next episode on this information-theoretic space time.


Comments

Clay

In my head, "****NEW STUFF STARTS HERE****" sounds exactly like the line "WATCH OUT FOR SNAKES" from the 1962 movie "Eegah!".