Home Artists Posts Import Register

Content

If I roll a pair of dice and you get to bet on one number, what do you choose? The smart choice is 7 because there are more ways for 2 dice to come up 7 than any other number. Well, it turns out that you can apply the same logic to predicting the behavior of the universe. Let’s see how some of our most powerful tools in physics are really a game of cosmic craps.

—-

Back in the 1800s, scientists were busily figuring out a series of laws to relate the crude observable properties of matter. By relating things like temperature, pressure and volume in fluids we discovered the laws of thermodynamics, but the origin of these laws eluded them

Things started to make a lot more sense when we realized that matter is made up of atoms and molecules, and that thermodynamic properties and laws result from the motion of those particles and their interactions with each other.

Now we can’t just calculate the behavior of, say, a room full of air by applying the laws of motion to every air molecule. There are just too many. But fortunately you don’t need to—simply understanding the particles’ motion on a statistical level lets us predict their behavior. That’s exactly what the field of statistical mechanics does. It tells us exactly why the laws of thermodynamics do what they do. But “stat mech” also makes some incredible predictions beyond what any 19th century physicist could have imagined. It has predicted the existence of highly exotic states of matter like superfluids and superconductors, and even allows us to know whether a star will collapse into a black hole when it dies.

To get started, let’s try an experiment. We have a vacuum-filled room with no gravity and into it we blast 100 rubber balls. Each ball is small, perfectly bouncy, and is shot in at a random angle and speed. They’re small enough that they almost never hit each other, but bounce off the walls without losing energy. They’re also moving fast enough that they quickly fill the room with a bouncy death-storm. Let’s focus on just one of them. If we can measure the starting speed and direction we can determine its location after a few bounces. But if we don’t know these things, that location will seem completely random.

In fact, if we allow enough bounces, every possible location for that ball has effectively equal likelihood. It could be anywhere in the room. But that also means that each configuration of locations for all 100 balls is equally likely. They could by chance all hit you in the head at the same time, or maybe line up in a perfect lattice, or they could be haphazardly scattered through the room with no obvious pattern. That last one seems more natural, but actually, any specific arrangement of balls is equally likely.

The reason we’re more likely to see a random-looking configuration is just that there are so many more room-filling, random-looking configurations than there are highly patterned ones. To make this easier to see, let’s divide the room into 10 boxes. Spraying balls into the room is equivalent to randomly dropping them into those boxes. With 100 balls we’re much more likely to get a relatively smooth spread across the boxes versus them all landing in the same box.

But what if we label each ball and specify that each ball belongs in a specific box? Now, the chance of every single ball randomly falling into its pre-allocated box is just as small as all balls falling into the same box. Both have a probability of one over the number of boxes to the power of the number of balls. 1/10^100 - which is a probability so small that if you randomize the balls once every second, the Milky Way’s supermassive black hole will have evaporated before you get the balls in their right boxes.

We can use this same idea to think about the state of molecules of air in the room. Now we have 10^27 particles instead of 100, and so the chance of getting any one specific configuration is staggeringly unlikely. And there are also vastly more configurations in which the air molecules are roughly evenly spread through the room than there are more ordered configurations.

In statistical mechanics, we call each specific configuration of balls or air molecules a microstate, while the general shape of the distribution—say, smoothly spread out or bunched up—is a macrostate. A macrostate represents the observable properties—typically the thermodynamic properties like temperature, pressure and volume. Microstates represent hidden properties like the position of each particle, and each macrostate will correspond to many potential microstates. We almost always observe the world in the macrostate that has close to the maximum possible number of microstates.

We talked about this when we discussed entropy a while back. Entropy is a measure of how close your macrostate is to having the maximum microstates, which corresponds to having the least amount of order. Systems always tend towards high entropy.

But we can use these ideas about counting states to go a lot further than entropy and the laws of thermodynamics. To realize the full power of “stat mech” we need to stop thinking about particle positions and instead think about energy. Back to our bouncy balls. Let’s imagine they can also bounce off each other. As they move around the room they change position, and they also change velocity as they collide, and so exchange energy with each other.

We can think about their energy distribution in the same way that we thought about the distribution of positions. Divide all possible energies the balls can have into a series of “energy bins”. The energy distribution of the balls is just the number of balls per energy bin.

Just as with the positions, balls will move between energy bins as they interact. And just as with position space, over time the system will explore all possible configurations of this energy space. Each specific distribution of energies is again a microstate, while the macrostate is a particular shape to the energy distribution.

For example these are different macrostates. And these are a few microstates within those macrostates. And again, we’re overwhelmingly likely to observe a macrostate—an energy distribution—that results from the largest number of microstates.

The positions of the balls tended to become evenly distributed with the bounds of the walls of the room. The energies of the balls are also constrained, but in a different way, and so the distribution of energies is not flat across all possible energies. The constraint is that energy is conserved. If one ball loses energy, another gains it. And the total energy of all balls always adds up to the same.

There are various ways you could arrange our particles in their energy bins to get the same total energy - for example, splitting them between a high and a low energy bin, or piling them all in the bin corresponding to the average energy. Each of these distributions can be built with multiple different microstates, just by swapping balls around within that basic spread.

But there’s a particular energy distribution that allows the most “swapping” - has the most microstates. And it’s something like this. This is the energy distribution resulting from Maxwell-Boltzmann statistics, named for Ludwig Boltzmann and for James Clerk Maxwell.

Mathematically it looks like this—it tells the number of particles in each energy bin and it depends on the temperature of the material. From this we also get a distribution in particle velocities, which is called the Maxwell-Boltzmann distribution.

All of this really just comes from counting the number of ways we can randomly put balls in different energy bins. The Maxwell-Boltzman distribution represents “maximum disorder”, or the highest entropy distribution of energies. Just as high entropy is represented by particles filling the room in a random-looking way.

To give you some intuition about why it looks like a wonky bell curve rather than a flat spread, consider rolling a pair of dice. The most common number to roll is 7, and that’s just because you can roll it the most ways. Six ways in fact: 1 and 6, 2 and 5, 3 and 4, 4 and 3, 5 and 2, 6 and 1. Each of these number combinations is a microstate, all corresponding to the same macrostate of having rolled a 7. The next most common “macrostates” are 6 and 8, with 5 ways to roll each. The least common are 2 and 12, because for each there’s only a single microstate—rolling 2 1’s and 2 6’s respectively.

I did gloss over something pretty important in all of this. When I was counting energy states, I acted like it really matters which order you put the balls in their energy bins. That’s fine if the balls represent meaningfully distinct objects. But if the particles in question are truly indistinguishable from each other, then simply swapping two particles with each other shouldn’t count as two different microstates. Changing the detailed spatial configuration counts, but not just mixing around the particle labels. That means we overcounted the states.

When we count states correctly, this is the equation we get for the energy distribution. It was discovered by Satyendra Nath Bose working with Albert Einstein in 1925, and so we call this statistical behavior Bose-Einstein statistics. Any type of particle that behaves this way is called a "Boson", and all bosons are A) indistinguishable from each other, and B) able to pile up to unlimited numbers in any given energy bin. We’ll come back to that last bit.

The only difference to the Maxwell-Boltzmann formula is this minus one here. There is hardly any difference for most circumstances, and so we can often just use the Maxwell-Boltzman formula. But when it matters it really matters. For systems with extremely low energies, that minus one leads a big difference in the number of particles per energy bin - making it possible to cram way more particles in the lowest energy states. This results in an entirely new state of matter: the Bose-Einstein condensate, which manifests as superconductivity, superfluidity, and various other weird behaviors depending on the material.

We’re going to come back to Bose-Einstein Condensates in more detail another time. For now, we’re going to wrap up with one last way of counting states that leads to wildly different behavior. For Bose-Einstein and Maxwell-Boltzmann statistics, there’s no limit to the number of particles per energy bin. But that’s not true of all particle types. Some particles refuse to share their energy bin, so that each energy state can only be occupied by a single particle. It would be as though rolling a 1 on one die meant the other could not also roll a 1. You need to count the possible states of such particles very differently, and the result is Fermi-Dirac statistics, named for Enrico Fermi and Paul Dirac, two of the founders of quantum theory. Particles obeying these statistics are called Fermions. Fermions are indistinguishable and restricted to one per energy bin.

We talked about why exactly some particles should suffer this restriction in our spin statistics episode—in short, particles with a quantum spin value that’s a half-integer—½, 3/2, 5/2, etc— are forbidden from overlapping … because quantum mechanics. These are Fermions, and include the particles that make up most of matter - electrons, protons and neutrons. Particles with integer spin are bosons, and include the force-carrying particles like photons, but you can also make bosons out of combinations of spin-half Fermions — for example the helium-4 atom.

Fermi-Dirac statistics looks similar to our other statistics types, but now the -1 in the Bose-Einstein formula becomes a +1. Again, this makes little difference for high temperatures, but at low temperature things get weird. Because particles can’t all fall into the single lowest energy state, they instead fill up all of the available low energy states, with one particle each state. This leads to nice things, like the fact that electrons in atoms can have only one electron per energy shell. Well, actually two electrons per shell, because electron spin allows up to two electrons to be distinguishable from each other with opposite-pointing spins. But the result is that atoms are much larger than they would be if electrons were bosons. And it’s why we have chemistry and structure at all in this universe.

The strange behavior of Fermi-Dirac statistics also leads to some of the weirdest objects in the universe—white dwarfs and neutron stars, where all lowest energy states are filled, resulting in what we call degenerate matter. These things are only supported from gravitational collapse by the fact that their Fermions can’t get closer together. And it was by using Fermi-Dirac statistics that we figured out how massive a star needed to be in order to collapse through the white dwarf phase and through the neutron star phase, into a black hole. Of course we’ve talked about that previously.

OK, so that’s all I have to say about statistical mechanics for now. It’s all just counting. We perceive the crude properties of the universe, and its details are hidden from our view. But simply by counting the different hidden ways by which those crude observable properties may have come about, we gain great power in predicting how these observables will behave. By mastering the game of cosmic craps, we come to a new understanding of the origin of the physical laws as the rolling of  dice hidden beneath surface layer of space time.


Comments

No comments found for this post.