Home Artists Posts Import Register

Content

Statistically speaking, you are a disembodied brain that just randomly assembled out of the chaotic void. Your memory of your entire life also just came into being by through a chance arrangement of particles. That includes your memory of that last statement. And that one. And that one. You, my friend, are a Boltzmann Brain. Statistically speaking.

Ludwig Boltzmann revolutionized physics in the latter half of the 19th century. His most incredible insight was the kinetic theory of gases. He showed that the laws of thermodynamics can be explained by thinking of a gas as a collection of microscopic particles in constant, random motion. Take entropy for example. Prior to Boltzmann, entropy was only understood as a measure of the proportion of energy in a system that can be used for useful work. For example, the hot, dense gas in a recently fired engine piston only expands because it’s surrounded by a colder, less dense environment. The energy in that burning gasoline does the useful work of driving your car down the street.

But if you put the compressed piston in an environment that’s full of equally hot, dense gas, nothing happens. The gas in the piston still contains the same amount of energy, but it’s useless energy. Why? Because in the latter case, the gas inside the piston is in thermal equilibrium with the gas outside the piston – everything the same temperature and pressure – perfectly mixed. Entropy is a measure of how far from equilibrium a system is. The lower the entropy, the further from equilibrium. Any part of the universe, left to its own devices, always tends to flow back to equilibrium. That’s the second law of thermodynamics right there: entropy must always increase in a closed system. It’s that flow back to equilibrium – that increase in entropy – that can be harnessed to do work. But WHY does the universe have to return to equilibrium? Why does entropy always increase?

Boltzmann figured this out. Entropy is just a measure of the “specialness” or the degree of order in current arrangement of positions and velocities. It’s the proportion of possible states that are indistinguishable from the current state. Instead of an engine, imagine a room full of air. The air molecules are moving randomly, and over time they pass through all possible arrangements that they could have. In most of those random arrangements the room is filled pretty evenly and we can’t tell those arrangements apart from each other. That’s a high entropy situation because a high proportion of all possible states look just like this one. But in a tiny fraction of the possible distributions, all of the air molecules end up randomly bunched together in one corner, or, I don’t know, produce a sound density wave playing the Ballad of Serenity over and over.  If you count up all possible arrangements of particles, only a tiny proportion do weird, highly ordered stuff like that, so they’re very low entropy situations. Entropy increases because particle positions and velocities get randomized over time.

Boltzmann’s interpretation of entropy leads to a prediction that seems innocuous but has some astounding implications. His statistical interpretation doesn’t prohibit entropy from decreasing. In fact it allows it. For example, tiny localized dips in entropy happen all the time when you get a chance convergence of a few particles in one part of the room. The larger the random dip in entropy, the less probable it is. But improbable isn’t impossible. There’s an incredibly tiny chance that all of the particles in a room of gas WILL all happen to end up in one corner of the room due to their random motion. It would take vastly longer than the age of the universe for that to happen, so in practice we never observe the second law of thermodynamics broken on macroscopic scales.

But given infinite time, any non-impossible arrangement will happen. Imagine an infinitely large room - a universe - that’s in perfect thermal equilibrium for infinite time. All sorts of dips in entropy will happen – particles will occasionally converge into a dense arrangement like a black hole or a galaxy, or into a complex arrangement like a teapot or a boxed DVD set of the never-made seasons 2 through 8 of Firefly. All incredibly improbable. However there’s one arrangement that those particles could randomly fall into that would be even less probable than all of the above. All the particles in a region much larger than our universe could randomly end up in almost the same exact location. Such an arrangement would give us the Big Bang.

It’s not known whether the Big Bang originated as a low-entropy dip in an otherwise high entropy universe. It may have. But however it happened, entropy WAS extremely low at the instant of the Big Bang, and has been increasing ever since. The “useful work” performed by that increase in entropy includes the formation of galaxies, stars, planets, Alan Tudyk, indeed the entire process of evolution. In the far future the universe will reach maximum entropy. The black holes will evaporate and the last proton will decay, and all of that cool stuff will cease. The universe will spend almost all of its time in that high entropy state.

Nonetheless, it shouldn’t really be so surprising that we observe a low-entropy blip in an otherwise mostly-high entropy universe. After all, our existence is a by-product of the universe’s progression towards that high entropy state. What other time could we possibly observe? This is an application of the Anthropic Principle: we can only observe an environment capable of producing observers. It’s not surprising that we view the universe from the comfy biosphere of a terrestrial planet, even though the volume of all biospheres is miniscule compared to the volume of the universe. Similarly, we must have appeared at a time and in a universe capable of producing biospheres.

The Anthropic Principle may explain why we exist in monstrously improbable or rare circumstances.  But the principle doesn’t allow us to assume a circumstance for our existence that is any more improbable than is absolutely necessary. In fact we’re mostly likely to be in the most common, most probable circumstance that could possibly explain our current experience. This is just Occam’s razor or the Law of Parsimony – don’t add unnecessary complexity to your explanation, but it can also be thought of as an extension of the Copernican Principle – we observe the universe from as typical a vantage-point as is consistent with our experience.

So aren’t there more probable, smaller dips in entropy that could lead to conscious observers? For example, why collapse a whole universe worth of particles? A single galaxy should be enough. Such systems should massively outnumber larger big-bang collapses, and so should the conscious observers that evolve in them. But we can go further. Why not just have particles converge directly into a single human brain, in exactly the right arrangement to have an illusion of memory and sensation that duplicates exactly our current experience, even if just for an instant? That would be a Boltzmann brain. In a universe where structure results from entropy fluctuations, the vast majority of conscious experiences that ever occur should be Boltzmann brains, rather than ones that arise from evolution. It sounds ridiculous, but it’s the logical conclusion if we assume a big bang from entropy fluctuations. The hypothesis is also impossible to prove wrong. Every experiment I do may be the randomly-assembled delusion of a Boltzmann Brain that happened to come into existence with the memory of trying to prove it isn’t a Boltzmann Brain. The hypothesis is unfalsifiable, but that alone may be enough reason to reject it.

Sean Carroll has another nice argument against Boltzmann brains. He says the idea is cognitively unstable. By accepting that we’re a Boltzmann brain we’re admitting to a state of fantastic delusion, and in doing so admit our incapacity to deduce our own nature. To expand on that, we can also go back to parsimony and the Copernican argument. If we’re Boltzmann Brains then we’re the most common type of Boltzmann brain that has an experience that is indistinguishable from … this. So surely it’s vastly simpler to accidentally manifest a brain with an instantaneous delusion about its ability to understand the world in complex ways than it is to assemble one with true intelligence that can trust its own conclusions. Conclude that you’re a Boltzmann Brain and you must also deny your capacity to reach that conclusion.

This is a cute philosophical point. However, I think the real interest of the idea of Boltzmann brains is as a lesson in caution. Caution in arguing probabilities before really understanding the prior assumptions. In this case, there’s no evidence that the Big Bang arose from a random fluctuation. Pondering the cause of the extremely low-entropy Big Bang is probably more fruitful that wondering if you existed a second ago. There’s a similar idea for which we should also exercise caution – that’s the idea that we live in a simulation. It’s been argued that simulated minds should be vastly more abundant than real minds, ergo, we’re pokemons. We’re actually going to jump into that rabbit hole next time, and it’ll be with a very special guest. That’s assuming we don’t blink out of existence in the next instant as momentary fluctuations in the infinite chaos of a maximally-entropic space time.

Comments

Anonymous

Dang this is alot of work, video format rulez! =)