Home Artists Posts Import Register

Content

[This is a transcript with links to references.]

Life needs order. This isn’t just what exhausted parents say, it’s a property of nature. Life requires structure. The human body for example isn’t just a bag of mixed atoms – the atoms are ordered, they’re in very specific places. Like, erm, organs and stuff. Look, what do I know, I’m a physicist, not a physician.

Physics doesn’t tell you much about life, but it tells you that entropy cannot decrease. And as entropy increases, order gets destroyed. This is why things break, why we age, and why the universe will eventually become a big, disordered bag of mixed particles. At some point, life in the universe will become impossible.

That’s kind of depressing. But is it right? What does physics really tell us about entropy increase? What is entropy to begin with? And how will everything end? That’s what we’ll talk about today.

Imagine you’re driving through the rain. Drops fall onto your window making little splashes. The street gets wet, you drive through puddles and realize that the wiper on your car really needs replacement. It sounds normal enough. But why do we never see water collect on the street to form drops which then fly into the sky and make clouds? Why do parts on your car break on their own but never repair themselves on their own?

Physicists call it the “arrow of time” and it’s the reason why we get older but never younger. Its origin has remained somewhat of a mystery. That’s because according to the equations that we use to describe nature, it’s totally possible for people to get younger, or for water drops to flow up into the sky. The equations we have found work both forward and backward in time. We say they are “time reversible”. We can totally write down an equation that makes water flow into the sky. In reality though, this doesn’t happen. So, what gives?

The reason is that to explain what goes on in nature you don’t just need an equation that tells you how things change. You also need what’s called an initial state. You need to know what changes. This initial state is the entire configuration of the system at one moment in time, for example the positions and momenta of all particles. You take this initial state, and then you calculate what it does. Physics basically comes down to the unforgettable TikTok “I had an initial state and guess what happened next.”

The thing is now that there are a lot of initial states that will make water drop down from the sky. But very few that will make it flow back up. To get this done, you’d have to very precisely arrange all the molecules in the ground to spit the water back out and the molecules in the air to create just the right drift for the drops to fly back up. It’s so incredibly unlikely it never happens. Physicists quantify this probability with entropy.

Entropy is a measure for how likely a system is to be in a certain configuration. Suppose you have a box and there’s air inside. The air is made of molecules that move around. It’s very unlikely that they all sit on the right side of the box. If that was so, the entropy would be small. It’s very likely that they are almost evenly distributed in the box. The entropy in this case would be large.

So what’s going to happen if you put air in only one side of a box? Well, it’s going to spread out. Because that’s the likely thing to happen. And once the air is in a likely state, it’s most likely to stay there. This means the distribution is going to remain approximately even. And luckily so because it’d be inconvenient if you entered a room and all the air went into a corner.

We call such approximately even distributions an “equilibrium”. It’s not that the air molecules stopped moving when they’re in equilibrium, they’re still zipping around. But on the *average the distribution stays the same for a very long time.

Physicists talk about entropy not just to confuse people on YouTube, but because it has practical purposes. If you have a system with small entropy you can do “work” with it. In physics “work” means useful energy. You can do something with it. For example, when the air is on one side of the box, you can put a divider in the middle and the air pressure will move it. You could convert this motion to electric energy and then do something else with it. Charge your phone, power a light, drill a hole into the wall and ruin your neighbour’s podcast recording. Every time you create structure, you do it by drawing on a reservoir of low entropy.

When a system has reached equilibrium and the average remains the same, there’s still *energy in it, because the molecules are still moving around, but it’s not *useful energy. You can’t do work with it anymore. Physicists call this useless energy “heat”.

Formally, entropy measures the number of microstates per macrostate. A microstate is the exact definition of the state of the system. For the molecules in the box that could be the position and momenta of each of the particles. A macrostate, on the other hand, is a state that we, as macroscopic objects, are interested in. For our box, that could be how many molecules are on the left and how many are on the right side of the box. A macrostate is an average over many microstates.

If you want to know the entropy you then ask how many possible ways there are for the air molecules to zip around so that there are approximately the same number on each side. Answer is, there are a lot of those, so that’s likely and the entropy is large. If they’re all on one side, there are far fewer ways to get this done, so that’s unlikely and the entropy is small.

Technically the entropy is the logarithm of the number of microstates, multiplied by a constant called Boltzmann’s constant. If you remember one thing about the logarithm it’s probably that the logarithm of 1 is zero. This means the entropy is zero if and only if there is exactly one microstate for your macrostate.

Ok, so why does entropy increase? Entropy increases because that’s the likely thing to happen. And that removes *part of the mystery about the arrow of time. Because it explains why we observe some things and not others, even though the laws of nature work both forward and backward in time. Because one way is more likely than the other. An egg that falls to the ground, breaks. It doesn’t jump up and unbreak. Alright, we sorted it out.

But this brings up another question. If low entropy states, like eggs, are unlikely, why do we have them to begin with? Why isn’t the universe in equilibrium already when that’s most likely?

We don’t know. We *do know that the universe must have started in a state of low entropy, a very unlikely state, but we have no idea why. It’s often called the “Past Hypothesis”, a term introduced by the philosopher David Alberts. It just posits that this is what it was, but doesn’t explain why.

The entropy of our universe has been increasing ever since the big bang because that’s what’s likely to happen. But this is the *total entropy. In some parts of the universe, entropy can, and does remain small, for a very long time, because it doesn’t have to be evenly distributed. Indeed, we can shovel around entropy ourselves.

If you have a reservoir of low entropy, you can use that to lower the entropy elsewhere. And luckily we have a big low entropy reservoir nearby though it’s more commonly called the sun. The sun started out at low entropy. As it fuses atomic nuclei, its entropy increases. Again, because that’s the likely thing to happen. As the entropy of the sun increases, it sends energy with low entropy to us in form of sunlight. We can use that to create electricity and lower the entropy of something else. We can use it to run a fridge and keep the eggs cool. Or power a light and find out what’s under the couch. Or maybe better not.

It's a similar story with fossil fuels. They’re low in entropy. If we burn them, we increase the entropy of those fuels, or their remains, and we can use that to decrease the entropy of something else. For example, we can manufacture goods and products. These are all low entropy objects with specific structures that we have created.

We can even use low entropy to live longer, to reverse some of the damage that the human body accumulates over time. So far we’ve only figured out how to repair some damaged parts. Maybe in the future we’ll even figure out a way to stop or reverse cell aging, and slow down entropy increase some more.

But eventually this will stop working because we will run out of low entropy reservoirs. Our sun will run out of fuel. Earth’s core will go cold. Other stars will run out of fuel. And eventually, 10 to the 100 years or so from now, the universe will come into thermal equilibrium, entropy will be as large as it can get, and nothing will happen, on the average.

Physicists call this “heat death”. The word is somewhat misleading because it doesn’t mean it’ll be hot. To the contrary it’ll be very cold and very dark. “Heat” refers to the useless energy that I mentioned previously. It’d better be called “high entropy death”.

What does entropy have to do with order? This has always confused me. I now think the brief answer is “nothing”. Because “order” doesn’t have a well-defined meaning. For example, imagine you are pouring milk into tea. At the beginning they are neatly separated, ordered, you could say. At the end they are evenly mixed. I would also call this ordered. But in physics this even distribution doesn’t count as order.

This is why I don’t find this referral to order useful, because what looks ordered to us is due to human perception and expectation, rather than a property of nature.

This reference to order causes a lot of confusion. For example, as we saw earlier, the entropy in the early universe must have been small. Back then, matter was very evenly distributed with small fluctuations in the density which then grew to galaxies. At the end of the universe, matter will again be evenly distributed with small fluctuations in density. Both cases seem equally ordered, if you wish. So how can it be that the one has low entropy and the other high?

It’s because in the early universe the density of the matter is high, and this means the gravitational force is strong. And the gravitational force is attractive, so it wants to draw the stuff together. An even distribution of matter when gravity is strong is incredibly unlikely. It’s unstable. It wants to clump. This even distribution therefore had low entropy. At the end of the universe, however, matter will be very thinly distributed, and gravity will be weak, so it doesn’t want to clump again. This is a very likely situation. So the entropy is high.

And what’s with information. Ah, yes, the relation between entropy and information is subtle. It comes from the definition about the macrostate and microstates. Remember that to calculate the entropy you average over microstates. This means you throw away information. In a macrostate, there’s something you don’t know. So high entropy means low information and low entropy high information.

All that I told you so far is pretty standard textbook stuff and if you picked a random physicist on the street and asked them if they agree, you should let them go because they have better things to do. Now I want to tell you why I don’t think heat death is how life will end.

Remember that entropy counts the number of microstates per macrostate. But what is a macrostate? A macrostate is a state that we as humans are interested in. It’s something we chose to calculate a quantity that is useful for our purposes. It’s an average description that we’re interested in, maybe because we want to understand how efficiently a fridge will work, or how much power an engine will have.

Those are good reasons, but a macrostate has no fundamental significance, and it makes no sense to use it to talk about the fate of the universe.

See, a system is always only in one microstate. That is true for the molecules in the box and also for the entire universe. They are in one particular configuration, one particular state. The probability of this microstate is 1, and that of all other states zero. And as the state changes in time, this will remain so. The entropy of this one microstate per microstate is always zero and stays zero. The second law of thermodynamics doesn’t say that entropy increases. It says it doesn’t decrease. But it can full well remain constant.

The reason we get more complicated probabilities and the reason we say entropy increases is because *we lack information about some systems. If we put air in a corner side of the box, we don’t know where they’re going, but at least we know where they are. If we now let the air spread into the entire box, it’s still the same microstate. It’s still a microstate that one minute ago was in the corner of the box. But we can’t tell. We used to have information about the state. We no longer have. That’s why entropy increases. Because we lose access to information.

But there are always macrostates that will turn a high entropy system into a low entropy system. For the molecules in the box for example you could put in a divider like this. It’s just that this would require information that we cannot easily access. We can access it, but for that we’d need another low entropy reservoir, so nothing would be gained.

The relevant point is now that our notion of entropy is based on the physical properties of macroscopic devices that we humans have easy access to. It isn’t a fundamental property of nature.

I believe that as the universe gets older and entropy increases according to us, new complex systems will emerge that rely on different macrostates, macrostates that we ourselves could never use. And for those complex systems, call them living beings, the entropy will be small again. So life will go on, but in a form very different from us.

And what about quantum mechanics, I hear you ask. Quantum mechanics doesn’t change anything about this. In quantum mechanics, the universe is still in exactly one microstate. It’s just that now it’s one big wave-function.

I sometimes hear people say that the Heisenberg Uncertainty principle implies there are quantities that you cannot measure at the same time, for example position and momentum. But this is a misunderstanding. You can full well measure both. It’s just that if you can predict the outcome of one of those measurement very well, you can say very little about the outcome of the other measurement. I talked about this in more detail in an earlier video.

So if, in 10 to the 100 years, you arrive at the end of the universe and they’ve put up a sign saying sorry we’re closed, tell them Sabine sent you.

This was the most optimistic and uplifting video I’ve ever done and can ever do. It can only go downhills from here.

Files

Will entropy increase kill the universe?

Learn more about differential equations (and many other topics in maths and science) on Brilliant using the link https://brilliant.org/sabine. You can get started for free, and the first 200 will get 20% off the annual premium subscription. Life needs order, but entropy increase will inevitably destroy order and eventually make life in the universe impossible. Or will it? In this video I'll explain why I don't believe this argument. 💌 Support us on Donatebox ➜ https://donorbox.org/swtg 👉 Transcript with links to references on Patreon ➜ https://www.patreon.com/Sabine 📩 Sign up for my weekly science newsletter. It's free! ➜ https://sabinehossenfelder.com/newsletter/ 🔗 Join this channel to get access to perks ➜ https://www.youtube.com/channel/UC1yNl2E66ZzKApQdRuTQ4tw/join 🖼️ On instagram ➜ https://www.instagram.com/sciencewtg/ 00:00 Introduction 1:00 The Arrow of Time 3:04 Entropy, Work, and Heat 7:07 The Past Hypothesis and Heat Death 9:34 Entropy, Order, and Information 11:38 How Will the Universe End? 15:46 Brilliant Sponsorship

Comments

Anonymous

It seems to me that some classic decision-making problems -- path dependency, joint decision-traps, obstacles to reconsidering decisions, resistance to correcting mistakes, overinvestment in past decisions, opportunity costs of decision versus further research -- are partly due to entropy and the arrow of time. But I have a hard time formulating exactly how. Could you help me on that and clarify the relation?

Anonymous

Being optimistic is the best to do, thank you for the nice gift

Anonymous

I'm confused, this paradigm is a fair chunk to digest. 😵 Btw, my cat wanted me so she called out to me, when I again asked 'Wo sind meine Katze?' she called out again so I could find her. My Deutsch sprechen is being put to some use at least.

Anonymous

Thank you Sabine for a very interesting video. I've been trying to get a better feeling for entropy, often described as a measure of randomness, but as some have said, one person's randomness is another person's information. Clearly, the logarithm of the number of microstates is important in some contexts, but your statement that the universe is in one microstate seems a fundamental truth.

Anonymous

>>“So how can it be that the one has low entropy and the other high? It’s because in the early universe the density of the matter is high, and this means the gravitational force is strong. And the gravitational force is attractive, so it wants to draw the stuff together. An even distribution of matter when gravity is strong is incredibly unlikely. It’s unstable. It wants to clump. This even distribution therefore had low entropy. At the end of the universe, however, matter will be very thinly distributed, and gravity will be weak, so it doesn’t want to clump again. This is a very likely situation. So the entropy is high.”<< So you say, with strong gravity, an even distribution has a low entropy. And later, with a weak gravity, an even and thin distribution means a high entropy. – Is this really logical? The understanding is much simpler if we do not follow Einstein. Because then, in a constant and extended space from the beginning, matter fills initially only a small portion of the space, and this means a low entropy like in your earlier example about molecules. And when later matter fills more and more of the space, entropy increases of course permanently. Isn’t this another example that Einstein's understanding of space / space-time is making things so hard to understand?

Anonymous

Colleen, congratulations, your german is better every week, your cat must be a very good teacher (and you a good learner of course)

Anonymous

Sabine, thank you for writing on this; your comments about entropy being anthropocentric help me understand why the concept was so confusing to me as an undergrad. You write "The sun started out at low entropy... At the end of the universe, however, matter will be very thinly distributed, and gravity will be weak, so it doesn’t want to clump again." Is this because the universe is expanding, and so the same amount of matter is filling a much larger space? I think the sun formed when the universe was already pretty big and matter was already pretty spread out. Where is the line between "strong gravity" and "weak gravity"? How big does the universe need to be, for bits of matter to be so far apart that they no longer feel each others' gravitational pull strongly enough to bother moving together?

Tanj

If entropy controls the direction of time, then we would see the sun 8 minutes in the future as well as 8 minutes in the past, because the relative entropy is the same. But in practice we have never seen even one photon from the future. The arrow of time is built into the source-destination flow at the level of single photons, not statistics of a swarm. If anything, the flow of photons gives direction to the flow of entropy, since entropy can only increase through the flow of energy.

Anonymous

Hah. I thought she was doing what *I* wanted her to, when I told her what I wanted of her.

Anonymous

So as I understood, the low entropy in the early universe with its homogeneity in matter distribution is the consequence of strong gravity, that's in GR just the curvature of space-time, that's caused by the matter. Sounds like circular reasoning. But there's another ingredient, the expansion, lambda, cosmological constant, or however you call it. The forming of structures of low entropy was guided by the interaction of both. Not sure, if my description lacks.

Anonymous

Thank you for your reply. I envy you cause of your language-gifted cat. Have a nice week.

Anonymous

Was the distribution of matter homogeneous in the early universe? Is the Big Bang process sufficiently understood to say so? And wasn't the curvature of space infinite at the moment of the Big Bang? And how can a space with infinite curvature develop? It all sounds pretty mysterious to me. But the formation of structures was clearly not caused by anything other than gravity. Regardless of the space formation. At least after the process of nucleogenesis. And how we understand the expansion described by lambda depends on our general understanding of spacetime. If we follow the thinking of Hendrik Lorentz and not Albert Einstein, space does not expand. Only the matter created or released in the Big Bang moves away like in a normal explosion. This is a simple but well-functioning understanding of cosmogenesis.

Anonymous

Science does not know, and predictively can never know about the early beginning. Most astrophysicists don't trust in a singularity at the beginning and invited inflation to get an isotropic and homogenous distribution.

Anonymous

I also believe that most astrophysicists do not trust a singularity in the beginning and later. And that's my opinion too: Nature has no infinities. - And here I sharply criticize Einstein's GRT, which has singularities and therefore also infinities. Lorentz' theory of relativity, on the other hand, does NOT have this.

Anonymous

First, very good explanation of entropy. Thermodynamics was always difficult. Second, isn't the "reversible arrow of time" thing just a "lost in math" thing? Gravity, space-time, in the real world provides nothing for masses to spontaneously move away from one another while the math might; A broken cup smashing on the floor spontaneously reversing direction to be reformed is not only highly improbable but impossible because that's simply not how space-time works with masses.

Anonymous

Dr Hossenfelder, your comment on macrostates struck me as novel and hugely interesting. Would you please consider elaborating on it in a future video? "I believe that as the universe gets older and entropy increases according to us, new complex systems will emerge that rely on different macrostates, macrostates that we ourselves could never use. And for those complex systems, call them living beings, the entropy will be small again. So life will go on, but in a form very different from us."