Home Artists Posts Import Register

Content

I have good news and bad news. Bad news first: two years ago we reported on the Crisis in Cosmology. Since then, it’s only gotten worse. And actually the good news is also that the crisis in cosmology has actually gotten worse, which means we may be onto something!

The most exciting thing for any scientist is when something they thought they knew turns out to be wrong. So it’s no wonder that many cosmologists are starting to  get excited by what has become known as the Hubble tension, or the crisis in cosmology. The “crisis” is the fact that we have two extremely careful, increasingly precise measurements of how fast the universe is expanding which should agree with each other, and yet they don’t. We first reported on the growing hints of this tension 2 years ago. Back then, the most likely explanation was that new, refined measurements would bring the numbers into agreement. So far that’s not been the case. But just recently, ONE of these methods received a massive refinement due to the Gaia mission and its unprecedented survey of a billion stars in the Milky Way. And guess what - the tension is now even tenser. So is it time to rethink all of cosmology?

Before we can even think about that, we should probably do a refresher on what the issue actually is. So you may have heard that the universe is expanding. Space on the largest scales is stretching, throwing galaxies apart from each other. We’ve talked about how astronomers first figured this out. Long story short - when a distant galaxy’s light travels to us through the expanding universe it gets stretched out - its wavelength increases. If we also know how far that light traveled - the distance to the galaxy - then we can figure out the rate at which space is expanding - at least along the path to that galaxy. Combine the redshifts and distances of many, many galaxies and you have the expansion rate of the universe, typically expressed as Hubble’s constant

after Edwin Hubble, the guy who first properly measured this back in 1929.

By comparison, getting the distances is much, much trickier than getting the redshifts. It depends on a long chain of distance measurements that we call the cosmic distance ladder. First you measure distances to objects in the solar system - then use those to measure distances to nearby stars, then more distant stars, then nearby galaxies, then distant galaxies,

etc.

If one of those distance measures is wrong, all the subsequent rungs of the distance ladder are off.

Hubble’s distance measurements were based on a method pioneered by Henrietta Swan Leavitt. She developed one of the first so-called standard candles. These are objects whose true brightness or luminosity can be known. Knowing the true luminosity of an object means you can figure out its distance just by observing how its brightness has been dimmed by that distance. Swan Leavitt realized that a type pulsating star known as a Cepheid variable has a rate of pulsation that depends on its luminosity. Measure the pulsation r ate and you know its true brightness, and so can find its distance. And if the Cepheid is in another galaxy, you have the distance to that galaxy also.

Cepheid variables are great standard candles, but they’re just stars, and are too faint to see  beyond a certain distance. In the 1990s two teams of astronomers employed a new type of standard candle - the incredibly bright “type 1a” supernovae that result when a white dwarf star explodes after cannibalizing its binary partner. Using these supernovae to get distances to galaxies halfway across the universe, found something totally unexpected - not only is the universe expanding, but that expansion is accelerating. And so dark energy was discovered - a mysterious and ubiquitous energy that grows as the universe grows, speeding up its expansion.

Dark energy very likely holds deep, deep clues about the fundamentals of reality. With its discovery it suddenly became VERY important to perfect our measurements of the expansion rate - both to confirm dark energy’s existence and to learn of its nature. And this is where our story splits. There are, broadly, two approaches to improving that measurement. One is to double down on the old method - find more type 1a supernova and improve those distance measures. The other is to find a totally independent measurement of the expansion rate. A good reason to do the latter is that the supernova method is a pretty high rung on the cosmic distance ladder - which means if any rung below it is broken, the method fails.

So different teams of astronomers pursued both approaches - and this is where the crisis emerged. One alternative method for getting the expansion rate is to study the oldest light in the universe - the cosmic microwave background. This light was released only a few hundred thousand years after the big bang, and carries with it vast information about the universe’s early state. I’ll leave you to watch our previous video on the subject to see how our map of the CMB using the Planck satellite can give us the expansion rate. The Planck team calculated a Hubble constant of 67.6 km/s/Mpc - let’s not worry about the weird units right now. They also claim an uncertainty of about half a percent, making it the most precise measurement of the expansion rate ever made.

Meanwhile, Adam Reiss, one of the Nobel-winning discoverers of dark energy has doubled down on the supernova method. A couple of years ago his team published a new Hubble constant of 73 and a half.+/- 1.5 km/s/Mpc.

That’s in the same ballpark, but far enough off to raise many, many eyebrows.

One possible explanation for the difference is that the nature of dark energy has changed over time. The Planck team’s Hubble constant assumes that dark energy has had a constant density for the entire age of the universe.

That’s what you expect for the simplest models of what dark energy might be, But if dark energy has HAS changed over time it could explain the discrepancy AND indicate the dark energy is even weirder than we thought. It’s hard to overstate how huge a discovery that would be.

You can see how it might be nice to find out one way or the other if the difference between the Planck and supernova results is real. Most people still think that there are unknown errors that are affecting one or both. For example, the cosmological distance ladder could have a broken rung. The supernova standard candles are calibrated based on distances from our good-ol Cepheid variables in galaxies where both are observed. But those distant Cepheids are in turn calibrated based on Cepheids in our own galaxy, for which we can get distances by a method that’s much more reliable. That method is stellar parallax - and it’s about as direct a method as you can get, short of building a giant space ruler. Ultimately, refining the supernova distance measurements comes down to refining parallax measurements, and that’s what we’ve finally achieved.

You’re already familiar with parallax. Place a finger in front of your eyes and blink left and right. Your finger moves relative to the background, which I guess is me in this case. Move your finger away and the displacement is less. Closer and it increases.

We can use this same trick to measure the distance to stars. As the earth orbits the sun over the course of the year, nearby stars appear to move relative to more distant stars. That’s stellar parallax, and our quest to measure it has been central to understanding our universe for hundreds of years. Prior to the invention of the telescope, the fact that we didn’t see obvious stellar parallax was taken as evidence that the Earth is NOT orbiting the Sun. It turned out that the stars are just so far away that you need careful observations with quite a good telescope to see parallax in even the nearest stars.

And so it was that in 1912 Henrietta Swan Leavitt used parallax measurements of Cepheids in the Milky Way to turn these stars into standard candles and so founded our distance ladder, which ultimately led to the discovery of dark energy. But this feels like a bit of a house of cards - the ladder was entirely dependent on the relatively few Cepheids that are close enough for parallax measurements. Things started to get better when we put telescopes in space - above the blurring effect of Earth’s atmosphere it’s possible to make better position measurements. The Hubble Space Telescope has done great work here, and so has the European Space Agency’s HIPPARCOS satellite, which tracked the motion of 100,000 stars in our local patch of the galaxy

But to really nail down the lowest rung of the distance ladder, we need a lot more Cepheid parallaxes to much greater distances. And that’s what ESA’s Gaia mission has given us. Parked in an orbit just past the moon, Gaia scans the sky year after year, mapping the structure and motion of a good faction of the Milky Way galaxy. Gaia is making the most accurate catalogue yet of parallax measurements, for the nearest brightest stars, it’s 200 times more accurate than any previous measurement.

Gaia has allowed us to recalibrate Cepheid variables as standard candles, which in turned enabled a recalibration of type 1a supernovae - which in turn gave Adam Reiss and team a refined measure of the Hubble constant. So what do you think - do the supernova and Planck results agree? Not in the least. The Gaia-based Hubble constant of 73.2 km/s/mpc seems to confirm the previous type-1a supernova result, now with more surety about the distance ladder it’s based on.

Before we start jumping up and down and yelling about new physics, remember that we’re level-headed scientists. Two independent methods aren’t enough. We need more - and we have some great options that will either to break the tie between the Planck and supernova teams, or to confirm that the difference is real.

We’ve already talked about one of these options. It’s to look for vast ring-like patterns in the way galaxies are scattered across the universe and use those rings as a sort of standard ruler. These “baryon acoustic oscillations” are the fossils of ancient sound waves that reverberated through the hot, dense plasma of the early universe. Now those ripples are frozen into the distribution of galaxies that formed from that matter. The Baryon acoustic oscillations seem to be coming in on the side of the Planck result - a Hubble constant in the high 60s.

Another extremely promising method is gravitational lensing - the bending of light around massive objects due to their warping of spacetime. One manifestation of this is when a distant quasar - a giant, gas-guzzling black hole - happens to be closely aligned behind a more nearby galaxy. Then, that quasar’s light travels multiple paths through this gravitational lens, resulting in multiple images of the quasar from our point of view. Quasars are violent beasts - the maelstrom of gas fluctuates in brightness as it spirals into the black hole. And so we see lensed quasar  images flicker - but they flicker out of sync. There’s a time offset due to the fact that these different paths through the universe have slightly different lengths. By measuring the time-lags in these flickering lenses, we can get a measurement of cosmic distances, and with that a measurement of the expansion rate that’s independent of the cosmic distance ladder. So far we’ve only done this with a small number of lenses

and so the uncertainty is large - but if published results give a Hubble constant in the low 70s - so in agreement with the supernova guys. But this game is about to take off, with upcoming giant surveys set to discover thousands of new lenses that should massively improve this measurement.

Before too long we may even be able to use gravitational waves from merging black holes to measure the Hubble constant. These waves get stretched by the expanding universe, just like light does. But unlike light, they also encode information about the distance they’ve traveled, and so can be used to measure the expansion rate without the cosmic distance ladder. We’re calling these black hole mergers  'standard sirens’, and while the error bars they give are still large, they’ll only get smaller over time.

So that’s where the crisis stands - it’s increasingly clear that there’s a hole in our understanding of the universe - whether it’s a crack in the rung of the distance ladder or something more fundamental about how the universe expands. Scientists love being wrong - because when you find the source of that wrongness, it can only lead to greater understanding - in this case, of the strange forces driving our ever-expanding spacetime.

Comments

Anonymous

Dark matter is matter created by the universe, or better yet, matter or particles created by the expansion of the universe.

Anonymous

Could the perceived accelerated expansion be a result of space/time changing its geometry?