Home Artists Posts Import Register

Content

The search for a single number: the hubble constant, which is the rate of expansion of our universe,  has consumed astronomers for generations. Finally, two powerful and independent methods have refined its measurement to unprecedented precision. The only problem is that they don’t agree. This calls into question some of our most basic assumptions about the universe.

In 1929, Edwin Hubble discovered the universe. He gave us our first incontrovertible proof that there are galaxies outside the Milky Way by measuring the distances to the spiral nebulae. They were many millions of light years from us –far outside the Milky Way and so must be galaxies in their own right. Combined with the Doppler-shift velocity measurements of Vesto Slipher, Hubble revealed that the galaxies are not only receding from us, but they are receding at a rate proportional to their distance.

An impossibly vast universe had been discovered beyond the Milky Way, and at the same time that universe was revealed to be expanding. The galaxies appear to be racing away from us because the intervening space is expanding. We encapsulate the expansion of the universe with a single number called the Hubble constant – H-naught. It tells us how fast the galaxies appear to be retreating from us, dependent on their distance apart. But more fundamentally, H0 tells us the rate of expansion of the universe in the modern era. Ever since the hubble great discovery, the search for H-naught has been the all-consuming obsession of thousands of astronomers across the generations.

And understandably, the rate of expansion of the universe, combined with the gravitational effect of the matter and energy it contains, can be used to determine its entire expansion history, from the Big Bang to its final fate. And it’s fundamental for interpreting our observations of the distant universe, whose light has traveled billions of years through this expanding cosmos. So you can imagine the alarm when the two most powerful methods used to define this fundamental parameter gave different results.

But before we get to that, let’s talk about the great quest to measure the hubble constant. Until the new millennium, the best we could do was to estimate H0 to within a factor of two – somewhere between 50 and 100 km/s/Mpc. These strange units warrant some explanation. Kilometers per second – that’s for the recession speed of a given galaxy – Megaparsecs is for its distance, with 1 Mpc being around 3.3 million light years. If the hubble constant were 75 km/s/Mpc then for every 1 Mpc distance, we expect the galaxy to be retreating from us at additional 75 km/s.

Historically, measurement of the hubble constant means measuring the recession velocity and distance for as many galaxies as possible. The velocity part is relatively easy – just do what Vesto Slipher did and measure redshift. This is the lengthening of the wavelengths of light from that galaxy, which was stretched as it traveled to us through an expanding universe. But distance… that’s tricky. Hubble used Cepheid variables: giant stars are in the last phases of their lives. They pulsate with a period that’s related to their true brightness, as discovered by Henrietta Leavitt. Measuring Cepheid periods in other galaxies gave Hubble their true brightnesses, as though undimmed by distance. Cepheids became what we call standard candles: objects of known luminosity, whose observed brightness therefore tells us their distance.

But this calculation involves assumptions and uncertainties. For one thing, the Cepheid period-luminosity relationship first had to be calibrated based on nearby Cepheids, whose distances can be figured using stellar parallax – tracking their tiny motions on the sky as the Earth orbits the Sun. This stepwise determination of astronomical distances is called the cosmic distance ladder. With each step on the ladder, uncertainties compound. Add to this our uncertainties in the behavior and observation of Cepheids themselves and the precise measurement of the hubble constant has been a slow, laborious process.

As larger telescopes and more expansive surveys were completed, we gradually whittled down the errors in H0. An important advance was the development of new standard candles. Cepheids are good, but can only be seen out to a certain distance. Supernovae can be seen much further, and type-1a supernovae are the key. These result when white dwarfs – ancient remnants of dead stars – absorb too much material from a binary partner. Runaway fusion causes them to detonate. The resulting explosion has a highly predictable brightness, making them excellent standard candles.

In the 1990s astronomers were using these supernovae to better nail down the hubble constant. They inadvertently discovered that the expansion of the universe is actually accelerating, revealing the existentence of dark energy. One of the Nobel Prize winning researches behind this discovery is Adam Reiss. Reiss has continued the quest to refine our measurement of H0 to ever-greater precision. A big part of his work is to improve the calibration of Type-1A supernovae as standard candles. Reiss’s Supernova H0 for the Equation of State – SHoES -  project uses the Hubble Space Telescope to match old supernova observations with new, more reliable Cepheid variables. By improving this rung on the cosmic distance ladder, all past supernova distances are also improved.

start here vv

Reiss and team have now narrowed the hubble constant down to 73.5±1.7 km/s/Mpc. That 2%-ish uncertainty is a hell of a lot better than the old factor of 2 uncertainty. So where’s the crisis? In order to fully believe a measurement like this, we’d prefer it to be made through independent methods. The SHoES project measures the recession of galaxies up to around 2 billion light years away – so it’s a more-or-less direct measure of the current expansion rate. But there’s another way to go. What if we could measure the expansion rate of the universe at the very beginning? Then we could figure out what its current expansion rate should be given our best understanding of all of the gravitational influences that affected that expansion since the beginning. So we’d better hope that it gives the same result or there’s a big problem of either our supernova measurements with our understanding of how the universe evolved. Spoiler: there’s a problem.

There’s enough reason to try calculate H0 from observation of the early universe: it’s that the observation I’m referring to is far more reliable than Cepheids and supernovae. I’m talking about the cosmic microwave background radiation. The CMB.

This is a topic we’ve been over, so for now just the TL;DR: the cosmic microwave background is the remnant heat-glow of the universe’s initial hot, dense state, released around 400,000 years after the Big Bang when the universe had finally cooled down enough to become transparent to light. We still see it today, now stretched by a factor of 1100 by its near-14-billion-year journey through an expanding universe. This is the map of the CMB across the entire sky, created by the Planck satellite. The speckles are tiny differences in temperature corresponding to tiny differences in density – the blue regions are a factor 1 in 100,000 cooler than the red regions, and also slightly more dense. These over-densities would go on to collapse into the vast clusters of galaxies of the modern universe.

So how can the CMB tell us the hubble constant? The key is the sizes of those speckles. In the era just before the release of the CMB, matter and light were trapped together. Matter wanted to collapse under its own gravity while light generated a powerful pressure to resist that collapse. These counteractive forces produced oscillations, really vast sound waves, that rippled across the universe. These are the baryon acoustic oscillations, and they occurred on all different size scales, sloshing between high and low density over 400,000 years. Then, the release of the CMB meant that light and matter were no longer coupled together, and so the oscillations stopped. The state of the oscillations at the moment of that release is imprinted on the CMB in these speckles.

We usually show the distribution of speckle sizes with what we call a power spectrum, which basically shows the abundance of speckles of different sizes. The locations of these peaks tell us which oscillation modes just happened to be at their peaks at the moment of the CMB was released. This in turn depends on the density of matter and radiation, as well as the expansion rate of the universe in that early epoch.

So how to get the hubble constant – the current expansion rate – from all this? First you figure out what starting cosmological parameters – what starting combination of matter – both dark and light – radiation, early expansion rate etc. could give the power spectrum observed by Planck. And then you figure out how the universe described by these parameters should evolve to the present day. This sounds involved, but the Planck power spectrum is so rich with information that the Planck team claim to have calculated H0 to even better precision than SH0ES. The problem is, the results don’t agree.

The Planck H0 is 66.9±0.6 km/s/Mpc, compared to the supernova result of 73.5±1.7.

They’re actually remarkably close given we figured them from data at the opposite ends of time. But they also seem irreconcilably different. 3.7-sigma different, in fact, which means a 1-in-7000 chance that that level of difference would have happened through random errors. This is the crisis in cosmology.

This discrepancy first emerged in 2016 when Reiss’s new calibration of the supernova-derived H0 revealed it to be in real conflict with the Planck result from a couple of years earlier. Since calibrations have been improved, results have been re-checked, and independent methods have been used to calibrate the supernova as standard candles. The difference is real, and in fact the error bars are only getting smaller. OK, before we declare all cosmology broken, let’s think about the two main possible sources of this discrepancy.

First: there are unknown systematic sources of uncertainty in either the supernova or Planck measurements – biases that are driving one or the other to be too high or too low. Perhaps we don’t understand Cepheid variables like we thought, or perhaps gravitational lensing alters the Planck speckles differently to how we thought. Ongoing efforts are ruling out systematic errors one by one, but it’s possible there’s still something we haven’t thought of.

Second: there’s some unknown physics that needs to be taken into account for the CMB calculation. This is the most exciting possibility. There are a few options, so let’s start a new list:

One: a new type of very fast-moving particle in sufficient numbers would skew the energy balance of the early universe and mess up the calculation. That particle could be the sterile neutrino – a hypothetical non-interacting neutrino that isn’t part of the standard model.

Two: dark matter behaves differently to how we thought. Perhaps it interacts more strongly with matter and radiation, which would shift the sizes of those CMB speckles.

Three: dark energy isn’t constant. The current calculations assume that dark energy is described by the cosmological constant, which by definition doesn’t change. But if dark energy increases, that could explain why we observe a higher H0 in the modern universe than is predicted by extrapolating from the early universe.

The answer will depend on whether the more correct the hubble constant comes from Planck or SHoES. New observations and new telescopes will refine these numbers even further. Independent methods – like using gravitational lensing and gravitational waves – will weigh in on one side or the other. Perhaps the uncertainties will be refined and the two results will converge. That’d be cool; the near century-long quest to measure the expansion-rate of the universe will be concluded. Or perhaps the discrepancy will persist. That’d be even cooler; we’ll have a new tool to investigate the mysterious physics of dark energy, dark matter, or of unknown particles beyond the standard model. For now we continue our obsessive quest for H0, and for what it will tell us of origin and fate of our expanding space time.


Comments

No comments found for this post.