Bad Astronomy | Measurements of the expansion of the universe show large differences

There is a problem with the universe.

Or there may be a problem with the way we observe it. Either way, there is something strange.

In short, the universe is expanding. There are a lot of different ways to measure that expansion. The good news is that these methods are all used about same number before. The bad news is they don’t get it exactly same number. One group of methods is given a number and another group is given a different number.

This discrepancy has been around for a while and is not getting any better. In fact, it is getting worse (as astronomers like to say, there is growing tension between the methods). The big difference between the two groups is that one set of methods looks at relatively nearby things in the universe, and the other at very distant things. Either we are doing something wrong, or the universe is doing something far away other than it is near here.

A new paper just published uses a clever method of measuring expansion by looking at nearby galaxies, and what it finds is right in line with the other methods of “near objects”. What may or may not help.

Okay, let’s back up … we’ve known for a century or so that the universe is expanding. We see galaxies all moving away from us, and the further away a galaxy is, the faster it appears to be moving. As far as we can tell, there is a close relationship between the distance of a galaxy and how fast it appears to be moving away. So, say, a galaxy of 1 megaparsec (Mpc for short) road can move away from us at 70 kilometers per second, and a twice as far (2 Mpc) moves twice as fast (140 km / sec).

That ratio seems to hold true for long distances, so we call it the Hubble constant or H.0 (pronounced “H nothing”), after Edwin Hubble who was one of the first to propose this idea. It’s measured in the odd units of kilometers per second per megaparsec (or speed per distance – something moves faster the farther away).

Methods that view objects closer, such as stars in nearby galaxies, exploding stars, and the like, get H.0 about 73 km / sec / Mpc. But methods that use more distant things, such as the cosmic microwave background and acoustic baryon oscillations, get a smaller number, more than 68 km / sec / Mpc.

They are close, but they are not the same. And since the two methods all seem internally consistent, that’s a problem. What is happening?

The new paper uses a cool method called fluctuations in the brightness of the surfaceIt’s a nice name, but it’s an idea that’s actually intuitive.

Imagine standing at the edge of a forest, right in front of a tree. Because you are so close, you will only see one tree in your field of vision. Step back a bit and you will see more trees. Back up further and you can see even more.

Same with galaxies. View a nearby one with a telescope. In a particular pixel of your camera, you might see ten stars, all faded into that one pixel. For statistics alone, another pixel can see 15 (it is 50% brighter than the first pixel), another 5 (half as bright as the first).

Now look at a galaxy that is the same in every way, but twice as far away. In one pixel you may see 20 stars, and in others you will see 27 and 13 (a difference of ~ 35%). At 10 times the distance, you see 120, 105, and 90 (about a 10% difference) – notice that I am way to simplify this here and just come up with numbers as an example. The point is, the farther away a galaxy is, the smoother the brightness distribution is (the difference between pixels gets smaller compared to the total of each pixel). Not only that, it’s smoother in a way that you can measure and assign a number to it.

In reality it is more complicated than that. If a galaxy is busy making stars in a section, the numbers are thrown out, so it’s best to look at elliptical galaxies, which haven’t made new stars for billions of years. The galaxy needs to be close enough to get good statistics, which limits this to those maybe 300 million light years away and closer. You also have to consider dust and background galaxies in your images, and star clusters, and how galaxies have more stars toward their center, and … and … and …

But these are all known and fairly easy to correct.

When they did all this, they got the number from H0 was (drum roll …) 73.3 km / sec / Mpc (with an uncertainty of about ± 2 km / sec / Mpc) in line with other nearby methods and very different from the other group using remote methods.

In a way that is expected, but again, this gives credence to the idea that we are missing something important here.

All methods have their problems, but their uncertainties are quite small. Either we really underestimate these uncertainties (always possible but slightly unlikely at this point) or the universe is behaving in a way that we did not expect

If I had to bet I would go for the latter.

Why? Because it has done this before. The universe is tricky. We have known since the 1990s that the expansion has deviated from a constant. Astronomers saw that very distant exploding stars were always further away than a simple measurement indicated, leading them to believe that the Universe is expanding faster now than it used to, which in turn led to the discovery of dark energy – the mysterious entity that is the Universal expansion.

When we look at very distant objects, we see them as they were in the past, when the Universe was younger. If the expansion rate of the universe was different thereafter (say 12-13.8 billion years ago) than it is now (less than a billion years ago) we can get two different values ​​for H.0Or maybe different parts of the universe are expanding at different rates.

If the expansion rate has changed that has profound implications. It means that the universe is not the age we think it is (we use the rate of expansion to trace back the age), which means it is a different size, which means the time it takes for things to happen , is different. It means that the physical processes that took place in the early Universe took place at different times, and perhaps other processes are involved that affect the rate of expansion.

So yes, it is a mess. Either we don’t understand how the universe is behaving well enough or we don’t measure it properly. Either way, it’s a huge pain. And we just don’t know which one it is.

This new article makes it even more likely that the discrepancy is real, and that is the fault of the universe itself. But it is inconclusive. We have to go through with this, keep getting rid of the uncertainties, keep trying new methods, and hopefully at some point we’ll have enough data to point to something and say, “AHA!

That will be an interesting day. Our understanding of the cosmos will take a giant leap when that happens, and then cosmologists will have to find something else to discuss. What they will do. It’s a big place, this universe, and there’s plenty to do about it.


A parsec is a unit of length equal to 3.26 light years (or 1 / 12th of a KesselIt’s a weird unit, I know, but it has a lot of historical significance and is connected to many of the ways we measure distance. Astronomers looking at galaxies like to use the distance unit of megaparsecs, where 1 Mpc is 3.26 million light years. That is slightly longer than the distance between us and the Andromeda galaxy.

Source