Monday, October 05, 2009

Olber's Paradox

I was walking in the woods with Clare yesterday afternoon (pictured) when apropos nothing at all I mentioned the marvel of the night sky.

If you look up on a clear night you can see the stars of course, but most of the sky is black. From this a profound conclusion emerges.

"If the universe," I said, "was infinite in space and time, then in every direction you looked in the sky, your line of sight would intersect with the surface of a star. So the night sky would be uniformly bright, not dark at all. This is called Olber's Paradox."

She thought for a moment. "Why exactly? The stars that are really far far away are too faint to see. The darkness of the night sky actually proves nothing."

Good point. There are lots of stars out there fainter than magnitude 6 which we don't see with the naked eye at all. Collapse of my argument and I resolved to check when I got home.

According to the Wikipedia article, the explanation is that in a uniform infinite universe, as you increase the distance from your eye by a given amount, the luminance from each star does indeed go down, but the number of stars goes correspondingly up. It sounds convincing in a hand-wavy way, but can we make this more precise?

Suppose we take a particular line of sight - a one dimensional line from your eye to a spot in the infinite sky and assume the stars are equally spaced along this line.

So the first star is at a distance of one "unit" (which might be a thousand light years), the next strung along this eye-line is at a distance of two thousand light years, the next three thousand ... and so on. Suppose each star send one unit of brightness to your eyes. Then the inverse-square law total brightness from this infinite bead of stars will be:

Total brightness = 1 + (1/4) + (1/9) + ... + (1/n2) + ...

OK. So what's the sum of this infinite series? I sat down with a pen and paper and tried to work it out: it's surprisingly hard. I did some approximations and guessed it was just over one and a half.

Wikipedia tells us that this was a legendary problem in early-modern times - the Basel problem - solved by the famous Euler in 1735. The answer is π2/6 = 1.65 approx.

But this doesn't really solve the problem. If the star was pretty faint in the first place, then all of its further-away clones only make it 65% brighter. You still wouldn't see it.

What we're not capturing is the increase in the number of stars in a given solid angle as we project the eyeline farther and farther away. For a given patch of sky, the area at a distance r from your eye is proportional to r2 - think of the area of a sphere, 4πr2. So at a distance r we have to consider not one star, but r2 stars. The true brightness you would see is:

Total brightness = 1 + 4(1/4) + 9(1/9) + ... + n2(1/n2) + ...

= 1 + 1 + 1 + ...

So the correct statement of Olber's paradox in an infinitely-old, infinitely-sized universe is that the night sky is infinitely bright.

I think we would have noticed.

Darkness tells us the universe had a definite beginning some while back, or that it's got a finite size, or both. Quite a big conclusion from darkness at night.