But the Greeks – gifted essay-test-takers who could fill in the blanks as well as any culture before or since – knew that observers could at least count on two stellar qualities that required no fevered imagination: color and brightness. A man who lived on Rhodes, Hipparchus, devised a scheme for sorting out their luminosities. His system, invented 2,200 years ago, is still in use today: He assigned every star one of six different magnitudes.
Unfortunately, as the entertaining Trojan Horse incident taught the world, we should sometimes beware of Greeks bearing gifts. Hipparchus’ magnitude system was a counterlogical, reverse-order scheme that still plagues beginners. For it’s structured so that brightest stars are assigned the lowest numbers and dimmest stars get the highest. This is like awarding the worst diner “six stars” and the finest cuisine only one. Yet the system endured.
It took William Herschel, who had already blown everyone’s mind by finding the first-ever new planet (Uranus, in 1781), to suggest a way to measure brightness accurately, and Norman Pogson, in the middle of the 19th century, to spell it out. Both had noticed that what the eye sees as equal intervals of brightness is actually equal ratios. In other words, to make an impression on our sense organs, you’ve got to whack them with dramatic differences. A pair of socks that seem “slightly” smellier than another are probably twice as smelly. So too with brightness.
Turns out that a five-magnitude jump – from a “first magnitude” to a “sixth magnitude” star, which were the terminals of the Greek system – exactly equals a hundredfold brightness change. This means that each magnitude is two-and-a-half times brighter than the next-higher number.
Astronomers now use photometers and other electronic methods of accurately measuring light, but we still express it the way Hipparchus did. The only real change is that the magnitude system burst through the ancient Greek restriction that merely went from one to six. By this new, more accurate reckoning, a half-dozen stars were promoted all the way up to zero (what a concept: a promotion to zero!), and a few objects had to spill over to even-brighter realms of negative numbers.
Zero-magnitude stars are sparsely sprinkled around the sky; this month check out creamy Capella in the northwest or orange Arcturus in the east. Or take a quick trip to the dazzling negative numbers by visiting the Dog Star, Sirius, blazing in the southwest at magnitude minus-1.5. When the Full Moon is out, you’re looking at magnitude minus-12.6. Then, when the Sun rises, you’re gazing (not too long) at an object of minus-26.5. Only a nuclear blast is brighter, which hopefully no one will ever see again.
But it’s in the opposite direction, in murky depths below the sixth magnitude, to which astronomy turns nearly all its affection. That naked-eye barrier was resolutely crossed in 1609, when Galileo first aimed a telescope skyward. He discovered that telescopes are brightness-enhancing devices more than enlarging-devices. That is, even an unmagnified, one-power peek at the sky would be breathtaking if only we could amplify everything’s brilliance. The Andromeda galaxy, for example, already appears large enough – some six times bigger than the Moon – to look absolutely amazing, if only its brightness could be cranked up.
Telescopes do exactly that: enhance brightness, typically by a factor of a thousandfold; and only then is the now-brightened image magnified, usually by just 100x or so. By focusing extra light into your pupil, even a small $50 telescope takes you to the 12th magnitude, meaning that you’ve increased every star’s apparent luminosity by a factor of 600. Compared with the naked eye, you’ve gained seven magnitudes at a cost of just eight bucks per magnitude.
As telescopes get bigger, the cost-per-magnitude-boost climbs dramatically. A telescope with a two-foot-wide mirror, costing $20,000, will brighten things by just 1.5 magnitudes over a 12-inch ‘scope that costs $2,000. By the time you get to the Keck class, you’re in over your head to the tune of millions of dollars per magnitude. Brightness isn’t cheap.
So the low end is where you find the bargains. An ordinary pair of binoculars dives you down to the 9th magnitude, where nine times more stars appear than can be seen by the naked eye.
Now, if there were just as many stars and galaxies at any one magnitude as at the next, astronomers wouldn’t be so obsessed with plunging into the muddy Land of the Obscure. But even a quick skyward glance reveals a basic celestial fact: There are many more dim stars than bright ones. Indeed, a small telescope shows 50,000 stars, instead of the mere 6,000 seen by the eye alone.
In 1996 an extraordinary 100-hour exposure through the Hubble Space Telescope picked up stars and galaxies of the 30th magnitude. Such an object is a trillion times dimmer than the faintest naked-eye stars. Its light is equal to the glow of a single cigarette – as seen from the Moon.