Magnitude Scale

When measuring the brightness of objects in the sky, astronomers use the magnitude scale. The lower the value the brighter the object.  The scale was invented by the ancient Greek astronomers who classified all the stars visible to the naked eye into six magnitudes. The brightest stars were said to be of magnitude 1, the next brightest magnitude 2 and so on. The faintest stars visible to the naked-eye were given a  magnitude of 6.

The magnitude scale was standardised  by nineteenth century astronomers  to make an increase in magnitude of 5 a decrease in apparent brightness of a factor 100 and vice versa. So for example, a star of magnitude 6  is  a hundred times fainter than one of magnitude 1. 

For more details see the following video

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.