The brightness of a celestial object, measured on a scale in which lower numbers mean greater brightness. The magnitude system stems from the ancient Greeks who ranked stars from first to sixth magnitude: those of first magnitude being the first to appear after sunset, those of sixth magnitude being at the limit of naked-eye visibility in a dark sky. In the 19th century, when it became possible to measure accurately the relative brightness of stars, the system was put on a strict quantitative footing by the English astronomer Norman Pogson (1829–1891). On this new scale (known as the Pogson scale), defined so that most of the traditional magnitudes of stars stayed roughly the same, a difference of one magnitude corresponds to a change in brightness by a factor of 2.512, while a jump of 5 magnitudes equals a brightness change of exactly 100-fold.
Apparent magnitude measures how bright an object looks from Earth. Absolute magnitude measures an object's intrinsic brightness and is defined as the apparent magnitude an object would have if viewed from a distance of 10 parsecs (32.6 light-years). Bolometric magnitude measures brightness over all wavelengths, not just those of visible light.
Related category ASTRONOMICAL QUANTITIES
Home • About • Copyright © The Worlds of David Darling • Encyclopedia of Alternative Energy • Contact