The magnitude scale is a calibration of the brightness of objects in the sky. The first such scale was due to Hipparchus, in about 120 B.C. He referred to the brightest stars visible to the eye as "first magnitude" and those at the limit of naked-eye visibility as "sixth magnitude." With the advent of photometers that could accurately record the amount of light received from an object in the sky, it became possible to put this scheme on a scientific footing. In 1856, the magnitude scale was fixed so that a difference of 5 magnitudes corresponded to a ratio of apparent brightness of 100. This meant that a difference of one magnitude corresponded to a brightness ratio of 2.512.