The brightness of a star or other celestial body is called its magnitude. On the magnitude scale, bright stars have low numbers and faint stars have high ones. This might seem the wrong way round, but it’s like exam grades or race winners – coming first is better than coming 14th.
Originally, the brightest stars were 1st magnitude and the faintest visible with the naked eye were 6th magnitude, but when it came to putting the scale onto a mathematical basis it was decided to make a brightness difference of 100 times equal to five magnitudes. This matches the traditional scale quite well, but it means that the very brightest stars now have negative magnitudes. So the brightest star in the sky is Sirius, at magnitude –1.47, for example.
There is no perfect non-varying standard star to provide a zero point, as there is with the freezing point of water for the zero of the temperature scale, although Vega is regarded as being very nearly magnitude zero.
The naked-eye limit (known as the limiting magnitude) is about magnitude 6.5, though this varies with the quality of your sky and your vision. In average country skies it is about 5.7, and in suburbs you will be hard pressed to see anything fainter than about magnitude 4.5. With binoculars you can see stars about three or four magnitudes fainter, and with a medium-sized telescope you might get down to about magnitude 14 on a good night in the country.
The magnitudes of some stars in the spring sky