Magnitude is a number which expresses the relative brightness of a celestial object. The Greek astronomer Ptolemy composed a catalog of visible stars which divided them into six categories. He assigned magnitude "1" to the brightest stars and magnitude "6" to those just barely visible (the telescope had not yet been invented).
In the 1830s, Pogson's Rule was adopted which establised a brightness ratio of 2.512:1 between cardinal magnitudes with larger numbers being dimmer. This was an attempt to reconcile the scientific with the subjective. Basically, this system is still in use.
The "brightness" of a star depends on, among other things, the wavelength of light considered and the sensitivity of the detection device to that wavelength. Consequently, magnitudes are published for blue light, red light, infrared, visual observation, photographic use, etc. Negative numbers are used for objects brighter than magnitude zero.
A strict scientific definition is the photon count per unit time at a specific wavelength assuming a point source and detector. No single celestial object is stable enough to provide a standard. In practice, magnitude is a relative value. Alpha Ursa Majoris, Polaris is magnitude 1.49, Alpha Canis Majoris, Sirius is magnitude -1.47, the Moon is -12.6 and the Sun is -26.8. All these values are visual magnitudes based on the human eye's average sensitivity.