## Absolute magnitudes and distance¶

The apparent magnitude is an observed quantity which depends on where in the universe we make our measurement. This is because the flux of light that we measure is inversely proportional to the square of the distance between the observer and the emitter. If we want to relate this to the intrinsic luminosity of the object, we need to know how far away it is. For a perfectly isotropic light emitter, the flux is given by

$$f = \frac{L}{4\pi d^2}$$

where $L$ is the intrinsic luminosity of the source and $d$ is the physical distance between the source and the observer.

In the wacky world of magnitudes, the luminosity of a source can be represented by the absolute magnitude, which is defined as the apparent magnitude that an object would have if observed from 10 parsecs (1 parsec $\approx 3\times 10^{18}$ cm $\approx 2\times 10^5$ AU is the distance from the Sun which would cause a parallax of 1 arcsecond to be observed from Earth). For many applications, this distance scale is entirely inappropriate. For instance, 10 pc away from the Milky Way Galaxy makes no sense, as it would still be *inside* the Galaxy. Nevertheless, it's the convention which has been used for nearly everything from the Sun to the most distant galaxies.

The distance can get translated to its impact on magnitudes by computing what is called the distance modulus,

$$\mu = 2.5\log_{10}\left(\frac{d}{10\text{ pc}}\right)$$

Note that the distance modulus (thank goodness) monotonically increases with distance.

If you have a measurement of the magnitude (flux) of a source and its distance, then the absolute magnitude is simply given by

$$M = m - \mu$$

Alternatively, if you know the intrinsic luminosity of a source, then the absolute magnitude is defined analogously to the apparent magnitude, and so the difference in absolute magnitudes is a measure of the ratio of luminosities.

$$M = -2.5\log_{10}\left(\frac{L}{L_\text{ref}}\right)$$

We can use this definition to calculate the luminosity of a source. Often, luminosities are reported in solar units (i.e., $L_\odot \approx 4\times 10^{33} \text{erg s}^{-1}$). The Sun then becomes our new standard of reference, and we can look up the zero-point offsets as the absolute magnitude of the Sun in our particular passband. For example, the absolute magnitude of the Sun in the $B$ band (in the Vega reference system) is about 4.8. If we measure the $B$ band absolute magnitude of a source as described above, then we can use the definition of magnitudes:

$$ M_B - M_{B, \odot} = -2.5\log_{10}\left(\frac{L_B}{L_{B, \odot}}\right)$$

where $M_{B, \odot} = 4.8$, and thus

$$ \frac{L_B}{L_{B, \odot}} = 10^{-0.4 (M_B - M_{B, \odot})}$$