* Smaller magnitude = Brighter object
* Larger magnitude = Fainter object
Here's the breakdown:
* Apparent magnitude is how bright a star *appears* to us on Earth. This depends on the star's intrinsic brightness (luminosity) and its distance from Earth.
* Absolute magnitude is a standardized measure of a star's intrinsic brightness. It's defined as the apparent magnitude the star would have if it were placed at a distance of 10 parsecs (about 32.6 light-years) from Earth.
So, if a star has an absolute magnitude greater than its apparent magnitude, it means the star is *closer* to Earth than 10 parsecs. This makes it appear brighter than it would if it were at the standard distance.
Here's an example:
* A star with an apparent magnitude of +5 and an absolute magnitude of +2 is closer than 10 parsecs.
* A star with an apparent magnitude of +2 and an absolute magnitude of +5 is farther than 10 parsecs.
It's easy to get confused by the opposite relationship between magnitude and brightness. Just remember that a smaller magnitude always means a brighter object, whether we're talking about apparent or absolute magnitude.