By Ariel Balter, Ph.D. Updated March 24, 2022
David Wood/iStock/Getty Images
A hygrometer is a precision instrument that quantifies the moisture content of the air—specifically, the relative humidity. This metric is crucial for assessing indoor comfort, predicting weather patterns, and safeguarding plants, animals, and building materials. Low humidity levels typically make air feel more comfortable in both cold and hot conditions, while high humidity increases the likelihood of dew, fog, clouds, and precipitation.
Relative humidity is expressed as a percentage, indicating how close the air is to its maximum moisture capacity at a given temperature. It is not the same as absolute humidity, which measures the actual mass of water vapor per unit volume of air. Hot air can hold more moisture than cold air; as temperature drops, water vapor condenses into droplets, forming clouds, fog, dew, or fogged windows. Thus, at a specific temperature, there is a theoretical maximum absolute humidity, and relative humidity represents the actual absolute humidity as a fraction of that maximum.
1. Mechanical Hygrometers – These devices use hair or other fibers that expand or contract with humidity changes. They are simple, reliable, and often used in HVAC systems. 2. Electronic Hygrometers – Sensors made from hygroscopic materials detect changes in electrical resistance or capacitance, providing rapid digital readouts. 3. Psychrometers – Consist of a dry-bulb and a wet-bulb thermometer. The wet-bulb’s cooling effect, driven by evaporation, allows calculation of relative humidity. 4. Dew‑Point Hygrometers – The most accurate type, they cool a mirror or surface until condensation appears. The temperature at which dew forms directly correlates with the ambient relative humidity.
At 20 °C (68 °F), the theoretical maximum absolute humidity ratio is 0.015 (1.5 %). If the actual absolute humidity is 0.005 (0.5 %), the air contains one‑third of the moisture it could hold at that temperature, yielding a relative humidity of roughly 33 %.