By Joe Turner – Updated Mar 24, 2022
A calorimeter is a laboratory device that measures the heat exchanged during a chemical reaction or the heat capacity of a substance. The term derives from the Latin word “calor,” meaning heat.
Water’s specific heat capacity is 4.18 J g⁻¹ °C⁻¹, the highest of any common liquid. This means that a large amount of energy is required to raise its temperature, making temperature changes in a calorimeter subtle and easy to measure with standard thermometers.
Because water remains liquid over a wide temperature range, the reaction mixture stays in a stable phase, preventing evaporation or boiling that could skew results. Its high heat capacity also minimizes temperature fluctuations, ensuring accurate energy calculations.
Tap water is inexpensive, widely available, and requires no special handling or disposal procedures, unlike many organic solvents that can be hazardous or costly.
By definition, a calorie is the energy needed to raise the temperature of 1 gram of water by 1 °C. Because water is the reference medium for this unit, its properties make it the cornerstone of calorimetric measurements.
For more detailed information, see the National Institute of Standards and Technology (NIST) data tables or the LibreTexts chemistry section on heat and energy.