• Home
  • Chemistry
  • Astronomy
  • Energy
  • Nature
  • Biology
  • Physics
  • Electronics
  • How to Test a Resistance Temperature Detector (RTD) for Accurate Performance

    By Jill Kokemuller, Updated Mar 24, 2022

    Hemera Technologies/AbleStock.com/Getty Images

    Resistance Temperature Detectors (RTDs) rely on the predictable change in a metal’s resistivity as temperature varies. Platinum, with its high and stable resistivity, is the industry’s material of choice. As temperature rises, platinum’s resistance increases, allowing RTDs to translate temperature into an electrical signal with high precision.

    Step 1 – Verify at Room Temperature

    Set your multimeter to the resistance (Ω) setting. Measure across the RTD terminals. A typical 1‑Ω‑at‑25°C RTD should read about 110 Ω at 25 °C. Slight variations are normal depending on the specific alloy or wire gauge.

    Step 2 – Test in Ice Water

    Submerge the RTD in ice water and wait a few minutes for thermal equilibrium. The resistance should drop to roughly 100 Ω. A lower reading confirms the sensor’s responsiveness to colder temperatures.

    Step 3 – Test in Boiling Water

    After allowing the RTD to return to room temperature, place it in boiling water. The resistance should rise above the 25 °C value—typically around 125 Ω for a standard RTD. A higher reading indicates proper functionality.

    Tools Needed

    • Multimeter with accurate Ω measurement
    • Ice water (0 °C)
    • Boiling water (100 °C)

    TL;DR (Quick Summary)

    Resistance ratio = 1 + (α × ΔT). For a platinum RTD, α ≈ 0.00385 Ω/Ω/°C, so R/R₀ = 1 + 0.00385 × ΔT.

    Science Discoveries © www.scienceaq.com