By John Brennan
Updated Mar 24, 2022
An oscilloscope records voltage changes over time and displays them graphically, making it an indispensable tool for troubleshooting circuits, conducting research, and teaching electronics. Before each measurement, always verify the instrument’s accuracy—factory settings can drift, and uncalibrated probes lead to misleading data. Calibration involves feeding a known signal into the oscilloscope and adjusting the device until the displayed waveform matches the reference.
Locate the small metal cylinder or knob labeled Probe Adjust on the oscilloscope front panel. This is the built‑in calibration signal source.
Connect the Probe Adjust to Channel 1. Attach the alligator clip to the probe output and the BNC end to the Channel 1 input.
Set the horizontal and vertical scales near Channel 1 until a clean square wave appears on the screen. The square wave should fill the display without clipping.
Fine‑tune the Focus knob until the waveform’s edges are crisp and not blurred.
Measure the peak‑to‑peak voltage. Refer to your vertical scale setting: for instance, if the scale reads 1 V/div, each vertical box represents one volt; if it reads 1 mV/div, each box is one millivolt.
Turn the voltage calibration knob until the displayed peak‑to‑peak voltage matches the value printed beneath the Probe Adjust on the front panel.
Measure the period of the square wave—the time between the start of one peak and the next. Account for your horizontal scale (e.g., 1 s/div means each horizontal box equals one second).
Calculate the frequency by taking the reciprocal of the period (frequency = 1 / period). For a 0.5 s period, the frequency is 2 Hz.
Adjust the frequency calibration knob until the oscilloscope’s displayed frequency aligns with the value listed under the Probe Adjust label.
Initially, setting the correct scales to visualize the square wave can be challenging, but repeated practice makes the process intuitive.