- Published on
How Accuracy Is Expressed in Digital Measuring Instruments
- Authors

- Name
- Daisuke Kobayashi
- https://twitter.com
When describing the accuracy of the measuring instruments we build, we sometimes use notation like this.
2.0% rdg. ± 2 digit
I often forget what it means, so I am leaving a note here.
rdg. -> short for reading, the value measured by the instrument digit -> the instrument's resolution (the smallest unit it can display)
Suppose an instrument displays one digit after the decimal point, the measured value is 100.0, and the accuracy specification is 2.0% rdg. ± 2 digit.
In that case, the error is ±(100.0 x 0.02 + 0.1 x 2) = ± 2.2.
The key part of this notation is the digit term. Because it is tied to the instrument's resolution, the smaller the output, the larger the allowable range contributed by the digit portion. The signal-to-noise ratio generally gets worse as the output becomes smaller, so adding the digit term makes it easier for the manufacturer to guarantee performance in the low-output range.