Accuracy of a measurement instrument defines how much a measurement value may deviate from the perfect measured value.
The total error can be broken down into many component errors such as non-linearity, hysteresis, repeatability, temperature errors, stability, zero offset and span offset.
Accuracy is very closely connected with precision, but unlike precision it is constrained by a series of fixed reference points or absolute values that all readings are compared with to define the exactness of the instrument measurement. Typically these comparison absolute values are read from a more accurate instrument ideally with a 10 to 1 ratio difference in accuracy to ensure it does not contribute significantly to the overall error calculation. However in practise it is not always possible to achieve such a high ratio difference in accuracy. When this is the case the overall accuracy should also include the reference instrument uncertainty which better reflects the true accuracy of the device.
Featured high accuracy measurement products
Glossary of Accuracy technical terms
- BSL – Best Straight Line
- Compensated Temperature Range
- Digital Compensation
- g Effect
- Hysteresis
- LHR – Linearity, Hysteresis and Repeatability
- Long Term Stability/Drift
- NL – Non-Linearity
- PPM – Parts Per Million
- Precision
- Pressure Hysteresis
- Repeatability
- RTE – Referred Temperature Error
- Secondary Pressure Standard
- TEB – Temperature Error Band
- TEB – Total Error Band
- Temperature Compensation
- Temperature Error
- Thermal Hysteresis
- Threshold
- TSL – Terminal Straight Line
- TSS – Thermal Span or Sensitivity Shift
- TZS – Thermal Zero Shift
Help from Accuracy resources
- Pressure Sensor Accuracy Specifications
- Measurement Accuracy
- Determining calibration error of Bourdon tube pressure gauge
- What is the difference between zero offset and zero drift?