Precision is the measure of how closely a set of readings will be to a reference line that passes through the middle of all the points.
Precision differs from the term accuracy, since an instrument that is able to measure precisely, may not be able to measure accurately due to an offset or some full scale shift.
A sensors measurement error is normally defined as precision because tight electrical tolerances are difficult to achieve with an analog output signal. Once the zero and full scale points have been ascertained, device scaled via a digital panel meter or other analog to digital converter instrument, the error can then be expressed as accuracy rather than precision.
Featured high accuracy measurement products
High Accuracy, Precision & Resolution Pressure Gauges - Explore high accuracy pressure gauges with superior precision & digital resolution, ideal as secondary standards for calibration labs & ISO 9000 procedures.
Material tension and compression testing machine pressure sensor with 1,500 psig range 4-20mA out - Ensure accurate force control in your material testing machine with this reliable and robust 1,500 psi pressure sensor.
Glossary of Accuracy technical terms
- Accuracy
- BSL – Best Straight Line
- Compensated Temperature Range
- Digital Compensation
- g Effect
- Hysteresis
- LHR – Linearity, Hysteresis and Repeatability
- Long Term Stability/Drift
- NL – Non-Linearity
- PPM – Parts Per Million
- Pressure Hysteresis
- Repeatability
- RTE – Referred Temperature Error
- Secondary Pressure Standard
- TEB – Temperature Error Band
- TEB – Total Error Band
- Temperature Compensation
- Temperature Error
- Thermal Hysteresis
- Threshold
- TSL – Terminal Straight Line
- TSS – Thermal Span or Sensitivity Shift
- TZS – Thermal Zero Shift
Help from Accuracy resources
- Pressure Sensor Accuracy Specifications
- Measurement Accuracy
- Determining calibration error of Bourdon tube pressure gauge
- What is the difference between zero offset and zero drift?


