Calibration Tolerance Assignment – Balancing Manufacturer and Process Requirements

Keep simple simple!

A simple gauge calibration shouldn't take much time to document.

Calibration tolerance assignment is one of those areas in a calibration program where you can easily get caught up in the process of getting just the right value.  The primary purpose of having a calibration tolerance is to use it as a trigger to notify you in the event that the measurement accuracy of the instrument may be insufficient for the intended use.  For the sake of this article, let’s make the assumption that there aren’t other factors contributing to the error in the measurement (such as significant figures, I/O conversions, vibration, etc).

Assigning a calibration tolerance allows you to be notified (if the unit under test fails to meet tolerance) of the fact that the instrument is operating outside of its normal performance range and in a manner that may begin to significantly affect the usability of the readings you are taking from it.  As such, you need to know, before assigning a  tolerance, at least two things:  1) How well can the instrument be expected to perform, and 2) What is the amount of deviation from the actual value that you can tolerate before it negatively affects your operations.

Typically, you can determine the expected performance of the instrument from the manufacturer specifications.  Most manufacturers have specifications for inaccuracy due to multiple factors and you can combine these for the operating range of interest.  It is typical that within the range of use, this is not difficult to determine and often one component of the inaccuracy may dominate and render the others rather insignificant by comparison.

The inaccuracy that you can actually tolerate may be another matter entirely and is not related to the selected instrument, but rather to the process for which you use it.  Typically a process will have a defined process range that is acceptable.  Operating outside of the acceptable range may affect the final product.  Within the acceptable range for the process, there is a narrower operating range.  The operating range is the range within the readings should be displayed.  If you assume that the process indicators will display within the operating range, then what becomes critical is if your instrument is off by enough to erroneously indicate that the process is operating within the operating range when it is actually operating outside of the acceptable range.  Thus the calibration tolerance should never be set to a value greater than the difference between the operating range and the acceptable process range.

It is further desirable to have some margin for error, and in a typical case you would like to have the calibration tolerance set at about 1/4 of the difference between the operating range and the acceptable process range.  So if your operating range is 10-15 Deg C and the acceptable range is 5-20 Deg C, the temperature indicator should have a tolerance of 1.25 Deg C, which would give you plenty of room to manage the out of tolerance condition.  It is also likely to be much wider than the manufacturer’s instrument tolerance.

There are many other factors to consider, but this is a basic method that can help keep you out of the weeds of the manufacturer specification and over-complicate your tolerance assignment method.  The important thing is that you can justify the reason for your tolerance and that you have an established procedure for how tolerances and intervals are assigned before you enter them into your validated calibration management system.  The CMMS system is only as good as the data you put in there.