4.1 Principles of calibration
Calibration consists of comparing the output of the instrument or sensor
under test against the output of an instrument of known accuracy when the same
input (the measured quantity) is applied to both instruments. This procedure is
carried out for a range of inputs covering the whole measurement range of the
instrument or sensor. Calibration ensures that the measuring accuracy of all
instruments and sensors used in a measurement system is known over the whole
measurement range, provided that the calibrated instruments and sensors are
used in environmental conditions that are the same as those under which they
were calibrated. For use of instruments and sensors under different
environmental conditions, appropriate correction has to be made for the ensuing
modifying inputs, as described in Chapter 3. Whether applied to instruments or
sensors, calibration procedures are identical, and hence only the term
instrument will be used for the rest of this chapter, with the understanding
that whatever is said for instruments applies equally well to single
measurement sensors.
Instruments used as a standard in calibration procedures are usually
chosen to be of greater inherent accuracy than the process instruments that
they are used to calibrate. Because such instruments are only used for
calibration purposes, greater accuracy can often be achieved by specifying a
type of instrument that would be unsuitable for normal process measurements.
For instance, ruggedness is not a requirement, and freedom from this constraint
opens up a much wider range of possible instruments. In practice, high-accuracy,
null-type instruments are very commonly used for calibration duties, because
the need for a human operator is not a problem in these circumstances.
Instrument calibration has to be repeated at prescribed intervals because
the charac[1]teristics of any
instrument change over a period. Changes in instrument characteristics are
brought about by such factors as mechanical wear, and the effects of dirt,
dust, fumes, chemicals and temperature changes in the operating environment. To
a great extent, the magnitude of the drift in characteristics depends on the
amount of use an instrument receives and hence on the amount of wear and the
length of time that it is subjected to the operating environment. However, some
drift also occurs even in storage, as a result of ageing effects in components
within the instrument.
Determination of the frequency at which instruments should be calibrated
is depen[1]dent upon several
factors that require specialist knowledge. If an instrument is required to
measure some quantity and an inaccuracy of š2% is acceptable, then a certain
amount of performance degradation can be allowed if its inaccuracy immediately
after recalibration is š1%. What is important is that the pattern of
performance degrada[1]tion be
quantified, such that the instrument can be recalibrated before its accuracy
has reduced to the limit defined by the application.
Susceptibility to the various factors that can cause changes in
instrument character[1]istics varies
according to the type of instrument involved. Possession of an in-depth
knowledge of the mechanical construction and other features involved in the
instru[1]ment is necessary
in order to be able to quantify the effect of these quantities on the accuracy
and other characteristics of an instrument. The type of instrument, its
frequency of use and the prevailing environmental conditions all strongly
influence the calibration frequency necessary, and because so many factors are
involved, it is difficult or even impossible to determine the required
frequency of instrument recalibration from theoretical considerations. Instead,
practical experimentation has to be applied to determine the rate of such
changes. Once the maximum permissible measurement error has been defined,
knowledge of the rate at which the characteristics of an instrument change
allows a time interval to be calculated that represents the moment in time when
an instrument will have reached the bounds of its acceptable performance level.
The instrument must be recalibrated either at this time or earlier. This
measurement error level that an instrument reaches just before recalibration is
the error bound that must be quoted in the documented specifications for the
instrument.
A proper course of action must be
defined that describes the procedures to be followed when an instrument is
found to be out of calibration, i.e. when its output is different to that of
the calibration instrument when the same input is applied. The required action
depends very much upon the nature of the discrepancy and the type of instrument
involved. In many cases, deviations in the form of a simple output bias can be
corrected by a small adjustment to the instrument (following which the
adjustment screws must be sealed to prevent tampering). In other cases, the
output scale of the instrument may have to be redrawn, or scaling factors
altered where the instrument output is part of some automatic control or
inspection system. In extreme cases, where the calibration procedure shows up
signs of instrument damage, it may be necessary to send the instrument for
repair or even scrap it.
Whatever system and frequency of
calibration is established, it is important to review this from time to time to
ensure that the system remains effective and efficient. It may happen that a
cheaper (but equally effective) method of calibration becomes available with
the passage of time, and such an alternative system must clearly be adopted in
the interests of cost efficiency. However, the main item under scrutiny in this
review is normally whether the calibration interval is still appropriate.
Records of the calibration history of the instrument will be the primary basis
on which this review is made. It may happen that an instrument starts to go out
of calibration more quickly after a period of time, either because of ageing
factors within the instrument or because of changes in the operating
environment. The conditions or mode of usage of the instrument may also be
subject to change. As the environmental and usage conditions of an instrument
may change beneficially as well as adversely, there is the possibility that the
recommended calibration interval may decrease as well as increase.
No comments:
Post a Comment
Tell your requirements and How this blog helped you.