Calibration is the process of standardizing an instrument by determining its deviation from a desired standard. It is through the calibration process that one obtains the proper correction factors for the transducers deviation. Calibration is essentially the comparison of transducer outputs when compared to a reference standard.
Every transducer is shipped with a sensitivity on its calibration certificate so that the electronic equipment associated with the transducer can be set up correctly.
Note: Honeywell expresses the sensitivity of the transducer by stating a calibration factor, rather than a sensitivity. The calibration factor for a transducer is the transducer’s output value at full scale when the output has been normalized (i.e., zeroed). The line drawn through normalized zero and a transducer’s calibration factor equates to the best fit straight line of the transducer output. Thus, the transducer’s calibration factor in effect establishes the transducer’s sensitivity.