Standard
All the instruments are calibrated during manufacture against a measurement standard. A standard of measurement is a physical representation of a unit of measurement. A standard means a known accurate measure of a physical quantity. Other physical quantities are compared with the standards to obtain their values.
A unit is realised by reference to an arbitrary material standard or to a natural phenomenon including physical and atomic constants. For example, the fundamental unit of mass i.e. kilogramme, is defined as the mass of a cubic decimeter of water at its temperature of maximum density of 4 °C. This unit is represented by a material standard i.e. by the mass of the international prototype kilogramme, consisting of a platinum-iridium alloy cylinder which is preserved at the International Bureau of Weights and Measures at Severes, near Paris and is the material representation of the unit kilogramme. The unit of length i.e. metre is represented by the distance between two fine lines engraved on gold plugs near the ends of a platinum-iridium alloy at 0 °C and mechanically supported in a prescribed manner. Similarly, for all the units including fundamental and derived units, different standards have been developed. All these standards are preserved at the International Bureau of Weights and Measures at Severes, near Paris.
The different types of standards of measurement are classified as
1. International Standards
1. International Standards
2. Primary Standards
3. Secondary Standards
4. Working Standards
3. Secondary Standards
4. Working Standards
1. International Standards
International standards are defined as mentioned above as the international agreement. These standards are maintained at the International Bureau of Weights and Measures and are periodically evaluated and checked by absolute measurements in terms of fundamental units of Physics. These international standards are not available to ordinary users for calibration purposes. For the improvements in the accuracy of absolute measurements, the international units are replaced by absolute units in 1948. Absolute units are more accurate than international units.
2. Primary Standards
These are highly accurate absolute standards, which can be used as ultimate reference standards. These primary standards are maintained at National Standard Laboratories in different countries. These standards representing fundamental units as well as some electrical and mechanical derived units are calibrated independently by absolute measurements at each of the national laboratories. These are not available for use, outside the national laboratories. The main function of the primary standards is the calibration and verification of secondary standards.
3. Secondary Standards
As mentioned above, the primary standards are not available for use outside the national laboratories. The various industries need some reference standards. So, to protect highly accurate primary standards the secondary standards are maintained, which are designed and constructed from the absolute standards. These are used by calibration laboratories in industries and are maintained by the
particular measurement industry to which they belong. Each industry has its own standards. For example, the national bureau of standards has been set. up, National. Secondary Standards in the United States of America. The particular industry maintaining the secondary standards is responsible for the calibration of these standards. These standards are periodically sent to the national standard laboratories for calibration. The national laboratories sent them back to the industries with the certification, compared with the primary standards.
particular measurement industry to which they belong. Each industry has its own standards. For example, the national bureau of standards has been set. up, National. Secondary Standards in the United States of America. The particular industry maintaining the secondary standards is responsible for the calibration of these standards. These standards are periodically sent to the national standard laboratories for calibration. The national laboratories sent them back to the industries with the certification, compared with the primary standards.
The certification indicates the measuring accuracy of secondary standards in terms of the primary standard.
4. Working Standards
These are the basic tools of a measurement laboratory and are used to check and calibrate the instruments used in the laboratory for accuracy and performance. For example, the resistor manufacturing industry maintains a standard resistor in the laboratory^ for checking the values of the manufactured resistors. The manufacturer verifies that the values of the manufactured resistors are well within the specified accuracy limits. Thus, the working standards are somewhat less accurate than the primary standards.
Thus, the working standards are used to check and calibrate general laboratory instruments for accuracy and performance.
Calibration
The calibration is the procedure for determining the correct values of measurement by comparison with the standard ones. The standard device with which comparison is made is called a standard instrument. The instrument which is unknown and is to be calibrated is called a test instrument. Thus in calibration, the test instrument is compared with the standard instrument.
Calibration Methodology
There are two fundamental methodologies for obtaining the comparison between test instruments and standard instruments. These methodologies are,
1. Direct comparisons
2. Indirect comparisons
1. Direct comparisons
2. Indirect comparisons
Direct Comparison Calibration Methodology
In a direct comparison, a source or generator applies a known input to the meter under test. The ratio of what the meter is indicating and the known generator values gives the meter's error. In such a case meter is a test instrument while the generator is the standard instrument. The deviation of the meter from the standard value is compared with the allowable performance limit. If the meter deviation exceeds the allowances, then the meter is considered to be out of tolerance.
Fig.: Meter Calibration
With the help of direct comparison, a generator or source also can be calibrated. In such calibration, the meter acts as a standard instrument while the generator acts as a test instrument.
The transducer converts the signal from one form to another. Hence if the transducer is to be calibrated using direct comparison then both generator as well as meters the standard instruments while the transducer act as a Test Instrument. The transducer characteristics are then expressed as a ratio between the device's output to its input in the appropriate input and output measurement units.
Indirect Comparison Calibration Methodology
In the indirect comparison, the test instrument is compared with the response of a standard instrument of the same type i.e. if the test instrument is a meter, the standard instrument is also a meter, if the test instrument is a generator, the standard instrument is also a generator and so on.
If the test instrument is a meter then the same input is applied to the test meter as well as a standard meter. Thus the indication of the test meter is compared with the indication of the standard meter for the same stimulus or input. Care must be taken that during the comparison process, the source supplying input to both must have the required level of stability. The magnitude of input is not important.
Fig.: Meter Calibration
In the case of generator calibration, the output of both the generators, test as well as standard, are set to the same nominal levels. Then a transfer meter is used which measures the outputs of both standard and test generators. From the linearity and the resolution of the transfer meter, the generator is calibrated.
Fig.: Generator Calibration
The transducer calibration using the indirect method is similar to the generator calibration. The same input is given to the test transducer and the standard transducer. These are measured using the standard transfer meter. The sensitivity of the test transducer is obtained by multiplying the determined ratio of the two outputs by the known Sensitivity of the standard.

Fig. : Transducer Calibration
Calibration Curve:
Comments
Post a Comment