Skip to main content

Standards and calibration

 Standard

All the instruments are calibrated during manufacture against a measurement standard. A standard of measurement is a physical representation of a unit of measurement. A standard means a known accurate measure of a physical quantity. Other physical quantities are compared with the standards to obtain their values.

A unit is realised by reference to an arbitrary material standard or to a natural phenomenon including physical and atomic constants. For example, the fundamental unit of mass i.e. kilogramme, is defined as the mass of a cubic decimeter of water at its temperature of maximum density of 4 °C. This unit is represented by a material standard i.e. by the mass of the international prototype kilogramme, consisting of a platinum-iridium alloy cylinder which is preserved at the International Bureau of Weights and Measures at Severes, near Paris and is the material representation of the unit kilogramme. The unit of length i.e. metre is represented by the distance between two fine lines engraved on gold plugs near the ends of a platinum-iridium alloy at 0 °C and mechanically supported in a prescribed manner. Similarly, for all the units including fundamental and derived units, different standards have been developed. All these standards are preserved at the International Bureau of Weights and Measures at Severesnear Paris.

The different types of standards of measurement are classified as
1
. International Standards
2Primary Standards
3. Secondary Standards
4. Working
Standards

1International Standards

International standards are defined as mentioned above as the international agreement. These standards are maintained at the International Bureau of Weights and Measures and are periodically evaluated and checked by absolute measurements in terms of fundamental units of Physics. These international standards are not available to ordinary users for calibration purposes. For the improvements in the accuracy of absolute measurements, the international units are replaced by absolute units in 1948. Absolute units are more accurate than international units.

2. Primary Standards

These are highly accurate absolute standards, which can be used as ultimate reference standards. These primary standards are maintained at National Standard Laboratories in different countries. These standards representing fundamental units as well as some electrical and mechanical derived units are calibrated independently by absolute measurements at each of the national laboratories. These are not available for use, outside the national laboratoriesThe main function of the primary standards is the calibration and verification of secondary standards.

3. Secondary Standards

As mentioned above, the primary standards are not available for use outside the national laboratories. The various industries need some reference standards. So, to protect highly accurate primary standards the secondary standards are maintained, which are designed and constructed from the absolute standards. These are used by calibration laboratories in industries and are maintained by the
particular measurement industry to which they belong. Each industry has its own standardsFor example, the national bureau of standards has been set. up, National. Secondary Standards in the United States of America. The particular industry maintaining the secondary standards is responsible for the calibration of these standards. These standards are periodically sent to the national standard laboratories for calibration. The national laboratories sent them back to the industries with the certification, compared with the primary standards.

The certification indicates the measuring accuracy of secondary standards in terms of the primary standard.

4. Working Standards

These are the basic tools of a measurement laboratory and are used to check and calibrate the instruments used in the laboratory for accuracy and performance. For example, the resistor manufacturing industry maintains a standard resistor in the laboratory^ for checking the values of the manufactured resistorsThe manufacturer verifies that the values of the manufactured resistors are well within the specified accuracy limits. Thus, the working standards are somewhat less accurate than the primary standards.
Thus, the working standards are used to check and calibrate general laboratory instruments for accuracy and performance.

Calibration

The calibration is the procedure for determining the correct values of measurement by comparison with the standard ones. The standard device with which comparison is made is called a standard instrument. The instrument which is unknown and is to be calibrated is called a test instrument. Thus in calibration, the test instrument is compared with the standard instrument.

Calibration Methodology

There are two fundamental methodologies for obtaining the comparison between test instruments and standard instruments. These methodologies are,
1. Direct comparisons
2
. Indirect comparisons


Direct Comparison Calibration Methodology

In a direct comparison, a source or generator applies a known input to the meter under testThe ratio of what the meter is indicating and the known generator values gives the meter's errorIn such a case meter is a test instrument while the generator is the standard instrumentThe deviation of the meter from the standard value is compared with the allowable performance limit. If the meter deviation exceeds the allowances, then the meter is considered to be out of tolerance.




Fig.: Meter Calibration


With the help of direct comparison, a generator or source also can be calibrated. In such calibration, the meter acts as a standard instrument while the generator acts as a test instrument.


Fig.: Generator Calibration



The transducer converts the signal from one form to anotherHence if the transducer is to be calibrated using direct comparison then both generator as well as meters the standard instruments while the transducer act as a Test Instrument. The transducer characteristics are then expressed as a ratio between the device's output to its input in the appropriate input and output measurement units.

Fig.: Transducer Calibration


Indirect Comparison Calibration Methodology

In the indirect comparison, the test instrument is compared with the response of a standard instrument of the same type i.e. if the test instrument is a meter, the standard instrument is also a meter, if the test instrument is a generator, the standard instrument is also a generator and so on.

If the test instrument is a meter then the same input is applied to the test meter as well as a standard meter. Thus the indication of the test meter is compared with the indication of the standard meter for the same stimulus or input. Care must be taken that during the comparison process, the source supplying input to both must have the required level of stability. The magnitude of input is not important.
                                  
                                                   Fig.: Meter Calibration


In the case of generator calibration, the output of both the generators, test as well as standard, are set to the same nominal levels. Then a transfer meter is used which measures the outputs of both standard and test generators. From the linearity and the resolution of the transfer meter, the generator is calibrated.
                         
                                         Fig.: Generator Calibration
The transducer calibration using the indirect method is similar to the generator calibration. The same input is given to the test transducer and the standard transducer. These are measured using the standard transfer meter. The sensitivity of the test transducer is obtained by multiplying the determined ratio of the two outputs by the known Sensitivity of the standard.

                    
                                   Fig. : Transducer Calibration

Calibration Curve:
The calibration means obtaining errors
at a number of points on its scale. The straight line joining all such error points gives a curve called the calibration curve. 
Fig. : Calibration Curve

Comments

Popular posts from this blog

Analog to Digital Converters: Successive Approximation A/D Converter

  Analog to Digital Converter & Its Working Almost every environmental measurable parameter is in analog form like temperature, sound, pressure, light, etc. Consider a temperature  monitoring system  wherein acquiring, analyzing, and processing temperature data from sensors is not possible with digital computers and processors. Therefore, this system needs an intermediate device to convert the analog temperature data into digital data in order to communicate with digital processors like microcontrollers and microprocessors. Analog to Digital Converter (ADC) is an electronic integrated circuit used to convert the analog signals such as voltages to digital or binary form consisting of 1s and 0s. Most of the ADCs take a voltage input as 0 to 10V, -5V to +5V, etc., and correspondingly produces digital output as some sort of a binary number. What is Analog to Digital Converter? A converter that is used to change the analog signal to digital is known as an analog to digital...

Single Channel Vs Multi Channel DAS

  Single Channel Data Acquisition System: A Single Channel Data Acquisition System consists of a signal conditioner followed by an analog to digital (A/D) converter, performing repetitive conversions at a free running, internally determined rate. The outputs are in digital code words including over range indication,  polarity  information and a status output to indicate when the output digits are valid. A Single Channel Data Acquisition System is shown in Fig. 17.3. The digital outputs are further fed to a storage or printout device, or to a digital computer device, or to a  digital computer  for analysis. The popular Digital panel Meter (DPM) is a well known example of this. However, there are two major drawbacks in using it as a DAS. It is slow and the BCD has to be changed into  binary coding , if the output is to be processed by digital equipment. While it is free running, the data from the A/D converter is transferred to the interface register at a rat...

DSO & MSO (Block Diagram)

DIGITAL STORAGE OSCILLOSCOPE (DSO) The digital storage oscilloscope eliminates the disadvantages of the analog storage oscilloscope. It replaces the unreliable storage method used in analog storage scopes with digital storage with the help of memory. The memory can store data as long as required without degradation. It also allows the complex processing of the signal by the high-speed digital signal processing circuits.  In this digital storage oscilloscope, the waveform to be stored is digitised and then stored in digital memory. The conventional cathode ray tube is used in this oscilloscope hence the cost is less. The power to be applied to memory is small and can be supplied by small battery. Due to this the stored image can be displayed indefinitely as long as power is supplied to memory. Once the waveform is digitised then it can be further loaded into the computer and can be analysed in detail. Block Diagram As done in all the oscilloscopes, the input signal is applied to the...