What is calibration ?
https://instrumentationtools.blogspot.com/2015/05/what-is-calibration-of-instrument.html
WHAT IS CALIBRATION?
There are as many definitions of calibration as there are methods. According to ISA’s The Automation, Systems, and Instrumentation Dictionary, the word calibration is defined as “a test during which known values of measurand are applied to the transducer and corresponding output readings are recorded under specified conditions.” The definition includes the capability to adjust the instrument to zero and to set the desired span. An interpretation of the definition would say that a calibration is a comparison of measuring equipment against a standard instrument of higher accuracy to detect, correlate, adjust, rectify and document the accuracy of the instrument being compared.
Different terms may be used at your facility. Just be careful not to confuse the range the instrument is capable of with the range for which the instrument has been calibrated.
WHAT ARE THE CHARACTERISTICS OF A CALIBRATION?
Calibration Tolerance: Every calibration should be performed to a specified tolerance. The terms tolerance and accuracy are often used incorrectly. In ISA’s The Automation, Systems, and Instrumentation Dictionary, the definitions for each are as follows:
Accuracy: The ratio of the error to the full scale output or the ratio of the error to the output, expressed in percent span or percent reading, respectively.
As you can see from the definitions, there are subtle differences between the terms. It is recommended that the tolerance, specified in measurement units, is used for the calibration requirements performed at your facility. By specifying an actual value, mistakes caused by calculating percentages of span or reading are eliminated. Also, tolerances should be specified in the units measured for the calibration.
For example, you are assigned to perform the calibration of the
previously mentioned 0-to-300 psig pressure transmitter with a specified calibration tolerance of ±2 psig. The output tolerance would be:
2 psig
÷ 300 psig
× 16 mA
----------------------------
0.1067 mA
rounding to 0.11 mA would exceed the calculated tolerance. It is recommended that both ±2 psig and ±0.10 mA tolerances appear on the calibration data sheet if the remote indications and output milliamp signal are recorded.
Note the manufacturer’s specified accuracy for this instrument may be 0.25% full scale (FS). Calibration tolerances should not be assigned based on the manufacturer’s specification only. Calibration tolerances should be determined from a combination of factors. These factors include:
• Requirements of the process
• Capability of available test equipment
• Consistency with similar instruments at your facility
• Manufacturer’s specified tolerance
Accuracy Ratio: This term was used in the past to describe the relationship between the accuracy of the test standard and the accuracy of the instrument under test. The term is still used by those that do not understand uncertainty calculations (uncertainty is described below). A good rule of thumb is to ensure an accuracy ratio of 4:1 when performing calibrations. This means the instrument or standard used should be four times more accurate than the instrument being checked. Therefore, the test equipment (such as a field standard) used to calibrate the process instrument should be four times more accurate than the process instrument, the laboratory standard used to calibrate the field standard should be four times more accurate than the field standard, and so on.
With today's technology, an accuracy ratio of 4:1 is becoming more difficult to achieve. Why is a 4:1 ratio recommended? Ensuring a 4:1 ratio will minimize the effect of the accuracy of the standard on the overall calibration accuracy. If a higher level standard is found to be out of tolerance by a factor of two, for example, the calibrations performed using that standard are less likely to be compromised.
Traceability is accomplished by ensuring the test standards we use are routinely calibrated by “higher level” reference standards. Typically the standards we use from the shop are sent out periodically to a standards lab which has more accurate test equipment. The standards from the calibration lab are periodically checked for calibration by “higher level” standards, and so on until eventually the standards are tested against Primary Standards maintained by NIST or another internationally recognized standard.
Figure 1.1 Traceability Pyramid
Calibration Traceability Pyramid |
Uncertainty: Parameter, associated with the result of a measurement that characterizes the dispersion of the values that could reasonably be attributed to the measurand. Uncertainty analysis is required for calibration labs conforming to ISO 17025 requirements. Uncertainty analysis is performed to evaluate and identify factors associated with the calibration equipment and process instrument that affect the calibration accuracy. Calibration technicians should be aware of basic uncertainty analysis factors, such as environmental effects and how to combine multiple calibration equipment accuracies to arrive at a single calibration equipment accuracy. Combining multiple calibration equipment or process instrument accuracies is done by calculating the square root of the sum of the squares, illustrated below:
Calibration Accuracy calculation Formula |
WHY IS CALIBRATION REQUIRED?
Calibration Span Error |
Calibration Zero Error |
Calibration Zero and Span Error |
Calibration Linearization Error |
Zero and span errors are corrected by performing a calibration. Most instruments are provided with a means of adjusting the zero and span of the instrument, along with instructions for performing this adjustment. The zero adjustment is used to produce a parallel shift of the input-output curve. The span adjustment is used to change the slope of the input-output curve. Linearization error may be corrected if the instrument has a linearization adjustment. If the magnitude of the nonlinear error is unacceptable and it cannot be adjusted, the instrument must be replaced.
Understanding of Processes: One thing that sets technicians apart is an understanding of the process, particularly how the instruments monitor and control the process. There is a difference between calibrating an individual component and calibrating an instrument as part of the bigger process control loop. For example, knowing when a controller can be placed in manual without affecting the process and what to do while that controller is in manual, requires an understanding of the process. Additionally, when an operator says there is a problem with his indication, a technician who knows the instrument loop and process will be more capable of identifying the cause of the problem.
• Loop Calibration vs. Individual Instrument Calibration
• Bench Calibration vs. Field Calibration
• Classification of Instruments as Critical, Non-Critical, For Reference Only, etc.
LOOP CALIBRATION VS. INDIVIDUAL INSTRUMENT CALIBRATION
2. Mistakes on re-connect
3. Less efficient use of time to do one calibration for each loop instrument as opposed to one calibration for the loop
ADVANTAGES OF INDIVIDUAL CALIBRATION
1. Correct instrument will be adjusted
2. More compatible with multifunction calibrators
Why should We Calibrate?
Calibration is required for:
- Testing a new instrument
- Testing an instrument after it has been repaired or modified
- Periodic testing of instruments
- Testing after the specific usage has elapsed
- Prior to and/or after a critical measurement
- When observations are not accurate or instrument indicators do not match the output of a surrogate instrument
- After events such as:
- An instrument has had a shock, vibration, or exposure to adverse conditions, which can put it out of calibration or damage it.
- Sudden weather changes
Risk Involved in Not Calibrating an Instrument
- Safety procedure: In case of instruments involving perishable products such as food or thermometers with area of sensitive nature, uncalibrated instruments may cause potential safety hazards.
- Wastage: If the instrument is not perfectly calibrated, it might lead to potential wastage of resources and time consumed in the operations, resulting in an overall increase in expenses.
- Faulty or Questionable Quality: If the instrument is improperly calibrated, the chances of faulty or questionable quality of finished goods arises. Calibration helps maintain the quality in production at different stages, which gets compromised if any discrepancy arises.
- Fines or litigations: Customers who have incurred damage may return the product against a full refund, which is still alright; but if they go for litigation due to damages, you could be up for serious costs in terms of reputation and restitution payments.
- Increased downtime: Poor quality of finished goods is the first indicator of disrepair in your equipment. Regular calibration programs identify warning signs early, allowing you to take action before any further damage is caused.
The calibration of instruments is the most basic maintenance requirement, which is an established procedure that every business using machinery or instruments must conduct periodically as specified in their machinery or instruments requirement.