Technology & Engineering

Calibration

Calibration is the process of adjusting and comparing the measurements of an instrument or device to ensure its accuracy and precision. This involves setting the instrument to a known standard or reference value. Calibration is essential for maintaining the reliability and consistency of measurement instruments, ensuring that they provide accurate results.

Written by Perlego with AI-assistance

5 Key excerpts on "Calibration"

Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.
  • Calibration: A Technician's Guide

    ...1 Calibration PRINCIPLES After completing this chapter, you should be able to: Define key terms relating to Calibration and interpret the meaning of each. Understand traceability requirements and how they are maintained. Describe characteristics of a good control system technician. Describe differences between bench Calibration and field Calibration. List the advantages and disadvantages of each. Describe the differences between loop Calibration and individual instrument Calibration. List the advantages and disadvantages of each. List the advantages and disadvantages of classifying instruments according to process importance—for example, critical, non-critical, reference only, OSHA, EPA, etc. 1.1 WHAT IS Calibration? There are as many definitions of Calibration as there are methods. According to ISA’s The Automation, Systems, and Instrumentation Dictionary, the word Calibration is defined as “a test during which known values of measurand are applied to the transducer and corresponding output readings are recorded under specified conditions.” The definition includes the capability to adjust the instrument to zero and to set the desired span. An interpretation of the definition would say that a Calibration is a comparison of measuring equipment against a standard instrument of higher accuracy to detect, correlate, adjust, rectify and document the accuracy of the instrument being compared. Typically, Calibration of an instrument is checked at several points throughout the Calibration range of the instrument. The Calibration range is defined as “the region between the limits within which a quantity is measured, received or transmitted, expressed by stating the lower and upper range values.” The limits are defined by the zero and span values. The zero value is the lower end of the range. Span is defined as the algebraic difference between the upper and lower range values...

  • Industrial Process Automation Systems
    eBook - ePub

    Industrial Process Automation Systems

    Design and Implementation

    ...So the output of an instrument accurately corresponds to its input throughout a specified range. The word standard refers to an authorized basis for a comparison of a unit of measure. It is important to calibrate an instrument for the following reasons: 1. Even the best instruments drift and lose their ability to give accurate measurements. The drift makes Calibration necessary. 2. Environment conditions, elapsed time, and type of application can all affect the stability of an instrument. 3. Even instruments of the same manufacturer, type, and range can show varying performance from one unit to another. 4. To maintain the credibility of measurements. 5. To maintain the quality of process instruments at a “good-as-new” level. 6. Safety and environmental regulations. 17.1.1. Instrument errors Instrument error can occur due to a variety of factors: drift, environment, electrical supply, addition of components to the output loop, process changes, and so on. Calibration is performed by comparing or applying a known signal to the instrument under test and errors are detected by performing a Calibration. An error is the algebraic difference between the indication and the actual value of the measured variable...

  • Measurement Technology for Process Automation
    • Anders Andersson(Author)
    • 2017(Publication Date)
    • CRC Press
      (Publisher)

    ...First, we will discuss Calibration and traceability as these are the basis for uncertainty. Calibration By definition, in measurement standards, ‘Calibration’ is equal to ‘comparison’. A Calibration does not therefore automatically include any adjustment activities. This is a bit controversial and is often a cause for confusion in discussions and reports. Adjustments to a reading ‘as close to correct as possible’ can be performed at the same time as a Calibration, but it is then something extra. Besides, if adjustments are performed, it might be a good idea to calibrate twice, before and after the adjustment. In many Calibration reports, these two values are labelled with ‘as found’ and ‘as left’. The fact is that a calibrated instrument is not by some guarantee a ‘good’ instrument. However, it is documented how accurate it is. To be able to decide how large the measuring error is, a better device or method (or at least a device with a known uncertainty) is needed. This is the reference. The precision, stability and confidence of the reference are very important. All Calibrations must be recorded in Calibration reports (or certificates). These documents shall include not only the result from the Calibration (the deviation between the indication of the reference and the tested instrument, sometimes called device under test, or DUT) but also the method used, environmental conditions, the references used and the estimated uncertainty in the Calibration. It must be possible to track the Calibration of the used reference as well [it does not have to be stated in the report itself, but may be shown in a quality assurance (QA) system of the laboratory or similar]. As the reference used in the Calibration also needs to be calibrated, you can follow this track of Calibration all the way to the international standard for that specific unit...

  • Synthetic Instruments: Concepts and Applications
    • Chris Nadovich(Author)
    • 2004(Publication Date)
    • Newnes
      (Publisher)

    ...CHAPTER 8 Calibration and Accuracy Calibration and metrology is an immense topic with numerous aspects. Calibration is particularly vital with respect to synthetic instruments because the SI is the new kid on the block and will be scrutinized carefully by expert and tyro test engineers alike. During this scrutiny, proponents of the new approach want to make sure the evaluations are fair, unbiased, and based on a valid scientific methodology. Metrology for Marketers and Managers When somebody decides they want to buy a measurement instrument, the only thing they may know for sure right off the bat is that they want to pay as little as possible for it. But if this was the only specification for a measurement instrument, it’s doubtful anything useful would be purchased. Sometimes instrument shoppers will specify that the instrument they want to buy should be able to do whatever it does exactly like some old instrument they currently have. But assuredly, if “doing exactly what X does” is the only performance criteria, the system that best performs just like X will be X itself. That’s not likely to be what they wanted, or they wouldn’t be shopping for a replacement for X. Intelligent people shopping for measurement instruments will think about the measurements the old instruments made and make a list of the measurements they still want. They will present this list of their requirements to various vendors. In this way, they allow suppliers freedom to offer them something other than what they already have. But they don’t want to give the vendors too much freedom, so they need to specify the accuracy they need for the measurements. Customarily they get the accuracy and measurement capabilities from the specifications of X. If the new system meets these abstract specifications gleaned from the specifications of their old system X, they figure they will get something at least as good as X...

  • Manufacturing of Quality Oral Drug Products
    eBook - ePub

    Manufacturing of Quality Oral Drug Products

    Processing and Safe Handling of Active Pharmaceutical Ingredients (API)

    • Sam A. Hout(Author)
    • 2022(Publication Date)
    • CRC Press
      (Publisher)

    ...15 Calibration DOI: 10.1201/9781003224716-15 Routine Calibration of equipment and documentation of the Calibration results is outlined by reflecting on methods of calibrating and documentation practices of miscellaneous devices, which measure products or processes. All inspection, measuring, and test equipment that can affect product quality or processes are to be calibrated against Standard Measurement and Test Equipment (M&TEs), which are traceable to the National Institute of Standards and Technology (NIST). If NIST or national traceable standards are not available for the parameter being measured, an independent reproducible standard shall be used. The following is to establish approach to Control of Measuring and Test Equipment-Calibration Program. Calibration – Set of operations that establish, under specified conditions, the relationship between values of quantities indicated by a measuring asset or measuring system, or values represented by a material measure or a reference material, and the corresponding values realized by Standard M&TEs. CMMS – Computerized Maintenance Management System is a centralized computer system controlled by client’s global IT department, with all system servers residing in client’s outsourced data center. CMMS is comprised of a web-based computer enterprise planning system application developed by IBM that can be accessed by any computer in the client network using Internet. All asset records are maintained electronically using computer enterprise planning system. Computer enterprise planning system is also used for all maintenance records. Measurement Standard/M&TE (Measurement and Test Equipment) – Material measuring asset, reference material, or measuring system intended to define, realize, conserve, or reproduce a unit or one or more values of a quantity to serve as a reference. Test Accuracy Ratio (TAR) – Ratio of the accuracy of the unit under test (UUT) and the reference standard used to calibrate the UUT...