METHODS FOR CERTIFYING MEASUREMENT EQUIPMENT. Scott Crone

Similar documents
Vocabulary of Metrology

A61-02 CALA Guidance on Traceability Revision 1.2 October 15, 2012

SWGDRUG GLOSSARY. Independent science-based organization that has the authority to grant

SAMM POLICY 5 (SP5) POLICY ON MEASUREMENT UNCERTAINTY REQUIREMENTS FOR SAMM TESTING LABORATORIES Issue 2, 28 February 2007 (Amd. 1, 11 August 2014)

Certificate of Accreditation to ISO/IEC 17025:2005

APPENDIX G EVALUATION OF MEASUREMENT UNCERTAINTY

APPENDIX G ESTIMATION OF UNCERTAINTY OF MEASUREMENT

EA Guidelines on the Calibration of Temperature Indicators and Simulators by Electrical Simulation and Measurement

New Thermometer Guidance Document from AASHTO Accreditation Program. Maria Knake Asphalt Binder ETG Meeting May 10, 2018

2.1. Accuracy, n- how close the indication of the thermometer is to the true value.

The General Conference on Weights and Measures (CGPM), at its 24th meeting,

Measurement & Uncertainty - Basic concept and its application

Joint Committee for Traceability in Laboratory Medicine Terminology. R. Wielgosz and S. Maniguet

INC 331 Industrial Process Measurement. Instrument Characteristics

Introduction to Engineering ENGR System of Units

Guidelines on the Calibration of Automatic Instruments for Weighing Road Vehicles in Motion and Measuring Axle Loads AWICal WIM Guide May 2018

Document No: TR 12 Issue No: 1

Appendix B: Skills Handbook

Certificate of Accreditation

Recent Developments in Standards for Measurement Uncertainty and Traceability (An Overview of ISO and US Uncertainty Activities) CMM Seminar

A guide to expression of uncertainty of measurements QA4EO-QAEO-GEN-DQK-006

Calibration Traceability Guidelines

Uncertainty Propagation for Force Calibration Systems

Establishment of a Traceability Chain in Metrology for Materials

OIML D 28 DOCUMENT. Edition 2004 (E) ORGANISATION INTERNATIONALE INTERNATIONAL ORGANIZATION. Conventional value of the result of weighing in air

Essentials of expressing measurement uncertainty

Calibration, traceability, uncertainty and comparing measurement results. Workshop 16 March 2015 VSL, Delft The Netherlands

Chapter 3A. Measurement and Significant Figures

CALIBRATION CERTIFICATE # 503

Certificate of Accreditation

Introduction to Mechanical Engineering Measurements Two Main Purposes of Measurements Engineering experimentation Operational systems

The Fundamentals of Moisture Calibration

Example 1: The mass of the earth is 5.98 x kg. What is its order of magnitude? What is the order of magnitude for 400?

o Lecture (2 hours) / week Saturday, g1: (period 1) g2: (period 2) o Lab., Sec (2 hours)/week Saturday, g1: (period 4) Wednesday, g2: (period 3)

R SANAS Page 1 of 7

What are these standards? Who decides what they are? How many Standards should we have?

Certificate of Accreditation

INTERNATIONAL STANDARD. This is a preview of "ISO :2009". Click here to purchase the full version from the ANSI store.

Chemistry 11. Measuring and Recording Scientific Data. Measurement tools Graphing Scientific notation Unit conversions Density Significant figures

OA03 UNCERTAINTY OF MEASUREMENT IN CHEMICAL TESTING IN ACCORDANCE WITH THE STANDARD SIST EN ISO/IEC Table of contents

Chemistry 11. Unit 2: Introduction to Chemistry. Measurement tools Graphing Scientific notation Unit conversions Density Significant figures

Schedule of Accreditation issued by United Kingdom Accreditation Service 2 Pine Trees, Chertsey Lane, Staines-upon-Thames, TW18 3HR, UK

Calibration and verification: Two procedures having comparable objectives and results

Measurement & Uncertainty - Concept and its application

DEVIL PHYSICS THE BADDEST CLASS ON CAMPUS IB PHYSICS

DEVIL PHYSICS THE BADDEST CLASS ON CAMPUS IB PHYSICS

How do you really know what the temperature is? Michael de Podesta

Control Engineering BDA30703

Measurements and Calculations. Chapter 2

Comparability and Traceability Point of view from a metrological institute

SCOPE OF ACCREDITATION TO ISO/IEC 17025:2005 & ANSI/NCSL Z

Certificate of Accreditation

Measurement and Chemical Calculations. Measurement and Chemical Calculations

Training Benefits Adding Values to Your Systems/ Organization through: Potential Attendees These Courses have been designed for;

Lecture 2: Introduction to Measurement and Units

TRACEABILITY STRATEGIES FOR THE CALIBRATION OF GEAR AND SPLINE ARTEFACTS

Determination of particle size distribution Single particle light interaction methods. Part 4:

Certificate of Accreditation

UNIT 1 - STANDARDS AND THEIR MEASUREMENT: Units of Measurement: Base and derived units: Multiple and submultiples of the units: 1

US Customary System (USC)

A combinatorial technique for weighbridge verification

GHANA CIVIL AVIATION (AIR NAVIGATION SERVICES) DIRECTIVES PART 22 - UNITS OF MEASUREMENTS TO BE USED IN AIR AND GROUND OPERATIONS

P103d Annex: Policy on Estimating Measurement Uncertainty for Construction Materials & Geotechnical Testing Labs Date of Issue 09/13/05

THE UNCERTAINTY OF MEASUREMENT IN CALIBRATION USING A COMPARISON FORCE STANDARD MACHINE

Certificate of Accreditation

Chem 1A General Chemistry. Chapter 1 Keys to the Study of Chemistry

Revising the International System (SI) of Units of Measurement. Dr Barry IngIis President, International Committee for Weights and Measures

Certificate of Accreditation

UNITS AND DEFINITIONS RELATED TO BIOMECHANICAL AND ELECTROMYOGRAPHICAL MEASUREMENTS

Reference Materials for Trade and Development: Quality and Comparability

Physics Quantities Reducible to 7 Dimensions

Text book. Tenth edition. Walker, Halliday and Resnick. Principles of physics.

PARTICLE MEASUREMENT IN CLEAN ROOM TECHNOLOGY

What is measurement uncertainty?

Certificate of Accreditation

Guest Forum. ph Measurement using the Harned Cell. The Screening Committee Lecture for the First Dr. Masao Horiba s Award

UNCERTAINTY ANALYSIS FOR LABORATORY ACCREDITATION. Peter B. Crisp. Fluke Precision Measurement Ltd, 52 Hurricane Way, Norwich, UK

ME6504 METROLOGY AND MEASUREMENTS

Part 5: Total stations

Certificate of Accreditation

Introduction to metrology

Metallic materials Knoop hardness test. Part 1: Test method. Matériaux métalliques Essai de dureté Knoop Partie 1: Méthode d'essai

Physics 11. Unit 1 Mathematical Toolkits

IS INTERNATIONAL STANDARD. Natural gas - Guidelines to traceability in analysis

Scientific Problem Solving

Mason Technology Ltd

Exact and Measured Numbers

Explicit-unit formulations of proposed redefinitions of SI base units

Measurement uncertainty and legal limits in analytical measurements

US Customary System (USC) Systeme Internationale (SI) Prefixes. Units & Significant Figures

SCOPE OF ACCREDITATION TO ISO/IEC 17025:2005 & ANSI/NCSL Z

POWER UNDERSTANDING MEASUREMENT UNCERTAINTY IN DP FLOW DEVICES

precision accuracy both neither

Measurement Uncertainty in Mechanical Testing

Morehouse. Edward Lane, Morehouse Instrument Company 1742 Sixth Ave York, PA PH: web: sales:

Measurement Uncertainty Knowing the Unknown

Certificate of Accreditation

Standard physical units

Nanometrology and its role in the development of nanotechnology

Chapter 3 - Scientific measurement. Using and expressing measurements

Transcription:

METHODS FOR CERTIFYING MEASUREMENT EQUIPMENT Scott Crone North American Sales Manager AMETEK Calibration Instruments 8600 Somerset Drive Largo, Florida 33773 USA INTRODUCTION Like any other piece of equipment, a measurement artifact must be maintained. Obviously, it has to be in working order in general. However, what is more important is that it be operating within specified paras and providing measurements that are traceable to a known source or sources. This paper provides a general overview of calibration and certification. It also discusses some key terminology and methods. BASIC TERMINOLOGY Metrology is the doctrine of measuring and includes three major subjects: the international definition of units, realization of the unit by means of scientific methods, and the establishment of traceability for the documentation to the accuracy of the measurement. Metrology is also defined as the science that deals with measurement. Metrology is important because it is the basis for calibration, or certifying, all measurement equipment. Legal metrology, which is used in many industries, is defined as securing the accuracy of a measurement, when the measurement could have any influence on human health or safety and the transparency of financial transactions. This is evident in custody transfer applications and is one of the main reasons that having certified equipment traceable to a standard is necessary and also requires understanding by the parties involved in the transaction. defined by ISO10012 is, The set of operations which establish under specified conditions the relationship between values indicated by a measuring instrument or measuring system, or values represented by a material measure or reference material, and the corresponding value of a quantity realized by a reference standard. Again, sounds fairly complex but it ties to the measurement in that it specifies particular conditions and defines the system used for determining how much was measured and then goes to the next level to standardize that measurement so that like measurements are comparable to a known standard. METHODS OF CALIBRATION There are two basic situations for performing a calibration: in the field and in a lab. The field calibrations are performed in situ and the conditions are other than stable and controllable. There are three basic types of field calibrations: Line/End-to- End/Complete Loop, Separate Line/Separate Loop, and Comparative/On Line Process. These are discussed in the following paragraphs. By comparison, lab calibrations are performed in a controlled environment and under very controlled conditions and typically concentrate on a sensor or a measurement artifact. A measurement, as defined by BIPM/ISO, Is the set of operations having the objective of determining the value of a quantity. It sounds complex but essentially means taking a reading where that reading is easily understandable and shows what and how much. A calibration, by contrast and as

Line/End-to-End/Complete Loop Calibration This method is used to test the entire system from sensor through to the read-out. It allows for all inaccuracies to be collected in one calibration result and therefore minimizes the risk of accumulating errors after a calibration. There is only one reference instrument necessary which makes for a small collected uncertainty. It also makes for easier paperwork. This approach does have drawbacks in that it does require portable calibrators and can, in most cases, limit the calibration to one instrument at a time. The major disadvantage is that it most often causes downtime in a production environment since the instrument must be taken out of service. Comparative/On-Line Process Calibration This method is used in cases where the sensor cannot be removed from the process or the facility cannot have down time. All of the inaccuracies are collected in a single certificate and documentation is easy. However, the significant drawback is that the process has limited variability so the amount of data points used typically suffers when using this approach. This would also require that there is an extra thermal pocket or pressure tap for the reference instrument. Separate Line/Separate Loop Calibration This method is used to test the sensor and the loop separately. This can only be used when a sensor can be tested outside of the loop associated in the process and is sometimes used when calibrated spares are maintained. This does allow for a fast onsite calibration in that only the loop is tested electronically. That shorter on-site calibration time means that there is less downtime in a production facility as well as having the sensor calibrations performed when it is convenient: both are desirable conditions. The downside is that the error must be calculated from both procedures and two certificates must be maintained to cover the full calibration. Lab Calibration This method is used where possible to calibrate sensors and potentially the loop instruments outside of the process area. This is also the method for calibrating reference instruments. Typically, automated calibration processes are used and this requires little hands-on time for the staff. The calibration may be carried out as a schedule permits time and this also establishes calibrated spares for fast and trusted replacement.

UNITS OF MEASUREMENT Engineering units are a precisely specified quantity in terms of which magnitudes of other quantities of the same kind can be stated. Simply put, what is Y and how many X units are required to make up a Y according to an online resource. These are used to quantify measurement paras so that they are measured in a uniform manner regardless of location. A physical constant is a unit derived from a physical principle and is typically the basis for the system of engineering units. The speed or wavelength of light is an example. These are very important as the units of measure are the base part of any measurement. These determine the base amount of any measured para. The number of units is what specifically constitutes the measurement. From those 7 units, we have the derived units. Derived quantity Area Volume Speed SI derived unit Name square cubic per second Symbol m 2 m 3 ms -1 The physical fundamental units are defined by Le Système International d Unités and are the 7 fundamental SI units agreed upon internationally and can describe all physical phenomena. Acceleration per second squared ms -2 Base quantity SI base unit Force Newton N (kgms -2 ) Name Symbol Pressure Pascal Pa (Nm -2 ) Length m Mass kilogram kg Energy joule J (kgm 2 s -2 ) Power Watt Js -1 Time second s Electric current ampere A Celsius temperature Kelvin 273.15 C Thermodynamic temperature Kelvin K Amount of substance mole mol Luminous intensity candela cd

Designated organizations maintain the references for the units of measure. In the USA, the Commerce Department is responsible for the maintenance of the standards. The National Institute of Standards and Technology is the division of the Commerce Department directly responsible. Other countries have their own standards organizations. Calibration of a measuring instrument to the reference standards establishes Traceability. requirements for calibration reports, uncertainty information and calibration methods. INSTRUMENTS AND STANDARDS Understanding the use of an instrument and the type of an instrument are key to the process of certifying instruments and of having instruments certified. A calibrator is an instrument with the capability to generate a known physical unit with known accuracy. A measuring instrument is a tool used in addition to a calibration instrument. The measuring instrument can only measure a para or electric signal. Calibrators are used to calibrate measuring instruments. Measuring instruments are only capable of checking and not calibrating paras. Conversion tables and software may be used to convert one unit to another provided that the conversion is within the same para. It is very important to use exact conversion factors. ISO Standards 8000 Quantities and Units and NIST Special Publication 811, 2008 edition Guide for the Use of the International Systems of Units (SI) both describe the proper conversion factors and the use of the same. The NIST publication is available at no charge and may be downloaded from their website. TRACEABILITY AND ACCREDITATION Traceability and accreditation are two very different things. As discussed, traceability documents an unbroken chain for the measurement para back to a recognized standard for that para. Accreditation deals with the method for performing a calibration and documents a repeatable process for the same. ISO/IEC 17025 General Requirements for the Competence of Testing and Calibration Laboratories is one of the more recognized of the accreditations internationally. It is commonly referred to as Guide 25 and establishes stringent A reference standard, according to ISO/BIPM/OIML, is, A standard, generally of the highest metrological quality at a given location, from which measurements made at that location are derived. These instruments would most certainly be traceable to a standard. Another type of standard that may be in the chain would be a measurement standard and ISO10012 defines that as, A material measure, measuring instrument, reference standard, material or system intended to define, realize, conserve or reproduce a unit or one or more values of quantity in order to transmit them to other measuring instruments by comparison. Both of these types of units establish measurements that are utilized as a standard for measurements, calibrations, or transfers made from one site to another typically through a physical means. In the process of legal metrology, where something may be transferred from one organization to another, a consensus standard may be established through agreement of both parties. A consensus standard is an artifact or process that is used as a de facto or consensus standard by mutual consent between parties in cases where no national standard or physical constant is available. In contrast to the general standards definitions in the preceding, there are other defining qualities of standards more widely utilized. A primary standard is operationally defined by ISO10012 as, A standard which provides a magnitude or value of the highest

accuracy (lowest) uncertainty in a given metrology discipline, for a given para or quantity. Strictly speaking, a primary standard is a device that is directly traceable to the physical standards and all errors are eliminated or evaluated. In comparison, a secondary standard is a reference instrument that typically is not directly based on the fundamental units of measure: the para measured is translated into a signal for reading such as electrical or a dial gauge. ISO/BIPM/OIML defines a secondary, or working standard, is, A standard whose value is fixed by comparison with a primary standard. The relation of a secondary standard to a primary standard is true regardless of the accuracy of the instrument. A transfer or check standard is one used as an intermediary to compare standards, material measures or measuring instruments: this is the level of a typical field calibrator. Instruments should be chosen based on the task at hand. A primary standard may not be the better choice for the field calibration of an analog indicator. A rule of thumb is to have a 4:1 ratio of accuracy of the calibrator to the instrument being calibrated. With the improvements in instrument technology, this may not always be recognizable; however, a minimum of 2:1 should be used. In addition to the accuracy specification, the choice of standards should also include future use considerations, confidence, and the compliance with internal and external quality or regulatory systems. FACTORS IN MEASUREMENT Having a working knowledge of factors in measurement is vital and without it, measurement and calibration cannot fully be understood. One term that will always be discussed, analyzed and debated is accuracy. Accuracy and Uncertainty Accuracy is defined by ISO10012 as, The closeness of the agreement between the result of a measurement and the true value This is in actuality a qualitative term without a numerical value. However, the industrial and operational use of this term has come to describe a characteristic with a numerical value. ISO/IEC 17025 has brought the defining term of uncertainty into the language of measurement and calibration and this shift will provide more clarity as it becomes more widely accepted. Until that time, users will continue to review specifications with an accuracy value stated for the performance of an instrument. Uncertainty is a term used to quantify the potential deviation of a measuring device or instrument from nominal or specified value. This is a quantitative term and has numerical value. This is determined statistically and is reported as a Type A uncertainty and includes a confidence level or coverage factor. If the uncertainty is determined in a manner other than through the use of statistics, it is called a Type B uncertainty. So, for now, accuracy is typically referred to as the maximum expected measuring error for a given unit against a known value. Unfortunately, this is typically all of the data that is given for consideration. There are factors within that performance statement which are rarely mentioned: hysteresis, linearity, and repeatability. Hysteresis is the property of an instrument whereby its response to a given stimulus depends on the sequence of the preceding stimuli: commonly discussed as upscale versus downscale. Linearity is ability of an instrument to maintain a constant deviation throughout the measurement range. Repeatability is sometimes called precision and refers to the quantified, repeated assessment of the same para with consistent results. All of these attributes factor into the accuracy performance of an artifact or measuring instrument and must be considered. Accuracy Expressions There are two typical methods to mathematically express accuracy in the instrumentation industry.

They may be used alone or in combination. These are percentage of full scale and percentage of reading. Percentage of full scale is typically applied to secondary standards and yields a performance that has an error that is consistent throughout the operational range. For example, a unit with a ±0.1% of full scale accuracy on a 3,000 psi instrument would read accurate to ±3.0 psi at any place from 0 to 3,000 psi. A percentage of reading instrument would have a smaller error lower in the scale of the unit. For example, a unit with a ±0.1% of reading accuracy on a 3,000 psi instrument would have an error of ±0.1 psi at 100 psi but ±3 psi at 3,000 psi. The percentage of reading accuracy statements are typically used for primary standards. Combination specifications are starting to be used more frequently and they basically designate that there is a floor specification to the overall accuracy performance of the unit below which there is an expected constant error. Understanding the accuracy statement is very important. Knowing whether other factors are included in that overall accuracy is just as important. Other Considerations Some other factors that are important deal with the actual readout on the device itself and the ability to view a response to stimulus. Scale resolution is the smallest or least scale division on the display. The resolution should support visualizing the stated accuracy of the instrument. If the resolution does not allow for viewing the stated accuracy of the instrument, it is difficult to verify that accuracy statement. The industry standard allows for one half of the smallest division to be considered. Scale sensitivity or sensitivity is the proportional, linear, least scale displacement. Again, if the stimulus cannot cause the instrument to deflect in a slight enough method to allow for viewing of the stated accuracy, then verification is difficult. Scale discrimination is simply the ability to provide the smallest perceptible indication from a stimulus. RESULTS Results should be documented for future reference and review. Most quality systems and auditing agencies require that documentation be maintained for critical calibrations. A part of that documentation is the certification for the site reference standards: to include performance, accuracy or uncertainty, and traceability for the calibration. Without documentation as evidence, all downstream calibrations are in question. REFERENCES http://dictionary.reference.com/search?q=metrology http://dictionary.reference.com/search?r=2&q=units http://cancerweb.ncl.ac.uk/cgi-bin/omd?query=calibrator&action=search+omd BIPM Bureau International des Poids et Measures ISO International Organization for Standardization ISO10012 OIML International Organization of Legal Metrology Le Système International d Unités NIST National Institute of Standards and Technology http://www.nist.gov