And how to do them. Denise L Seman City of Youngstown

Similar documents
Clinical Chemistry Quality Control Policies, Processes, & Procedures

Objectives. Concepts in Quality Control Data Management. At the conclusion, the participant should be able to:

Explain and apply the use of calibrators and controls in immunoassay Criteria for assay run acceptance/rejection, including Westgard rules

Laboratory Techniques 100: Back To Basics. Carol Injasoulian Lab Manager City of Bay City April 29,2015

This procedure describes the monitoring activities in a laboratory quality control (QC) program to ensure the quality of test results.

Copyright ENCO Laboratories, Inc. II. Quality Control. A. Introduction

EPAs New MDL Procedure What it Means, Why it Works, and How to Comply

Statistics: Error (Chpt. 5)

The New MDL Procedure How To s. Presented by: Marcy Bolek - Alloway

Laboratory Quality Control Report: Why is it Important?

Proposed Procedures for Determining the Method Detection Limit and Minimum Level

ELISA QUALITY ASSURANCE Analytical Phase

Saleh Nasiri (Ph.D.) IBTO-Research Center

Calibration (The Good Curve) Greg Hudson EnviroCompliance Labs, Inc.

How s that *new* LOD Coming? Rick Mealy WWOA Board of Directors DNR LabCert

QUALITY CONTROL CRITERIA FOR CHEMISTRY EXCEPT RADIOCHEMISTRY.

Protean Instrument Dutchtown Road, Knoxville, TN TEL/FAX:

Part 14: Water quality Sampling. Guidance on quality assurance and quality control of environmental water sampling and handling

IE 316 Exam 1 Fall 2011

Revision: 11 (MBAS) ALLOWAY METHOD OUTLINE. Standard Laboratory Method:

ALLOWAY METHOD OUTLINE

Really, A Revised MDL Procedure

Quality Assurance is what we do to get the right answer for our purpose FITNESS FOR PURPOSE

Guide to the Expression of Uncertainty in Measurement (GUM)- An Overview

TNI V1M Standard Update Guidance on Detection and Quantitation

QA/QC in the Wastewater Laboratory. Steve Roberts Ohio EPA Division of Environmental Services 05/11/2016

Harris: Quantitative Chemical Analysis, Eight Edition CHAPTER 03: EXPERIMENTAL ERROR

Statistical Process Control

Water Quality MDLs, PQLs, and Censored Values What Does It All Mean?

Harris: Quantitative Chemical Analysis, Eight Edition CHAPTER 03: EXPERIMENTAL ERROR

Chapter 5. Errors in Chemical Analysis 熊同銘.

Laboratory 101: A Guide to Understanding your Testing Laboratory

Calibration Verification Linearity in the Clinical Lab. Did I Pass or Fail? 2017 ASCLS New Jersey

CHAPTER VI ANALYTICAL METHODS & QUALITY CONTROL

TNI Standard; EL-V1M4 Sections and (Detection and Quantitation) page1 of 8. TNI Standard VOLUME 1 MODULE 4

EPA's Revision to the 40 CFR Part 136 Method Detection Limit (MDL) Procedure

Water Quality Reporting Limits, Method Detection Limits, and Censored Values: What Does It All Mean?

Schedule. Draft Section of Lab Report Monday 6pm (Jan 27) Summary of Paper 2 Monday 2pm (Feb 3)

Method Validation and Accreditation

IE 316 Exam 1 Fall 2011

Measurement Uncertainty: A practical guide to understanding what your results really mean.

Uncertainty, Error, and Precision in Quantitative Measurements an Introduction 4.4 cm Experimental error

Laboratory ID. Laboratory Name. Analyst(s) Auditor. Date(s) of Audit. Type of Audit Initial Biennial Special ELCP TNI/NELAP.

How to Describe Accuracy

NATIONAL ASSOCIATION OF TESTING AUTHORITIES (NATA) REQUIREMENTS FOR ACCREDITATION OF ICP-MS TECHNIQUES

1 Normal Distribution.

ANALYTICAL METHODS & QUALITY CONTROL

Assessment of Accuracy and Precision

1.11 Measurement Uncertainty

9/2/2010. Wildlife Management is a very quantitative field of study. throughout this course and throughout your career.

Precision and Accuracy Assessing your Calorimeter

Objective Experiments Glossary of Statistical Terms

LAMBTON SCIENTIFIC (A Division of Technical Chemical Services Inc.)

03.1 Experimental Error

Is the laboratory s pledge or declaration of the quality of the results produced. to produce data compliant with the Safe Drinking Water Act (SDWA)

Precision and Accuracy Assessing your Calorimeter

Chapter 15. Quality Assurance. In Chapter 14 we discussed the process of developing a standard method, including optimizing.

Guideline/SOP: Handling of Laboratory Gross Errors/Data History

Direct Determination of Aluminium in Milk by Graphite Furnace Atomic Absorption Spectrometry

Liquid Scintillation Counter

Biol/Chem 4900/4912. Forensic Internship Lecture 5

Uncertainty of Measurement (Analytical) Maré Linsky 14 October 2015

VOTING DRAFT STANDARD

Control Charts Confidence Interval Student s t test. Quality Control Charts. Confidence Interval. How do we verify accuracy?

-However, this definition can be expanded to include: biology (biometrics), environmental science (environmetrics), economics (econometrics).

ADVANCED ANALYTICAL LAB TECH (Lecture) CHM

IE 361 Exam 1 October 2004 Prof. Vardeman

Hach Method Total Organic Carbon in Finished Drinking Water by Catalyzed Ozone Hydroxyl Radical Oxidation Infrared Analysis

High-Speed Environmental Analysis Using the Agilent 7500cx with Integrated Sample Introduction System Discrete Sampling (ISIS DS)

A. Incorrect! Replacing is not a method for solving systems of equations.

Statistical Analysis of Engineering Data The Bare Bones Edition. Precision, Bias, Accuracy, Measures of Precision, Propagation of Error

T.I.H.E. IT 233 Statistics and Probability: Sem. 1: 2013 ESTIMATION AND HYPOTHESIS TESTING OF TWO POPULATIONS

Method Update Rule of 2015: New Method Detection Limit MDL Determination. David Caldwell OK DEQ Laboratory Accreditation Program

Ohio EPA Total (Extracellular and Intracellular) Microcystins - ADDA by ELISA Analytical Methodology Ohio EPA DES Version 2.

Chem 321 Lecture 4 - Experimental Errors and Statistics 9/5/13

CS110 Personal Computing 1

MICROPIPETTE CALIBRATIONS

Basic Statistics. 1. Gross error analyst makes a gross mistake (misread balance or entered wrong value into calculation).

Deep Algebra Projects: Algebra 1 / Algebra 2 Go with the Flow

Standard Operating Procedure for: Conductivity Using Cole-Parmer Traceable Portable Conductivity Meter. Missouri State University.

CS 160: Lecture 16. Quantitative Studies. Outline. Random variables and trials. Random variables. Qualitative vs. Quantitative Studies

Uncertainties in AH Physics

Human anti-deoxyribonuclease B, anti-dnase B ELISA Kit

Regulatory and Alternative Analytical Procedures are defined as follows 2 :

STANDARD OPERATING PROCEDURE (SOP) FOR ESTIMATION OF DIETARY INTAKE OF SODIUM (DIS) IN 24 HOUR URINE USING ELECTROLYTE ANALYZER

Test Method: CPSC-CH-E

Relationships between variables. Visualizing Bivariate Distributions: Scatter Plots

Technical Notes & Troubleshooting in ELISA

Weather Prediction Using Historical Data

CCME Reference Method for the Canada-Wide Standard for Petroleum Hydrocarbons (PHC) in Soil - Tier 1 Method

SCIENTIFIC INQUIRY AND CONNECTIONS. Recognize questions and hypotheses that can be investigated according to the criteria and methods of science

0.27 Sample Preparation

Unit 4 Systems of Equations Systems of Two Linear Equations in Two Variables

Topic 2 Measurement and Calculations in Chemistry

CALCULATION OF UNCERTAINTY IN CHEMICAL ANALYSIS. A.Gnanavelu

Aflatoxin Analysis: Uncertainty Statistical Process Control Sources of Variability. COMESA Session Five: Technical Courses November 18

Quality Control Procedures for Graphite Furnace AAS using Avanta Software

Zinc Metal Determination Perkin Elmer Atomic Absorption Spectrometer AAnalyst Procedures

Glossary of Common Laboratory Terms

Standard Operating Procedure for Performing a QA Review of an Aldehyde Data Batch

Transcription:

And how to do them Denise L Seman City of Youngstown

Quality Control (QC) is defined as the process of detecting analytical errors to ensure both reliability and accuracy of the data generated QC can be used to ensure the equipment is functioning properly, the reagents are good, and the technician is skilled at the analysis

QC involves a statistical evaluation of data and the use of quality control materials Ideally, the use of third party, certified materials Statistical evaluation acceptance criteria may be defined within the approved methodology

QC must be performed on a regular basis QC must be included in regular runs, not a special QC only run QC materials should be treated the same as samples, from beginning to end of run

Good QC means: regular calibration of the equipment Frequent checks of known standards against the calibration to verify the cal is working (do NOT use the same standard to verify cal)

Mean Commonly called the average Standard deviation A statistic quantifying how close the numbers in a data set are to each other Accuracy How close to the true value you are Precision How close your duplicates are to each other

Coefficient of variation (CV) Ratio of the standard deviation to the mean Expressed as a percentage Coefficient of variation ratio Comparison of your labs CV to the values of a peer groups CV Standard Deviation Index Also peer based Mean of your lab minus the mean of the group Divided by the standard deviation of the group

These 2 will have the most impact on your analytical data You can be accurate and always fall within the acceptable range of a known standard You may be imprecise and your results are all over the spectrum within that acceptable range

Control charts are an important part of lab QC Levey-Jennings charts are used to track the day to day statistics of any analytical parameter. Charts are created for each test, and for each QC value within the test

A control chart gives you a visual display of method stability or instability over a period of time

Every method has some normal variation Use of control charts will allow you to track the normal variation, and quickly observe any abnormal variations

Some variation is simply the result of numerous, everpresent differences in the method. This is common cause variation

Chance occurrences Random issues that can t be controlled

Some variation may be the result of causes which are not normally present in the method This could be special cause variation

Incorrect reagents Expired reagents Inaccurate measurements Dirty glassware

Control Charts differentiate between these two types of variation

A corrective action investigation will isolate the possible special causes of a set of data

Control charts are tools used to determine whether or not an analytical process is in a state of statistical control, or stability

Stability is defined as the state in which a method has displayed a degree of consistency in the past and is expected to continue to do so in the future.

This consistency is demonstrated by a stream of data falling within control limits based on plus or minus 3 standard deviations of the calculated mean

Control charts can help you determine if any analysis is in control (statistically acceptable for all parameters) over a period of time

If the analysis is in control, the data generated can be reported with a relative degree of certainty

If the control chart indicates the analysis is NOT in control, a corrective action investigation should be conducted to correct the errors, and samples re-run before reporting

Control chart parameters should never be adjusted to be able to report data This will result in skewed, inaccurate data AND you will be reporting false data to the regulating agency

Calculate the mean and standard deviation to begin These 2 values will be used to determine the zones of the chart

The mean will be used as the center line on your chart The standard deviation will be used to develop the various zones of acceptance on the chart

When an analytical method is in control, 68% of the data will fall within +/- 1 std dev 96.5% of the data should fall within +/- 2 std dev

About 99.7% of the data will fall between +/- 3 std dev Data within the +/- 2 std dev is considered to be reportable, and method is well within control

Data that falls outside of the +/- 2 std dev, but within the +/- 3 std dev is considered reportable but you should start checking the method for any possible sources of problems

Things to look at: Age of reagents Does probe need to be cleaned Is equipment due to for maintenance

If you monitor your data as it falls outside of the +/- 2 std dev, but before it falls outside the +/- 3 std dev you can almost eliminate having any data fall outside the +/- 3 std dev (which indicates the method is out of control)

You should have at least 20 data points before developing a representative chart for quality control

Data should be updated frequently to incorporate new data into the values You don t need to do this daily

Ideally, you should work on developing an infinite set of data for the control charts This will give you more reliable QC than tossing half of your data set to add in a new group of data

After calculating the mean and standard deviation, it s time to set up a chart The chart is used as a visual aid for analysis

Plot the data on a graph Draw the Center Line (mean) Draw in the Warning limits Draw in the Control limits

Establish your zones on the chart Zone A is between 2 and 3 std dev Zone B is between 1 and 2 std dev Zone C is 1 std dev

New charts should be started for new methods or for any method that undergoes a significant change/ modification

All data points should be included in the evaluation of the control charts

Data that is affected by normal variability should be left in use

Data should ideally only be excluded if and when it can be linked to a specific cause variability.

Specific cause variability would be something linked directly to an error in the method: Incorrect reagents Dirty glassware Poor technique

There are other factors, or trends, that must be taken into consideration as well as whether the data is inside or outside the control limits to determine if a method/ data is in control

look for systematic patterns of points (e.g., means) across samples, because such patterns may indicate that the process average has shifted

A run of 9 consecutive points on the same side of the mean indicates the method is out of control

probability of this happening statistically is.00195 same probability that a point will fall outside the 3 std deviation line

if this pattern is detected, then chances are the average has probably changed

Successive samples with lessthan-average variability may be worth investigating, since they may provide hints on how to decrease the variation in the method

Note that it is assumed that the distribution of the data points will be symmetrical around the mean

6 points in a row steadily increasing or decreasing. This also signals a drift in the average

Often, such drift can be the result of equipment wear, deteriorating maintenance, improvement in skill, etc

14 points in a row alternating up and down. If this trend is present, it indicates that two systematically alternating causes are producing different results

2 out of 3 points in a row in Zone A or beyond. This provides an "early warning" of a process shift.

15 points in a row in Zone C (+/- 1 std dev) (above and below the center line). This test indicates a smaller variability than is expected (based on the current control limits)

8 points in a row in Zone B, A, or beyond, on either side of the center line (without points in Zone C). This indicates that different samples are affected by different factors, resulting in a bimodal distribution of means

This may happen, for example, if different samples were processed by two different techs, where one follows good measurement protocol and the other doesn t

A tend indicates a gradual loss of reliability in the test system Trends are often subtle The data slowly shifts from the norm to a new pattern

Trends may be caused by: Deterioration of instrument light source/ membrane, etc Gradual accumulation of dirt/ contaminants within the test system Age of reagents Gradual deterioration of control materials Gradual deterioration of calibration

An abrupt change in the control mean is a shift Shifts in the QC data represent a sudden and dramatic positive or negative change in the performance

Shifts may be caused by: Sudden failure of light source/ fouling of membrane Change in reagent formulation Change of reagent lot Major instrument maintenance Change of room temp/ humidity Failure of sampling system Failure of the automatic equipment Bad calibration

Once the data has been evaluated and a known variable for a source of error has been determined outliers can be removed from the statistical pattern

Using the outlier rules: If you continue to eliminate points without confirming there was special cause for them to occur, you will eventually end up with a control chart so tight you will almost always be out

Every time you eliminate outliers, you need to recalculate the mean, and the standard deviation

When evaluating the run, look at the QC from that run as it relates to the parameters generated from the control chart

If the data is in control, report the results If a data point is out of control, look for special causes and reanalyze the sample(s) affected

Good data is data that is accurate and precise and can be documented to demonstrate those characteristics.

Quality assurance is the program developed for the laboratory QC This involves rules on when to perform the QC, how to perform the QC, and storage of the QC data

This also includes documenting the instrument maintenance Written SOP s for all methods, based on the exact equipment you use to perform the analysis is part of a good QA program

Quality assurance is a necessary component of a good laboratory practice

Method detection limit

The level where you know the analysis can actually generate/ detect values within a given sample with a relative degree of certainty

This limit is based on the entire analytical process, including any distillation and/ or digestion steps, as it incorporates all avenues of potential error

Detection limits are a statistical evaluation of a series of known analytical results from the same sample/ standard

The concentration in the sample being used for the determination should be within 2 to 10 times the anticipated detection limit

If you are prepping a standard (a spiked blank) the conc should be 1 to 5 times the expected limit

A series of 7 analysis of the same sample are done, and the results calculated to determine the MDL

The average of the data is calculated, as well as the standard deviation The standard deviation is multiplied by the t-value for a 99% confidence interval (for 7 samples it is 3.14)

The resulting value is your MDL For a more realistic value use the S-pooled to calculate the result

From your 7 samples, calculate the variance of the data Using the previous set of data for the MDL of this analysis calculate the variance of that set

Xι; i=1 to n, are the analytical results in the final method reporting units obtained from the n sample aliquots and Σ refers to the sum of the X values from i=l to n. 6. (a) Compute the MDL as follows: MDL = t (n-1,1-α=0.99) (S) where: MDL = the method detection limit t (n-1,1-α=.99) = the students' t value appropriate for a 99% confidence level and a standard deviation estimate with n-1 degrees of freedom. See Table. S = standard deviation of the replicate analyses.

Divide the larger variance by the smaller variance to determine the ratio If this value is > 3.05 redo the analysis

If the ratio is less than 3.05 use this equation to determine the S pooled value: SQRT(((variance 1*6)+(variance 2*6))/12) Multiply by 2.681 to determine the MDL (t (12,1 α=.99) )

MDL s should be revisited at least once a year. MDL s should be done for any new methods and method modifications MDL s are NOT the limit for most accurate reporting levels, but the point at which you are sure you CAN detect results

Questions? Contact info: DSeman@CityofYoungstownOH.com oweastatelac@gmail.com