Radiochemistry Webinars Data Verification and Validation

Similar documents
DOE S RADIOLOGICAL TRACEABILITY PROGRAM FOR RADIOANALYTICAL MEASUREMENTS: WORK IN PROGRESS

ENVIRONMENTAL LABORATORY SECTOR VOLUME 1 MANAGEMENT AND TECHNICAL REQUIREMENTS FOR LABORATORIES PERFORMING ENVIRONMENTAL ANALYSIS

Laboratory ID. Laboratory Name. Analyst(s) Auditor. Date(s) of Audit. Type of Audit Initial Biennial Special ELCP TNI/NELAP.

Key Considerations for Ensuring Quality Radioanalytical Laboratory Services for Superfund Sites Activities

ENVIRONMENTAL LABORATORY SECTOR MODIFIED WORKING DRAFT STANDARD (MWDS) VOLUME 1

QSM 5.1 FAQs. For a test with a recommended maximum holding time measured in hours, the holding time shall be tracked by the hour.

Table of Contents I. PURPOSE AND SCOPE:... 3 II. AUTHORITY:... 3 III. REFERENCE:... 3 IV. RESPONSIBILITY:... 3 V. POLICY:... 3 VI. PROCEDURE:...

Radiological Traceability Program (RTP) Radiological and Environmental Science Laboratory (RESL) Svetlana Nour Kenneth Inn Jerry LaRosa Jackie Mann

ISO Water quality Measurement of polonium 210 activity concentration in water by alpha spectrometry

Review and Reporting of Chemical of Concern (COC) Concentration Data Under the TRRP Rule (30 TAC 350)

REGULATORY GUIDE OFFICE OF NUCLEAR REGULATORY RESEARCH. REGULATORY GUIDE 4.15 (Draft was issued as DG-4010, dated November 2006)

Copyright ENCO Laboratories, Inc. II. Quality Control. A. Introduction

Analytical Report

Qualification Code Reference Table Cont.

ALLOWAY METHOD OUTLINE

Implementation of ISO/IEC in a low level liquid s cintillation tritium laboratory

TNI Standard; EL-V1M4 Sections and (Detection and Quantitation) page1 of 8. TNI Standard VOLUME 1 MODULE 4

RPR 29 CYCLOTRON RADIOCHEMISTRY LABORATORY

Maine DEP DRO. MY05TP220A(0-0.5) MY05TP222A(0-0.5) FD MY05TP223A(0-0.5) FD MY05TP220A (EB) EB-Equipment Rinsate Blank FD- Field Duplicate Samples

This document is a preview generated by EVS

VOTING DRAFT STANDARD

Revision: 11 (MBAS) ALLOWAY METHOD OUTLINE. Standard Laboratory Method:

This document is a preview generated by EVS

University of Tennessee Safety Procedure

KESTREL. April 26, Mr. Stephen Evans Maine Yankee 321 Old Ferry Road Wiscasset, Maine 04578

QUALITY CONTROL CRITERIA FOR CHEMISTRY EXCEPT RADIOCHEMISTRY.

QAM-Q-101 Laboratory Quality Control

SYNTHETIC AND VIRTUAL ENVIRONMENTAL MEDIA (SAVEM) Frederic H.K. Booth, WPI Middlebrook Rd., Suite 250 Germantown, MD 20874

Protocol for the design, conducts and interpretation of collaborative studies (Resolution Oeno 6/2000)

Applied Nuclear Science Educational, Training & Simulation Systems

Hach Method Spectrophotometric Measurement of Free Chlorine (Cl 2 ) in Finished Drinking Water

3.1.1 The method can detect, identify, and potentially measure the amount of (quantify) an analyte(s):

U.S. Department of Energy s Consolidated Audit Program (DOECAP) George E. Detsis

AUTOMATIC AND INTERACTIVE ANALYSIS SOFTWARE FOR BETA- GAMMA COINCIDENCE SYSTEMS USED IN CTBT MONITORING

APPENDIX B QUALITY ASSURANCE PROJECT PLAN

Chapter X: Radiation Safety Audit Program

METHOD 8033 ACETONITRILE BY GAS CHROMATOGRAPHY WITH NITROGEN-PHOSPHORUS DETECTION

B. Tucker Shaw Environmental & Infrastructure 11 Northeastern Boulevard Salem, NH 03079

Is the laboratory s pledge or declaration of the quality of the results produced. to produce data compliant with the Safe Drinking Water Act (SDWA)

NUCL 3000/5030 Laboratory 2 Fall 2013

Method Validation and Accreditation

Radioactive Waste Management

ISO Water quality Strontium 90 and strontium 89 Test methods using liquid scintillation counting or proportional counting

International Atomic Energy Agency. Department of Nuclear Sciences and Applications. IAEA Environment Laboratories

SOUTHERN NUCLEAR COMPANY VOGTLE ELECTRIC GENERATING PLANT UNITS 1 AND 2 NRC DOCKET NOS AND

Laboratory Techniques 100: Back To Basics. Carol Injasoulian Lab Manager City of Bay City April 29,2015

Draft PS 18 APPENDIX A Dynamic Spiking Procedure ( ) 1.1 This appendix to Performance Specification 18

Characterization Survey Techniques and Some Practical Feedback

TNI V1M Standard Update Guidance on Detection and Quantitation

Method 8270C PAH. FD- Field Duplicate Samples

Hach Method Total Organic Carbon in Finished Drinking Water by Catalyzed Ozone Hydroxyl Radical Oxidation Infrared Analysis

QA/QC in the Wastewater Laboratory. Steve Roberts Ohio EPA Division of Environmental Services 05/11/2016

Results of the EPA Method 1631 Validation Study

Standard Operating Procedure for: ph using Oakton ph 5+ Handheld ph Meter. Missouri State University. and

SLAC Radioanalysis Laboratory

Application of national regulations for metallic materials recycling from the decommissioning of an Italian nuclear facility.

TCEQ Regulatory Guidance Remediation Division RG-366/TRRP-13 December 2002

Facilities Management

APPLICATION FOR AUTHORIZATION

Accreditation of radiochemical analyses, from NAMAS to ISO 17025:2005 and beyond

Radioactive Waste Characterization and Management Post-Assessment Answer Key Page 1 of 7

Acoustics and Ionising Radiation Formulation and Strategy. 13 November 2008 Alan DuSautoy

International Atomic Energy Agency. Department of Nuclear Sciences and Applications. IAEA Environment Laboratories

SWGDRUG GLOSSARY. Independent science-based organization that has the authority to grant

Application of Spectral Summing to Indeterminate Suspect Low-Level Drums at Los Alamos National Laboratory

USEPA CONTRACT LABORATORY PROGRAM NATIONAL FUNCTIONAL GUIDELINES FOR ORGANIC DATA REVIEW

Proposed Procedures for Determining the Method Detection Limit and Minimum Level

NUMUG POSITION PAPER What is a Qualified Meteorologist?

STANDARD OPERATING PROCEDURES SOP: 1828 PAGE: 1 of 14 REV: 0.0 DATE: 05/12/95 ANALYSIS OF METHYL PARATHION IN CARPET SAMPLES BY GC/MS

OBJECTIVE DATA PACKAGE DELIVERABLES

SYRACUSE UNIVERSITY RADIATION PROTECTION PROGRAM APPLICATION FOR USE OF RADIOACTIVE MATERIALS

Radionuclides in food and water. Dr. Ljudmila Benedik

RADIOPHARMACY PURPOSE

STANDARD OPERATING PROCEDURES

Radiochemistry Webinars Environmental/Bioassay Radiochemistry Series Gamma Spectrometry (Part 1)

LOUISVILLE RADIOLOGICAL GUIDELINE (LRG) [DATA ANALYSIS AND VALIDATION GUIDELINES] VERSION 4

Higher National Unit specification. General information for centres. Unit code: F6JK 35

AUTOMATIC AND INTERACTIVE ANALYSIS SOFTWARE FOR BETA- GAMMA COINCIDENCE SYSTEMS USED IN CTBT MONITORING

Guideline/SOP: Handling of Laboratory Gross Errors/Data History

2.1. Accuracy, n- how close the indication of the thermometer is to the true value.

Application for Radioactive Material Use

Application of Spectral Summing to Suspect Low Level Debris Drums at Los Alamos National Laboratory

Perspectives on the Confirmation of 2,3,7,8-TCDF for Isotope-Dilution HRGC/HRMS Methodologies

Gross Alpha-Gross Beta Analysis in Water by Liquid Scintillation Counting (LSC)

Active concentration for material not requiring radiological regulation

Food Defense During a Radiological Emergency

CERTIFICATE OF ANALYSIS

STANDARD OPERATING PROCEDURES

EPA's Revision to the 40 CFR Part 136 Method Detection Limit (MDL) Procedure

A Brief Overview of Radiation and Analytical Water Testing for Radiological Contaminants.

Responsibilities: Effective Date: November Revision Date: February 8, VP, Facilities and Construction Management. Issuing Authority:

December 30, Laboratory Report This Page is to be Stamped Introduction: This report package contains total of 5 pages divided into 3 sections:

CCME Reference Method for the Canada-Wide Standard for Petroleum Hydrocarbons (PHC) in Soil - Tier 1 Method

Laboratory Analytical Data

RADIATION SAFETY GUIDELINES FOR NON-USERS

This procedure describes the monitoring activities in a laboratory quality control (QC) program to ensure the quality of test results.

DATA ITEM DESCRIPTION

ISO INTERNATIONAL STANDARD

A61-02 CALA Guidance on Traceability Revision 1.2 October 15, 2012

Uncertainty in Measurement of Isotope Ratios by Multi-Collector Mass Spectrometry

Transcription:

National Analytical Management Program (NAMP) U.S. Department of Energy Carlsbad Field Office Radiochemistry Webinars Data Verification and Validation In Cooperation with our University Partners

2 Meet the Presenter Tom Rucker Dr. Rucker is the manager of the Radiological Assessment and Protection business area for Science Applications International Corporation (SAIC) in Oak Ridge, TN. Dr. Rucker earned a PhD in Analytical Chemistry (Radiochemistry Emphasis, Health Physics Minor) in 1988 from the University of Tennessee. Dr. Rucker joined SAIC in 1987 as a Senior Radiochemist, where he now provides technical and project leadership for radiological characterization, assessment, and protection services. Dr. Rucker specializes in radionuclide measurement and dose/risk assessment for human health and environment protection. He also provides expertise in other areas of analytical chemistry, environmental chemistry, and health physics including environmental monitoring, waste management, and nuclear safeguards and security. Before joining SAIC, Dr. Rucker served as a Research Associate at the Oak Ridge National Laboratory from 1984 to 1987 where he provided radiochemistry procedures and radiation counting facilities development and upgrade. Dr. Rucker also worked at the Oak Ridge Gaseous Diffusion Plant from 1977 to 1984 as Group Leader of the Radioanalysis Group. Dr. Rucker has extensive experience in radioanalytical data evaluation, validation, and management and served as co-chairman of the ANSI/ANS- 41.5-2012 writing group. Contact information: Phone: (865)481 2993 Email: THOMAS.L.RUCKER@saic.com

Verification and Validation of Radiological Data for Use in Waste Management and Environmental Remediation Thomas Rucker, PhD National Analytical Management Program (NAMP) U.S. Department of Energy Carlsbad Field Office TRAINING AND EDUCATION SUBCOMMITTEE

4 ANSI/ANS-41.5-2012 Sponsorship American Nuclear Society (ANS) Nuclear Facilities Standards Committee (NFSC) Subcommittee on Decommissioning and Site Remediation Standards (ANS-23) Manages the development and maintenance of standards that address the cleanup of radioactive materials and radioactivity mixed with hazardous substances.

5 ANS-41.5 Working Group Members S. R. Salaymeh (Chair), Savannah River National Laboratory (Retired) T. L. Rucker (Co-Chair), Science Applications International Corporation A. E. Rosecrance, Oilfield Environmental Compliance D. E. McCurdy, Independent Technical Consultant J. E. Chambers, Fluor-B&W Portsmouth D. W. Poyer, U.S. Army Center for Health Prevention and Preventive Medicine C. King Liu, U.S. Department of Energy J. G. Griggs, U.S. Environmental Protection Agency J. C. Jang, U.S. Nuclear Regulatory Commission (Deceased) P. D. Greenlaw, U.S. Department of Homeland Security

6 A 20-year Process Working group first assembled 1992. Volunteers wrote sections that were reviewed and revised at working group meetings once or twice a year. Initial draft completed in 2002 and submitted to ANSI Committee for review. Working subgroup reviewed committee comments and revised the draft at meetings once or twice a year. Obtained ANS committee approval February 2012. Published August 2012 and available for purchase from the ANS Store: http://www.new.ans.org/store/i_240288

7 Contents Section 1 Purpose and scope Section 2 Acronyms and definitions Section 3 General principles Section 4 Sample-specific parameters Section 5 Batch control parameters Section 6 Instrument parameters Section 7 Personnel qualifications

8 Appendices Contents Appendix A Discussion of Data Life Cycle and Supporting Documents and Information Appendix B Recommended Validation Report Contents Appendix C Discussion of Compliance Verification and Validation Parameters Appendix D Explanation of Equations for Verifying Compliance to Required Sample-Specific Detection Level Appendix E Explanation of Equations for Decision Level and Detection Decisions

9 Purpose of the Standard Provide requirements (shall statements) and recommended practices (should statements) for determining the validity of radioanalytical data for waste management and environmental remediation. Applications include site characterization, waste acceptance, waste certification, waste treatment design, process control, litigation, and other applications requiring data verification and validation.

10 Purpose of the Standard (Cont.) Provide a minimum set of checks and tests that will ensure a consistent approach for compliance verification and validation of data produced by any radioanalytical laboratory. Eliminate many of the inconsistencies in the approaches, evaluation algorithms, parameters evaluated, qualifiers, and qualifications of validators used in existing site-specific data compliance verification and validation programs.

11 Scope of the Standard The standard applies to radioanalytical data for waste management and environmental remediation. The standard applies to data generated by field measurements and radioanalytical laboratories, which require independent review as specified by the data quality objectives (DQOs). Some of the elements of the standard may apply to non-destructive assay and in situ measurements. The standard does not apply to non-radioassay measurement methods (i.e., ICP-MS, KPA, X-ray diffraction, etc.).

12 Scope of the Standard (Cont.) Applies only to independent compliance verification and validation processes and should not be construed to apply to any actions taken by laboratories to internally generate or review data, including audits and performance evaluation studies. However, this standard expects that certain laboratory quality control (QC) and programmatic quality assurance (QA) measures have taken place that feed data for review by the data verifiers and validators, including results of audits and performance evaluation studies.

13 Scope of the Standard (Cont.) Acceptance criteria for the test and checks are intentionally not provided in most cases. Waste management or environmental remediation programs or projects may have unique measurement quality objectives (MQOs) based on the intended use of the data. The Standard assumes a DQO process has been used by the project to define the quality of data needed for the decision process and to develop corresponding MQOs of accuracy, precision, sensitivity, selectivity, and representativeness to be met. The DQO process should also provide guidance for the frequency, percentage, and extent of data validation.

Data Life Cycle 14

15 Compliance Verification The systematic process of checking data for completeness, correctness, consistency, and compliance with written analytical specifications (e.g., SOW, contract, project plans). (Does not assume or require laboratory work be performed under contract.) Verification evaluates all aspects of compliance and attempts to resolve non-compliance. Verification does not provide qualification of the data. Non-correctable non-compliance items are flagged and forwarded to the validation step in a verification report.

16 Validation Process of examining a verified data package to provide a level of confidence in the reported analyte s identification, concentration (including detection status), and associated measurement uncertainty. Analyte- and sample-specific, and extends beyond the method or written analytical specification (e.g., SOW, contract, project plans) compliance. Produces a data set with a limited number of qualifiers associated with the result based on the data s fitness (suitability) for their intended use, as defined by the MQOs and DQOs. Results are documented in a validation report and forwarded for use in data quality assessment.

17 Validation Qualifiers The standard recommends use of traditional EPA Contract Laboratory Protocol qualifiers. The actual qualifiers and associated reasons assigned to each result shall be recorded in an organized manner for final evaluation and reporting. One result may receive multiple (even repeated) qualifiers, each with its own reason. If qualifiers are combined for recording purposes, the rules should be developed during project planning and addressed in the validation plan.

18 Validation Qualifiers (Cont.) <none>: The analyte has been detected and any problems that exist are minor or irrelevant to the intended use. The uncertainty in the result is fairly represented by the reported uncertainty. U: J: R: Undetected. The analyte result is less than the critical level. Estimated. An unusually uncertain or biased, but usable, result. The uncertainty associated with the result significantly (relative to the MQOs) exceeds the reported uncertainty. Unusable. The problems are so severe that the data cannot be used because they significantly affect the decisions based on them.

19 Use of Audit Information The standard anticipates that specific information gathered and evaluated during audits will be available and referenced during the compliance verification and validation processes, rather than contained in the data package for evaluation. Audit information may include generic audit items, items reviewed during a post-award lab audit, or items reviewed on a regular basis for updates based on a desk audit. Desk audits may be included in the validation process if included in the validation plan.

20

21 Generic Audit Information Radiochemical procedures Equations Initial Instrument calibrations NIST traceability for equipment and standards Historic internal QC and external performance testing sample results When an audit report is available, deficiencies are evaluated during verification for corrective actions and during validation for data qualification relative to MQOs.

22 Results of External PE Programs to be Forwarded to the Verifier PE program results reviewed during compliance verification and used as a feedback mechanism to the laboratory to correct any major deficiencies. Data verifier should verify that participation meets the requirements of the SOW.

23 Results of External PE Programs to be Forwarded to the Validator The magnitude of the bias and precision shown by the PE program should be viewed in terms of project-identified action levels and the magnitude of the sample data results. External PE program results cannot be applied to qualify any one batch of samples, but the validator may make recommendations to the data quality assessor based on the PE results and their effect on overall data usability.

Questions? 24

25 Sample Specific Parameters Sample Preservation Holding Times Sample-Specific Chemical Yield Required Detection Limit Nuclide Identification Quantification and CSU Detectability Sample Aliquot Representativeness

Sample Preservation 26 Audit Information Relevant procedures Sample preservation documentation Documentation on calibration and maintenance of relevant equipment or instrumentation (refrigerators, ph meters, thermometers, etc.)

27 Sample Preservation (Cont.) Compliance Verification Review laboratory data sheets and/or chain-ofcustody records for evidence of sample preservation.

28 Sample Preservation (Cont.) Validation If sample preservation requirements were not followed, all affected sample results are questionable. Qualify all affected sample results as either estimated (J) or unusable (R), depending on the magnitude of the uncertainty introduced compared to the established MQOs.

29 Holding Times Audit Information Relative procedures Sample documentation Compliance Verification Determine total elapsed time from date of sample collection to date of analysis. If this time exceeds the specified holding time for a given nuclide or matrix, a notation should be made in the verification report.

30 Holding Times (Cont.) Validation Determine if the results were adversely effected by exceeded the holding time. Qualify all affected sample results as either estimated (J) or unusable (R), depending on the magnitude of the uncertainty caused by the holding time exceedance compared to the established MQOs.

31 Sample-specific Chemical Yield Audit Information Relevant procedures Sample documentation Relevant QC data and corrective action on outliers Certification and traceability of tracers Tracer preparation log Chemical yield trending

32 Sample-specific Chemical Yield (Cont.) Compliance Verification If the chemical yield does not meet the method or project requirements, a notation should be made in the verification report.

33 Sample-specific Chemical Yield (Cont.) Validation Calculate the uncertainty of the chemical yield. If the tracer uncertainty (1 σ) is > 10% (or other limits as specified by the APS or MQOs), qualify the sample result as estimated (J) unless the tracer uncertainty has been propagated into the reported measurement uncertainty. If chemical yield is greater than 110%, qualify the sample result as estimated (J) or unusable (R) based on the amount of bias allowed by the MQOs.

34 Required Detection Limit Audit Information SOPs Documentation for the equations and experimental data from which the typical values of the parameters are obtained for calculation of the L c.

35 Required Detection Limit (Cont.) Compliance Verification For each result that is less than L c, test to determine if the RDL has been met where: k CSU RDL, CSU is the combined standard uncertainty. k is 3.5 for most applications assuming α and β probabilities of 0.05 each. However, the appropriate value can vary depending on the number of background counts (see MARLAP). If this test is not met, a notation is made in the compliance verification report.

36 Required Detection Limit (Cont.) Validation If RDL is not met, note in the validation report that the RDL has not been met. If the result is less than L c and the result plus 1.65 times its CSU is greater than the action level, qualify as unusable (R).

37 Nuclide Identification Audit Information Documentation on measured resolution of the various detectors and the achieved process alpha, beta, and gamma-ray resolutions for typical final sample mounts. Spectral or mathematical unfolding routines/algorithms used in the identification of radionuclides. Basis and/or mathematical algorithms for energy determinations of alpha, beta, and gamma-ray spectra.

38 Nuclide Identification (Cont.) Compliance Verification Verify that the raw spectral data and/or peak search and identification reports have been included in the data package for each analysis, if required by the SOW or other planning documents. Exceptions are noted in the compliance verification report.

39 Nuclide Identification (Cont.) Validation The alpha, beta, or gamma-ray spectra are inspected for the following determination, if required by the validation plan: Obvious misidentification due to improper position of peaks, nonlinear energy response or skewed spectral peak positions. Unresolved multiple peaks overlapping peak interferences. Degradation of resolution resulting from improper sample mounts or final geometry. Quenching of liquid scintillation solutions insufficient counts in the peak for proper peak centroid.

40 Nuclide Identification (Cont.) Validation (Cont.) For alpha spectrometric applications involving radiotracers, the resolution and centroid position of the peak associated with the radiotracer are evaluated, if required by the validation plan. Independent calculations are performed from instrument QC data to verify the detector resolution and energy calibration parameters (gain and offset values) of the spectrometry system, and the peak centroid energy, if required by the validation plan.

41 Nuclide Identification (Cont.) Validation (Cont.) If the analyte has been misidentified or its identification is highly questionable, the results are qualified as unusable (R). If there is a possibility of several radionuclides present in the sample and the energy resolution of the measurement does not permit proper identification, the affected results are qualified as unusable (R).

42 Nuclide Identification (Cont.) Validation (Cont.) If the quench of a sample being counted by liquid scintillation is severe and no corrections have been made for energy correction, the affected results are qualified as estimated (J) or unusable (R), depending on the severity of the problem. If the energy resolution of the alpha spectral measurement has deteriorated to the point that multiple radionuclide peaks overlap significantly, the affected results are qualified as estimated (J) or unusable (R), depending on the severity of the problem.

43 Quantification and CSU Audit Information Detailed radiochemical and/or radiometric procedures. Documentation of procedure validation. Equations used to calculate the analytical result, CSU, MDC, and L c or DL. Documentation relative to the expected range and boundary values for various parameters used in the quantification process and the calculation of the CSU and a priori MDC. Verification of accurate transfer of information from analytical instrument to a database or reporting system.

44 Quantification and CSU (Cont.) Compliance Verification Spot checks (a percentage defined by the validation plan) are performed to evaluate: The occurrence of transcription errors. The consistency between hard-copy and electronic data submissions. The quantification calculations by independent calculations if required by the verification and validation plan.

45 Quantification and CSU (Cont.) Compliance Verification (Cont.) All quantification data and calculation parameters are verified against requested analyses and reporting requirements.

46 Quantification and CSU (Cont.) Compliance Verification (Cont.) The raw data are reviewed to ensure the following: Procedures and equations are consistent with those required in the written analytical specifications or validated in the audit. Correct dates and time intervals are used in the equations for radioactive decay and ingrowth.

47 Quantification and CSU (Cont.) Validation For parametric, spectral resolution, and calculation outliers/errors, the data are qualified as estimated (J) or unusable (R), depending on the magnitude of the bias introduced compared to the established MQOs. For concentrations greater than ten times the MDC, if CSU > 0.25 R S then there is excessive uncertainty in the measurement and further review may be necessary. An estimated (J) qualifier may be assigned to the sample data.

48 Quantification and CSU (Cont.) Validation (Cont.) If the net negative result is more negative than the 2σ CSU, there is a negative bias resulting from improper background subtraction and the data shall be qualified as estimated (J) or unusable (R), depending on the magnitude of the error introduced compared to the established MQOs.

49 Detectability Audit Information When L c is calculated by the laboratory, the equation and data used for its derivation are normally reviewed during the audit or desk audit. When an audit report is available, L c calculation deficiencies identified are evaluated relative to the MQOs during verification and validation to determine when corrective actions and/or data qualification are to be performed. When documentation of changes in the method used to calculate L c is provided in the data package, the experimental data and equations from which the L c values are obtained are reviewed.

50 Detectability (Cont.) Compliance Verification When L c is not calculated by the laboratory, the sample CSU may be used for its derivation: Validation L c = 1.65 CSU R or by using a set of blank data: L c = [(t S B ) + R B ] / (E R IDF W) If the analyte concentration is found to be less than L c, an undetected (U) qualifier shall be applied to the data result. L c = Critical Level; CSU = Combined Standard Uncertainty; t = student t factor; S b = standard deviation of set of blanks; R b = average blank count rate; E = fraction detector efficiency; R = fractional chemical yield; IDF = ingrown/decay factor; W = weight or volume

51 Sample Aliquot Representativeness Audit Information Subsampling procedures and sizes and homogenization methods. Compliance Verification Verify that required homogenization techniques and aliquot sizes were used. Exceptions are noted in the compliance verification report.

52 Sample Aliquot Representativeness (Cont.) Validation Results from different but comparable analytical techniques from different subsample aliquots of the same sample should be compared for consistency. If the laboratory did not follow the required homogenization techniques or use the required aliquot sizes qualify the affected results as estimated (J) or unusable (R), depending on the magnitude of the uncertainty introduced compared to the established MQOs.

Questions? 53

54 Batch Control Parameters Laboratory Control Sample Matrix Spike Sample Duplicate and MS Duplicate Sample Analysis Batch Method Blank Analysis

55 Batch Control Parameters (Cont.) Batch = A group of samples prepared at the same time, by the same analyst, in the same location, and using the same method.

56 Laboratory Control Sample Audit Information Algorithm used to calculate the LCS percent difference (LCS %D). Quality control charts. LCS M = LCS measured value; LCS E = LCS expected value

57 Laboratory Control Sample (Cont.) Compliance Verification Review the results for each batch to ensure that the required number or frequency of LCSs was included with the sample batch. Review the results for each batch to determine if the percent difference was within the QC acceptance limits. Exceptions are noted in the compliance verification report.

58 Laboratory Control Sample (Cont.) Validation If LCSs were not performed at the frequency specified in the MQOs, qualify the data for all samples analyzed with the batch as estimated (J). If the percent difference for the LCSs was not within the QC acceptance limits as established in the MQOs (accuracy), qualify the data for all samples analyzed with the batch as estimated (J).

59 Matrix Spike Analysis Audit Information Algorithm used to calculate the MSS percent difference (MSS %D). Quality control charts. SSR = Sample spike result; SR = Sample result; SA = Spike activity

60 Matrix Spike Analysis (Cont.) Compliance Verification Review the results for each batch to ensure that the required number or frequency of MSSs was included with the sample batch. Review the results for each batch to determine if the percent difference was within the QC acceptance limits. Exceptions are noted in the compliance verification report.

61 Matrix Spike Analysis (Cont.) Validation If MSSs were not performed at the frequency specified in the MQOs, qualify the data for all samples analyzed with the batch as estimated (J). If the percent difference for the MSSs was not within the QC acceptance limits as established in the MQOs (accuracy), qualify the data for all samples analyzed with the batch as estimated (J).

62 Duplicate and MSD Sample Analysis Audit Information Algorithm used to calculate the relative percent difference (RPD) and duplicate error ratio (DER). Quality control charts. S = Sample result; D = Duplicate result; CSU S = Combined standard uncertainty of sample; CSU D = Combined standard uncertainty of duplicate

63 Duplicate and MSD Sample Analysis (Cont.) Compliance Verification Review the results for each batch to ensure that the required number or frequency of duplicates and/or MSDs was included with the sample batch. Review the results for each batch to determine if the duplicates (laboratory, field, or MSD) were within the QC acceptance limits. If the RPD was not within QC limits, verify that the DER is within limits set by the DQO process. A limit of 2 provides a 5% false conclusion rate and a limit of 2.58 provides a 1% false conclusion rate. Exceptions are noted in the compliance verification report.

64 Duplicate and MSD Sample Analysis (Cont.) Validation If laboratory duplicates and/or MSDs were not performed at the frequency specified in the MQOs, qualify the data for all samples analyzed with the batch as estimated (J). If the precision for the duplicates (laboratory, field, or MSD) was not within the QC acceptance limits as established in the MQOs (precision), qualify the data for all samples analyzed with the batch as estimated (J).

65 Batch Method Blank Analysis Compliance Verification Review the results for each batch to ensure that the required number or frequency of blanks was included with the sample. Review the results for each batch to verify the batch method blank is less than 1.65 CSU (when background is subracted) and/or within control limits (when the method blank is subtracted). Exceptions are noted in the compliance verification report.

66 Batch Method Blank Analysis (Cont.) Validation If batch method blanks were not performed at the frequency specified in the MQOs, qualify the data for all samples analyzed with the batch as estimated (J). If the net batch method blank result was not less than 1.65 CSU or was outside control limits (depending on applicable test): Qualify the results for all samples analyzed with the batch that are less than ten times the net batch method blank value as estimated (J). Qualify the results for all samples analyzed with the batch that are less than the sum of the net batch method blank and its 1.65 CSU as undetected (U).

Questions? 67

68 Instrument Parameters Counting Efficiency Calibration Efficiency Calibration Background Determination

69 Counting Efficiency Calibration Audit Information Date of efficiency calibration and date that the new calibration factors were effective. Standard preparation log for the counting geometry standard with traceability to a certified reference material or standard. Certificate for certified reference material or standard. Counting time for the standard. Raw count results for the standard. Calculations showing derivation of the counting efficiency factor or statistical curve fit.

70 Counting Efficiency Calibration (Cont.) Audit Information (Cont.) If the laboratory recalculates the efficiency calibration after the audit and documentation of changes is provided in the data package, the experimental data and equations from which the efficiency calibrations are obtained are reviewed via a desk audit or during data validation.

71 Counting Efficiency Calibration (Cont.) Compliance Verification Verify that the instrument s most recent efficiency calibration was performed at the required frequency. Verify that efficiency performance checks are analyzed prior to the counting of samples each day that samples are counted. Verify performance check count-rate results data are within properly established tolerance limits (based on system performance and analytical MQOs) or that recalibration was performed whenever the limits were exceeded. The limits are related to the mean count-rate value established at the time of calibration for each detector. Evaluate check source counting statistics to verify that the counting uncertainty (1σ) was less than or equal to one-fifth of the MQO.

72 Counting Efficiency Calibration (Cont.) Validation If the specified efficiency calibration and/or verification frequency is not followed, the efficiency or quench curves are not smooth, or the QC performance check results fall outside the appropriate tolerance limits, qualify the results for all samples analyzed between acceptable calibration verifications as estimated (J) or unusable (R), depending on the magnitude of the error based on the established MQOs. When significant errors are found in the calculation, qualify all affected results as either estimated (J) or unusable (R), depending on the magnitude of the error based on the established MQOs.

73 Energy Calibration Audit Information Date of energy calibration and date that the new calibration factors were effective. Certificate for calibration standard. Peak centroid for all peaks used for calibration. Procedure and calculations showing derivation of the energy calibration gain and offset factors or other curve-fit parameters. Voltage, gain, and cross-talk calibration data, depending on specific instruments. Discriminator or region of interest setting determinations. Energy resolution (full-width at half-maximum) calibration data for spectroscopy systems.

74 Energy Calibration (Cont.) Audit Information (Cont.) If the laboratory recalculates the energy calibration after the audit and documentation of changes is provided in the data package, the experimental data and equations from which the energy calibrations are obtained are reviewed via a desk audit or during data validation.

75 Energy Calibration (Cont.) Compliance Verification Verify that the instrument s most recent energy calibration was performed at the required frequency. Verify that energy performance checks are analyzed prior to the counting of samples each day that samples are counted. Verify performance check peak centroid or calculated energy is within properly established tolerance limits (based on system performance and analytical MQOs), or that recalibration was performed whenever the limits were exceeded. The limits are the energy tolerance used for peak identification for the samples.

76 Energy Calibration (Cont.) Validation If the specified energy calibration and/or verification frequency is not followed, the efficiency or quench curves are not smooth, or the QC performance check results fall outside the appropriate tolerance limits, qualify the results for all samples analyzed between acceptable calibration verifications as unusable (R) if the error is great enough to cause misidentification of the radionuclide (outside the peak identification energy tolerance limit). When significant errors are found in the calculation, qualify all affected results as either estimated (J) or unusable (R), depending on the magnitude of the error based on the established MQOs.

77 Background Determination Audit Information Date of background determination and date that the new factors were effective (for calculation). Counting time for the background determination. Raw background count results. Calculations showing derivation of the background count-rate factor. Review data fit when multiple factors are used to produce a background (i.e., quench) curve.

78 Background Determination (Cont.) Audit Information (Cont.) Review of background individual energy ranges for spectroscopy measurements. If the laboratory recalculates the background values after the audit and documentation of changes is provided in the data package, the experimental data and equations from which the background values are obtained are reviewed via a desk audit or during data validation.

79 Background Determination (Cont.) Compliance Verification Verify that the instrument s most recent background determination was performed each time there is a significant instrument operational and at the required frequency as stated in the SOW or QAPP. Verify that background performance checks are analyzed at the required frequency as stated in the SOW or QAPP. Verify background performance check count rates are within properly established tolerance limits (based on system performance and analytical MQOs) or that redetermination was performed whenever the limits were exceeded. The limits are related to the mean count-rate value established at the time of background determination for each detector.

80 Background Determination (Cont.) Validation If the specified background determination and/or verification frequency is not followed, the background quench curves are not smooth, or the QC performance check results fall outside the appropriate tolerance limits, qualify the results for all samples analyzed between acceptable background verifications as either estimated (J) or unusable (R), depending on the magnitude of the error based on the established MQOs. When significant errors are found in the calculation, qualify all affected results as either estimated (J) or unusable (R), depending on the magnitude of the error based on the established MQOs.

Questions? 81

82 Personnel Qualifications Verifier Validator Auditor

83 Verifier Qualifications A high school diploma or AA degree. 2 years of radiochemical laboratory experience including chemical separations, nuclear instrumentation, and record keeping. Familiarity with radiochemical, nuclear instrumentation, and QC procedures.

84 Validator Qualifications BS or BA degree in chemistry or related physical sciences or engineering disciplines. 3 years of radiochemical laboratory experience including sample preparation, radiochemical procedures, and measurement instrumentation. 2 years of experience in data interpretation and review. Familiarity with the DQO process and statistical concepts, inferences, interpretation, and tests.

85 Auditor Qualifications BS or BA degree in chemistry or related physical sciences or engineering disciplines (years of related experience may substitute). 4 years of radiochemical laboratory experience including sample preparation, radiochemical procedures, and measurement instrumentation. 3 years of experience in data interpretation and review. Completion of internal or external auditor training. Familiarity with the DQO process and statistical concepts, inferences, interpretation, and tests.

Questions? 86

Upcoming Webinars in the Environmental/Bioassay Radiochemistry Series EPA Incident Response Guide and Rapid Methods Overview Traceability and Uncertainty Subsampling Mass Spectrometry Gamma Spectrometry (Parts I & II) Radiobioassay GUM