Presentation is loading. Please wait.

Presentation is loading. Please wait.

A Historical Survey of Radiochemistry Laboratory Audit Findings Bob Shannon Contact: Presented June 27, 2006 at the 16th Annual.

Similar presentations


Presentation on theme: "A Historical Survey of Radiochemistry Laboratory Audit Findings Bob Shannon Contact: Presented June 27, 2006 at the 16th Annual."— Presentation transcript:

1 A Historical Survey of Radiochemistry Laboratory Audit Findings Bob Shannon Contact: BobShannon91@earthlink.net Presented June 27, 2006 at the 16th Annual RETS – REMP Workshop

2 2 Changes in Radioanalytical Capabilities and Needs Transition from Site labs to contract labs Increasingly complex SOW's Data processing enables complex data work-up and reporting Increasing concern about Defensibility of Data Rules, Rules, Rules - Raw Data and Rules, Rules, Rules ….

3 3 Integrated Contractor Purchasing Team Basic Ordering Agreement Mid-1990s -- GAO pointed out a lack of coordination in oversight of analytical services 1999 - Formation of ICPT BOA - a national contract vehicle for DOE environmental lab services

4 4 Increased Efficiency and Quality through Standardized Requirements Single SOW with standardized requirements levels playing field Site-specific SOWs address unique needs (...wants ?) With fewer SOW's and deliverables, labs should be able to operate more efficiently and concentrate on quality

5 5 DOECAP DOE Consolidated Audit Program National team of technically qualified auditors supplied by DOE Sites Fewer audits save sites and labs time and money

6 6 The QSAS is NELAC … and more...

7 7 QSAS Quality Systems for Analytical Services (Revision 2.2 - November 2005) Single, integrated QA program for environmental analytical laboratories supporting DOE Step toward harmonizing analytical quality requirements across Federal agencies

8 8 Technical Basis for QSAS Chapter 5 of NELAC (2003) Gray-text supplements, notes deviations from and clarifies NELAC Standard Compliant with DOE Order 414.1A - Quality Assurance The QSAS and audit checklists available at: http://www.oro.doe.gov/DOECAP

9 9 An evaluation of historical findings and observations was conducted Priority 1 Findings 9 (2%) Priority 2 Findings 289 (50%) Observations 276 (48%) Total Findings and Observations 574 (100%)

10 10 Frequency of Findings / Observations

11 11 Frequency of Findings / Observations Other Corrective action1.4% Process Personnel1.0% RM Licensing0.9% Communication0.5% Quality Systems0.5% Capability0.3% Narrative0.3% Software QA0.3% Configuration 0.2% Control Miscellaneous2.3%

12 12 Finding and Observations by Deficiency Type Established criteria 14% inadequate Documentation inadequate10% Proceduralization inadequate10% Requirement not performed8% Criteria not established8% Technically inadequate6% DQOs not satisfied5% Review inadequate4% Frequency requirements not 4% satisfied Corrective action not 3% performed Implementation inadequate3% Cannot demonstrate control 3% over process Standard expired2% Add tracer before prep1% No expiration date1% No second source1% QC and field samples treated 1% differently Training inadequate1% Tracer contaminant check not 0.9% performed Minimum count criteria 0.9% not satisfied Configuration control 0.7% inadequate Point source used for 0.7% plateau Geometry mismatch0.5% Applied beyond calibrated 0.5% range (quench or attenuation) Wrong constant (2.71) 0.5% used in MDA Others9%

13 13 General Calibration Practices (16%) Almost all labs perform technically adequate calibrations for efficiency, self-absorption, quench, energy and shape. The most common weaknesses include: Processes not adequately proceduralized Definition of nuclide, geometry, activity, matrix Calibration standard preparation Counting procedures Calculations, documentation and review Standards Issues Standards not traceable, expired, not verified Standards use or prep not adequately documented Calibration source not independent of QC samples

14 14 General Calibration Practices (16%) Acceptance criteria for calibrations Criteria not established Curve fit acceptance criteria not documented Count precision requirements not met Frequency requirements not met Background determinations Frequency requirements not met Background count time shorter than associated samples Geometry or mount not representative of samples (e.g. LSC / filters)

15 15 GPC Set-up / Calibration Practices GPC Plateau Measurements/Discriminator Set-up Slope does not meet acceptance criteria (Note that at HV set-point may approach or exceed 5%) Discriminators not set consistently between detectors Instrument not recalibrated when HV is reset

16 16 GPC Performance Checks Discrete alpha and beta sources not used for daily checks -- track major & minor channel response (these support both efficiency and crosstalk calibrations) Limits do not reflect conditions at point of calibration A rolling mean of does not reflect conditions prior to first point averaged) GPC Calibration Practices

17 17 GPC Calibration Practices GPC Efficiency calibrations Choice of reference nuclide inappropriate 230 Th and 137 Cs or 90 Sr / 90 Y required reference nuclides for Gross Alpha and Beta drinking water compliance testing Other nuclides (non-DW) per application - consider technical basis; document in raw data and case narrative Standards not representative of sample (range of masses, chemical composition, source geometry and configuration) Self-absorption curve does not span range of masses for analysis

18 18 GPC Calibration Practices Self-absorption and crosstalk calibrations not performed using reference nuclide Crosstalk calibrations not performed Required for Gross Alpha / Beta Crosstalk calibration does not span self-absorption curve mass range

19 19 Technical Issues (14%) Most technical practices at most labs are adequate to provide acceptable data Internal and External QC results, and experience with reported data, show up these problems Technical issues tend reflect staff expertise and the level of scrutiny to which the labs data is subjected

20 20 Technical Issues (Examples) Gamma Spectroscopy Software not configured properly for intended purpose Software parameter set-up not proceduralized / documented Gamma spec library not based on technically defensible assumptions. (e.g. U-238 determined in water sample by measurement of Th-234 although secular equilibrium cannot be defended) Sample geometry does not conform to calibration geometry Other examples Digestion processes inadequate LSC background not representative of sample Method validation not technically adequate or documented Reagent preparation instructions inaccurate

21 21 Standard Reference Materials (14%) Standards preparation documentation inaccurate or ambiguous Standards use ambiguously documented Standards not NIST traceable (or ASTM C1128) Standards verification requirements not met Annual reverification not performed Criteria not formalized Tracers not tested for contaminants (e.g., Am-241 in Am- 243) Independent (second) source not used for verification Expiration date issues

22 22 Quality Control Practices (7%) Historical performance (control charts) does not support QC limits applied QC samples not subjected to same process as samples – esp. blanks! QC sample evaluation acceptance criteria ambiguous or not defined

23 23 PT failure without corrective action Chemical yield practices inadequate Yield monitor not added early in process (prior to digestion / separation) Acceptance criteria not established, proceduralized or implemented Overcorrection for yields >> 100% Quality Control Practices (7%)

24 24 Calculations (7%) Inadequate software V&V Inadequate Configuration Control Spreadsheets not protected Instrument setting not documented or checked Calculations incorrectly implemented CSU not reported, not proceduralized, technical basis not documented/ not adequate

25 25 Standard Operating Procedures (7%) Procedures dont support practice Details inadequate or ambiguous Uncontrolled SOPs in use No SOP for method (esp. for HTDs) Deviations from regulatory reference methods not documented

26 26 Contamination Control (6%) Program for contamination control not proceduralized or inadequate Instrument background not checked after high activity samples No corrective action after background check failure Detector cleaned prior to background checks (prevents identification of cross-contamination) Inadequate separation of work areas for environmental versus elevated activity operations Glassware cleaning practices inadequate

27 27 Conclusions Technical quality has improved over the lifetime of DOECAP Labs are less successful at maintaining pace with increased QA and documentation requirements. Labs and their customer must work to maintain both the technical and the legal defensibility of their data. Besides that is nothing more than good science. Harmonizing requirements around quality standards (e.g. NELAC) simplifies requirements for labs and leads to more effective operation, lower costs and better data. Caveat Emptor!!!

28 28 Finding and Observations by Deficiency Type Established criteria 14% inadequate Documentation inadequate10% Proceduralization inadequate10% Requirement not performed8% Criteria not established8% Technically inadequate6% DQOs not satisfied5% Review inadequate4% Frequency requirements not 4% satisfied Corrective action not 3% performed Implementation inadequate3% Cannot demonstrate control 3% over process Standard expired2% Add tracer before prep1% No expiration date1% No second source1% QC and field samples treated 1% differently Training inadequate1% Tracer contaminant check not 0.9% performed Minimum count criteria 0.9% not satisfied Configuration control 0.7% inadequate Point source used for 0.7% plateau Geometry mismatch0.5% Applied beyond calibrated 0.5% range (quench or attenuation) Wrong constant (2.71) 0.5% used in MDA Others9%

29 29 Acknowledgments DOECAP Rad Labs DOECAP Auditors DOECAP Management Team Graphic Design Support Maya Shannon Correspondence Bob Shannon QRS, LLC BobShannon91@earthlink.net 2349 Bellaire St Denver, CO 80207 303-432-1137


Download ppt "A Historical Survey of Radiochemistry Laboratory Audit Findings Bob Shannon Contact: Presented June 27, 2006 at the 16th Annual."

Similar presentations


Ads by Google