4-1 CLU-IN Studios Web Seminar August 14, 2008 Stephen Dyment USEPA Technology Innovation Field Services Division Demonstration.

Slides:



Advertisements
Similar presentations
Radiochemical Methods and Data Evaluation
Advertisements

Quality is a Lousy Idea-
Objectives Uncertainty analysis Quality control Regression analysis.
Lessons Learned Multi Incremental Sampling Alaska Forum on the Environment February, 2009 Alaska Department of Environmental Conservation.
Design of Experiments Lecture I
Forecasting Using the Simple Linear Regression Model and Correlation
Sampling: Your data is only as good as your field technicians.
Vapor Intrusion. What is Vapor Intrusion? The migration of volatile chemical vapors from the subsurface to overlying buildings.
Assessing Laboratory Quality – Systematic Bias Robert O. Miller Colorado State University Fort Collins, CO.
FTP Biostatistics II Model parameter estimations: Confronting models with measurements.
Mitigating Risk of Out-of-Specification Results During Stability Testing of Biopharmaceutical Products Jeff Gardner Principal Consultant 36 th Annual Midwest.
The Simple Linear Regression Model: Specification and Estimation
Importance of Quality Assurance Documentation and Coordination with Your Certified Laboratory Amy Yersavich and Susan Netzly-Watkins.
Chapter 12 Simple Regression
Monitoring and Pollutant Load Estimation. Load = the mass or weight of pollutant that passes a cross-section of the river in a specific amount of time.
Lecture 10 Comparison and Evaluation of Alternative System Designs.
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
Today Concepts underlying inferential statistics
Short Course on Introduction to Meteorological Instrumentation and Observations Techniques QA and QC Procedures Short Course on Introduction to Meteorological.
Annex I: Methods & Tools prepared by some members of the ICH Q9 EWG for example only; not an official policy/guidance July 2006, slide 1 ICH Q9 QUALITY.
1 Risk Assessment Develop Objectives And Goals Develop and Screen Cleanup Alternatives Select Final Cleanup Alternative Communicate Decisions to the Public.
Codex Guidelines for the Application of HACCP
Field Analysis Quality Control
Statistical Methods For Engineers ChE 477 (UO Lab) Larry Baxter & Stan Harding Brigham Young University.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 12-1 Chapter 12 Simple Linear Regression Statistics for Managers Using.
Mantova 18/10/2002 "A Roadmap to New Product Development" Supporting Innovation Through The NPD Process and the Creation of Spin-off Companies.
Simple Linear Regression
Overview of US EPA’s Vapor Intrusion Guidance VAP CP Summer Coffee July 14 th, 2015 Carrie Rasik Ohio EPA CO- Risk Assessor
1 Framework Programme 7 Guide for Applicants
Deana M. Crumbling, M.S. Technology Innovation Office U.S. Environmental Protection Agency Washington, D.C. (703) Northeast.
1 Facilitating Reuse at RCRA Sites: Innovative Technologies for Groundwater Characterization and Cleanup Introduction Walter W. Kovalick Jr., Ph.D. Associate.
1. IASC Operational Guidance on Coordinated Assessments (session 05) Information in Disasters Workshop Tanoa Plaza Hotel, Suva, Fiji June
Quality WHAT IS QUALITY
3.1-1 Advanced Design Application & Data Analysis for Field-Portable XRF Contact: Stephen Dyment, OSRTI/TIFSD,
Triad Approach / Best Management Practices. Triad Approach and BMPs Systematic Planning – Conceptual Site Model (CSM) Dynamic Work Strategies Real-Time.
Introduction The past, the present and the future.
Making Physical Measurements Terry A. Ring Department of Chemical Engineering University of Utah.
Generic Approaches to Model Validation Presented at Growth Model User’s Group August 10, 2005 David K. Walters.
Integrated Risk Management Charles Yoe, PhD Institute for Water Resources 2009.
Applications of Regression to Water Quality Analysis Unite 5: Module 18, Lecture 1.
| Data Validity and Usability in the RI/FS Process CHRISTINA MOTT FRANS, JOSH HOPP, CHRISTINE RANSOM.
Using the Right Method to Collect Information IW233 Amanda Murphy.
ME551/GEO551 Introduction to Geology of Industrial Minerals Spring 2005 SAMPLING.
Module 6: Alternatives. 2  Module 6 contains three sections: – 6.1 Development and Screening of Alternatives – 6.2 Detailed Analysis of Alternatives.
STRATEGIC ENVIRONMENTAL ASSESSMENT METHODOLOGY AND TECHNIQUES.
1 of 39 The EPA 7-Step DQO Process Step 3 - Identify Inputs (45 minutes) Presenter: Sebastian Tindall Day 2 DQO Training Course Module 3.
1 of 36 The EPA 7-Step DQO Process Step 6 - Specify Error Tolerances (60 minutes) (15 minute Morning Break) Presenter: Sebastian Tindall DQO Training Course.
Issues concerning the interpretation of statistical significance tests.
1 Analytical Services Program Workshop September 2015 Data Usability Applications and Importance Kelly Black Statistician / President.
5-1 XRF and Appropriate Quality Control CLU-IN Studios Web Seminar August 18, 2008 Stephen Dyment USEPA Technology Innovation Field Services Division
Retain H o Refute hypothesis and model MODELS Explanations or Theories OBSERVATIONS Pattern in Space or Time HYPOTHESIS Predictions based on model NULL.
SQO 4/7/05 INCORPORATING MULTIPLE LINES OF EVIDENCE INTO SEDIMENT QUALITY OBJECTIVES Stephen B. Weisberg Southern California Coastal Water Research Project.
Adaptive Integrated Framework (AIF): a new methodology for managing impacts of multiple stressors in coastal ecosystems A bit more on AIF, project components.
1 of 27 How Many Samples do I Need? Part 2 Presenter: Sebastian Tindall (60 minutes) (5 minute “stretch” break) DQO Training Course Day 1 Module 5.
1 of 31 The EPA 7-Step DQO Process Step 6 - Specify Error Tolerances 60 minutes (15 minute Morning Break) Presenter: Sebastian Tindall DQO Training Course.
Report Performance Monitor & Control Risk Administer Procurement MONITORING & CONTROLLING PROCESS.
Wenclawiak, B.: Fit for Purpose – A Customers View© Springer-Verlag Berlin Heidelberg 2003 In: Wenclawiak, Koch, Hadjicostas (eds.) Quality Assurance in.
26134 Business Statistics Week 4 Tutorial Simple Linear Regression Key concepts in this tutorial are listed below 1. Detecting.
LECTURE 13 QUALITY ASSURANCE METHOD VALIDATION
A Framework and Methods for Characterizing Uncertainty in Geologic Maps Donald A. Keefer Illinois State Geological Survey.
1 of 48 The EPA 7-Step DQO Process Step 6 - Specify Error Tolerances 3:00 PM - 3:30 PM (30 minutes) Presenter: Sebastian Tindall Day 2 DQO Training Course.
1 FORMER COS COB POWER PLANT From Characterization to Redevelopment Brownfields2006 November 14, 2006.
Chapter 1: The Nature of Analytical Chemistry
26134 Business Statistics Week 4 Tutorial Simple Linear Regression Key concepts in this tutorial are listed below 1. Detecting.
Quality is a Lousy Idea-
Making Physical Measurements
Quality is a Lousy Idea-
METHOD VALIDATION: AN ESSENTIAL COMPONENT OF THE MEASUREMENT PROCESS
ME551/GEO551 Introduction to Geology of Industrial Minerals Spring 2007 SAMPLING.
Objectives Provide an overview of the triad approach and its application Describe the elements of the triad approach for practical application Describe.
Presentation transcript:

4-1 CLU-IN Studios Web Seminar August 14, 2008 Stephen Dyment USEPA Technology Innovation Field Services Division Demonstration of Method Applicability (DMA) For Field Portable XRF

4-2 Logistics  How to  Ask questions »“?” button on CLU-IN page  Control slides as presentation proceeds »manually advance slides  Review archived sessions »  Contact instructor

4-3 A Demonstration of What?  Relax »It can be as easy or as complex as you need it to be  Commensurate with... »Complexity of project »Expected use of data  Fairly simple for most XRF applications

4-4 DMA History  Concept founded in SW-846, performance based measurement (PBMS) initiative 846/pbms.htmhttp:// 846/pbms.htm  Initial site-specific performance evaluation »Analytical and direct sensing methods »Sample design, sample collection techniques, sample preparation strategies »Used to select information sources for field and off- site  Goal is to establish that proposed technologies and strategies can provide information appropriate to meet project decision criteria “I think that in the discussion of natural problems we ought to begin not with the Scriptures, but with experiments, and demonstrations.” Galileo Galilei

4-5 DMA Technical Bulletin  New publication of Brownfield Technical Support Center (BTSC) »  15 page technical bulletin describing »DMA components, benefits, considerations »Common products, data evaluation »3 short case studies  Audience- technical team members and stakeholders

4-6 Why Do You Need a DMA?  Triad usually involves real-time measurements to drive DWS  Greatest sources of uncertainty usually sample heterogeneity and spatial variability  Relationships with established laboratory methods often required to make defensible decisions  Provides an initial look at CSM assumptions

4-7 What’s Involved?  There is no template for DMAs! »Format, timing, documentation etc. depend heavily on site specifics, existing information, intended data use  Performed early in program  Go beyond simple technology evaluation to optimize full scale »Sample design, decision and unit designations »Sample prep, throughput, other logistics »Data management issues  Documentation

4-8 What to Look For….  Effectiveness- Does it work as advertised?  QA/QC issues »Are DLs and RLs for site matrices sufficient? »What is the expected variability? Precision? »Bias, false positives/false negatives? »How does sample support effect results? »Develop initial relationships of collaborative data sets that provide framework of preliminary QC program  Matrix issues?  Do collaborative data sets lead to the same decision?  Assessing alternative strategies as contingencies

4-9 Is a DMA Always Appropriate?  No »SI guidance indicates <20 samples »Some activities with limited scope or resources »Projects with adequate resources to employ established mobile or fixed lab methods at sufficient density*  But »Usually appropriate and simple for XRF (continued)

4-10 Is a DMA Always Appropriate?  The Brownfield’s perception »“a property, redevelopment, or reuse which may be complicated by the presence or potential presence of a hazardous substance, pollutant, or contaminant” »Underscores the need for higher density information, collaborative data sets »Facilitates stakeholder communication and public presentations

4-11 More Benefits  Augment planned data collection and CSM development activities  Test drive communication and data management schemes, decision support tools (DSTs) »Sampling and statistical tools »Visualization tools, data management tools  Develop relationships between visual observations and direct sensing tools  Give flexibility to change tactics based on DMA rather than full implementation  Establish initial decision logic for DWS  Evaluate existing contract mechanisms  Optimize sequencing, staffing, load balance, unitizing costs

4-12 Choosing Samples for a DMA  Ideally across the expected range of concentrations with a focus around action levels  Limited by difficulty in obtaining material (depth, drilling, etc.)  Multiple matrices?  Problematic, interesting, or strange samples make a great choice »Remember dynamic range issues

4-13 Considerations for Evaluation  Instrument operation »Count times, in-situ, ex-situ, calibration  Sample Preparation »Bags, cups, drying, sieving, grinding  Sample variability »How will you assess and then manage?  Collaborative Samples »Consider decision rules for choosing lab samples  QC program »Frequencies and level of QC required

4-14 Typical DMA Products – Summary Statistics

4-15 Typical DMA Products  Parametric – Linear regressions

4-16 Appropriate Regression Analysis  Based on paired analytical results, ideally from same sub- sample  Paired results focus on concentration ranges pertinent to decision-making  Non-detects are removed from data set  Best regression results obtained when pairs are balanced at opposite ends of range of interest  No evidence of inexplicable “outliers”  No signs of correlated residuals  High R2 values (close to 1)  Constant residual variance (homoscedastic) is nice but unrealistic

Example: Lead at Fort Lewis  Full data set: »Wonderful R 2 »Correlated residuals »Dynamic range issue? »What does slope indicate? »Y intercept?  Trimmed data set: »Balanced data »Correlation gone from residuals »XRF NDs? »Good slope »R 2 drops significantly

4-18 Typical DMA Products  Non-parametric techniques – Ranges or bins

4-19 Comparability A Dirty Word?  Vague references in QAPPs »PARCC — How do you measure comparability?  Our use of the term »The frequency with which results from different techniques agree with respect to a declared reference point

4-20 Weight of Evidence vs. Collaborative Data Sets  The Triad perspective »Weight of Evidence — Combining information from various sources into a holistic picture, advancing the CSM »Collaborative data — Using 2 or more analytical methods to measure the same compound, analyte, surrogate, or class — Using established relationships, one method can be used to inform the user when analysis by another is warranted or beneficial

Typical DMA Products  Uncertainty evaluations – Example: Navy Uncertainty Calculator

4-22 Navy Uncertainty Calculator

4-23 Typical DMA Products  QC program worksheets

4-24 Translating DMA Results  Develop.... »Field based action levels »SOPs »QA/QC programs »Statistical sampling design »Dynamic work strategies — Decision rules, decision logic diagrams »Contingencies — Based on cost and performance

4-25 Field Based Action Levels  Action levels for field analytics or direct sensing tools that trigger action »Collection of collaborative data »Step outs, additional sampling or analysis, well placement, etc. »Remedy implementation — Removal — Confirmation of clean (sometimes required) (continued)

Field Based Action Levels 1 False Negative Error = 5% 3 False Positive Errors =7.7% 59 Total pairs True Positive 19 Pairs True Negative 36 Pairs

59 Total pairs 10 False Positive Errors = 26% True Positive 20 Pairs True Negative 29 Pairs 0 False Negative Error = 0% Typical DMA Products (continued)

3 False Positive Errors = 7.7% 59 Total pairs True Positive 19 Pairs 0 False Negative Error = 0% True Negative 26 Pairs 11 Samples for ICP 3 Way Decision Structure With Region of Uncertainty Typical DMA Products

4-29 Optimize SOPs and QC Programs  XRF »Count times — seconds »Battery change frequency — temperature and use can impact »Media for collection/analysis — bags, cups, in-situ, sample prep »Frequency of QC — blanks, SRMs, SSCS, spikes

4-30 XRF Tool Box  A beta version of a series of spreadsheets we have developed to provide QC support »Monitoring trends, control charting »Monitoring instrument precision »Evaluating total measurement precision »Estimating number of measurements needed in a particular bagged sample »Estimating number of samples required to make a decision (clean vs. dirty) in a decision unit at a required confidence

DMA Data to QC program  QC program worksheets – Control charts

Optimize Statistical Sampling Design  Characterize or verify clean-up  Statistical confidence desired  How close to each other are the true mean and AL  How much variability is present in soil concentrations

Statistical Sampling Design

Decision Logic

4-35 XRF Simple Decision Rule Example  Bagged samples, measurements through bag  Need decision rule for measurement numbers for each bag  Action level: 25 ppm  3 bagged samples measured systematically across bag 10 times each  Average concentrations: 19, 22, and 32 ppm »30 measurements total

4-36 Example Simple Decision Rule: if 1 st measurement less than 10 ppm, stop, no action level problems if 1 st measurement greater than 50 ppm, stop, action level problems if 1 st measurement between 10 and 50 ppm, take another three measurements from bagged sample

Contingencies

4-38 EPA TIFSD DMA Lessons Learned  Linear regression - can be helpful or misleading  Heterogeneity - large scale, small scale, and within sample »Don’t expect collaborative data to compare any better than 2 labs or even the same lab  Focus on decision quality  Structure vendor contracts to include some DMA principles  Particular instruments ≠ technology generalizations

4-39 DMA Example  Use Fort Lewis, Tacoma Washington » DraftReport.pdfhttp:// DraftReport.pdf  2 Former small arms firing ranges and a skeet range, overgrown with trees »Lead primary COC »DMA supported risk pathway evaluation, characterization, and removal actions »XRF sufficient for decisions at all action levels (50, 250, 400, 1000 ppm)  Goals »Delineate the vertical and horizontal extent of lead contamination above 50 ppm »Manage uncertainty around contaminant volume estimates greater than 250 ppm, 400 ppm, and 1,000 ppm

4-40 SOP Modifications Based on DMA  For development of potential remedial boundaries »Additional precision analyses when results fall near action levels (0-100 ppm, ppm, and ppm) — When precision analysis average falls within region of uncertainty... – Collect and analyze cup sub-samples from bags – Determined necessity of co-located sample duplicates (2 feet from primary sample) based on cup and bag precision comparison — Collaborative samples focused near DLs — Collection of deeper samples (2-3 feet) in some locations

SADA Optimize Sampling Locations

4-42 Action Level (mg/kg) Volume (Yards 3 )Excavation Effort Maximum Moderate Minimal SADA Estimate Removal Volumes

4-43 Resources  EPA-Technology Integration and Information Branch  Case studies  US EPA Technical Bulletin - “Performing Demonstrations of Method Applicability Under a Triad Approach” - Due out August 2008  Examples on the US Triad CoP Web site, discussions with practitioners

4-44 Questions?

After viewing the links to additional resources, please complete our online feedback form. Thank You Links to Additional Resources Feedback Form Thank You 4-45