Presentation is loading. Please wait.

Presentation is loading. Please wait.

Return of DQOs - Data Interpretation and Risk Assessments Amy Yersavich, Susan Netzly-Watkins and Mike Allen.

Similar presentations


Presentation on theme: "Return of DQOs - Data Interpretation and Risk Assessments Amy Yersavich, Susan Netzly-Watkins and Mike Allen."— Presentation transcript:

1 Return of DQOs - Data Interpretation and Risk Assessments Amy Yersavich, Susan Netzly-Watkins and Mike Allen

2 Next Phase of the Project Life Cycle 1.PLANNING: 2.SAMPLING: 3.ASSESSMENT: 4.EVALUATION: Plan for data collection using the DQO process Collect data using a SAP and FSOPs Verify that data meets DQOs Make data-based project decisions

3 Five Steps of Data Quality Assessment 1.Review the DQOs and Sampling Design 2.Conduct a Preliminary Data Review 3.Select the Statistical Test 4.Verify the Assumptions of the Statistical Test 5.Draw Conclusions from the Data

4 Is Your Data What You Expected?  Are results what you expected based on site conditions?  Are duplicates similar in concentrations?  Did you get your all your results, including subcontracted analysis?  Can you match your results with sample locations?  Do you have the right analysis to make a decision? [Data Gaps]

5 Data packages vary by what is requested by the customer. If you don’t specify … 1.You get what the lab sends you. 2.You may get summary sheets with no laboratory data sheets or no laboratory quality control information. 3.You may get laboratory data sheets with limited QA parameters (blanks and surrogates), but no QA narrative, qualifiers, or receipt information. 4.You may get more sheets of paper than you know what to do with (Full data packages and chromatographs). 5.You may get VAP or RR data packages with or without affidavits. Wide World of Data Packages

6 What I minimally need from a Laboratory Report! 1.Case narrative – notes QA issues with samples and analysis. (may be qualifiers only) 2.QA samples – surrogates (organics); spikes (inorganics), blanks, Laboratory Control Samples (LCS). 3.Chain of Custody – Sample tracking 4.Reporting Limits – Some reports may only state “ND” with no reporting limit listed. Need for Risk Assessment? 5.Units and Methods – ppm or ppb; also the analytical method used; does it match the certification? 6.Cross References - Match sampling numbers through processing. Wide World of Data Packages

7 Ohio EPA’s basic review in checklist form with guidance;( DERR-DI-00-034 ) Use as a reminder to ensure you have quality laboratory data and that you have everything you need! Two checklists are available: Laboratory Data Field Screening Data (including mobile lab and immunoassay).

8 What’s in the Laboratory Data Review Checklist? Checklist item numbers correspond to guidance numbers in this document. Item 1.0: Consistency with the approved QA documents from the Project. Did they do what they were supposed per Team Meetings? Any issues noted which differ from your QA or direction?

9 Laboratory Data Review Checklist Item 2.0: Case Narrative – Can vary, but should include information on QA from laboratory. May also include sample receipt information QA summary or Qualifiers may be used by CLs – Ask in Kick off meeting.

10 Laboratory Data Review Checklist Item 3.0: Chain of Custody: This defines holding times, methodology, preservation, bottle type, sample identifications, and the number of samples collected. The field samplers may have included notes or special instructions on the samples.

11 Laboratory Data Review Checklist Item 4.0: Laboratory Report Basics: – Reference sample numbers to lab # and location (and subcontractor’s data); – Date Sampled and Analyzed - meet holding times? – Date extracted (SVOCs) – changes holding times; – Matrix – check units (mg/kg or mg/L?) and sample ID; – Method used – VAP certified? per QAPP? Watch “b” “c” designation – Units, Reporting Limits, and Qualifiers – Appropriate limits? Data qualified? Concerns? Note: Qualifiers may be used.

12 Laboratory Data Review Checklist Item 5.0: Sample Results Were all provided? Subcontracted results – can you match to your sample locations? Dilutions, can you evaluate meeting standards? Item 6.0: Dry or Wet Weight Very important to your risk assessor and for comparison to sediment standards in Ecological Assessments.

13 Laboratory Data Review Checklist Item 7.0: Receipt of Samples – Was my sample compromised (broken, open seals, temperature issues)? Item 8.0: Holding Times Y ou may need to compare sample date with sample preparation or/and analysis date to determine validity.

14 Laboratory Data Review Checklist Item 9.0: Blanks - Were Blanks included in the sampling? Were they analyzed? Did all blanks (method or Field) have contamination? Effect on data validity. Trips only for VOCs Item 10.0: Duplicates - Were Duplicates run? If yes (based on QAPP) are there the correct number of duplicates?

15 Laboratory Data Review Checklist Item 11.0 &12.0: Matrix Spikes - Lab can give you a range to make QA determinations. If you have questions – Ask the Lab! Item 13.0: Surrogates - Organic Analysis Only! Based on the QAPP, were they within limits? If not, was corrective action taken to address issues (case narrative)?

16 Laboratory Data Review Checklist Item 14.0: Laboratory Control Samples - May or may not see these; Use purchased standard for specific analysis to determine if instrument running properly. LCS may be noted in case narrative if surrogate is out of control limits to demonstrate matrix issues or method efficiency to validate data quality. The lab can provide a range for comparison.

17 Field Screening Data Evaluations All data should be of appropriate quality for use in your projects. Only you know your site and what quality you need under your DQOs. Field screening data checklist is similar to lab data checklist. Field screening have specific QA parameters – duplicate readings, calibrations, and spikes. Get what you need to make informed decisions.

18 Data Usability Determination Under the VAP Were analytical problems noted and did corrective action address the problem? Were data “rejected” or may not be valid? Do the data meet the DQOs and QA for the project per VAP Phase II? Do the data include blank contamination that needs to be considered when evaluating contaminant levels? (RAGS, risk assessor) Any non-certified analysis per affidavit, certification appropriate, and to compare with applicable standards per VAP Phase II? Do you have the affidavits for VAP analysis, OAC 3745-300-13(O)?

19 Data Usability Determination Under the VAP I’ve been Shattered.. Hooop… sha doobie.. Where have we been and where are we going.. Where were we and where do we think we are going.. Where are we and where do we want to go.. Scattered all over.. Manhattan.. Oh yeah the DQOs.. And inside the DQO the DQA Data Quality Assessment

20 Data Usability Determination Under the VAP This last section of today’s talks.. – Assess or Validate – DQO revisit VAP and otherwise – Applicable Standards in light of Conceptual Site Models

21 Data Quality Objectives Fundamentally.. Start with the end in mind… So we’re at the end of our day here and I’m pointing you back to the beginning Heuristic… or Iterative

22

23 Next Phase of the Project Life Cycle 1.PLANNING: 2.SAMPLING: 3.ASSESSMENT: 4.EVALUATION: Plan for data collection using the DQO process Collect data using a SAP and FSOPs Verify that data meets DQOs Make data-based project decisions

24 Statistical Evaluation of the Data Whoa big fella.. Hang on there Jake.. Validate or Assess? Is there a difference? Yeah.. it’s a mule.. Can it pull?

25 Data Quality Objectives Two competing interests, human health and the environment and money/time.. Risk goal based cleanup in the VAP and in the other Programs across the State are constructed to maintain the human health aspect ~~ that is that there is some assurance or agreement as to what is clean enough, or safe enough….

26 Data Quality Objectives Rule 3745-300-07 Constructed in order for you as the CP to provide a Defensible NFA Letter Defensible in the legal sense and the technical Language throughout …. “Sufficient to determine….” Determine What?… The Applicable Standards

27 Data Quality Objectives in ~07 (1) Identify the goals of the phase II property assessment.. (2) Identify the data and information... to support the objectives.. (3) Define the boundaries …the identified areas …current and reasonably anticipated … (4) Develop an approach to identify contaminants…. (5) Specify how the data..will be used in the decision-making process …. Clarify performance and acceptance/rejection criteria for the data. (6) Develop a sampling and analysis plan …. (7) Develop a conceptual site model that illustrates ….exposure scenarios that identify the environmental media, chemicals of concern, current and reasonably anticipated future land use and receptor populations…..

28 Data Quality Objectives ~USEPA (1) State the Problem (2) Identify the Decision (3) Identify the Inputs to the Decision (4) Define the Study Boundaries (5) Develop a Decision Rule (6) Specify the Limits on Decision Errors (7)Optimize the Design for Obtaining Data

29 Team Planning Coordination Conceptual Site Models Identify inputs for the decision or #3 in the VAP….. Define the boundaries …the identified areas …current and reasonably anticipated future use… Using a Conceptual Site Model – Exposure scenarios – Define potential Applicable Standards using Property understanding Types of COCs.. Risk Drivers? Media Identified Areas Receptors Pathways Exposure Units? Engineering Control? Institutional Control?

30 Conceptual Site Models

31 Team Planning Coordination Conceptual Site Models Identify inputs for the decision or #3 in the VAP….. Define the boundaries …the identified areas …current and reasonably anticipated future use… Using a Conceptual Site Model – ~~~ Detection limits ~~~~ Define specific methods Identify concerns of sample size and preservation and holding times Identifies special analytical considerations, potential matrix issues, suspected high concentrations

32 Considering Applicable Standards and Detection Limits COCs, Risk Drivers, Identified Areas, Multiple Chemical Adjustment, Media and Aggregate Assumptions, Exposure Units Generic Numerical Standards.. Single Chemical Assumptions.. As stated earlier.. Handling these types of considerations can be a team effort with the lab..

33 Considering Applicable Standards and Detection Limits For example.. We may have potentially 8 risk drivers within an Identified Area (IA).. 3 of which may not have a carcinogenic toxicological endpoint and we can deal with them separately.. The other 5 have GNS values that we divide by 5 to get an approximate ‘applicable standard’ that we communicate to the lab… before we head out to sample?

34 Considering Applicable Standards and Detection Limits For example.. We realize that indoor air samples cannot be taken (no building yet) but we know VOCs in GW may impact receptor risk in aggregate and we plan to estimate indoor risk using VISL assumptions.. The balance of that risk may drive our direct contact applicable standards.. Use restriction? Soil remediation plans? Possibly wait to sample after remediation?

35 Considering Applicable Standards Hydrogeologic information (ground water flow direction, saturated units) from newly installed borings and wells is inconsistent with information from existing monitoring wells and borings. Prior to proceeding with additional sampling or exposure pathway evaluation, the CP should resolve the inconsistency by reviewing all hydrogeologic data, and if necessary, collect additional subsurface information to fill data gaps. The conceptual site model should be updated accordingly to meet project DQOs (i.e., determining exposure scenarios etc.)

36 Considering Applicable Standards While investigating part of an identified area, a CP encounters high levels of soil contamination that were not expected given the past activities in the identified area. Because of matrix interference associate with the contaminated soil, detection limits for several COCs are elevated, and the CL cannot report two COCs below their applicable standards. The CP should consider re-evaluating the delineation of the identified area (perhaps the location of the highly contaminated soil should be defined as a separate identified area). In addition, the CP should re- evaluate the COC list (should additional COCs be added?), and exposure pathways to meet project DQOs (i.e., characterization of identified areas and associated risk)

37 Conclusions Start with the end in mind Communicate with the laboratory Consider the questions you want to answer first Design the sampling to answer those questions Design alternative possibilities using the CSM Use risk levels to construct the appropriate sampling Revisit the CSM often


Download ppt "Return of DQOs - Data Interpretation and Risk Assessments Amy Yersavich, Susan Netzly-Watkins and Mike Allen."

Similar presentations


Ads by Google