Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater.

Similar presentations


Presentation on theme: "1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater."— Presentation transcript:

1 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater Best Management Practice Demonstrations April 2004 Prepared by Eric Winkler Ph.D. and Nicholas Bouthilette Center for Energy Efficiency and Renewable Energy University of Massachusetts – Amherst

2 2 University of Massachusetts, Amherst, © 2004 Meet the Instructors Nancy Baker Massachusetts Dept. of Environmental Protection 1 Winter Street Boston, MA 02108 (617) 654-6524 (Voice) (617)292-5850 (Fax) Nancy.Baker@state.ma.us Eric Winkler, Ph.D. University of Massachusetts 160 Governors Drive Amherst, MA 01003-9265 (413) 545-2853 (Voice) (413) 545-1027 (Fax) winkler@ceere.org

3 3 University of Massachusetts, Amherst, © 2004 Meet the Sponsors TARP Stormwater Work Group: California Maryland Massachusetts New Jersey Pennsylvania Virginia State of Washington, Illinois, New York, and ETV are collaborating with TARP TARP Member State

4 4 University of Massachusetts, Amherst, © 2004 TARP Member State Goals of TARP Stormwater Work Group Use Protocol to Test New BMPs Approve Effective New Stormwater BMPs Get Credible Data on BMP Effectiveness Share Information and Data Increase Expertise on New BMPs Use Protocol for Appropriate State Initiatives

5 5 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Learning Objectives Use the TARP Stormwater Demonstration Protocol to review a test plan and a technology evaluation. Recognize data gaps and deficiencies. Develop, implement, and review a test plan. Understand, evaluate, and use statistical methods. Logistical Reminders Phone Audience Keep phone on mute * 6 to mute your phone and again to un-mute Do NOT put call on hold Simulcast Audience Use at top of each slide to submit questions Course Time = 2 hours 2 Question & Answer Periods Links to Additional Resources Your Feedback

6 6 University of Massachusetts, Amherst, © 2004 Project Notice Prepared by The Center for Energy Efficiency and Renewable Energy, University of Massachusetts Amherst for submission under Agreement with the Environmental Council of States. The preparation of this training material was financed in part by funds provided by the Environmental Council of States (ECOS). This product may be duplicated for personal and government use and is protected under copyright laws for the purpose of author attribution. Publication of this document shall not be construed as endorsement of the views expressed therein by the Environmental Council of States/ITRC or any federal agency."

7 7 University of Massachusetts, Amherst, © 2004 DISCLAIMER The contents and views expressed are those of the authors and do not necessarily reflect the views and policies of the Commonwealth of Massachusetts its agencies or the University of Massachusetts. The contents of this training are offered as guidance. The University of Massachusetts and all technical sources referenced herein do not (a) make any warranty or representation, expressed or implied, with respect to accuracy, completeness, or usefulness of the information contained in this training, or that the use of any information, apparatus, method, or process disclosed in this report may not infringe on privately owned rights; (b) assume any liabilities with respect to the use of, or for damages resulting from the use of, any information, apparatus, method or process disclosed in this report. Mention or images of trade names or commercial products does not constitute endorsement or recommendation of use.

8 8 University of Massachusetts, Amherst, © 2004 Module I: Planning for A Stormwater BMP Demonstration 1.Factors Affecting Stormwater Sampling 2.Data Quality Objectives and the Test QA Plan 3.Sampling Design 4.Statistical Analyses 5.Data Adequacy: Case Study Module II: Collecting and Analyzing Stormwater BMP Data

9 9 University of Massachusetts, Amherst, © 2004 3. Sampling Design Stormwater Data Collection Guidance Locating Samples & Stations Selecting Water Quality Parameters QA/QC Sample Handling and Record Keeping Field Measures

10 10 University of Massachusetts, Amherst, © 2004 Sampling Plan Elements TARP Data Collection Criteria (Section 3.3, TARP Protocol) Any relevant historic data Monthly mean rainfall and snowfall data (12 months over the period of record) Rainfall intensity over 15 minute increments

11 11 University of Massachusetts, Amherst, © 2004 Protocol Minimum Criteria Identifying Qualifying Storm Event (Section 3.3.1.2 and Section 3.3.1.3, TARP Protocol) Minimum rainfall event depth is 0.1 inch. Minimum inter-event duration of 6 hours (duration beginning a cessation of flow to unit). Base flow should not be sampled. Identification of qualifying event needs to verify flow to the unit and rainfall concurrently.

12 12 University of Massachusetts, Amherst, © 2004 Analysis for Determining Number of Samples Paired sampling approach- Significant difference in mean concentration Coefficient of variation Power of 80% 95% confidence (Urban Stormwater BMP Performance Monitoring. USEPA, 2002.) Coefficient of Variation Difference in sample set means (%) Example: 80% difference in means, 100% CV = 20 sample pairs

13 13 University of Massachusetts, Amherst, © 2004 Qualifying Event Sample TARP Protocol Criteria 10 water quality samples per event 10 influent and 10 effluent If composite - 2 composites, 5 sub-samples Data for flow rate and flow volume At least 50% of the total annual rainfall CA – monitor 80-90% of rainfall.

14 14 University of Massachusetts, Amherst, © 2004 Qualifying Event Sample (continued) Preferably 20 storms, 15 minimum Sampling over the course of a full year of sampling to account for seasonal variation Compositing flow-weighted samples cover at least 70% of storm flow (and as much of the first 20% as possible) Examples of variation within TARP community: PA - Temporary BMPs sized using 2 year event NJ – Water Quality design based on volume from a 1.25 inch event.

15 15 University of Massachusetts, Amherst, © 2004 Sampling Locations Location in close proximity to BMP technology inlet and outlet. Consider Field personnel safety during equipment access. Secure equipment to avoid vandalism Provide a scaled plan view of site that indicates: All buildings Land uses Storm drain inlets Other control devices

16 16 University of Massachusetts, Amherst, © 2004 Automated Sampling Equipment Water surface elevation should be less than 15 feet below the elevation of the pump in the sampler. Access to the equipment should take into consideration confined space safety. Flow measurement equipment should be located to avoid measuring turbulent flow.

17 17 University of Massachusetts, Amherst, © 2004 Adequacy of Sampling and Flow Monitoring Procedures Primary and secondary flow measurement devices are required. Programmable automatic flow samplers with continuous flow measurements are recommended. Time-weighted composite samples are not acceptable unless flow is monitored and the event mean concentration can be calculated.

18 18 University of Massachusetts, Amherst, © 2004 Monitoring Location Recommendations Monitoring location design should consider whether the upgradient catchment system is served by a separate storm drain system. Pay attention to potential combined sewer system or illicit connections which may contaminate stormwater system. The storm drain system should be well understood to allow reliable delineation and description of catchment area. Flow-measuring monitoring stations in open channels should have suitable hydraulic control and where possible the ability to install primary flow measurement devices.

19 19 University of Massachusetts, Amherst, © 2004 Monitoring Location Recommendations (continued) Avoid steep slopes, pipe diameter changes, junctions, and irregular channel shaping (due to breaks, roots, debris). Avoid locations likely to be affected by backwater and tidal conditions. Pipe, culvert, or tunnel stationing should be located to avoid surcharging (pressure flow) over the normal range of precipitation. Use of reference watershed and remote rainfall data are discouraged.

20 20 University of Massachusetts, Amherst, © 2004 Selecting Applicable Water Quality Parameters Designated uses of the receiving water – consider stormwater discharge constituents Overall program objectives and resources – adjust parameter list according to resources (test method capability, personnel, funds, and time) Use of Keystone pollutant may vary from state to state

21 21 University of Massachusetts, Amherst, © 2004 Resources for Standardized Test Methods (Section 3.1, TARP Protocol) EPA Test Methods – pollutant analysis www.epa.gov/epahome/Standards.html www.epa.gov/epahome/Standards.html ASME Standards and Practices – pressure flow measurements ASCE Standards – hydraulic flow estimation ASTM Standards – precision open-channel flow measurements for water constituent analysis National Field Manual for Collection of Water Quality Data, Wilde et al., USGS water.usgs.gov/owq/FieldManual/ water.usgs.gov/owq/FieldManual/ Guidance Manual: Stormwater Monitoring Protocols – Caltrans. www.dot.ca.gov/hq/env/stormwater/ www.dot.ca.gov/hq/env/stormwater/

22 22 University of Massachusetts, Amherst, © 2004 Examples of Analytical Laboratory Methods Total Phosphorus – SM4500-PE Nitrate and Nitrite – EPA353.1 Ammonia – EPA350.1 Total Kjeldahl Nitrogen – EPA351.2 TSS – SM2540D SSC – ASTM D3977-97 Enterococci – SM9230C Fecal Coliform – SM9222D Chronic Microtox Toxicity Test – Azur Environmental Reference: ASTM, EPA, American Public Health Association, (Other non- standard methods or tests approved through acceptable regulatory process, EPA or state authority)

23 23 University of Massachusetts, Amherst, © 2004 Constituent Listing w/ Detection Limits Urban Stormwater BMP Performance Monitoring. US EPA, 2002.

24 24 University of Massachusetts, Amherst, © 2004 Representative Sampling Sample Collection Uniformity varies between staff and technique Automated sampling allows for reproducible samples Standard Operating Procedures (SOPs) and Quality Assurance help track variability

25 25 University of Massachusetts, Amherst, © 2004 Required Elements in Lab QA/QC (Section 3.3.5, TARP Protocol) The QAPP Test QA Plan and Sampling and Analysis Plan should include: Laboratory and sampling equipment decontamination Sample preservation and holding time Sample volumes QC Samples (Spikes, Blanks, Splits, Field and Blank Lab Duplicates) QA on Sampling Equipment (Calibration) Packaging and Shipping Identification and Labeling Chain of Custody Lab Certification

26 26 University of Massachusetts, Amherst, © 2004 Quality Control Tests and Acceptance Limits for Physical/Chemical Parameters Analyzed in the Laboratory Parameter AccuracyPrecision QC Test Acceptance Limits (RPD) a FrequencyQC Test Acceptance Limits (RPD) Frequency Total Phosphorus QCS b 90-110% rec. Each sample batch of 20 samples (5%) Duplicates20 Each sample batch of 20 samples (5%) LFB c 85-115% rec. LFM d 80-120% rec. Nitrate and Nitrite N QCS90-110% rec. Each sample batch of 20 samples (5%) Duplicates45 Each sample batch of 20 samples ( 5%) LFB85-115% rec. LFM80-120% rec. Ammonia-N QCS90-110% rec. Each sample batch of 20 samples (5%) Duplicates 52 Each sample batch of 20 samples (5%) LFB85-115% rec. LFM80-120% rec. Total Kjeldahl N QCS90-110% rec. Each sample batch of 20 samples (5%) Duplicates28 Each sample batch of 20 samples (5%) LFB85-115% rec. LFM80-120% rec. Total Suspended Solids QCS75-125% rec. Each sample batch of 20 samples (5%) Duplicates 25 Each sample batch of 20 samples (5%) Suspended Sediment Concentration QCS75-125% rec. Each sample batch of 20 samples (5%) Duplicates 25 Each sample batch of 20 samples (5%) a RPD = relative percent difference among duplicates c LFB= laboratory fortified blank sample b QCS = quality control sample from source outside of laboratory d LFM = laboratory fortified matrix sample

27 27 University of Massachusetts, Amherst, © 2004 Laboratory Records Sample data Number of samples, holding times, location, deviation from SOPs, time of day, and date. Corrective procedures for samples inconsistent with the protocol. Management of records Consider electronic filing. Document data validation, calculations and analysis, and data presentation. Review data reports for completeness, including requested analyses and all required QA.

28 28 University of Massachusetts, Amherst, © 2004 Field Sample QC Field Matrix Spike A sample prepared at the sampling point by adding a known mass of the target analyte to a specified amount of sample. Used to determine the effect of sample preservation, shipment, storage, and preparation on analyte recovery efficiency. Field Split Sample The split of a sample into two representative portions to be sent off to different laboratories. Estimates inter-laboratory precision.

29 29 University of Massachusetts, Amherst, © 2004 Quality Control Field Blanks A clean sample carried to the sampling site, exposed to sampling conditions, and returned to the laboratory. Provides useful information about pollutants and error that may be introduced during the sampling process. Field Duplicates Identical samples taken from the same sampling location and time (not split). Identical sampling and analytical procedures to assess variance of sampling and analysis.

30 30 University of Massachusetts, Amherst, © 2004 Quality Assurance / Quality Control Documentation and Records Documentation: Include all QC data Define critical records and information, as well as the data reporting format and document control procedures Reporting: Field operation records Laboratory records Data handling records

31 31 University of Massachusetts, Amherst, © 2004 Field Operation Records Sample collection records Show that the proper sampling protocol was used in the field by indicating persons names, sample numbers, collection points, maps, equipment/methods, climatic conditions, and unusual observations Chain of custody records Document the progression of samples as they travel from sampling location to the lab to disposal area QC sample records Document QC samples such as field, trip, blanks, and duplicate samples Provide information on frequency, conditions, level of standards, and instrument calibration history

32 32 University of Massachusetts, Amherst, © 2004 Field Operation Records (continued) General field procedures Record field procedures and areas of difficulty in gathering samples. Corrective action reports Where deviations of standard operating procedures occurs, report methods used and/or details of procedure, and a plan to resolve noncompliance issues.

33 33 University of Massachusetts, Amherst, © 2004 Questions and Answers (1 of 3)

34 34 University of Massachusetts, Amherst, © 2004 Working With Stormwater Data 1. Factors Affecting Stormwater Sampling 2. Data Quality Objectives and the Test QA Plan 3. Sampling Design 4. Statistical Analyses 5. Data Adequacy: Case Study

35 35 University of Massachusetts, Amherst, © 2004 4. Statistical Analyses Data Reporting and Presentation Statistical Method Review Appropriate forms of Analyses Results Interpretation

36 36 University of Massachusetts, Amherst, © 2004 Presenting Statistical Data and Applicability Efficiency Calculation Implications Determine the category of BMP. BMPs with well-defined inlets and outlets whose primary treatment depends upon extended detention storage of stormwater. BMPs with well-defined inlets and outlets that do not depend upon significant storage of water. BMPs that do not have a well defined inlet and/or outlet. Widely distributed BMPs that use reference watersheds to evaluate effectiveness.

37 37 University of Massachusetts, Amherst, © 2004 Analysis to Address Data Quality Objectives What degree of pollution control does the BMP provide under typical operating conditions? How does efficiency vary from pollutant to pollutant? How does efficiency vary with input concentrations? How does efficiency vary with storm characteristics? How do design variables affect performance? How does efficiency vary with different operational and/or maintenance approaches?

38 38 University of Massachusetts, Amherst, © 2004 Analysis to Address Data Quality Objectives (Continued) Does efficiency improve, decay, or remain stable over time? How does efficiency, performance, and effectiveness compare to other BMPs? Does the BMP reduce toxicity? Does the BMP cause an improvement or protect downstream biotic communities? Does the BMP have potential downstream negative impacts?

39 39 University of Massachusetts, Amherst, © 2004 Efficiency Ratio (ER) TARP Protocol Recommended Method Where Event Mean Concentration (EMC): V=volume of flow during period i n=total number of events C=average concentration associated with period j m=number of events measured

40 40 University of Massachusetts, Amherst, © 2004 Efficiency Ratio Interpretation EMCs weight all storms equally. Most useful when loads are directly proportional to the relative magnitude of the storm – accuracy varies with BMP type. Minimizes impacts of smaller/cleaner storms. Allows for use of data where portions of data are missing – would not significantly effect the average EMC. Can apply log normalization to avoid equal weighting of events.

41 41 University of Massachusetts, Amherst, © 2004 Summation of Loads The sum of loads can be calculated using concentration and flow volume, as follows: Removal efficiencies calculated by the summation of loads tend to be dominated by larger storm events. The summation of loads method uses a mass balance approach.

42 42 University of Massachusetts, Amherst, © 2004 Regression of Loads The Regression of Loads is defined as: Percent reduction in loads is approximated as: β - slope term in regression analysis. Data can be dominated by large storm events. It is recommended that the line not be forced through origin. May require polynomial fit to achieve higher R2

43 43 University of Massachusetts, Amherst, © 2004 Mean Concentration The Mean Concentration is defined as: average of outlet concentration average of inlet concentration MC = 1 – May be useful for threshold level pollutants, e.g., bacterial or toxics. Weights samples equally and may result in bias due to variances in sampling protocols. Not amenable to mass balance approach. Flow measure must represent total event characteristics

44 44 University of Massachusetts, Amherst, © 2004 Efficiency of Individual Storm Loads The efficiency of the BMP for a single storm is given by: Average efficiency can be calculated as follows: Weights all storms equally. Must have paired data for inflow and outflow. Does not account for interrelationship between storm events.

45 45 University of Massachusetts, Amherst, © 2004 Interpreting Results and Inappropriate Analyses Efficiency Ratio – Most useful when loads are directly proportional to the relative magnitude of the storm – accuracy varies with BMP type. Summation of Loads – A small number of large storms can significantly influence results. Regression of Loads – Assumes removal efficiency is uniform over a range of operating conditions and concentrations. Mean Concentration – Not appropriate where flow-weighted sampling is performed – weighs all events equally. Average efficiency requires that inflow and outflow are related.

46 46 University of Massachusetts, Amherst, © 2004 Using Statistics Inappropriately Each method is likely to produce a different efficiency. Method should be chosen by its relevance and applicability to each scenario, not by the efficiency value it produces. Be aware of statistics being misused to support claims. Reporting of ranges may be more appropriate under certain test conditions, e.g., determination that data is qualitative versus quantitative.

47 47 University of Massachusetts, Amherst, © 2004 Questions and Answers (2 of 3)

48 48 University of Massachusetts, Amherst, © 2004 Working With Stormwater Data 1. Factors Affecting Stormwater Sampling 2. Data Quality Objectives and the Test QA Plan 3. Sampling Design 4. Statistical Analyses 5. Data Adequacy: Case Study

49 49 University of Massachusetts, Amherst, © 2004 5. Case Study I: Test Plan and Data Adequacy Review of Data Test QA Plan and Data Quality Assurance Project Plan Field and lab data adequacy Data reporting and depiction of performance claims Potential problems Field and lab violations Data reduction issues

50 50 University of Massachusetts, Amherst, © 2004 Background Study of Structural BMP Performance Claim made – 80% TSS removal Study commenced prior to TARP Protocol as well as other published, public domain stormwater BMP monitoring protocols. Resources for conducting study borne by single private entity. Study design included limited information on site, sampling equipment, sampler programming, calibration, sample collection and analysis. TSS/SSC primary water quality parameter of interest. Flow measurement and rain gauge equipment to be installed. Analytical testing to be performed by outside laboratory. Sample collection by technology developer.

51 51 University of Massachusetts, Amherst, © 2004 Sample Plan Adequacy – Issues Identified Provide complete engineering drawings of system including: Entire drainage area connected to system. Pipe sizing and inlet locations. Description of pervious and impervious surfaces. Design calculations used to size unit. Climatic data used to design structures. Any additional structures or site details that may have bearing on system. Provide use characteristics of site including: Vehicle and industrial usage. Maintenance of site relative to sweeping, gutter maintenance, spill containment, and snow stockpiling/disposal. Indicate condition and maintenance of system prior to commencing test, e.g., was system cleaned out during or before initiation of this phase of the study?

52 52 University of Massachusetts, Amherst, © 2004 Sample Plan Adequacy – Test Equipment and Sampling Issues Provide a description and/or reference to manufacturers documentation showing how flow meter and sampling equipment is calibrated and their location in the system (cross- section and plan view). Detailed description of sampling program by design and storm event. Indicate when and how much sample to be taken. Provide explanation and methodology for discreet sampling. Indicate who, when, and where in the system these samples were taken. Identify equipment used and calibrations used for sampling. Provide discussion/explanation for sampling without detention delay in the system between inlet and outlet.

53 53 University of Massachusetts, Amherst, © 2004 Sample Plan Adequacy – Lab and Data Analysis Issues Identify laboratory and controls for sampling handling, QA/QC on sample data Identify methodology used for PSD, including sieve sizes and sample sizes. Analytical method for TSS stated in protocol - EPA 160.2. Need explanation and equations used to calculate EMC, including calculation worksheets.

54 54 University of Massachusetts, Amherst, © 2004 Field Test Issues Flow meters not calibrated for the first 10 events. Truncated sampling protocol. No documented instrument calibration. Once flow measurement error was identified, an adjusted flow factor is applied to the first 10 events, based on the outcome of the second 10 events. Measured and calculated storm volumes vary by up to a factor of 10 resulting in adjustment to flow volume.

55 55 University of Massachusetts, Amherst, © 2004 Data Analysis Issue Storm Rainfall Depth (in)Flow Volume M (ft 3 ) 1 Number of Sub Samples Influent EMC (mg/L) Effluent EMC (mg/L) Event Removal Efficiency 1 10.321,600 (2880)2865.950.324 % 20.5212,051 (1607)171010.7149.285 % 30.329,117 (1216)121364.963.695 % 40.3213,203 (1760)13857.649.494 % 50.539,630 (1284)11367.6145.960 % 60.466,831 (911)18533.257.889 % 70.5514,441 (1925)30433128 % 80.7518,045 (2406)401088.85295 % 90.102,183 (291)637.233.610 % 100.174,559 (608)12613838 % 115.45147,58612388.859.133 % 120.481,28440111.647.358 % 130.5313,2107046.219.857 % 140.132,9081269.214.779 % 150.439,5434033.112.662 % 161.9171,60740164.193.243 % 171.0229,37880233.6102.456 % 180.276,8583393.325.573 % 190.256,6143257.42163 % 200.307,75337188.470.363 % Values in italics are untransformed flow values in ft 3

56 56 University of Massachusetts, Amherst, © 2004 Predicted / Measured Runoff Values The values highlighted in red indicate that the relationship between rainfall and total runoff is inconsistent. Applying the rational method (Q=CiA) leads reviewers to believe that measured runoff is inaccurate when compared to predicted values.

57 57 University of Massachusetts, Amherst, © 2004 Variation in Performance Values Removal Efficiencies for all Events: Removal Efficiency by Efficiency Ratio: 83% Removal Efficiency by Summation of Loads: 73% Removal Efficiency by Regression of Loads: 77% Removal Efficiency by Efficiency of Individual Storm Events: 60% Removal Efficiencies for Events 11-20: Removal Efficiency by Efficiency Ratio: 57% Removal Efficiency by Summation of Loads: 44% Removal Efficiency by Regression of Loads: 40% Removal Efficiency by Efficiency of Individual Storm Events: 59%

58 58 University of Massachusetts, Amherst, © 2004 Data Analysis and Presentation Issues Impact of adjusted flow on average net TSS/SSC removal resulted in positive bias in performance efficiency Missing raw data including documented deviations from sampling plan, lab analysis and data management Missing laboratory and data management QA/QC No independent validation of calibrations, sampling and analysis Missing statistical analysis including relative percent differences (precision) and percent recovery (accuracy) as well as number of sample (n), standard deviations, means (specify arithmetic or geometric) and standard errors. Missing documentation of chain of custody protocol

59 59 University of Massachusetts, Amherst, © 2004 Ensuring Adequate Data Provide QC activities including blanks, duplicates, matrix spikes, lab control samples, surrogates, or second column confirmation. State the frequency of analysis and the spike compound sources and levels. State required control limits for each QC activity and specify corrective actions and effectiveness if limits exceeded.

60 60 University of Massachusetts, Amherst, © 2004 Summary of Data Quality Assessment 5 Steps 1. Review the DQOs and Sampling Design. 2. Conduct a Preliminary Data Review. 3. Select the Statistical Test. 4. Verify the Assumptions of the Statistical Test. 5. Draw Conclusions from the Data.

61 61 University of Massachusetts, Amherst, © 2004 Summary : Sampling Design Use of TARP Protocol data collection criteria Consideration of issues relating to sampling locations Sampling water quality parameters Lab analysis of water quality parameters

62 62 University of Massachusetts, Amherst, © 2004 Summary : Statistical Analysis Protocol recommends the efficiency ratio Summation of Loads, regression of loads, & mean concentration may be applicable Each method gives somewhat different results Check statistics to be sure they support claims

63 63 University of Massachusetts, Amherst, © 2004 Summary : Case Study Learn to recognize limitations in test plans, equipment data and documentation deficiencies, and problems with the statistical analysis Making decisions on the usability of the data Sharing the data among TARP states and others

64 64 University of Massachusetts, Amherst, © 2004 Module II: Retrospective Sampling Design Planning, Contingency Planning, & Flexibility Statistical Analysis Methods Show Different Results --- Use Caution Interpreting Results Data Adequacy : Case Study Learning to Deal with Sampling & Data Deficiencies

65 65 University of Massachusetts, Amherst, © 2004 What Have You Accomplished? Guidance for Using the TARP Protocol Exposure to Key Issues in a Technology Demonstration Field Test Knowledge of TARP Stormwater Work Group and Others Evaluating Technologies

66 66 University of Massachusetts, Amherst, © 2004 Questions and Answers (3 of 3)

67 67 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance TARP: http://www.dep.state.pa.us/dep/deputate/pollprev/techservices/tarp/ Links to Additional Resources


Download ppt "1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater."

Similar presentations


Ads by Google