Presentation on theme: "QAPP outline. 2 Element 1: Title Page with Approval Signatures Title of QAPP Name(s) of organizations implementing project Approval personnel Assistance."— Presentation transcript:
2 Element 1: Title Page with Approval Signatures Title of QAPP Name(s) of organizations implementing project Approval personnel Assistance agreement or contract number(s)
3 Element 2: Table of Contents List of all required elements and their page numbers Appendices References
4 Element 3: Distribution List Lists people who will get original and revised QAPP Everyone who does the work Everyone who manages them Funding agency
5 4-Organization of Project Governmental Entities, Contractors, and Key Individuals. Roles and Responsibilities. How often will these be done? How will each person do their job? To whom will they report?
6 Examples of Agencies Required Agencies Tribal Environmental Program USEPA Other Agencies TAMS Center State and Local Partners
7 Possible Contractors Sampling Laboratory Data Analysis QA/QC Audit
8 Examples of Key Individuals Required Tribal employees Air Quality Program Manager Quality Assurance Coordinator Optional positions (may be contractors) Environmental Specialist Environmental Technician
9 Program Manager: Roles & Responsibilities Oversees monitoring project Prepares or reviews quarterly & annual reports for submittal to EPA Ensures staff is hired and trained
10 Program Manager: Roles & Responsibilities (cont.) Prepares & maintains project work plan & budget Communicates with Environmental Director & EPA Project Officer Responsible for approval & modifications of project QAPP
11 QA Coordinator: Roles & Responsibilities Prepares or coordinates preparation of QAPPs Reviews and approves corrective actions Conducts system audits
12 QA Coordinator: Roles & Responsibilities (cont.) Oversees or conducts method performance audits Prepares QA reports Conducts or oversees data verification, validation and assessment
13 Environmental Specialist: Roles & Responsibilities Conducts sample transport, handling & exchange Delivers samples to laboratory (by mail or actual drop off) Signs off on chain of custody forms
14 Environmental Specialist: Roles & Responsibilities (cont.) Conducts quarterly calibrations, quarterly audits Records sample information on data forms Reports all aspects of monitoring project to Program Manager
15 Project Organizational Chart (Example) John W. Smith Director Navajo Tribe Alexandria Washington Air Quality Program Manager Samuel Vaughn Air Quality Specialist EPA as appropriate Michelle Winston Air Quality Specialist Tom Lamb Air Quality Technician Laboratory Sue Jones QA Officer
16 5-Project Background History Context Assume an ignorant reader (e.g., member of public)
17 Why is this work important? Are there health effects in your community that may result from this problem (asthma, bronchitis)? Reduced visibility?
18 More reasons why this work is important: Concern about possible regional transport of pollutant (ozone precursors)? Increased development, more roads, businesses, residents? Concern about childrens exposure?
19 Summarize Existing Information Previous results from earlier studies? Results from nearby areas? (if you did not gather the data, you may not be able to use it with your data but it can be useful to you in planning) Results from emissions inventory? Results from compliance monitoring of nearby facilities?
20 Who are the Decision-makers? Tribal Council Tribal Environmental Office director EPA Region
Element 6: Project Description Ondrea Barber Salt River Pima-Maricopa Indian Community
22 Summarize Purpose of Project Why are we making these measurements?
23 Standards What standard will the measurement results be compared against (if applicable)?
24 Field Work-Summary What kinds of measurements are being made? What kind of samplers are being used? How many measurements over what time period? Site locations
25 Field Activities- Summary Routine field sampling Sample collection Monthly calibrations/audits Instrument maintenance
26 Laboratory Work How are the samples being analyzed? state the method can refer to a standard method) Who is doing the analysis?
27 Schedule 1. Hiring deadlines 2. Training Dry runs with equipment 3. Field measurements 4. Analysis 5. Reporting ActivityFrequency Test sampler October 1, 2005 First sampling December 1, 2005
28 Assessments How will you check on yours and the labs work to ensure data is good (summarize)? Who is involved and what are their roles (summarize)?
29 Records of Assessment Internal assessments (readiness review) External assessments a. PEP audits for PM2.5, NPAP for others b. Technical Systems Audits
30 YOUR Assessments of the Analysis Lab Initial review of their QAPP and calibration certificates when you agree to the contract Onsite visit during the contract Ongoing review of their QC results
31 Records BRIEF description of projects records, Information on where they are stored Ensure that detailed information is in Section 19 What reports are required?
Element 7: Project Quality Objectives Mathew Plate US EPA Region 9
33 Project Objectives Why are we making these measurements? Conformance with NAAQS Obtain baseline data To determine need for additional monitoring Health risk evaluation
34 Project Objectives What will we do with the results? Compare with NAAQS Report to community, EPA, health officials
35 Systematic Planning Required by grant regs: 48 CFR 46 Performance criteria QAPP or equivalent Data assessment Corrective action QA training for management and staff
36 Why Systematic Planning Quality is the extent to which our data is sufficient for purposes it is being used We need a process that defines objectives for our monitoring data and ensures that we know when these objectives are met Program objectives should be developed in consultation with decision makers
37 Decision Maker(s) Those who use data for decisions or conclusions, such as Is a standard violated? Should we take action to improve the air quality? What is the air quality now, so that we will know if it gets worse or better? Should we be taking more measurements?
38 Who are the Decision Makers for Air Monitoring Data? Required Decision Makers for EPA grants EPA The Tribes Environmental Program Other Decision Makers the Tribe may consider in quality planning State and local organizations Researchers
39 Types of Objectives Project and Program Objectives Data Quality Objectives (DQOs) Based on Program Objectives Qualitative and quantitative Measurement Quality Objectives (MQOs) Specific criteria which when met should produce data of acceptable quality Quantitative
40 Information in DQOs How data will be used Type of data needed How data should be collected
41 DQOs Work Backwards... Degree of uncertainty you can tolerate Acceptable degree of uncertainty in each measurement & number of measurements to take Fro m To
42 DQO Functions DQOs Link answers to actual measurements Set limits on uncertainty so that data produce required uncertainty in the answer
43 Example "We know that we meet the standard with 80% confidence this means there is a 20% chance that we could be wrong and we are higher than that standard."
44 Balancing Cost vs. Degree of Uncertainty Balances costs of taking many samples with desired uncertainty in Taking many samples with expensive devices yields low decision errors Taking few samples at low cost yields high decision errors Resultmay have to change objectives, e.g. minivols to see if you need to monitor and ask for more $
45 Accuracy DQOs are concerned with determining the accuracy of measurements
46 Accuracy Accuracy = Total error Includes both bias and precision Measured by true audit and/or by evaluating method quality objectives
47 Translating DQOs into Useable Criteria DQOs should be defined in terms of data quality indicators Criteria set for the data quality indicators are method quality objectives MQOs are set by using empirical data, conservative assumptions, statistical assumptions, and/or common sense.
48 Data Quality Indicators These are sometimes Called the PARCCS Precision (P) Bias (A) ( bias is sometimes called accuracy ) Representativeness (R) Completeness (C) Comparability (C) Detectability (S) ( also called sensitivity )
49 Precision...how well different measurements of the same thing under prescribed similar conditions agree with each other Random component of error sometimes high, sometimes low
50 Precision Precision =wiggle (variability within many measurements of the same thing) You are trying to estimate the variability within the population of all your measurements of the same thing (concentration) Two ways to estimate precision for a single instrument If you have enough equipment, side-by-side, can be two or more devices measuring the same concentration If you have only one continuous instrument, you must estimate precision by how much the measurement fluctuates over time when it is measuring the same concentration?
51 Precision for Manual Methods Relative percent difference (RPD) RPD =* difference between two monitors* their average Example: RPD= * *=0.05, or 5%
52 Precision Exercise for Manual Methods #1
53 Precision Exercise for Manual Methods #2
54 Bias A systematic distortion of a measurement process, which causes errors in one direction (i.e., generally positive or generally negative)
55 Bias Bias =how far from truth you are, in terms of percentage Bias =audit result – your result audit result Expressed as a percentage, so multiply by would be 3% 0.11 would be 11%
56 Bias Exercise #1
57 Bias Exercise #2
58 Bias Exercise #3
59 Representativeness...a measure of how well your measurements represent the entire population of what you are trying to measure
60 Detectability …how low can this be reliably measured with this equipment? Is this low enough to measure trends and evaluate regulatory compliance? this value is in instrumentation manuals or laboratory QA plan
61 Completeness... the amount of valid data obtained from a measurement system compared to amount expected to be obtained under correct, normal conditions
62 Completeness (cont.) Completeness = # results that are usable (valid) # measurements necessary 80% completeness 8 valid results for Quarter 1 10 necessary Also can do make-up runs to make up missed days =
63 Comparability...measure of confidence with which one data set can be compared to another Beware of apples and oranges in disguise EPA tries to control this by the method designation process
64 Manual Methods Our precision measured by side-by- side monitors is 10% or less for all checks Our bias measured by independent audits (PEP for PM 2.5, NPEP for other criteria pollutants and PM 10 flow) is 10% or less for all checks NPEP is mail-in or van program required for all SLAMS and PSD monitors
65 Automatic (Continuous) Methods Our precision measured by bi-weekly one- point precision checks (flow rate for PM, span gas for other criteria pollutants) is 15% or less for all checks Our accuracy, measured by independent audits, is 15% or less for all audits
67 EPAs DQO Standards for the PM 2.5 NAAQS The question: Does this air meet the annual standard of 15 g/m 3 ?
68 Decision Error Limits EPA decision: acceptable to make correct decision on attainment 95% of time 5% of time, decision may be wrong
69 EPA Measurement Assumptions EPA based their calculations on 137 samples in a 3-year period, which is 75% completeness with a 1 in 6 day schedule
70 MQO Calculations Using acceptable uncertainty and the other assumptions, EPA calculated that each measurement must have a bias and a precision error (MQOs) of 10% or less (See element 7 in the EPA PM2.5 Model QAPP)
71 MQOs If these MQOs are met, then the conclusion about attainment will be correct 95% of the time (Maybe - - This assumption needs to be verified when the data is assessed and the DQOs are reviewed)
Element 8: Special Training Requirements Melinda Ronca-Battista ITEP/TAMS
73 Specialized Training Identify & describe for each person all specialized training/certifications needed How will training be provided? How will training be documented (memos to the file of on-the-job training, personnel files with certificates of training, annual appraisals)?
74 What Training Is Necessary? Technical (equipment, QA/QC, etc.) From instrument manufacturer From TAMS Allot TIME allowed/expected for reading manuals, doing on-line courses40 hrs? Document that you have spent 16 (or more) hours reading the QAPP and understand its requirements, using a sign- off sheet, and this will count as necessary training for anyone working on the project
75 What Training Is Necessary? (cont.) Computer use: Word processing Excel (see excellent on-line courses) Access GIS/GPS? On-the-job training/staff orientation Safety courses? (24-hour field safety, first aid/CPR etc.) TAMS AQS training
76 Examples of Training Options: EPA (regional, headquarters, OAQPS etc.) ITEP: AIAQTP and TAMS Center California Air Resources Board (CARB) Local or state agencies with whom you have a good relationship borrow their manuals accompany them in the field document this in your personal logbook (# hrs spent, activity, instruments, your initials and date; make photocopy for personnel file)
77 Examples of Training Options (cont.) US EPAs Air Pollution Training Institute (APTI); can view tapes Tribal/state/local agencies University/college courses (college credit, CEUs, etc.) Air & Waste Management Association (www.awma.org)
Element 9: Documentation and Records Melinda Ronca-Battista ITEP/TAMS
80 This section needed to: Protect legal and financial rights of the agency and persons affected by agencys activities Ensure that data are legally defensible (e.g., all changes are documented [ who/when/why WRITTEN DOWN] )
81 This section: Is shortcomplete data management is described in element 19 Think of data management as: What (requirements) Why Who How Where When Goes in element 19 Element 9 (can be very brief if you reference SOPs!)
82 What This element lists the requirements for the recordswhat do you require of these records? Records of planning Records of operation Records of data management
83 Records created during project planning: Hiring and training records Initial contracts with instrument vendors, contractors Budget records Plans for instrument specs, network, site locations, site visits LIST THESE RECORDS, WHO WILL WRITE/REVISE/APPROVE THEM, HOW OFTEN, WHERE YOU WILL STORE THEM
84 Records created during project operations: Site visits Data transfer Audits and assessments QC checks (internal and external) Calibrations LIST THESE RECORDS, WHO WILL WRITE/REVISE/APPROVE THEM, HOW OFTEN, WHERE YOU WILL STORE THEM
85 Records created during data management: Download and transfer File naming, moving, pw-protecting, archiving Review, range checks, flagging, calculations Reporting to tribe, EPA LIST THESE RECORDS, WHO WILL WRITE/REVISE/APPROVE THEM, HOW OFTEN, WHERE YOU WILL STORE THEM
86 A little detail please… In the text or table, list for each item : Any rulesmust be written in pen, dated and initialed Changes must be made so that original data is kept legible Computer files are password-protected for making changes Duplicate copies made and stored where? How long are they kept and who decides to throw them away? File naming conventions?
87 Requirements for data transfer: Specify rules for: never over-writing files frequency of downloading files from instrument, backing up files Keeping files on paper (may be summary if very long) and on disk and in PC (designate PC) File structure (see example)
88 File naming convention example: All your files follow the same format (example): pppp_ssss_dddddd_nnn_rev# All your files follow the same format (example): pppp_ssss_dddddd_nnn_rev# Post this on a wall and changes to it to add fields Post this on a wall and changes to it to add fields Where pppp=pollutant Where pppp=pollutant Ssss=site name or number Ssss=site name or number Dddddd=date when measurements made Dddddd=date when measurements made Nnn=initials of person saving the file Nnn=initials of person saving the file Rev # is the revision number Rev # is the revision number
89 List all documents: SOPs, QAPP (this one and that of any lab analyzing your samples) Site logbooks, personal logbooks Repair and maintenance records Reports drafted, final, sent out to tribal authorities, EPA Photos of sites Letters from community, EPA, etc. are kept and filed
90 Data handling procedures: Example: state that this project will use an Access database for all transcribed or input records and data State that the requirements for data handling are in element 19; this element only lists the records and the requirements for the records
91 List your plans for reports received and sent out: Quarterly & Annual Reports submitted to tribe, EPA Quarterly Laboratory Data Report Audit reports Quarterly AIRS-AQS Data submittal to EPA List these in a table
92 References QA documents SOPs Other documents Have all references available in your office
Element 10: Network Design Ondrea Barber Salt River Pima-Maricopa Indian Community
95 Rationale for Measurement Location and Frequency Refer to the data quality objectives (Element 7) Use the objectives when you decide where/how often to monitor Near where people live? Overall community background? Near sources? Discuss purpose of primary & collocated samplers
96 Design Assumptions Document what you are assuming: That a source will impact air quality in the monitored area? Weather patterns, road use, community development in the future, whatever Are your locations reasonable in terms of CFRs and other guidance documents (see ITEP CD)
97 Data Generation/ Collection Design What type of equipment will be used to generate/collect data? How many will be used? Primary and collocated, meteorological eqmnt Frequency of data collection (24 hours)? Saturdays & Sundays? How many calendar years? Is the monitoring equipment FRM or FEM? If so, what is designation number?
98 Sampling Network Design Do you have special purpose samplers (SPMS) measuring for baseline conditions? Do you have a SLAMS-designated site that will provide EPA with national data? Discuss siting requirements (40 CFR Part 58 App. D & E; although these may be impossible to meet so do your best, and confer with your EPA regional office) Monitor location: Roof? Platform? Other? Spatial scale of representativeness (40 CFR Part 58 App. D)
99 Monitoring Objectives could include: 1.To determine highest concentrations expected to occur in area covered by network 2. To determine representative concentrations in areas of high population density 3. To determine impact on ambient pollution levels of significant sources or source categories
100 More possible monitoring objectives 4.To determine general background concentration levels 5.To determine extent of regional pollutant transport among populated areas, and in support of secondary standards 6.To determine culture [EPA calls welfare]- related impacts in more rural & remote areas (i.e., visibility impairment, effects on vegetation)
101 TABLE 1.RELATIONSHIP AMONG MONITORING OBJECTIVES AND SCALE OF REPRESENTATIVENESS Monitoring ObjectiveAppropriate Siting Scales Population………….. Source impact………. General background… Regional transport….. Welfare-related impacts... Micro, Middle, neighborhood (sometimes urban 1 ) Neighborhood, urban Neighborhood, urban, regional Urban/Regional 1 Urban denotes a geographic scale applicable to both cities and rural areas
102 Critical / Non-critical Measurements Critical measurements are required to achieve project objectives or limits on decision errors Field measurement requirements (i.e. ambient temperature, barometric pressure, etc.) -What you would include when submitting data to AQS Non critical measurements are those that are nice to know but not make-or-break
103 Standard Measurements Federal Reference Methods (FRMs) and Federal Equivalent Methods (FEMs) provide standard measurements as required by EPA for comparison to the NAAQS Use of any non-standard measurement method can provide useful information Also pertains to filter-weighing laboratories – EPA requirements must be met
Element 11: Sampling Methods Requirements Melinda Ronca-Battista ITEP/TAMS
106 Sampling Method Requirements Requirements, not procedures go in the QAPP Attach your SOPs!
107 Sampling Equipment, Preservation, & Holding Time Requirements Requirements for getting samples (data) to lab without losing what you are measuring or making it stink…
108 Requirements for the Sample Volume of air or whatever medium you are collecting Object with which you are collecting the medium Data for automatic methods (complete, copied onto floppy, file named correctly, never overwritten)
109 Requirements for PM 2.5 Samplers Installed & operated with adherence to requirements in 40 CFR Parts 50, 53 and 58; Section 2:12 of EPA QA Handbook; the sampler manufacturers operation manual, SOPs, and this QAPP
110 Sample (Data) Collection Overview Schedule of receiving filters (data) Schedule of sampling Sample route Describe procedures (may be in SOP) for when retrieving samples (data) do not go as planned
111 Prevention of Contamination Requirements for Temperature Humidity Time Integrity Custody Data handling
112 Sample Preservation Requirements for PM2.5 Filter cassettes stored in filter cassette storage containers Stored with the particulate side up. Temperature (40 CFR Part 50, Appendix L) No direct sunlight or closed-up car during summer Time before sample recovery and time before weighing
113 Set-Up Requirements When Where How (ref. SOPs) Who Documentation
114 Sample (or data) Recovery Requirements When PM2.5 must occur within 96 hours of end of sample period for that filter Where (sites) How (order) Who Documentation
115 Support Facilities for Sampling Methods Office, trailer, truck, & cooler must be consistent with requirements for Temperature Humidity Integrity Custody Storage capacity
116 Field Safety: State that safety comes before getting the sample (data) Reference health and safety plan Provide training if appropriate
117 Field Corrective Action Who is responsible for fixing it? Verifying that it is fixed? Reporting the fix to? Where do they write how they fixed it? When do they have to fix it by? How do you make sure it does not happen again? Where is the documentation stored?
118 Corrective Action Responsibilities (cont.) Who makes sure problem was solved Who makes sure it doesnt happen again who approves changes to sampling locations, personnel, SOPs, QAPP
Element 12: Sample & Data Custody Ondrea Barber Salt River Pima-Maricopa Indian Community
121 Sample (Data) Integrity Packing List Shipping Receipt Annual Shipping Schedule Log Book Other
122 Sample Custodian Who is responsible for sample or data custody? Verify sample/data security. Track sample/data storage location.
124 Internal Sample Custody Procedure Sampling media (filter) receipt Tracked in Logbooks and/or Computers Storage Access Archive Disposal
125 Data Custody Procedure Data storage Data transfer Data security Data backup (logger/strip chart)
Element 13: Analytical Methods Requirements Melinda Ronca-Battista ITEP/TAMS
128 This element is important for comparability: Requirements of all FRM monitors are the same, so that data from different sites can be compared Performance of all labs meeting these requirements is the same, so that data from different labs can be compared
129 Summarize method: In one paragraph summarize how the lab or instrument conducts the measurement If FRM or FEM cite the method number List special components, modifications, inlets List requirements for equipment you are using (scales, thermometers)
130 List in this element: Requirements for how well the lab or instrument performs List analytical methods and equipment
131 Instrument or lab environment: Limits for temperature, humidity If the instrument requires a shelter, describe the requirements here If lab or instrument changes their procedures or performance, who approves it? What are the requirements they/it must meet?
Element 14: Quality Control Melinda Ronca-Battista ITEP/TAMS
134 QC: An Ongoing System Measuring Comparing with MQO Graphing it Fixing it when needed Everything must be documente d and, when significant, reported
135 Evaluate Where Things Can Go Wrongand How To Check Preparing for the field Sampling in the field Analyzing the samples Entering the data Reporting the data
136 Error …the difference between your answer and the truth
137 Bias Error Minimized by calibrating your equipment against a standard Make sure the standard has TRACEABILITY to a NIST standard Compare standard to field and/or lab equipment Measures any bias in your equipment
138 Verifying the Accuracy of Your Transfer Standard If necessary, send your standard to the manufacturer, who has NIST-traceable equipment, and they send you back your standard with a certificate of traceability If appropriate, use natural standards, such as freezing ice water for temperature, and carefully document your calibration according to a standard method
139 Single (One) Point Verifications Single-point verification is when you conduct a check using one flow rate or concentration or value Span is an example of a one-point check Verifications DO NOT involve any changes to your equipment or settings
140 Multi-Point Verifications Check of the response of the instrument to more than one flow rate, concentration, or value Usually zero, a low level, and at the upper end of the range expected to be measured
141 Example of Verification Perform a check of your equipment: flow rate for PM concentration for gas internal mass for TEOM This may be single-point or multi- point If this is within specifications, record this and continue
142 Calibrations If results are NOT within specs, then the instrument must be adjusted This adjustment means that the response of the instrument is changed, which is defined as a CALIBRATION Must be multi-point After calibrating, check again at a single point
143 Example for PM2.5 Corrective Action if the flow rate difference between the transfer standard and the sampler is > 4% Check sampler for internal and external leaks Ensure that temp. and pressure are within normal ranges Run check a 2 nd time If still > 4%, perform a multipoint calibration followed by a single-point verification
144 Who can conduct verifications? Routine checks of instrument stability can be conducted by the site operator Periodic assessments (may be every six months, may be every year, should be quarterly check of flow rate for PM2.5) should be conducted by someone OTHER than the site operator QAPP specifies how often these checks are conducted, by whom, what to do if results are off
145 Calculations for Results of Verifications Consider the standard to be the ideal truth for your equipment Difference between the ideal truth and your equipment is the instrument error (this may include bias [error either usually high or usually low], and imprecision [wiggle sometimes high and sometimes low]) How is instrument error quantified?
146 Quantifying instrument error: Truth = the standards result Y = your equipments result Your equipments error Truth – Y Truth =
148 Precision Error Some imprecision is unavoidable Sometimes up, sometimes down– random Estimated by measuring the same thing several times Minimized by carefully following procedures
149 Two Sources of Precision Error Field Lab Cows can be problems….
150 Estimating Precision in the Field Estimate the random wiggle error If you have more than one of the same type of instrument, place side-by-side (measuring the same sample of air within a room or small area) If you only have one instrument make repeated measurements (same sample of air, quickly in time so the air does not change) Verify that results are within limits
151 Estimating Precision in the Lab Repeated filter weighings OF THE SAME FILTER or standard weight Verify that results are within limitsif results are very different then there is a lot of imprecision and there may be an intermittent power draw, a breeze blowing onto the scale, changes in humidity….
152 Next steps of QC If you are outside the limits: Review procedures & logs to identify problem Go back and review datamay have to throw out data to the last good check Fix, document and report (when significant)
153 Calculating Precision Error Because we dont know which device is better, there is no truth, so use the average as truth X = one sampler Y = the other sampler then precision error is ( ) (multiplied by 100 to yield a value in percent)
154 Coefficient of Variation (COV) COV = coefficient of variation Where s is the sample standard deviation COV = s/(average) See the Tribal Data Analysis spreadsheets for example calculations that you can use
156 Accuracy = Total Error Accuracy for EPA means total error Comparison conducted with completely different system can be used to estimate total error For PM2.5, this is a performance evaluation conducted with a side-by-side FRM device, with the filter weighed by a different lab Total error includes both precision and bias errors
157 Blanks Measure anything that affects the result outside of what you are measuring May make the result greater (contamination) Or decrease the result
158 Types of Blanks For real-time measurements zero checks display the value with no air Manual methods using filters must use field blanksaccompany real samples Labs must measure their own blanks to assess whether there is any contamination in the lab If it is possible that samples get damaged or contaminated during shipping use shipping blanks (trip blanks)
159 Field Blanks Handled exactly as field samples Some field blanks go everywhere field samples go With each operator, site, procedure
160 Control Charts Try to keep something steady, but it naturally varies 95% of time within 2 s of average 99.8% of time within 3 s of average red lines within 2 s of average S = sample standard deviation
Element 15: Equipment Testing, Inspection, and Maintenance Requirements (General Principles) Ondrea Barber Salt River Pima-Maricopa Indian Community
163 Purpose of T.I.&M. Increase system reliability Data completeness Minimize down time Document credibility of data
164 Types of instruments Emphasize those instruments that impact data quality Acceptance testing of equipment Outside person to review final tests Provide a table – referencing SOPs is okay! Ensuring your data meets its quality objectives:
165 Checks Check equipment before you go into field Check in field before you make measurements Check after you make measurements Ongoing checks and maintenance Checklists! (include in QAPP, even if you know it will be revised)
166 Personnel Delineate responsibilities Person(s) to do T.I.&M. Person(s) to order equipment and supplies Person report to for replacement parts or potential problems Person(s) to report problems to and to contact for corrective actions
167 Procedures Describe how T.I.&M. will be done Schedule of T.I.&M. Documentation in QAPP appendices (maintenance checklists kept in sampler case and logbook, standard forms with boxes to list values for parameters or check that tubing and wiring is in satisfactory condition) Location/storage of completed checklists Maintenance history – inventory of replacement parts, suppliers, spare parts, other consumables
168 Visual Inspections Inspect for Damage to monitor Condition of filter and surroundings (i.e., cleanliness) Consistent power supply so that start/stop times are reliable O-rings in place and not torn Wires and tubes all attached
169 Inspections Specify what is done in the field or in the office Be practical Allow for adequate time to do inspections and document their results Take spare parts such as o-rings with youadd this to the checklist
170 Inspection Considerations Frequency Inspection parameter – what do you look for? Action if item fails inspection – how do you fix it? Documentation (logbooks for each piece of equipment & inventory of spare parts, oil, etc., instrument or site visit checklistsinclude in QAPP or reference owners manual)
171 Site-Specific Factors Temperature Precipitation Wind Curious people Birds, bugs, leaves
172 Toolbox Screwdrivers (Phillips, flat head, tiny) Wrenches Gloves Digital multimeter – trouble shoot electrical components Extra batteries – to charge computer board, etc. Duct tape State this in the
173 Maintenance Spare parts Location of parts Inventory of parts Where parts or equipment are purchased
174 Schedule of Maintenance Field or transfer standard traceability to NIST or certificate of calibration referenced
175 Cleaning Supplies Ammonia-based general purpose cleaner – clean hub unit Cotton swabs – clean vent tubes Soft bristle brushes – air screen, bug screen, etc.
176 Cleaning Supplies (cont.) KimWipes – lint-free Silicone-based lubricant – ensure leak-free fit of rain jar, O-rings, other connections Powder-free rubber gloves – sample handling, transport
177 Be sure to include: How you conduct T.I.&M. (reference SOPs, pages from owners manuals) or briefly summarize Schedule Spare parts Corrective action Checklists, documentation sheets
Element 16: Instrument Calibration Ondrea Barber Salt River Pima-Maricopa Indian Community
180 Calibration Calibration is defined as Comparison of instrument response to a standard and Adjusting response to fall within planned limits (remember that if you just check the response and it is okay then that is a verification)
181 List Equipment that Requires Calibration Identify all tools, gauges, instruments, and anything that produces values Make a table that lists the frequency of calibration and how it is to be conducted and who is responsible for ensuring that it gets done
182 List Equipment that Requires Calibration (cont.) Limit to equipment you will be responsible for If you use transfer standards (temperature, flow rate, BGI delta-cal) then these must be periodically recalibrated, so that you know they is producing valid results and this is documented
183 Reference Attached SOPs and... Describe briefly or reference SOP: How calibration is done When to calibrate Summarize calculations Summarize calibration records (logbooks, forms, reports)
184 Calibration Standards Primary standardskeep as the gold standard Field, transfer, or working standards are used in the field These apply to flow rate, temp., pressure, etc.
185 Types of Calibration Multiple point Zero-level Repetitions at each concentration Always verify stable operation after a calibration by checking at least one point again
186 Changes to Calibration Schedule You may have to recalibrate if you: Move, repair, or reassemble equipment If QC checks show degradation If QC checks show great stability, then may not have to recalibrate so soon Change in weather Change in pollutant concentration
187 Justification for Changes Documentationwrite a memo to the files (see example on CD) External reviewerget a reality check from another person Periodic verifications of your calibration schedule and procedures
188 Do Calibrations Yourself? Requires careful documentation Use standards calibrated by vendor or another certified lab Use these standards to calibrate your instruments
189 If you need a Laboratory Get copy of laboratorys QAPP, include as appendix to your QAPP Conduct tour of facility if possible Communicate regularly with facility personnel Make sure lab documents everything, provides reports
190 Documents to you from analysis lab: Copy of their QAPP Copies of their internal performance evaluations within the last year, and throughout the project Agreement that they provide you data on paper and electronically (via ) in excel or whatever format you agree to
191 Documents to you from the lab that certifies your calibration standards: Copy of calibration certificate showing what standard they use to calibrate your equipment Traceability of this standard to the National Institute of Standard and Technology (NIST) Agreement that they provide you a calibration certificate and detailed report on paper and electronically (via ) in excel or whatever format you agree to Outline of calibration procedure
Element 17: Inspection and Acceptance for Supplies Ondrea Barber Salt River Pima-Maricopa Indian Community
195 Requirements What are the requirements for the equipment / supplies you will use during the project?
196 Make a List All supplies & consumables that may directly or indirectly affect the quality of the project Filters Hoses Oil Batteries Disks
197 Provide a Table Listing Description Vendor Specifications Model number Call the sales rep. and ask them to fax you a list of what parts will be needed
198 Two tables may be needed: (1) listing of critical supplies, parts (hoses, filters, disks, etc.) (2) list of their acceptance criteria (diameter/type of hose, type of filter, disks preformatted, whatever is appropriate)
199 (1) List of Critical Supplies: Example First Table for PM2.5: Critical Supplies and Consumables (excerpt)
200 (2) Acceptance Criteria List criteria for each item Should be consistent with projects overall technical and quality criteria Should reflect common sense, but need to document the basic requirements, even if they seem obvious to you
201 (2) Acceptance-Criteria Table Second Table: Example Acceptance Criteria for Supplies and Consumables
Element 18: Data Acquisition Requirements for Non-Direct Measurements Melinda Ronca-Battista ITEP/TAMS
204 What is non-direct data? Any data that you do not gather, such as Weather information Housing information Physical constants Instrument parameters
205 Examples Chemical and physical properties data Operations manuals Geographic data – site, boundary conditions, met sites Data from previous studies Meteorological information – U.S. Weather Service Data, wind rose information Census data, housing office data
206 Be sure to include: Any non-direct data you plan to use How you plan to use it If you are using historical monitoring data, how you will keep track of which data you gather and which data you have gotten from another source
207 Why is the QA Significant? If you plan to use the data with yours, data that you do not gather should: Have undergone QA review Meet quality objectives established for your program This means that you need to get a copy of their QAPP or QA report and compare with this QAPP
208 Step 1 – Describe your Requirements: How you are going to use this data Whether this could change conclusions What are your quality objectives for this data If the data is critical describe some or all QC parameters
209 Step 2 – Describe the Data Sources: Nationally recognized source (USGS, NWS, & NIST) Peer Reviewed Source (published in a peer reviewed journal) Monitoring data from another study
210 Step 3 – Determine if Data Validation will be Performed: If the data is from a nationally recognized source and is being used for a similar purpose validation is probably not needed If the data has not undergone external review and/or the data will be used to meet a different purpose than intended (when collected) some level of validation is required
211 Step 4 – Describe Data Validation for data you plan to use but did not gather Review QC checks Review QA evaluations Compare to other data sets Perform QC measurements Dont be afraid to discard data that you dont trust Reference validation in elem. 24
Element 19: Data Management Melinda Ronca-Battista ITEP/TAMS Ondrea Barber Salt River Pima-Maricopa Indian Community
214 Six Elements of Data Management (1) Data processing and transmission (WHAT) (2) Data end use and integrity protection (WHY) (3) Data access (WHO)
215 Six Elements of Data Management (cont.) (4) Data dissemination (WHERE) (5) Data storage and retrieval (HOW) (6) Data disposal (WHEN) Document everything!
216 Describe in the QAPP: Brief description of Data management objectives (all data is retrievable) How you will meet these objectives (no data erased, no files overwritten, use sharpies, date and initial) Describe (briefly) your paper filing system Describe logbooks: personal, instrument Electronic filing system, file naming system
217 Data Flow Step by step data tracking Who has access to read Who has access to change data Version tracking of files How changes are approved How file integrity is checked
218 Files for each instrument and any extra sensors: Request as much information as possible electronically and on paper from vendors to allow you to make extra copies Maintain file for each instrument with contact information, calibration records, manuals
219 Material Receipt and Storage Use logs for everything (see examples in template QAPPs) Use indelible markers Use tags Use designated shelves with labels Use SOP that is POSTED
220 Types of Field Data Sheets Site Data Sheets document the site information in site files and in database Sampler run data sheets go into field input info into database Verification data sheets every 4 weeks (1-point flow rate, temp, pressure, time) Internal audit data sheets for the sampler and extra sensors (your own audits)
221 Before leaving office for the field: Review the number of each item you will need, and bring backups (use a checklist) Check field data sheet from previous visit to site PM2.5: Ensure that there are filter cassettes for routine, field blanks, and collocated samples PM2.5: Ensure there are enough field transport containers, ice substitutes, max/min thermometers, preprinted mailing labels, if mailing immediately
222 Write in pen and update the documents: Continuously update the checklist in pen make photocopy put in to-do pile to – add the information to the database – change the form in the computer if appropriate At the site draw a map on the field data sheet Take photos if possible
223 Sampler Placement Records Unobstructed air flow for a minimum of 1 m in all directions Inlet at a height of 2 to 15 m above ground level If collocated with any other PM sampler, the spacing between sampler inlets must be > 1 m for other PM2.5 samplers and > 2 m for PM10 All samplers that are called collocated must be within 4 m Sampler inlet must be level Vertical distance between two inlets < 1 m
224 Sampler Maintenance and Cleaning Records Plan all required maintenance 12 months aheadpost 3 months on a calendar Use a checklist that includes items to bring with you and what to do for: Every visit (filter change data sheet) Every month (1/3 day schedule) or quarter (1/6 day) (verification data sheet) Every quarter (flow rate audit data sheet) See examples
225 Computer Backups Every two weeks or as data are gathered add back up today to calendar Alternate between two sets of backup ZIP or JAZ disks Store backup disks in a relatively fireproof location (another building, garage) Plan for the time and money it takes to save copies of files
226 Supplementary records must be kept for only 3 years Chain of custody forms Notebooks Field data sheets
227 Keep records organized in readiness for an audit Paper copies must be available to auditor to attach to their reports in the categories of: Management and organization Site information Field operations Raw data Data reporting QA THE BEST WAY TO GET RID OF AN AUDITOR QUICKLY IS TO GIVE HER PAPER COPIES OF EVERYTHING SHE WANTS
228 Site information files to include: Site Data Sheets Site maps Site photos Summary of instruments at each site and shelter, trailer information Addresses, names, phone numbers
229 Field Operations files to include: Instrument manuals, warranties, calibration certificates in a file for each instrument Standard operating procedures (SOPs) Field notebooks and communications Copies of most recent field sheets Inspection/maintenance records
230 Raw Data files to include: Any original data (routine and QC data on disk and paper) Reports from laboratory or external audits Strip charts Disks Supporting data, such as NWS data
231 Data Reporting files to include: Internal reports Weekly/monthly summaries Corrective action reports Reports to EPA Copies of presentations to community
232 QA files to include: For each instrument, copies of instrument check reports (verifications) Calibration reports (multipoint using some standard) Archived control charts QA reports Audit reports and reports on how problems were solved (corrective action reports)
233 Measurement results should be kept indefinitely Data must be accessible for 5 years All official reports kept for 5 years Paper copies can be discarded after 5 years and electronic copies archived on disk/CD
234 Data Transformation & Analysis Data analysis requirements contained in 40 CFR Part 58, Appendix A Show equations you will use (usually simple relative percent difference) Show examples of charts (use example control chart in the template QAPP)
235 QAPP outlines plans for data transmittal: How data gets from equipment into your computer How (who, how often) data is input How (who, how often, how much) someone double checks data entry How data is sent electronically How you will transfer data into AQS
236 Describe plans for data flagging Criteria for flagging data Flags may be generated by the instrument Flags may be noted by you on the site data sheet (e.g., high winds) and entered into the database later Flags may be automatically written into a column in excel if values are outside a specified range (see example on CD)
237 Describe plans for data storage & retrieval Data archival policies: where, how long, on paper and disk Security of data; locked cabinet; password-protected Database management
Element 20: Assessments & Response Mathew Plate US EPA Region 9
240 Corrective Action Designed to identify and correct flaws in your system Anyone can start and document corrective action Documented by a formal process (a review by a supervisor and a memo to the file; see example)
241 Internal Assessments May be qualitative, such as a review of whether documentation is in order, or technical, checking procedures, and May be conducted by your tribal air organization Someone who has common sense and is technically competent Someone from another program, such as water, solid waste
242 Internal Assessments, cont. May be informal and consist of a review between you and your supervisor on progress IF this is to be considered an assessment it must be DOCUMENTED; use a checklist or memo
243 External Assessments Performed by outsiders such as another tribe, EPA region, or consultant who is technically qualified and Understands projects QA requirements A real auditor
244 Types of Assessments Surveillance Over-the-shoulder monitoring of project & records; this can be internal or external Technical Systems Audit (TSA) On-site examination of facilities, equipment, personnel, training, procedures, and record keeping, usually conducted by EPA Performance Evaluation (PE) involves numeric comparison of results between auditors equipment and yours
245 Performance Evaluations: Internal that you conduct as a test with a borrowed standard, during an inter- comparison with other agenciesVERY useful, esp. before an external audit External side-by-side on-site with another device independently calibrated Comparison of your results with equipment mailed to you or in an EPA van (NPEP)
246 Definitions of Assessments Audits of data quality – qualitative are you working toward your objectives? Do your measurements make sense? Data Quality Assessments – quantitative comparison of results with someone elses Management Systems Reviews review of QA system; usually conducted by EPA Network Reviews are your locations/instruments appropriate?
247 All Assessments: Basically compare What is actually being done in the field and the office Against what is stated in the QAPP and SOPs
248 Readiness Reviewone type of Internal Audit Conducted before starting routine measurements to assess: are we ready? Technical components – equipment Training The report should be approved by an uninvolved person Readiness review can be counted as an internal assessment, IF it is documented and any problems are resolved and documented as well
249 Technical Systems Audits Every three years Look at reports, computer files, logbooks, control charts Follow people around Compare what is happening with QAPP and SOPs
250 Technical Systems Audit (cont.) Field operations: sampling, shipping QA – corrective action QC – field checks, data flagging, record keeping Data management: security Reporting – accuracy
251 Audit of Data Quality Evaluating your data before you report it How data are handled What judgments were made Do your conclusions make sense? Annually and as part of a technical systems audit
252 Network Review 40 CFR Part 58 Appendices D and E How well is your network meeting its objectives? How should it be modified? Conducted formally annually, but you should be continually assessing your results
253 Describe in your QAPP Number, frequency, and types of assessments People or organizations doing the assessments Schedule Criteria for assessments Reporting and responsibility for follow- up
254 Element 20 includes a table listing assessments:
255 Responsibility for Follow-Up and Verification of Corrective Action CLOSE THE LOOP (fix problems and take action to make sure they do not happen again) And make sure it is documented
Element 21: Reports to Management Ondrea Barber Salt River Pima-Maricopa Indian Community
258 Purposes of Reports Communicate Document Track CYA
259 PM2.5 Reporting Pictures are worth a thousand words CHART YOUR DATAby site, by date Flow Rate Audits conducted quarterly by someone other than the routine site operator Results of collocated FRM samplers Results in ranges of concentrations (see tribal data analysis spreadsheet) 40 CFR 58 Appendix A (Section 3.5)
261 Report Tracking List Those responsible for generating and reviewing reports What is in each report (brief) How often each type of report is issued Who receives each type of report
Element 23: Data Review Methods Melinda Ronca-Battista
264 Data Validation Can the data be used for the purpose it is intended? Is the data invalid or can it be used with qualifications? Is the data generation process likely to produce invalid data in the future?
265 Data Validation Templates Developed by OAQPS, EPA Regions and monitoring organizations Three tables generated
266 Data Validation Tables Critical – In CFR with acceptance requirements
267 Data Validation Tables Operational – In CFR without acceptance criteria or identified in guidance
268 Data Validation Tables Systematic – important for correct interpretation of data but do not usually impact validity of sample or group of samples