8Examples of Key Individuals Required Tribal employeesAir Quality Program ManagerQuality Assurance CoordinatorOptional positions (may be contractors)Environmental SpecialistEnvironmental Technician
9Program Manager: Roles & Responsibilities Oversees monitoring projectPrepares or reviews quarterly & annual reports for submittal to EPAEnsures staff is hired and trained
10Program Manager: Roles & Responsibilities (cont.) Prepares & maintains project work plan & budgetCommunicates with Environmental Director & EPA Project OfficerResponsible for approval & modifications of project QAPP
11QA Coordinator: Roles & Responsibilities Prepares or coordinates preparation of QAPPsReviews and approves corrective actionsConducts system audits
12QA Coordinator: Roles & Responsibilities (cont.) Oversees or conducts method performance auditsPrepares QA reportsConducts or oversees data verification, validation and assessment
13Environmental Specialist: Roles & Responsibilities Conducts sample transport, handling & exchangeDelivers samples to laboratory (by mail or actual drop off)Signs off on chain of custody forms
14Environmental Specialist: Roles & Responsibilities (cont.) Conducts quarterly calibrations, quarterly auditsRecords sample information on data formsReports all aspects of monitoring project to Program Manager
15Project Organizational Chart (Example) John W. SmithDirectorNavajo TribeAlexandria WashingtonAir Quality Program ManagerSamuel VaughnAir Quality SpecialistEPA as appropriateMichelle WinstonTom LambAir Quality TechnicianLaboratorySue JonesQA Officer
165-Project Background History Context Assume an “ignorant” reader (e.g., member of public)
17Why is this work important? Are there health effects in your community that may result from this problem (asthma, bronchitis)?Reduced visibility?
18More reasons why this work is important: Concern about possible regional transport of pollutant (ozone precursors)?Increased development, more roads, businesses, residents?Concern about children’s exposure?
19Summarize Existing Information Previous results from earlier studies?Results from nearby areas? (if you did not gather the data, you may not be able to use it with your data but it can be useful to you in planning)Results from emissions inventory?Results from compliance monitoring of nearby facilities?
20Who are the Decision-makers? Tribal CouncilTribal Environmental Office directorEPA Region
21Element 6: Project Description Ondrea BarberSalt River Pima-Maricopa Indian Community
22Summarize Purpose of Project Why are we making these measurements?
23StandardsWhat standard will the measurement results be compared against (if applicable)?
24Field Work-Summary What kinds of measurements are being made? What kind of samplers are being used?How many measurements over what time period?Site locations
25Field Activities-Summary Routine field samplingSample collectionMonthly calibrations/auditsInstrument maintenance
26Laboratory Work How are the samples being analyzed? state the methodcan refer to a standard method)Who is doing the analysis?
27Schedule 1. Hiring deadlines 2. Training 3. Field measurements Dry runs with equipment3. Field measurements4. Analysis5. ReportingActivityFrequencyTest samplerOctober 1, 2005First samplingDecember 1, 2005
28AssessmentsHow will you check on yours and the lab’s work to ensure data is good (summarize)?Who is involved and what are their roles (summarize)?
29Records of Assessment Internal assessments (readiness review) External assessmentsa. PEP audits for PM2.5, NPAP for othersb. Technical Systems Audits
30YOUR Assessments of the Analysis Lab Initial review of their QAPP and calibration certificates when you agree to the contractOnsite visit during the contractOngoing review of their QC results
31Records BRIEF description of project’s records, Information on where they are storedEnsure that detailed information is in Section 19What reports are required?
32Element 7: Project Quality Objectives Mathew PlateUS EPA Region 9
33Project Objectives Why are we making these measurements? Conformance with NAAQSObtain baseline dataTo determine need for additional monitoringHealth risk evaluation
34Project Objectives What will we do with the results? Compare with NAAQSReport to community, EPA, health officials
35Systematic Planning Required by grant regs: 48 CFR 46 Performance criteriaQAPP or equivalentData assessmentCorrective actionQA training for management and staff
36Why Systematic Planning Quality is the extent to which our data is sufficient for purposes it is being usedWe need a process that defines objectives for our monitoring data and ensures that we know when these objectives are metProgram objectives should be developed in consultation with decision makers
37Decision Maker(s)Those who use data for decisions or conclusions, such asIs a standard violated?Should we take action to improve the air quality?What is the air quality now, so that we will know if it gets worse or better?Should we be taking more measurements?
38Who are the Decision Makers for Air Monitoring Data? Required Decision Makers for EPA grantsEPAThe Tribe’s Environmental ProgramOther Decision Makers the Tribe may consider in quality planningState and local organizationsResearchers
39Types of Objectives Project and Program Objectives Data Quality Objectives (DQOs)Based on Program ObjectivesQualitative and quantitativeMeasurement Quality Objectives (MQOs)Specific criteria which when met should produce data of acceptable qualityQuantitative
40Information in DQOs How data will be used Type of data needed How data should be collected
41DQOs Work Backwards... From To Degree of uncertainty you can tolerate Acceptable degree of uncertainty in each measurement & number of measurements to takeTo
42DQO Functions DQOs Link answers to actual measurements Set limits on uncertainty so that data produce required uncertainty in the answer
43Example"We know that we meet the standard with 80% confidence—this means there is a 20% chance that we could be wrong and we are higher than that standard."
44Balancing Cost vs. Degree of Uncertainty Balances costs of taking many samples with desired uncertainty in Taking many samples with expensive devices yields low decision errorsTaking few samples at low cost yields high decision errorsResult—may have to change objectives, e.g. minivols to see if you need to monitor and ask for more $
45AccuracyDQOs are concerned with determining the accuracy of measurements
46Accuracy Accuracy = Total error Includes both bias and precision Measured by true audit and/or by evaluating method quality objectives
47Translating DQOs into Useable Criteria DQOs should be defined in terms of data quality indicatorsCriteria set for the data quality indicators are method quality objectivesMQOs are set by using empirical data, conservative assumptions, statistical assumptions, and/or common sense.
48Data Quality Indicators These are sometimes Called the PARCCSPrecision (P)Bias (A) (bias is sometimes called accuracy)Representativeness (R)Completeness (C)Comparability (C)Detectability (S) (also called sensitivity)
49Precision...how well different measurements of the same thing under prescribed similar conditions agree with each other“Random” component of error—sometimes high, sometimes low
50PrecisionPrecision =“wiggle” (variability within many measurements of the same thing)You are trying to estimate the variability within the population of “all” your measurements of the same thing (concentration)Two ways to estimate precision for a single instrumentIf you have enough equipment, side-by-side, can be two or more devices measuring the same concentrationIf you have only one continuous instrument, you must estimate precision by how much the measurement fluctuates over time when it is measuring the same concentration?
51Precision for Manual Methods Relative percent difference (RPD)RPD =*difference between two monitors*their averageExample: RPD= * *=0.05, or 5%
59Representativeness...a measure of how well your measurements represent the entire population of what you are trying to measure
60this value is in instrumentation manuals or laboratory QA plan Detectability…how low can this be reliably measured with this equipment? Is this low enough to measure trends and evaluate regulatory compliance?this value is in instrumentation manuals or laboratory QA plan
61Completeness... the amount of valid data obtained from a measurement system compared to amount expected to be obtained under correct, normal conditions
62Completeness (cont.) Completeness = # results that are usable (valid) # measurements necessary80% completeness 8 valid resultsfor Quarter necessaryAlso can do make-up runs to make up missed days=
63Comparability...measure of confidence with which one data set can be compared to anotherBeware of apples and oranges in disguiseEPA tries to control this by the method designation process
64Manual MethodsOur precision measured by side-by-side monitors is 10% or less for all checksOur bias measured by independent audits (PEP for PM2.5, NPEP for other criteria pollutants and PM10 flow) is 10% or less for all checksNPEP is mail-in or van program required for all SLAMS and PSD monitors
65Automatic (Continuous) Methods Our precision measured by bi-weekly one-point precision checks (flow rate for PM, span gas for other criteria pollutants) is 15% or less for all checksOur accuracy, measured by independent audits, is 15% or less for all audits
66Measurement Quality Objectives (MQOs) PM2.5 example considering:Bias errorPrecision errorCompleteness
67EPA’s DQO Standards for the PM2.5 NAAQS The question: Does this air meet the annual standard of 15 mg/m3 ?
68Decision Error LimitsEPA decision: acceptable to make correct decision on attainment 95% of time5% of time, decision may be wrong
69EPA Measurement Assumptions EPA based their calculations on 137 samples in a 3-year period, which is 75% completeness with a 1 in 6 day schedule
70MQO CalculationsUsing acceptable uncertainty and the other assumptions, EPA calculated that each measurement must have a bias and a precision error (MQOs) of 10% or less(See element 7 in the EPA PM2.5 Model QAPP)
71MQOsIf these MQOs are met, then the conclusion about attainment will be correct 95% of the time(Maybe - - This assumption needs to be verified when the data is assessed and the DQOs are reviewed)
72Element 8: Special Training Requirements Melinda Ronca-BattistaITEP/TAMS
73Specialized TrainingIdentify & describe for each person all specialized training/certifications neededHow will training be provided?How will training be documented (memos to the file of on-the-job training, personnel files with certificates of training, annual appraisals)?
74What Training Is Necessary? Technical (equipment, QA/QC, etc.)From instrument manufacturerFrom TAMSAllot TIME allowed/expected for reading manuals, doing on-line courses—40 hrs?Document that you have spent 16 (or more) hours reading the QAPP and understand its requirements, using a sign-off sheet, and this will count as necessary training for anyone working on the project
75What Training Is Necessary? (cont.) Computer use:Word processingExcel (see excellent on-line courses)AccessGIS/GPS?On-the-job training/staff orientationSafety courses? (24-hour field safety, first aid/CPR etc.)TAMS AQS training
76Examples of Training Options: EPA (regional, headquarters, OAQPS etc.)ITEP: AIAQTP and TAMS CenterCalifornia Air Resources Board (CARB)Local or state agencies with whom you have a good relationshipborrow their manualsaccompany them in the fielddocument this in your personal logbook (# hrs spent, activity, instruments, your initials and date; make photocopy for personnel file)
77Examples of Training Options (cont.) US EPA’s Air Pollution Training Institute (APTI); can view tapesTribal/state/local agenciesUniversity/college courses (college credit, CEUs, etc.)Air & Waste Management Association (www.awma.org)
79Element 9: Documentation and Records Melinda Ronca-BattistaITEP/TAMS
80This section needed to: Protect legal and financial rights of the agency and persons affected by agency’s activitiesEnsure that data are legally defensible (e.g., all changes are documented [who/when/why WRITTEN DOWN] )
81This section: Element 9 Goes in element 19 Is short—complete data management is described in element 19Think of data management as:What (requirements)WhyWhoHowWhereWhenElement 9Goes in element 19(can be very brief if you reference SOPs!)
82“What”This element lists the requirements for the records—what do you require of these records?Records of planningRecords of operationRecords of data management
83Records created during project planning: Hiring and training recordsInitial contracts with instrument vendors, contractorsBudget recordsPlans for instrument specs, network, site locations, site visitsLIST THESE RECORDS, WHO WILL WRITE/REVISE/APPROVE THEM, HOW OFTEN, WHERE YOU WILL STORE THEM
84Records created during project operations: Site visitsData transferAudits and assessmentsQC checks (internal and external)CalibrationsLIST THESE RECORDS, WHO WILL WRITE/REVISE/APPROVE THEM, HOW OFTEN, WHERE YOU WILL STORE THEM
85Records created during data management: Download and transferFile naming, moving, pw-protecting, archivingReview, range checks, flagging, calculationsReporting to tribe, EPALIST THESE RECORDS, WHO WILL WRITE/REVISE/APPROVE THEM, HOW OFTEN, WHERE YOU WILL STORE THEM
86A little detail please… In the text or table, list for each item :Any rules—must be written in pen, dated and initialedChanges must be made so that original data is kept legibleComputer files are password-protected for making changesDuplicate copies made and stored where?How long are they kept and who decides to throw them away?File naming conventions?
87Requirements for data transfer: Specify rules for:never over-writing filesfrequency of downloading files from instrument, backing up filesKeeping files on paper (may be summary if very long) and on disk and in PC (designate PC)File structure (see example)
88File naming convention example: All your files follow the same format (example): pppp_ssss_dddddd_nnn_rev#Post this on a wall and changes to it to add fieldsWhere pppp=pollutantSsss=site name or numberDddddd=date when measurements madeNnn=initials of person saving the fileRev # is the revision number
89List all documents:SOPs, QAPP (this one and that of any lab analyzing your samples)Site logbooks, personal logbooksRepair and maintenance recordsReports drafted, final, sent out to tribal authorities, EPAPhotos of sitesLetters from community, EPA, etc. are kept and filed
90Data handling procedures: Example: state that this project will use an Access database for all transcribed or input records and dataState that the requirements for data handling are in element 19; this element only lists the records and the requirements for the records
91List your plans for reports received and sent out: Quarterly & Annual Reports submitted to tribe, EPAQuarterly Laboratory Data ReportAudit reportsQuarterly AIRS-AQS Data submittal to EPAList these in a table
92Have all references available in your office QA documentsSOPsOther documentsHave all references available in your office
94Element 10: Network Design Ondrea BarberSalt River Pima-Maricopa Indian Community
95Rationale for Measurement Location and Frequency Refer to the data quality objectives (Element 7)Use the objectives when you decide where/how often to monitorNear where people live?Overall community background?Near sources?Discuss purpose of primary & collocated samplers
96Design Assumptions Document what you are assuming: That a source will impact air quality in the monitored area?Weather patterns, road use, community development in the future, whateverAre your locations reasonable in terms of CFRs and other guidance documents (see ITEP CD)
97Data Generation/ Collection Design What type of equipment will be used to generate/collect data?How many will be used? Primary and collocated, meteorological eqmntFrequency of data collection (24 hours)? Saturdays & Sundays? How many calendar years?Is the monitoring equipment FRM or FEM? If so, what is designation number?
98Sampling Network Design Do you have special purpose samplers (SPMS) measuring for baseline conditions?Do you have a SLAMS-designated site that will provide EPA with national data?Discuss siting requirements(40 CFR Part 58 App. D & E; although these may be impossible to meet so do your best, and confer with your EPA regional office)Monitor location: Roof? Platform? Other?Spatial scale of representativeness(40 CFR Part 58 App. D)
99Monitoring Objectives could include: 1. To determine highest concentrations expected to occur in area covered by network2. To determine representative concentrations in areas of high population density3. To determine impact on ambient pollution levels of significant sources or source categories
100More possible monitoring objectives 4. To determine general background concentration levels5. To determine extent of regional pollutant transport among populated areas, and in support of secondary standards6. To determine culture [EPA calls welfare]-related impacts in more rural & remote areas (i.e., visibility impairment, effects on vegetation)
101Welfare-related impacts... Micro, Middle, neighborhood TABLE 1.—RELATIONSHIP AMONG MONITORING OBJECTIVES AND SCALE OF REPRESENTATIVENESSMonitoring Objective Appropriate Siting ScalesPopulation…………..Source impact……….General background…Regional transport…..Welfare-related impacts...Micro, Middle, neighborhood(sometimes urban1)Neighborhood, urbanNeighborhood, urban, regionalUrban/Regional1 Urban denotes a geographic scale applicable to both cities and rural areas
102Critical / Non-critical Measurements Critical measurements are required to achieve project objectives or limits on decision errorsField measurement requirements (i.e. ambient temperature, barometric pressure, etc.)-What you would include when submitting data to AQSNon critical measurements are those that are “nice to know” but not make-or-break
103“Standard” Measurements Federal Reference Methods (FRMs) and Federal Equivalent Methods (FEMs) provide standard measurements as required by EPA for comparison to the NAAQSUse of any non-standard measurement method can provide useful informationAlso pertains to filter-weighing laboratories – EPA requirements must be met
106Sampling Method Requirements Requirements, not procedures go in the QAPPAttach your SOPs!
107Sampling Equipment, Preservation, & Holding Time Requirements Requirements for getting samples (data) to lab without losing what you are measuring or making it stink…
108Requirements for the Sample Volume of air or whatever medium you are collectingObject with which you are collecting the mediumData for automatic methods (complete, copied onto floppy, file named correctly, never overwritten)
109Requirements for PM2.5 Samplers Installed & operated with adherence to requirements in40 CFR Parts 50, 53 and 58;Section 2:12 of EPA QA Handbook;the sampler manufacturer’s operation manual,SOPs,and this QAPP
110Sample (Data) Collection Overview Schedule of receiving filters (data)Schedule of samplingSample routeDescribe procedures (may be in SOP) for when retrieving samples (data) do not go as planned
111Prevention of Contamination Requirements forTemperatureHumidityTimeIntegrityCustodyData handling
112Sample Preservation Requirements for PM2.5 Filter cassettes stored in filter cassette storage containersStored with the particulate side up.Temperature (40 CFR Part 50, Appendix L)No direct sunlight or closed-up car during summerTime before sample recovery and time before weighing
114Sample (or data) Recovery Requirements WhenPM2.5 must occur within 96 hours of end of sample period for that filterWhere (sites)How (order)WhoDocumentation
115Support Facilities for Sampling Methods Office, trailer, truck, & cooler must be consistent with requirements forTemperatureHumidityIntegrityCustodyStorage capacity
116Field Safety: State that safety comes before getting the sample (data) Reference health and safety planProvide training if appropriate
117Field Corrective Action Who is responsible for fixing it? Verifying that it is fixed? Reporting the fix to?Where do they write how they fixed it?When do they have to fix it by?How do you make sure it does not happen again?Where is the documentation stored?
118Corrective Action Responsibilities (cont.) Who makes sure problem was solvedWho makes sure it doesn’t happen againwho approves changes to sampling locations, personnel, SOPs, QAPP
128This element is important for comparability: Requirements of all FRM monitors are the same, so that data from different sites can be comparedPerformance of all labs meeting these requirements is the same, so that data from different labs can be compared
129Summarize method:In one paragraph summarize how the lab or instrument conducts the measurementIf FRM or FEM cite the method numberList special components, modifications, inletsList requirements for equipment you are using (scales, thermometers)
130List in this element:Requirements for how well the lab or instrument performsList analytical methods and equipment
131Instrument or lab environment: Limits for temperature, humidityIf the instrument requires a shelter, describe the requirements hereIf lab or instrument changes their procedures or performance, who approves it? What are the requirements they/it must meet?
133Element 14: Quality Control Melinda Ronca-BattistaITEP/TAMS
134QC: An Ongoing System Measuring Comparing with MQO Graphing itFixing it when neededEverything must be documented and, when significant, reported
135Evaluate Where Things Can Go Wrong—and How To Check Preparing for the fieldSampling in the fieldAnalyzing the samplesEntering the dataReporting the data
136…the difference between your answer and the “truth” Error…the difference between your answer and the “truth”
137Bias Error Minimized by calibrating your equipment against a standard Make sure the standard has TRACEABILITY to a NIST standardCompare standard to field and/or lab equipmentMeasures any bias in your equipment
138Verifying the Accuracy of Your Transfer Standard If necessary, send your standard to the manufacturer, who has NIST-traceable equipment, and they send you back your standard with a certificate of traceabilityIf appropriate, use natural standards, such as freezing ice water for temperature, and carefully document your calibration according to a standard method
139Single (One) Point Verifications Single-point verification is when you conduct a check using one flow rate or concentration or valueSpan is an example of a one-point checkVerifications DO NOT involve any changes to your equipment or settings
140Multi-Point Verifications Check of the response of the instrument to more than one flow rate, concentration, or valueUsually zero, a low level, and at the upper end of the range expected to be measured
141Example of Verification Perform a check of your equipment:flow rate for PMconcentration for gasinternal mass for TEOMThis may be single-point or multi-pointIf this is within specifications, record this and continue
142CalibrationsIf results are NOT within specs, then the instrument must be adjustedThis adjustment means that the response of the instrument is changed, which is defined as a CALIBRATIONMust be multi-pointAfter calibrating, check again at a single point
143Example for PM2.5Corrective Action if the flow rate difference between the transfer standard and the sampler is > 4%Check sampler for internal and external leaksEnsure that temp. and pressure are within normal rangesRun check a 2nd timeIf still > 4%, perform a multipoint calibration followed by a single-point verification
144Who can conduct verifications? Routine checks of instrument stability can be conducted by the site operatorPeriodic assessments (may be every six months, may be every year, should be quarterly check of flow rate for PM2.5) should be conducted by someone OTHER than the site operatorQAPP specifies how often these checks are conducted, by whom, what to do if results are off
145Calculations for Results of Verifications Consider the standard to be the “ideal truth” for your equipmentDifference between the “ideal truth” and your equipment is the instrument error (this may include bias [error either usually high or usually low], and imprecision [wiggle sometimes high and sometimes low])How is instrument error quantified?
146Quantifying instrument error: Truth = the standard’s resultY = your equipment’s resultYour equipment’s errorTruth – YTruth=
148Precision Error Some imprecision is unavoidable Sometimes up, sometimes down–“random”Estimated by measuring the same thing several timesMinimized by carefully following procedures
149Two Sources of Precision Error FieldLabCows can be problems….THE COW IS GOING TO KNOCK OVER THE MET TOWERTHE MILK WAS SPILLED IN THE LAB
150Estimating Precision in the Field Estimate the random “wiggle” errorIf you have more than one of the same type of instrument, place side-by-side (measuring the same sample of air within a room or small area)If you only have one instrument make repeated measurements (same sample of air, quickly in time so the air does not “change”)Verify that results are within limits
151Estimating Precision in the Lab Repeated filter weighings OF THE SAME FILTER or standard weightVerify that results are within limits—if results are very different then there is a lot of imprecision and there may be an intermittent power draw, a breeze blowing onto the scale, changes in humidity….
152Next steps of QC If you are outside the limits: Review procedures & logs to identify problemGo back and review data—may have to throw out data to the last good checkFix, document and report (when significant)
153Calculating Precision Error Because we don’t know which device is better, there is no “truth,” so use the average as “truth”X = one samplerY = the other samplerthen precision error is( )RPD ADD ABS VALUE(multiplied by 100 to yield a value in percent)
154Coefficient of Variation (COV) COV = coefficient of variationCOV = s/(average)Where “s” is the sample standard deviationDEFINE DSee the Tribal Data Analysis spreadsheets for example calculations that you can use
155Remember... 0.05 = 5% 0.10 = 10% Precision calculations exercises MOVE TO BEG
156Accuracy = Total Error “Accuracy” for EPA means “total error” Comparison conducted with completely different system can be used to estimate total errorFor PM2.5, this is a performance evaluation conducted with a side-by-side FRM device, with the filter weighed by a different labTotal error includes both precision and bias errors
157BlanksMeasure anything that affects the result outside of what you are measuringMay make the result greater (contamination)Or decrease the result
158Types of BlanksFor real-time measurements zero checks display the value with no airManual methods using filters must use field blanks—accompany “real” samplesLabs must measure their own blanks to assess whether there is any contamination in the labIf it is possible that samples get damaged or contaminated during shipping use shipping blanks (trip blanks)
159Field Blanks Handled exactly as field samples Some field blanks go everywhere field samples goWith each operator, site, procedure
160Control Charts red lines within 2 s of average S = sample standard deviationred lines within 2 s of averageMELINDA WILL DO DEMO OF DRIFT USING CARBON DIOXIDE MONITOR OR…….– HAVE NOT EXCEEDED CRITERIA BUT SHOULD RECALIBRATETry to keep something steady, but it naturally varies95% of time within 2 s of average99.8% of time within 3 s of average
162Ondrea Barber Salt River Pima-Maricopa Indian Community Element 15: Equipment Testing, Inspection, and Maintenance Requirements (General Principles)Ondrea BarberSalt River Pima-Maricopa Indian Community
163Purpose of T.I.&M. Increase system reliability Data completeness Minimize down timeDocument credibility of data
164Ensuring your data meets its quality objectives: Types of instrumentsEmphasize those instruments that impact data qualityAcceptance testing of equipmentOutside person to review final testsProvide a table – referencing SOPs is okay!
165Checks Check equipment before you go into field Check in field before you make measurementsCheck after you make measurementsOngoing checks and maintenanceChecklists! (include in QAPP, even if you know it will be revised)
166Personnel Delineate responsibilities Person(s) to do T.I.&M. Person(s) to order equipment and suppliesPerson report to for replacement parts or potential problemsPerson(s) to report problems to and to contact for corrective actions
167Procedures Describe how T.I.&M. will be done Schedule of T.I.&M. Documentation in QAPP appendices (maintenance checklists kept in sampler case and logbook, standard forms with boxes to list values for parameters or check that tubing and wiring is in satisfactory condition)Location/storage of completed checklistsMaintenance history – inventory of replacement parts, suppliers, spare parts, other consumables
168Visual Inspections Inspect for Damage to monitor Condition of filter and surroundings (i.e., cleanliness)Consistent power supply so that start/stop times are reliableO-rings in place and not tornWires and tubes all attached
169Inspections Specify what is done in the field or in the office Be practicalAllow for adequate time to do inspections and document their resultsTake spare parts such as o-rings with you—add this to the checklist
170Inspection Considerations FrequencyInspection parameter – what do you look for?Action if item fails inspection – how do you fix it?Documentation (logbooks for each piece of equipment & inventory of spare parts, oil, etc., instrument or site visit checklists—include in QAPP or reference owner’s manual)
179Element 16: Instrument Calibration Ondrea BarberSalt River Pima-Maricopa Indian Community
180Calibration Calibration is defined as Comparison of instrument response to a standard andAdjusting response to fall within planned limits (remember that if you just check the response and it is okay then that is a verification)
181List Equipment that Requires Calibration Identify all tools, gauges, instruments, and anything that produces valuesMake a table that lists the frequency of calibration and how it is to be conducted and who is responsible for ensuring that it gets done
182List Equipment that Requires Calibration (cont.) Limit to equipment you will be responsible forIf you use transfer standards (temperature, flow rate, BGI delta-cal) then these must be periodically recalibrated, so that you know they is producing valid results and this is documented
183Reference Attached SOPs and... Describe briefly or reference SOP:How calibration is doneWhen to calibrateSummarize calculationsSummarize calibration records (logbooks, forms, reports)
184Calibration Standards Primary standards—keep as the “gold standard”Field, transfer, or working standards are used in the fieldThese apply to flow rate, temp., pressure, etc.
185Types of Calibration Multiple point Zero-level Repetitions at each concentrationAlways verify stable operation after a calibration by checking at least one point again
186Changes to Calibration Schedule You may have to recalibrate if you:Move, repair, or reassemble equipmentIf QC checks show degradationIf QC checks show great stability, then may not have to recalibrate so soonChange in weatherChange in pollutant concentration
187Justification for Changes Documentation—write a memo to the files (see example on CD)External reviewer—get a reality check from another personPeriodic verifications of your calibration schedule and procedures
188Do Calibrations Yourself? Requires careful documentationUse standards calibrated by vendor or another certified labUse these standards to calibrate your instruments
189If you need a Laboratory Get copy of laboratory’s QAPP, include as appendix to your QAPPConduct tour of facility if possibleCommunicate regularly with facility personnelMake sure lab documents everything, provides reports
190Documents to you from analysis lab: Copy of their QAPPCopies of their internal performance evaluations within the last year, and throughout the projectAgreement that they provide you data on paper and electronically (via ) in excel or whatever format you agree to
191Documents to you from the lab that certifies your calibration standards: Copy of calibration certificate showing what standard they use to calibrate your equipmentTraceability of this standard to the National Institute of Standard and Technology (NIST)Agreement that they provide you a calibration certificate and detailed report on paper and electronically (via ) in excel or whatever format you agree toOutline of calibration procedure
194Element 17: Inspection and Acceptance for Supplies Ondrea BarberSalt River Pima-Maricopa Indian Community
195RequirementsWhat are the requirements for the equipment / supplies you will use during the project?
196Make a ListAll supplies & consumables that may directly or indirectly affect the quality of the projectFiltersHosesOilBatteriesDisks
197Provide a Table Listing DescriptionVendorSpecificationsModel numberCall the sales rep. and ask them to fax you a list of what parts will be needed
198Two tables may be needed: (1) listing of critical supplies, parts (hoses, filters, disks, etc.)(2) list of their acceptance criteria (diameter/type of hose, type of filter, disks preformatted, whatever is appropriate)
199(1) List of Critical Supplies: Example First Table for PM2.5: Critical Supplies and Consumables (excerpt)
200(2) Acceptance Criteria List criteria for each itemShould be consistent with project’s overall technical and quality criteriaShould reflect common sense, but need to document the basic requirements, even if they seem obvious to you
201(2) Acceptance-Criteria Table Second Table: Example Acceptance Criteria for Supplies and Consumables
203Element 18: Data Acquisition Requirements for Non-Direct Measurements Melinda Ronca-BattistaITEP/TAMS
204What is non-direct data? Any data that you do not gather, such asWeather informationHousing informationPhysical constantsInstrument parameters
205Examples Chemical and physical properties data Operations manuals Geographic data – site, boundary conditions, met sitesData from previous studiesMeteorological information – U.S. Weather Service Data, wind rose informationCensus data, housing office data
206Be sure to include: Any non-direct data you plan to use How you plan to use itIf you are using historical monitoring data, how you will keep track of which data you gather and which data you have gotten from another source
207Why is the QA Significant? If you plan to use the data with yours, data that you do not gather should:Have undergone QA reviewMeet quality objectives established for your programThis means that you need to get a copy of their QAPP or QA report and compare with this QAPPTRIMODELLING DATAUNCERTAINTYTIME ON WATCHES APPROX IS OKAY, EXACT MEANS A DIFFWALMART THERMOMETERS COMPARED TO NIST—SO HOW DID YOU MEASURE TEMP?
208Step 1 – Describe your Requirements: How you are going to use this dataWhether this could change conclusionsWhat are your quality objectives for this dataIf the data is critical describe some or all QC parameters
209Step 2 – Describe the Data Sources: Nationally recognized source (USGS, NWS, & NIST)Peer Reviewed Source (published in a peer reviewed journal)Monitoring data from another study
210Step 3 – Determine if Data Validation will be Performed: If the data is from a nationally recognized source and is being used for a similar purpose validation is probably not neededIf the data has not undergone external review and/or the data will be used to meet a different purpose than intended (when collected) some level of validation is required
211Step 4 – Describe Data Validation for data you plan to use but did not gather Review QC checksReview QA evaluationsCompare to other data setsPerform QC measurementsDon’t be afraid to discard data that you don’t trustReference validation in elem. 24
213Element 19: Data Management Melinda Ronca-BattistaITEP/TAMSOndrea BarberSalt River Pima-Maricopa Indian Community
214Six Elements of Data Management (1) Data processing and transmission (WHAT)(2) Data end use and integrity protection (WHY)(3) Data access (WHO)
215Six Elements of Data Management (cont.) (4) Data dissemination (WHERE)(5) Data storage and retrieval (HOW)(6) Data disposal (WHEN)Document everything!
216Describe in the QAPP:Brief description ofData management objectives (all data is retrievable)How you will meet these objectives (no data erased, no files overwritten, use sharpies, date and initial)Describe (briefly) your paper filing systemDescribe logbooks: personal, instrumentElectronic filing system, file naming system
217Data Flow Step by step data tracking Who has access to read Who has access to change dataVersion tracking of filesHow changes are approvedHow file integrity is checked
218Files for each instrument and any extra sensors: Request as much information as possible electronically and on paper from vendors to allow you to make extra copiesMaintain file for each instrument with contact information, calibration records, manuals
219Material Receipt and Storage Use logs for everything (see examples in template QAPPs)Use indelible markersUse tagsUse designated shelves with labelsUse SOP that is POSTED
220Types of Field Data Sheets Site Data Sheetsdocument the site informationin site files and in databaseSampler run data sheetsgo into fieldinput info into databaseVerification data sheetsevery 4 weeks (1-point flow rate, temp, pressure, time)Internal audit data sheetsfor the sampler and extra sensors (your own audits)
221Before leaving office for the field: Review the number of each item you will need, and bring backups (use a checklist)Check field data sheet from previous visit to sitePM2.5: Ensure that there are filter cassettes for routine, field blanks, and collocated samplesPM2.5: Ensure there are enough field transport containers, ice substitutes, max/min thermometers, preprinted mailing labels, if mailing immediately
222Write in pen and update the documents: Continuously update the checklist in penmake photocopyput in “to-do” pile toadd the information to the databasechange the form in the computer if appropriateAt the site draw a map on the field data sheetTake photos if possible
223Sampler Placement Records Unobstructed air flow for a minimum of 1 m in all directionsInlet at a height of 2 to 15 m above ground levelIf collocated with any other PM sampler, the spacing between sampler inlets must be > 1 m for other PM2.5 samplers and > 2 m for PM10All samplers that are called collocated must be within 4 mSampler inlet must be levelVertical distance between two inlets < 1 m
224Sampler Maintenance and Cleaning Records Plan all required maintenance 12 months ahead—post 3 months on a calendarUse a checklist that includes items to bring with you and what to do for:Every visit (filter change data sheet)Every month (1/3 day schedule) or quarter (1/6 day) (verification data sheet)Every quarter (flow rate audit data sheet)See examples
225Computer BackupsEvery two weeks or as data are gathered—add “back up today” to calendarAlternate between two sets of backup ZIP or JAZ disksStore backup disks in a relatively fireproof location (another building, garage)Plan for the time and money it takes to save copies of files
226Supplementary records must be kept for only 3 years Chain of custody formsNotebooksField data sheets
227Keep records organized in readiness for an audit Paper copies must be available to auditor to attach to their reports in the categories of:Management and organizationSite informationField operationsRaw dataData reportingQATHE BEST WAY TO GET RID OF AN AUDITOR QUICKLY IS TO GIVE HER PAPER COPIES OF EVERYTHING SHE WANTS
228Site information files to include: Site Data SheetsSite mapsSite photosSummary of instruments at each site and shelter, trailer informationAddresses, names, phone numbers
229Field Operations files to include: Instrument manuals, warranties, calibration certificates in a file for each instrumentStandard operating procedures (SOPs)Field notebooks and communicationsCopies of most recent field sheetsInspection/maintenance records
230Raw Data files to include: Any original data (routine and QC data on disk and paper)Reports from laboratory or external auditsStrip chartsDisksSupporting data, such as NWS data
231Data Reporting files to include: Internal reportsWeekly/monthly summariesCorrective action reportsReports to EPACopies of presentations to community
232QA files to include:For each instrument, copies of instrument check reports (verifications)Calibration reports (multipoint using some standard)Archived control chartsQA reportsAudit reports and reports on how problems were solved (corrective action reports)
233Measurement results should be kept indefinitely Data must be accessible for 5 yearsAll official reports kept for 5 yearsPaper copies can be discarded after 5 years and electronic copies archived on disk/CD
234Data Transformation & Analysis Data analysis requirements contained in 40 CFR Part 58, Appendix AShow equations you will use (usually simple relative percent difference)Show examples of charts (use example control chart in the template QAPP)
235QAPP outlines plans for data transmittal: How data gets from equipment into your computerHow (who, how often) data is inputHow (who, how often, how much) someone double checks data entryHow data is sent electronicallyHow you will transfer data into AQS
236Describe plans for data flagging Criteria for flagging dataFlags may be generated by the instrumentFlags may be noted by you on the site data sheet (e.g., high winds) and entered into the database laterFlags may be automatically written into a column in excel if values are outside a specified range (see example on CD)
237Describe plans for data storage & retrieval Data archival policies: where, how long, on paper and diskSecurity of data; locked cabinet; password-protectedDatabase management
239Element 20: Assessments & Response Mathew PlateUS EPA Region 9
240Corrective ActionDesigned to identify and correct flaws in your systemAnyone can start and document corrective actionDocumented by a formal process (a review by a supervisor and a memo to the file; see example)
241Internal AssessmentsMay be qualitative, such as a review of whether documentation is in order, or technical, checking procedures, andMay be conducted by your tribal air organizationSomeone who has common sense and is technically competentSomeone from another program, such as water, solid waste
242Internal Assessments, cont. May be informal and consist of a review between you and your supervisor on progressIF this is to be considered an assessment it must be DOCUMENTED; use a checklist or memo
243External AssessmentsPerformed by outsiders such as another tribe, EPA region, or consultant who is technically qualifiedandUnderstands project’s QA requirementsA real auditor
244Types of Assessments Surveillance Technical Systems Audit (TSA) “Over-the-shoulder” monitoring of project & records; this can be internal or externalTechnical Systems Audit (TSA)On-site examination of facilities, equipment, personnel, training, procedures, and record keeping, usually conducted by EPAPerformance Evaluation (PE)involves numeric comparison of results between auditor’s equipment and yours
245Performance Evaluations: Internalthat you conduct as a “test” with a borrowed standard, during an inter-comparison with other agencies—VERY useful, esp. before an external auditExternalside-by-side on-site with another device independently calibratedComparison of your results with equipment mailed to you or in an EPA van (NPEP)
246Definitions of Assessments Audits of data quality – qualitativeare you working toward your objectives? Do your measurements make sense?Data Quality Assessments – quantitativecomparison of results with someone else’sManagement Systems Reviewsreview of QA system; usually conducted by EPANetwork Reviewsare your locations/instruments appropriate?
247All Assessments: Basically compare What is actually being done in the field and the officeAgainst what is stated in the QAPP and SOPs
248Readiness Review—one type of Internal Audit Conducted before starting routine measurements to assess: “are we ready?”Technical components – equipmentTrainingThe report should be approved by an uninvolved personReadiness review can be counted as an internal assessment, IF it is documented and any problems are resolved and documented as well
249Technical Systems Audits Every three yearsLook at reports, computer files, logbooks, control chartsFollow people aroundCompare what is happening with QAPP and SOPs
250Technical Systems Audit (cont.) Field operations: sampling, shippingQA – corrective actionQC – field checks, data flagging, record keepingData management: securityReporting – accuracy
251Audit of Data Quality Evaluating your data before you report it How data are handledWhat judgments were madeDo your conclusions make sense?Annually and as part of a technical systems audit
252Network Review 40 CFR Part 58 Appendices D and E How well is your network meeting its objectives?How should it be modified?Conducted formally annually, but you should be continually assessing your results
253Describe in your QAPP Number, frequency, and types of assessments People or organizations doing the assessmentsScheduleCriteria for assessmentsReporting and responsibility for follow-up
254Element 20 includes a table listing assessments:
255Responsibility for Follow-Up and Verification of Corrective Action CLOSE THE LOOP(fix problems and take action to make sure they do not happen again)And make sure it is documented
259PM2.5 ReportingPictures are worth a thousand words— CHART YOUR DATA—by site, by dateFlow Rate Audits conducted quarterly by someone other than the routine site operatorResults of collocated FRM samplersResults in ranges of concentrations (see tribal data analysis spreadsheet)40 CFR 58 Appendix A (Section 3.5)
263Element 23: Data Review Methods Melinda Ronca-Battista
264Data Validation Can the data be used for the purpose it is intended? Is the data invalid or can it be used with qualifications?Is the data generation process likely to produce invalid data in the future?
265Data Validation Templates Developed by OAQPS, EPA Regions and monitoring organizationsThree tables generated
266Data Validation Tables Critical – In CFR with acceptance requirements
267Data Validation Tables Operational – In CFR without acceptance criteria or identified in guidance
268Data Validation Tables Systematic – important for correct interpretation of data but do not usually impact validity of sample or group of samples