1 Dec. 11, 1997 LEADS Quality Control Systems Robert Brewer (512) 239-1618 Monitoring Operations Division Network QA Manager.

Slides:



Advertisements
Similar presentations
EPA Methods 3A, 6C, 7E, 10 & 20 Corrections to May 15, 2006 Final Rule That Updated the Methods That Updated the Methods Foston Curtis US EPA.
Advertisements

The ISA for Physics What you need to revise.
SECTION 11 LEVEL B SUIT OUT. SECTION 12 AIR MONITORING.
PART 75 SPAN & RANGE Manuel J Oliva Clean Air Markets Division
Harmonization of Part 60 and Part 75 CEM Requirements Robert Vollaro
Ljupcho Grozdanovski Division for air quality monitoring Ministry of environment and physical planning Prizren, July 2013 Air Quality support meeting,
Preventing and Resolving Reporting Errors Using Monitor Data Checking Software (MDC) Louis Nichols Clean Air Markets Division.
Assessing Laboratory Quality – Systematic Bias Robert O. Miller Colorado State University Fort Collins, CO.
Automating Result Verification
Software Quality Assurance Plan
Deliverable 2.8: Outliers Gary Brown Office for National Statistics UK.
Why do we lose analyzer data? Monitor malfunction DAS malfunction Power outages Environmental problems Wildlife damage Vandalism Operator error.
SRPMIC Ozone Monitoring Sites Senior Center SiteHigh School Site Red Mountain Site Lehi Site.
Laboratory Quality Control
Data Handling l Classification of Errors v Systematic v Random.
Short Course on Introduction to Meteorological Instrumentation and Observations Techniques QA and QC Procedures Short Course on Introduction to Meteorological.
EPA Precursor Gas Training Workshop Kevin A Cavender EPA-Office of Air Quality, Planning and Standards Precursor Gas Monitoring NO y Monitoring Training.
Quality Assurance / Quality Control
Method Comparison A method comparison is done when: A lab is considering performing an assay they have not performed previously or Performing an assay.
What is the It is the Next Generation, Calibration Station for the GasBadge ® Plus Personal Monitor
1 Guest Speaker: Bill Frietsche US EPA.  April 7: QA Systems, EPA definitions, PQAOs and common sense – Mike Papp  April 14: Routine Quality Control.
Presented by: Mike Hamdan South Coast Air Quality Management District Diamond Bar, CA Presented at: The Tribal Air Monitoring Training, Pechanga Reservation,
Quality Assessment 2 Quality Control.
What to compare against the validation templates (see templates in course webpage: Resources/Validation%20Templates%20from%20Red.
IB Chemistry Chapter 11, Measurement & Data Processing Mr. Pruett
1 IPS MeteoStar May 4, 1999 Texas Natural Resource Conservation Commission TROUBLESHOOTING GUIDE.
Gaseous Pollutants mini-course TAMS Center February 2009 Creating a Test Atmosphere.
Quality Assurance/ Quality Control
LEADS/EMSOVERVIEW IPS MeteoStar February 22,2007 WHAT IS LEADS Leading Environmental Analysis and Display System Development Began At Lockheed.
Calibration Savvy. Calibration and Conformance  Calibration: Check a measurement against a known universally recognized standard to determine any deviation.
1 Dec. 8, 1997 LEADS Quality Assurance Summary Robert Brewer (512) Monitoring Operations Division Network QA Manager.
1 Radiometer Medical ApS, Åkandevej 21, DK-2700 Brønshøj, Tel: , RTC, December 2004 ABL800 FLEX Calibration.
Success depends upon the ability to measure performance. Rule #1:A process is only as good as the ability to reliably measure.
1 Saxony-Anhalt EU Twinning RO 04/IB/EN/09 State Environmental Protection Agency Wolfgang GarcheWorkshop European Standards Requirements of.
Automated CBC Parameters
Quality Control – Part II Tim Hanley EPA Office of Air Quality Planning and Standards.
Quality Control and Patient Risk Curtis A. Parvin, Ph. D
Laboratory QA/QC An Overview.
Audit Sampling: An Overview and Application to Tests of Controls
CARB Continuous PM 2.5 Monitoring Activity (BAM’s) Reggie Smith California Air Resources Board.
Chapter 36 Quality Engineering (Part 2) EIN 3390 Manufacturing Processes Summer A, 2012.
Why do we need to do it? What are the basic tools?
REGRESSION DIAGNOSTICS Fall 2013 Dec 12/13. WHY REGRESSION DIAGNOSTICS? The validity of a regression model is based on a set of assumptions. Violation.
Module 1: Measurements & Error Analysis Measurement usually takes one of the following forms especially in industries: Physical dimension of an object.
Compliance Assurance Monitoring (CAM) November 24, 2009.
LEADS/EMS DATA VALIDATION IPS MeteoStar December 11, 2006 WHAT IS VALIDATION? From The Dictionary: 1a. To Make Legally Valid 1b. To Grant Official.
Archived Data Management System Study Advisory Committee Meeting May 14, 2003.
Chapter 21 Measurement Analysis. Measurement It is important to define and validate the measurement system before collecting data. –Without measurement.
Validation Defination Establishing documentary evidence which provides a high degree of assurance that specification process will consistently produce.
Project Management Risk and Quality.
Module 11 Module I: Terminology— Data Quality Indicators (DQIs) Melinda Ronca-Battista ITEP Catherine Brown U.S. EPA.
Uncertainties and errors
 Software reliability is the probability that software will work properly in a specified environment and for a given amount of time. Using the following.
EQUIPMENT and METHOD VALIDATION
Week 2 Normal Distributions, Scatter Plots, Regression and Random.
Robert S. Wright US EPA, Research Triangle Park, NC EPA Protocol Gases Fall 2012 Update PurityPlus Specialty Gas Producers.
36.3 Inspection to Control Quality
How to use SPC Before implementing SPC or any new quality system, the manufacturing process should be evaluated to determine the main areas of waste. Some.
1 2 3 INDIAN INSTITUTE OF TECHNOLOGY ROORKEE PROJECT REPORT 2016
Calibration Calibration of the analyzer establishes the relationship between actual pollutant concentration input and the analyzer response. Should be.
Electronic Control Systems Week 4 – Signaling and Calibration
CHAPTER 29: Multiple Regression*
Chapter 5 Quality Assurance and Calibration Methods
Proposed Ozone Monitoring Revisions Ozone Season and Methods
Quality Control Lecture 3
QA/QC Gaseous Pollutants mini-course TAMS Center February 2009.
Precision, Bias, and Total Error (Accuracy)
Quality Assessment The goal of laboratory analysis is to provide the accurate, reliable and timeliness result Quality assurance The overall program that.
Measurements & Error Analysis
Presentation transcript:

1 Dec. 11, 1997 LEADS Quality Control Systems Robert Brewer (512) Monitoring Operations Division Network QA Manager

2 Dec. 11, 1997 Quality Control (QC) Quality Control is the overall system of technical activities that measure the attributes and performance of a process, item, or service against defined standards to verify that they meet the stated requirements established by the customer.

3 Dec. 11, 1997 LEADS QC Checks n During Calibrations n Monitor Voltage and Concentration Outlier Checks n Concentration Spacing Check n Slope/Intercept Checks - Cal. only n Zero/Span Checks n Precision/Linearity Check - Cal. Only n Converter Efficiency Checks (NOx & H2S) n During Span Checks n Monitor Voltage and Concentration Outlier Checks n Concentration Spacing Check n Zero/Span Checks n Linearity Check - Span Only n Converter Efficiency Checks (NOx & H2S) n Span Source Audit - Checks the accuracy of the DASIBI 5008 Calibrator every 45 days

4 Dec. 11, 1997 Calibration Sequences n Sequences are composed of Levels. n Levels - A Level consists of a set concentration from the DASIBI 5008 calibrator introduced into a monitor for a set number of 5-min average updates. Each Level is assigned a letter code (M, R, S, T, or G) by the datalogger. n Updates - A set number of 5-min updates (usually 3) in each level are allowed for instrument stabilization. The remaining updates (usually 4) are processed by LEADS.

5 Dec. 11, 1997 O3, SO2, & CO Cal. Sequence Level M* R S T* G* Conc. (ppm) 0.4, CO , CO , CO , CO * Note - Levels used during Span Checks

6 Dec. 11, 1997 NO, NO2, & NOx Cal. Sequence NO2 Conc. (ppm) NOx Conc. (ppm) Level G* M* M†* R R† S S† T* T†* NO Conc. (ppm) *Note - Levels used during Span Checks †Note - Gas Phase Titration Levels

7 Dec. 11, 1997 Scrubber Bypassed NO Yes NO H2S Cal. Sequence SO2 Conc. (ppm) Level M* R S T* T1* T2* G* H2S Conc. (ppm) *Note - Levels used during Span Checks T1 - H2S Converter Efficiency Check T2 - SO2 Scrubber Efficiency Check

8 Dec. 11, 1997 Outlier Tests Min Updates StabilizationProcessed Updates Average of 4 Limits Voltage or Conc. Next Level Outlier is found so test is repeated without outlier.

9 Dec. 11, 1997 Outlier Tests - Second Pass Min Updates StabilizationProcessed Updates Average of 3 Limits Voltage or Conc. Next Level The first outlier is ignored. Remaining 3 updates are within limits. Result: Pass with Warning

10 Dec. 11, 1997 Concentration Spacing Test Voltage Concentration G T S R M - Warning or Failure Limit around Ideal Value Outside of Limit

11 Dec. 11, 1997 Slope and Intercept Tests (Cal. Only) Voltage Concentration G T S R M Slope Outside of Limit Intercept ²y ²x Slope = ²y ²x Intercept Limits Slope Limits Ideal Slope Note: Both Warning and Failure Limits are used in these tests. Regression Line

12 Dec. 11, 1997 Zero and Span Tests Voltage Concentration G T S R M Zero Limits Previous Cal. Span Voltage Previous Cal. Zero Voltage Span Limits Zero Voltage Out of Limits - Ideal Values Note: Both Warning and Failure Limits are used in these tests.

13 Dec. 11, 1997 Precision/Linearity Test (Cal. Only) Voltage Concentration G T S R M Note: Both Warning and Failure Limits are used in this test. Limits Regression Line Out of Limits

14 Dec. 11, 1997 Linearity Test (Span Check Only) Voltage Concentration G T S R M Note: Both Warning and Failure Limits are used in this test. Limits Line between Zero and Span Out of Limits

15 Dec. 11, 1997 NO2 Converter Efficiency Test n The NO2 Converter Efficiency is calculated for the M, R, S, and T concentration levels and compared to an ideal of 100%. n Efficiency is calculated from the change in response of the NOx channel when a level of nitric oxide is titrated with ozone to produce NO2. n Both Warning and Failure Limits are used in these tests.

16 Dec. 11, 1997 H2S Converter and Scrubber Efficiency Tests n H2S Converter and Scrubber Efficiency are calculated for the T level only and compared to an ideal of 100%. n Converter Efficiency is calculated from the monitor’s response to the T level of H2S (T) as compared to the T Level of SO2 with the scrubber bypassed (T1). n Scrubber Efficiency is calculated from the monitor’s response to the T level of SO2 with the scrubber bypassed (T1) as compared to the T Level of SO2 through the scrubber (T2). n Both Warning and Failure Limits are used in these tests.

17 Dec. 11, 1997 Ideal Values n Ideal values for each of the tests are listed in the Calibration and Span Check reports available from the LEADS Network Status Report Web Pages. n For Each Test, the Ideal Value is subtracted from the Measured Value to obtain the test error. This error is then compared to the Warning and Failure Limits.

18 Dec. 11, 1997 Test Limits n All QC tests performed on LEADS calibration or Span Check data have both Warning and Failure Limits except the Outlier Tests. Each Outlier Test uses only one Limit but the test is repeated once if an outlier is detected. n Each Warning Limit is chosen statistically to represent the 3 standard deviation value about the mean error of a test. This means that there should be only a 0.27% probability of exceeding a warning limit if the monitoring system is working properly. DO NOT IGNORE WARNINGS. n Each Failure Limit is chosen to be at least 1.5 times the warning limit and is intended to represent the maximum error that will be tolerated without invalidation of the affected data.

19 Dec. 11, 1997 Automatic Data Validation Rules n Automatic invalidation of data is based only on whether a QC test passes or fails. Warnings are not considered in this processing. n Failure of a Concentration Outlier test or a Concentration Spacing test indicates a problem with the calibration system but not with the monitor (except NO2). The Calibration or Span Check event involved is considered invalid and the ambient pollution data is unaffected. n Calibration QC Tests n Failure of a Monitor Outlier Test, Precision/Linearity Test, Converter Efficiency or Scrubber Efficiency Test causes invalidation of ambient data back to the last good Cal. or Span Check and forward to the next good Cal. n Failure of a Slope or Intercept Test causes ambient data invalidation forward to the next good Cal. n Failure of a Zero or Span test causes ambient data invalidation back to the last good Cal. or Span Check.

20 Dec. 11, 1997 Automatic Data Validation Rules (Cont.) n Span Check Tests n Failure of any test, except the Conc. Outlier and Spacing Tests, causes invalidation of ambient data back to the last good Cal. or Span Check and forward to the next good Cal.

21 Dec. 11, 1997 Span Source Audits QC n Span Source Audits evaluate the accuracy of the CAMS DASIBI 5008 calibrator. Corrective action is required if audit limits are exceeded. n If the audit passes, then the pollutant monitor must be adjusted to agree with the M-Level concentration produced by the CAMS DASIBI calibrator. This sets the Slope of the monitor’s calibration curve to the ideal. n If a Slope Test warning is reported thereafter, then action must be taken to determine if the error was caused by monitor drift or span source drift. A Span Source Audit may be needed if there are no obvious instrument problems.