Time-Resolved & In-Depth Evaluation of PM and PM Precursors using CMAQ Robin L. Dennis Atmospheric Modeling Division U.S. EPA/ORD:NOAA/ARL PM Model Performance.

Slides:



Advertisements
Similar presentations
Analysis of CMAQ Performance and Grid-to- grid Variability Over 12-km and 4-km Spacing Domains within the Houston airshed Daiwen Kang Computer Science.
Advertisements

NASA AQAST 6th Biannual Meeting January 15-17, 2014 Heather Simon Changes in Spatial and Temporal Ozone Patterns Resulting from Emissions Reductions: Implications.
Inventory Issues and Modeling- Some Examples Brian Timin USEPA/OAQPS October 21, 2002.
Photochemical Model Performance for PM2.5 Sulfate, Nitrate, Ammonium, and pre-cursor species SO2, HNO3, and NH3 at Background Monitor Locations in the.
Diurnal Variability of Aerosols Observed by Ground-based Networks Qian Tan (USRA), Mian Chin (GSFC), Jack Summers (EPA), Tom Eck (GSFC), Hongbin Yu (UMD),
An Assessment of CMAQ with TEOM Measurements over the Eastern US Michael Ku, Chris Hogrefe, Kevin Civerolo, and Gopal Sistla PM Model Performance Workshop,
Sources of PM 2.5 Carbon in the SE U.S. RPO National Work Group Meeting December 3-4, 2002.
Sensitivity to changes in HONO emissions from mobile sources simulated for Houston area Beata Czader, Yunsoo Choi, Lijun Diao University of Houston Department.
Three-State Air Quality Study (3SAQS) Three-State Data Warehouse (3SDW) 2008 CAMx Modeling Model Performance Evaluation Summary University of North Carolina.
Incorporation of the Model of Aerosol Dynamics, Reaction, Ionization and Dissolution (MADRID) into CMAQ Yang Zhang, Betty K. Pun, Krish Vijayaraghavan,
CMAQ and REMSAD- Model Performance and Ongoing Improvements Brian Timin, Carey Jang, Pat Dolwick, Norm Possiel, Tom Braverman USEPA/OAQPS December 3, 2002.
Improving the Representation of Atmospheric Chemistry in WRF William R. Stockwell Department of Chemistry Howard University.
Title EMEP Unified model Importance of observations for model evaluation Svetlana Tsyro MSC-W / EMEP TFMM workshop, Lillestrøm, 19 October 2010.
Christian Seigneur AER San Ramon, CA
Data assimilation of trace gases in a regional chemical transport model: the impact on model forecasts E. Emili 1, O. Pannekoucke 1,2, E. Jaumouillé 2,
CENRAP Modeling Workgroup Mational RPO Modeling Meeting May 25-26, Denver CO Calvin Ku Missouri DNR May 25, 2004.
From Ammonia to PM 2.5 Brent Auvermann Texas Cooperative Extension Texas Agricultural Experiment Station Amarillo, TX.
Office of Research and Development Atmospheric Modeling and Analysis Division, National Exposure Research Laboratory Simple urban parameterization for.
University of California Riverside, ENVIRON Corporation, MCNC WRAP Regional Modeling Center WRAP Regional Haze CMAQ 1996 Model Performance and for Section.
MODELS3 – IMPROVE – PM/FRM: Comparison of Time-Averaged Concentrations R. B. Husar S. R. Falke 1 and B. S. Schichtel 2 Center for Air Pollution Impact.
Modeling Studies of Air Quality in the Four Corners Region National Park Service U.S. Department of the Interior Cooperative Institute for Research in.
Comparison of three photochemical mechanisms (CB4, CB05, SAPRC99) for the Eta-CMAQ air quality forecast model for O 3 during the 2004 ICARTT study Shaocai.
Clinton MacDonald 1, Kenneth Craig 1, Jennifer DeWinter 1, Adam Pasch 1, Brigette Tollstrup 2, and Aleta Kennard 2 1 Sonoma Technology, Inc., Petaluma,
Center for Environmental Research and Technology University of California, Riverside Bourns College of Engineering Evaluation and Intercomparison of N.
PM2.5 Model Performance Evaluation- Purpose and Goals PM Model Evaluation Workshop February 10, 2004 Chapel Hill, NC Brian Timin EPA/OAQPS.
WRAP Update. Projects Updated 1996 emissions QA procedures New evaluation tools Model updates CB-IV km MM5 Fugitive dust NH 3 emissions Model.
1 Using Hemispheric-CMAQ to Provide Initial and Boundary Conditions for Regional Modeling Joshua S. Fu 1, Xinyi Dong 1, Kan Huang 1, and Carey Jang 2 1.
Ozone MPE, TAF Meeting, July 30, 2008 Review of Ozone Performance in WRAP Modeling and Relevance to Future Regional Ozone Planning Gail Tonnesen, Zion.
On the Model’s Ability to Capture Key Measures Relevant to Air Quality Policies through Analysis of Multi-Year O 3 Observations and CMAQ Simulations Daiwen.
A comparison of PM 2.5 simulations over the Eastern United States using CB-IV and RADM2 chemical mechanisms Michael Ku, Kevin Civerolo, and Gopal Sistla.
WRAP Experience: Investigation of Model Biases Uma Shankar, Rohit Mathur and Francis Binkowski MCNC–Environmental Modeling Center Research Triangle Park,
PM Model Performance & Grid Resolution Kirk Baker Midwest Regional Planning Organization November 2003.
Model Evaluation Comparing Model Output to Ambient Data Christian Seigneur AER San Ramon, California.
Operational Evaluation and Comparison of CMAQ and REMSAD- An Annual Simulation Brian Timin, Carey Jang, Pat Dolwick, Norm Possiel, Tom Braverman USEPA/OAQPS.
Evaluation of sulfate simulations using CMAQ version 4.6: The role of cloud Chao Luo 1, Yuhang Wang 1, Stephen Mueller 2, and Eladio Knipping 3 1 Georgia.
OThree Chemistry Modeling of the Sept ’00 CCOS Ozone Episode: Diagnostic Experiments--Round 3 Central California Ozone Study: Bi-Weekly Presentation.
Evaluation of the VISTAS 2002 CMAQ/CAMx Annual Simulations T. W. Tesche & Dennis McNally -- Alpine Geophysics, LLC Ralph Morris -- ENVIRON Gail Tonnesen.
Applications of Models-3 in Coastal Areas of Canada M. Lepage, J.W. Boulton, X. Qiu and M. Gauthier RWDI AIR Inc. C. di Cenzo Environment Canada, P&YR.
Use of space-based tropospheric NO 2 observations in regional air quality modeling Robert W. Pinder 1, Sergey L. Napelenok 1, Alice B. Gilliland 1, Randall.
An Exploration of Model Concentration Differences Between CMAQ and CAMx Brian Timin, Karen Wesson, Pat Dolwick, Norm Possiel, Sharon Phillips EPA/OAQPS.
William G. Benjey* Physical Scientist NOAA Air Resources Laboratory Atmospheric Sciences Modeling Division Research Triangle Park, NC Fifth Annual CMAS.
GEOS-CHEM Modeling for Boundary Conditions and Natural Background James W. Boylan Georgia Department of Natural Resources - VISTAS National RPO Modeling.
Post-processing air quality model predictions of fine particulate matter (PM2.5) at NCEP James Wilczak, Irina Djalalova, Dave Allured (ESRL) Jianping Huang,
Evaluation of Models-3 CMAQ I. Results from the 2003 Release II. Plans for the 2004 Release Model Evaluation Team Members Prakash Bhave, Robin Dennis,
May 22, UNDERSTANDING THE EFFECTIVENESS OF PRECURSOR REDUCTIONS IN LOWERING 8-HOUR OZONE CONCENTRATIONS Steve Reynolds Charles Blanchard Envair 12.
Diagnostic Study on Fine Particulate Matter Predictions of CMAQ in the Southeastern U.S. Ping Liu and Yang Zhang North Carolina State University, Raleigh,
Office of Research and Development National Exposure Research Laboratory, Atmospheric Modeling and Analysis Division 16 October 2012 Integrating source.
DEVELOPMENT AND APPLICATION OF MADRID: A NEW AEROSOL MODULE IN MODELS-3/CMAQ Yang Zhang*, Betty Pun, Krish Vijayaraghavan, Shiang-Yuh Wu and Christian.
Evaluating temporal and spatial O 3 and PM 2.5 patterns simulated during an annual CMAQ application over the continental U.S. Evaluating temporal and spatial.
Extending Size-Dependent Composition to the Modal Approach: A Case Study with Sea Salt Aerosol Uma Shankar and Rohit Mathur The University of North Carolina.
Partnership for AiR Transportation Noise and Emission Reduction An FAA/NASA/TC-sponsored Center of Excellence Matthew Woody and Saravanan Arunachalam Institute.
New Features of the 2003 Release of the CMAQ Model Jonathan Pleim 1, Gerald Gipson 2, Shawn Roselle 1, and Jeffrey Young 1 1 ASMD, ARL, NOAA, RTP, NC 2.
Office of Research and Development National Exposure Research Laboratory, Atmospheric Modeling and Analysis Division October 21, 2009 Evaluation of CMAQ.
August 1999PM Data Analysis Workbook: Characterizing PM1 Characterizing Ambient PM Concentrations and Processes What are the temporal, spatial, chemical,
Operational Evaluation and Model Response Comparison of CAMx and CMAQ for Ozone & PM2.5 Kirk Baker, Brian Timin, Sharon Phillips U.S. Environmental Protection.
Impact of Temporal Fluctuations in Power Plant Emissions on Air Quality Forecasts Prakash Doraiswamy 1, Christian Hogrefe 1,2, Eric Zalewsky 2, Winston.
Emission reductions needed to meet proposed ozone standard and their effect on particulate matter Daniel Cohan and Beata Czader Department of Civil and.
Western Air Quality Study (WAQS) Intermountain Data Warehouse (IWDW) Model Performance Evaluation CAMx and CMAQ 2011b University of North Carolina (UNC-IE)
Sensitivity of PM 2.5 Species to Emissions in the Southeast Sun-Kyoung Park and Armistead G. Russell Georgia Institute of Technology Sensitivity of PM.
V:\corporate\marketing\overview.ppt CRGAQS: CAMx Sensitivity Results Presentation to the Gorge Study Technical Team By ENVIRON International Corporation.
Robin L. Dennis, Jesse O. Bash, Kristen M. Foley, Rob Gilliam, Robert W. Pinder U.S. Environmental Protection Agency, National Exposure Research Laboratory,
Application of the CRA Method Application of the CRA Method William A. Gallus, Jr. Iowa State University Beth Ebert Center for Australian Weather and Climate.
Daiwen Kang 1, Rohit Mathur 2, S. Trivikrama Rao 2 1 Science and Technology Corporation 2 Atmospheric Sciences Modeling Division ARL/NOAA NERL/U.S. EPA.
Preliminary Evaluation of the June 2002 Version of CMAQ Brian Eder Shaocai Yu Robin Dennis Jonathan Pleim Ken Schere Atmospheric Modeling Division* National.
The Influence on CMAQ Modeled Wet and Dry Deposition of Advances in the CMAQ Systems for Meteorology and Emissions Robin Dennis, Jesse Bash, Kristen Foley,
Simulation of Ozone and PM in Southern Taiwan
WRAP Modeling Forum, San Diego
REGIONAL AND LOCAL-SCALE EVALUATION OF 2002 MM5 METEOROLOGICAL FIELDS FOR VARIOUS AIR QUALITY MODELING APPLICATIONS Pat Dolwick*, U.S. EPA, RTP, NC, USA.
RMC Activity Update Emissions Forum July 1, 2003.
Robin L. Dennis Atmospheric Sciences Modeling Division
Presentation transcript:

Time-Resolved & In-Depth Evaluation of PM and PM Precursors using CMAQ Robin L. Dennis Atmospheric Modeling Division U.S. EPA/ORD:NOAA/ARL PM Model Performance Workshop U.S. EPA/OAQPS February 10-11, 2004 Research Triangle Park, NC

Objectives of Diagnostic/In-Depth Model Evaluation Test the model to check –Reliability of the Predictions (Right Reason) Right answer for the right reason Wrong answer for the right reason or understandable reason –Right Response Reasonably accurate response (a major focus of the work) Separate sources of error –Discern among: Emissions input error Meteorological error Chemistry/aerosol physics and chemistry error Aid model developers in identifying and treating problem areas

We need to understand what is behind the comparisons, to help interpret them. Importantly, we have to understand how the models’ state aligns with the real world state. –The model as predictor. –The model as imperfect. This talk will focus on the inorganic PM system. Focus on urban areas, where people live. The complementary probing with PM box models is very important, but will not be discussed in this talk.

Overview of Talk Issue of model “structure”, specifically meteorology and K Z Two issues relating to emissions inputs Reminder of the issue of oxidized nitrogen chemistry (total- nitrate) Assess the inorganic system state of the model A conclusion: We lack critical, key measurements to evaluate the model system, leaving us partially blind in our examination of the model as predictor

Issue of Model Structure/Meteorology We see a persistent premature collapse of the boundary layer and a morning rise of the mixed layer that is too slow. Always been there. We see this with the conservative species. The premature collapse also exists in the rural areas.

There is a clear, rapid rise to overprediction in the evening as the PBL collapses around 17:00 EST. Atlanta, August 1999: EC

The pattern of overprediction in the evening and morning occurs day-in and day-out.

We see similar behavior for NO Y and CO, especially the evening over- prediction. The obs rise more than for EC.

EC, NO Y and CO have the same diurnal pattern

Suburban/ Rural NO Y We also see the pattern of overprediction at suburban/rural and rural sites

Suburban/ Rural CO We also see the pattern of overprediction at suburban/rural and rural sites

There is a systematic problem with MM5 that leads to a premature collapse of the boundary layer. We need to be aware of how this affects comparisons. For the nighttime concentrations we have a situation of compensating errors. I do not think one should arbitrarily change CMAQ’s default K Z to get better performance statistics for O 3 without a thorough analysis for the period being simulated with regard to the conservative species like EC, CO and NO Y. For CMAQ, concentrations during the daylight hours, when the atmosphere is well mixed, are the best for checking the model for issues such as bias.

Emission Input Issues EC –We are not discerning the bias with 24-hr averages. NH 3 –Our ignorance regarding ammonia’s diurnal profile is causing problems to model ammonia concentrations.

Daylight hour predicted EC concentrations are low, indicating the EC emissions are biased low Atlanta: EC

The daylight hour EC underprediction is true for almost every day of the month Atlanta: EC

While the synoptic-scale agreement is quite good, the agreement of the 24-hr averages is for the wrong reason. Emissions of EC are actually baised low, something not discernable from 24-hr averages. Atlanta: EC

We use inverse modeling to set the overall monthly level of ammonia –factor of 1.2 x’s 1999 NEI annual average parsed into monthly 12ths for month of August –Factor of 0.4 x’s 1999 NEI annual average parsed into monthly 12ths for month of January Where we can test it against NH X (= NH 3 + NH 4 + ) it works pretty well. Ammonia

The CMAQ NHX predictions are tracking the synoptic signal quite well, but they are not tracking the measured diurnal pattern Atlanta: NH X

While the NHX pattern is not as pronounced as the EC pattern, it is most likely also caused by the MM5 issue along with possible errors in the NHX diurnal profile in SMOKE. How to separate? Atlanta: NH X

While there is an issue with NH X, for sulfate the diurnal pattern is inverted, the range of variation is smaller, and model and measurements are in much better agreement. Atlanta: SO 4 2-

Diurnal biases in NH X show up as biases in aerosol nitrate, especially in the early morning. Atlanta: NO 3 -

The predicted NH X also has a more pronounced diurnal swing in winter, with the evening peak showing the largest deviation or bias. Pittsburgh, January 2002: NH X

>>Reminder<< Oxidized Nitrogen Chemistry: total-Nitrate Heterogeneous N 2 O 5 Reaction 2002 release of CMAQ –Reaction probability γ = 0.1 recommended by Dentener and Crutzen (JGR 1993) –Makes a lot of HNO 3 at night Recent studies show wide range of γ values –Dependence on humidity, temperature, chemical composition - sulfate, nitrate, and organic content 2003 Release of CMAQ –Reaction probability  γ  = depending on NO 3 /(SO 4 +NO 3 ) according to Riemer et al. (JGR 2003) based on lab measurements of Mentel et al (PCCP 1999)

HNO 3 concentrations significantly reduced with updated CMAQ Must turn off all production from N 2 O 5 to get down to observed levels of HNO 3 Atlanta: HNO 3 (average diurnal cycle) UrbanSuburban

Suburban Atlanta: HNO 3 (average diurnal cycle ) Daytime over-production of HNO 3 is also an issue

Same behavior of HNO 3 overprediction is observed at Pittsburgh. The overprediction of HNO 3 appears relatively smaller in summer (no daytime issue) than in winter. Winter may have bigger issues. Pittsburgh: WinterAtlanta: Summer

Pittsburgh: total-NO3 January ‘02 Pittsburgh: NH X January ‘02 At Pittsburgh the wintertime relative overprediction of total-NO3 is larger than the relative overprediction of NH X.

Pittsburgh: total-NO3 January ‘02 Pittsburgh: NH X January ‘02 Also seen in 24-hr data: At Pittsburgh the wintertime relative overprediction of total-NO3 is larger than the relative overprediction of NH X.

Setup of the Inorganic System State of the Model What sort of problem do these biases appear to create in terms of setting the model up for predicting the PM response to emissions reductions? We will use the Gas Ratio suggested by Spyros Pandis to examine the system state. First, what do the time-resolved patterns look like relative to the average diurnal patterns. We will include model sensitivities to help us learn.

Sulfate tracks pretty well except for a few excursions

Total-nitrate is overpredicted. Zeroing the heterogeneous production brings total-nitrate very close to the observations.

The NH X predictions track fairly well, but with periods of overprediction. Not much difference between model versions.

Gas Ratio (per S. Pandis) Free Ammonia NH X - 2 * SO 4 2- GR = = Total Nitrate HNO 3 (g) + NO 3 - (p) Calculated in Molar Units GR > 1 => HNO 3 limiting 0 NH 3 limiting GR NH 3 severely limiting (can’t form NH 4 NO 3 )

The CMAQ O3 Release looks best, even though it has clear biases

Observations Gas ratio analysis shows that the model will need a “right” combination of off-setting errors to come close to the control response state of the atmosphere. May require some bias. Uncertainty in the ammonia inventory is a serious issue. PM predictions are very sensitive to errors in the NH X. Get NH X. We need a top-down engineering exam using measurements. Other sources of error combine differently with the MM5 or meteorological error, so that the errors do not consistently affect the PM predictions across different sections of the diurnal cycle. Errors are not necessarily consistent across space (needs to be further tested). We have a dilemma. Do we want the model to look good; use official inputs? Or Do we want the model to be a good predictor?

Observations (cont.) High time resolution is necessary to check for bias. Agreement on 24-hr averages may be for the wrong reason. Comparisons must include and involve multiple species, including conservative tracers. It is important to assess the models’ state relative to emissions changes. Currently this is not possible because we are missing key gas species and the temporal coverage is inadequate. Without measurements of NH 3 and HNO 3 to go along with aerosol measurements (forgetting size for the moment) we are walking into the SIP process partially blind as to the quality of the models. –Need NH 3 and HNO 3 24-hr averages, minimum, preferably hourly. –Need measurements every day, all seasons.