Presentation is loading. Please wait.

Presentation is loading. Please wait.

WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 WRAP 2002 Visibility Modeling: Emission, Meteorology Inputs and CMAQ Performance.

Similar presentations


Presentation on theme: "WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 WRAP 2002 Visibility Modeling: Emission, Meteorology Inputs and CMAQ Performance."— Presentation transcript:

1 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 WRAP 2002 Visibility Modeling: Emission, Meteorology Inputs and CMAQ Performance Evaluation Gail Tonnesen, Bo Wang, Chao-Jung Chien, Zion Wang, Mohammad Omary University of California, Riverside Zac Adelman, Andy Holland University of North Carolina Ralph Morris et al. ENVIRON Corporation Int., Novato, CA WRAP Attribution of Haze Meeting, Denver, CO July 22, 2004

2 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Summary of RMC 2002 Modeling Annual MM5 Simulations run at the RMC Emissions processed with SMOKE –Preliminary 2002 Scenario C used here. CMAQ version 4.3 (released October 2003) Data summaries, QA, results are posted on the RMC web page: www.cert.ucr.edu/aqm/308

3 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 MM5 Modeling Domain (36 & 12 km) National RPO grid –Lambert conic Projection –Center: -97 o, 40 o –True lat: 33 o, 45 o MM5 domain –36 km: (165, 129, 34) –12 km: (220, 199, 34) 24-category USGS data –36 km: 10 min. (~19 km) –12 km: 5 min. (~9 km)

4 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 MM5 Physics Physics OptionConfigurationConfigure.user MicrophysicsReisner2 (with graupel)IMPHYS = 7 Cumulus SchemeKain-FritschICUPA = 6 PBLPleim-Chang (ACM)IBLTYP = 7 RadiationRRTMFRAD = 4 Land-surface modelPleim-XiuISOIL = 3 Shallow ConvectionNoISHALLO = 0 Snow Cover EffectSimple snow modelISNOW = 2 Thermal RoughnessGarratIZ0TOPT = 1 Varying SSTYesISSTVAR = 1 Time step90 seconds(PX uses an internal timestep of 40 seconds)

5 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Subdomains for 36/12-km Model Evaluation 1 = Pacific NW 2 = SW 3 = North 4 = Desert SW 5 = CenrapN 6 = CenrapS 7 = Great Lakes 8 = Ohio Valley 9 = SE 10 = NE 11 = MidAtlantic

6 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Evaluation Review Evaluation Methodology –Synoptic Evaluation –Statistical Evaluation using METSTAT and surface data WS, WD, T, RH –Evaluation against upper-air obs Statistics: –Absolute Bias and Error, RMSE, IOA (Index of Agreement) Evaluation Datasets: –NCAR dataset ds472 airport surface met observations –Twice-Daily Upper-Air Profile Obs (~120 in US) Temperature Moisture

7 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 METSTAT Evaluation Package Statistics: –Absolute Bias and Error, RMSE, IOA Daily and, where appropriate, hourly evaluation Statistical Performance Benchmarks –Based on an analysis of > 30 MM5 and RAMS runs –Not meant as a pass/fail test, but to put modeling results into perspective

8 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Evaluation of 36-km WRAP MM5 Results Model performed reasonably well for eastern subdomains, but not the west (WRAP region) –General cool moist bias in Western US –Difficulty with resolving Western US orography? May get better performance with higher resolution –Pleim-Xiu scheme optimized more for eastern US? More optimization needed for desert and rocky ground? MM5 performs better in winter than in summer –Weaker forcing in summer July 2002 Desert SW subdomain exhibits low temperature and high humidity bias 2002 MM5 Model Evaluation 12 vs. 36 km Results Chris Emery, Yiqin Jia, Sue Kemball-Cook, and Ralph Morris (ENVIRON International Corporation) & Zion Wang (UCR CE-CERT), Western Regional Air Partnership (WRAP) National RPO Meeting, May 25, 2004

9 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 WRAP 36km/12km July Wind Performance Comparison 0 20 40 60 80 100 120 00.511.522.533.5 Wind Speed RMSE (m/s) Wind Direction Error (degrees) Benchmark12 km SubdomainsMM5/RAMS Runs36 km Subdomains DesertSW North SW PacNW

10 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04

11

12

13 MM5 Implications for AoH The RMC is continuing to test alternative MM5 configurations – to be completed at the end of 2004. Expect some reduction in bias &error in the WRAP states, however even in the best case we will have error & bias in MM5 that must be considered when using CMAQ for source attribution.

14 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Emissions Inventory Summary Preliminary 2002 Scenario C based on the 1996 NEI, grown to 2002, with many updates by WRAP contractors and other RPOs. Processed for CMAQ using SMOKE. Extensive QA plots on the web page –Both SMOKE QA and post-SMOKE QA

15 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Emissions Sources by Category & RPO

16 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 WRAP 2002 Annual NOx Emissions Area Biogenic On Road Non Road Road Dust Point Rx Fire Ag Fire Wildfire Offshore

17 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 2002 WRAP NOx Emissions by Source & State 0 200000 400000 600000 800000 1000000 1200000 1400000 Arizona California Colorado Idaho Montana Nevada New Mexico North Dakota Oregon South Dakota Utah Washington Wyoming [Tons/Yr] Ag Fire Rx Fire Wildfire Area Point Nonroad Onroad

18 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 WRAP 2002 Annual SO2 Emissions Area Biogenic On Road Non Road Road Dust Point Rx Fire Ag Fire Wildfire Offshore

19 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 2002 WRAP SO2 Emissions by Source & State 0.00E+00 5.00E+04 1.00E+05 1.50E+05 2.00E+05 2.50E+05 3.00E+05 Arizona California Colorado Idaho Montana Nevada New Mexico North Dakota Oregon South Dakota Utah Washington Wyoming [Tons/Yr] Onroad Ag Fire Rx Fire Wildfire Area Nonroad Point

20 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 2002 WRAP NH3 Emissions by Source Category 0.00E+00 5.00E+04 1.00E+05 1.50E+05 2.00E+05 2.50E+05 Arizona California Colorado Idaho Montana Nevada New Mexico North Dakota Oregon South Dakota Utah Washington Wyoming Tons/Yr Nonroad Ag Fire Rx Fire Point Onroad Wildfire Area

21 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Emissions Summary Preliminary 2002 EI Used here. Updates for final 2002 EI will include: –New EI data from other RPOs and Canada –2002 NEI to replace grown 1996 NEI –Reprocess in SMOKE with final MM5 –All final inputs ready now except Canada & MM5

22 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 CMAQ Simulations CMAQ v4.3 36-km grid, 112x148x19 Annual Run CB4 chemistry Evaluated using: IMPROVE, CASTNet, NADP, STN, AIR/AQS

23 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 PM Performance Criteria Guidance from EPA not yet ready: –Difficult to assert that model is adequate. –Therefore, we use a variety of ad hoc performance goals and benchmarks to display CMAQ results. We completed a variety of analyses: –Compute over 20 performance metrics –Scatter-plots & time-series plots –Soccer plots –Bugle plots

24 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Goal of Model Evaluation We completed a variety of analyses: –Compute over 20 performance metrics –Scatter-plots & time-series plots –Soccer plots –Bugle plots Goal is to decide whether we have enough confidence to use the model for AoH: –Is this a valid application of the model?

25 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Soccer Goal Plots Plot error as as a function of bias. Ad hoc performance goal: –15% bias, 35% error based on O3 modeling goals. –Larger error & bias are observed among different PM data methods and monitoring networks. Performance benchmark: –30% bias, 70% error (2x performance goals) –PM models can achieve this level in many cases.

26 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Annual CMAQ vs IMPROVE

27 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Spring Summer FallWinter

28 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Annual CMAQ vs CASTNet

29 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Spring Summer FallWinter

30 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Annual CMAQ vs STN

31 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Spring Summer FallWinter

32 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Annual CMAQ vs NADP

33 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Spring Summer FallWinter

34 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Performance Goals and Criteria - Proposed by Jim Boylan Based on FE and FB calculations Vary as a function of species concentrations –Goals: FE  +50% and FB  ±30% –Criteria: FE  +75% and FB  ±60% –Less abundant species should have less stringent performance goals and criteria

35 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Performance Goals and Criteria - Proposed by Jim Boylan PM Performance Goals Proposed PM Performance Criteria

36 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Monthly SO4 Fractional Bias

37 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Monthly SO4 Fractional Error

38 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Monthly NO3 Fractional Bias

39 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Monthly NO3 Fractional Error

40 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Monthly NH4 Fractional Bias

41 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Monthly NH4 Fractional Error

42 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Monthly OC Fractional Bias

43 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Monthly OC Fractional Error

44 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Monthly EC Fractional Bias

45 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Monthly EC Fractional Error

46 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Monthly PM25 Fractional Bias

47 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Monthly PM25 Fractional Error

48 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 CMAQ & EI Versions TSSA results are run in CMAQ v4.4 with emissions version Preliminary 2002 C Performance evaluation used CMAQ 4.3 Previous CMAQ runs used CMAQ 4.3 with Preliminary 2002 B emissions (no fires)

49 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 CMAQ v4.3 & v4.4 versus IMPROVE July

50 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 CMAQ Ozone Performance CMAQ v4.3 Mean fractional bias (no filter)  January +25% MFB  July –20% mean MFB Slightly worse January O3 performance in v4.4

51 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 CMAQ Emissions B & C versus IMPROVE Summer

52 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Issues for AoH Is this set of Emissions/MM5/CMAQ adequate for studying AoH? Analysis of CMAQ performance on best & worst days still in progress: –However, we expect CMAQ will tend to over predict lows & under predict highs. –Should we use CMAQ results unpaired in time?

53 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Options for future work Continue CMAQ source apportionment with current data sets. Wait for new MM5 and emissions. Investigate other CMAQ configurations: –Unlikely to see large improvements


Download ppt "WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 WRAP 2002 Visibility Modeling: Emission, Meteorology Inputs and CMAQ Performance."

Similar presentations


Ads by Google