Numerical Weather Prediction Parametrization of diabatic processes Clouds (4) Cloud Scheme Validation AN ECMWF LECTURER RICH, Richard Forbes and Adrian.

Slides:



Advertisements
Similar presentations
Repaso: Unidad 2 Lección 2
Advertisements

AP STUDY SESSION 2.
1
Copyright © 2003 Pearson Education, Inc. Slide 1 Computer Systems Organization & Architecture Chapters 8-12 John D. Carpinelli.
STATISTICS HYPOTHESES TEST (I)
David Burdett May 11, 2004 Package Binding for WS CDL.
1 RA I Sub-Regional Training Seminar on CLIMAT&CLIMAT TEMP Reporting Casablanca, Morocco, 20 – 22 December 2005 Status of observing programmes in RA I.
1 The GEMS production systems and retrospective reanalysis Adrian Simmons.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Extended range forecasts at MeteoSwiss: User experience.
ECMWF Slide 1Met Op training course – Reading, March 2004 Forecast verification: probabilistic aspects Anna Ghelli, ECMWF.
Slide 1 Forecast Products User Meeting June 2006 Slide 1 ECMWF medium-range forecasts and products David Richardson Met Ops.
Local Customization Chapter 2. Local Customization 2-2 Objectives Customization Considerations Types of Data Elements Location for Locally Defined Data.
CALENDAR.
Environmental Remote Sensing GEOG 2021 Lecture 2 Image display and enhancement.
Lidar observations of mixed-phase clouds Robin Hogan, Anthony Illingworth, Ewan OConnor & Mukunda Dev Behera University of Reading UK Overview Enhanced.
Quantifying sub-grid cloud structure and representing it GCMs
Proposed new uses for the Ceilometer Network
Slide 1 Dust Modelling Workshop, 26 Feb08 Dust Modelling Workshop, Reading U., 26 February 2008 Modelling dust aerosols for the ECMWF IFS J.-J. Morcrette,
Ewan OConnor, Robin Hogan, Anthony Illingworth, Nicolas Gaussiat Radar/lidar observations of boundary layer clouds.
R. Forbes, 17 Nov 09 ECMWF Clouds and Radiation University of Reading ECMWF Cloud and Radiation Parametrization: Recent Activities Richard Forbes, Maike.
Anthony Illingworth, + Robin Hogan, Ewan OConnor, U of Reading, UK and the CloudNET team (F, D, NL, S, Su). Reading: 19 Feb 08 – Meeting with Met office.
© University of Reading Richard Allan Department of Meteorology, University of Reading Thanks to: Jim Haywood and Malcolm.
Radar/lidar observations of boundary layer clouds
Robin Hogan, Julien Delanoë, Nicky Chalmers, Thorwald Stein, Anthony Illingworth University of Reading Evaluating and improving the representation of clouds.
Joint ECMWF-University meeting on interpreting data from spaceborne radar and lidar: AGENDA 09:30 Introduction University of Reading activities 09:35 Robin.
Robin Hogan Anthony Illingworth Ewan OConnor Nicolas Gaussiat Malcolm Brooks University of Reading Cloudnet products available from Chilbolton.
DYMECS: Dynamical and Microphysical Evolution of Convective Storms (NERC Standard Grant) University of Reading: Robin Hogan, Bob Plant, Thorwald Stein,
Robin Hogan, Richard Allan, Nicky Chalmers, Thorwald Stein, Julien Delanoë University of Reading How accurate are the radiative properties of ice clouds.
1 Evaluating climate model using observations of tropical radiation and water budgets Richard P. Allan, Mark A. Ringer Met Office, Hadley Centre for Climate.
Evaluating the Met Office global forecast model using GERB data Richard Allan, Tony Slingo Environmental Systems Science Centre, University of Reading.
Robin Hogan (with input from Anthony Illingworth, Keith Shine, Tony Slingo and Richard Allan) Clouds and climate.
Robin Hogan Ewan OConnor Anthony Illingworth Department of Meteorology, University of Reading UK PDFs of humidity and cloud water content from Raman lidar.
Robin Hogan Ewan OConnor Damian Wilson Malcolm Brooks Evaluation statistics of cloud fraction and water content.
Robin Hogan Julien Delanoe University of Reading Remote sensing of ice clouds from space.
© University of Reading Richard Allan Department of Meteorology, University of Reading Thanks to: Jim Haywood.
1 00/XXXX © Crown copyright Carol Roadnight, Peter Clark Met Office, JCMM Halliwell Representing convection in convective scale NWP models : An idealised.
Chapter 13 – Weather Analysis and Forecasting
Recent Evidence for Reduced Climate Sensitivity Roy W. Spencer, Ph.D Principal Research Scientist The University of Alabama In Huntsville March 4, 2008.
Break Time Remaining 10:00.
Integrated Profiling at the AMF
Table 12.1: Cash Flows to a Cash and Carry Trading Strategy.
PP Test Review Sections 6-1 to 6-6
Oil & Gas Final Sample Analysis April 27, Background Information TXU ED provided a list of ESI IDs with SIC codes indicating Oil & Gas (8,583)
© European Centre for Medium-Range Weather Forecasts Operational and research activities at ECMWF now and in the future Sarah Keeley Education Officer.
Copyright © 2012, Elsevier Inc. All rights Reserved. 1 Chapter 7 Modeling Structure with Blocks.
1 RA III - Regional Training Seminar on CLIMAT&CLIMAT TEMP Reporting Buenos Aires, Argentina, 25 – 27 October 2006 Status of observing programmes in RA.
Adding Up In Chunks.
Page 1© Crown copyright Simulation of radar reflectivities in the UK Met Office model: comparison with CloudSat Data Alejandro Bodas-Salcedo, M.E. Brooks.
1 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt Synthetic.
: 3 00.
5 minutes.
Institut für Physik der Atmosphäre Institut für Physik der Atmosphäre Object-Oriented Best Member Selection in a Regional Ensemble Forecasting System Christian.
1 hi at no doifpi me be go we of at be do go hi if me no of pi we Inorder Traversal Inorder traversal. n Visit the left subtree. n Visit the node. n Visit.
1 Titre de la diapositive SDMO Industries – Training Département MICS KERYS 09- MICS KERYS – WEBSITE.
Essential Cell Biology
Converting a Fraction to %
Clock will move after 1 minute
PSSA Preparation.
Slide: 1 Version 0.3, 20 January 2004 METEOSAT SECOND GENERATION (MSG) METEOROLOGICAL USE OF THE SEVIRI HIGH-RESOLUTION VISIBLE (HRV) CHANNEL Contact:Jochen.
Physics for Scientists & Engineers, 3rd Edition
Select a time to count down from the clock above
9. Two Functions of Two Random Variables
Evaluation of ECHAM5 General Circulation Model using ISCCP simulator Swati Gehlot & Johannes Quaas Max-Planck-Institut für Meteorologie Hamburg, Germany.
Parameterization Cloud Scheme Validation
Initial 3D isotropic fractal field An initial fractal cloud-like field can be generated by essentially performing an inverse 3D Fourier Transform on the.
Profiling Clouds with Satellite Imager Data and Potential Applications William L. Smith Jr. 1, Douglas A. Spangenberg 2, Cecilia Fleeger 2, Patrick Minnis.
Yuying Zhang, Jim Boyle, and Steve Klein Program for Climate Model Diagnosis and Intercomparison Lawrence Livermore National Laboratory Jay Mace University.
Cloud Validation: The issues
Presentation transcript:

Numerical Weather Prediction Parametrization of diabatic processes Clouds (4) Cloud Scheme Validation AN ECMWF LECTURER RICH, Richard Forbes and Adrian Tompkins forbes@ecmwf.int Cloud ParametrizationClouds 3

Cloud Validation: The issues AIM: To perfectly simulate one aspect of nature: CLOUDS APPROACH: Validate the model generated clouds against observations, and use the information concerning apparent errors to improve the model physics, and subsequently the cloud simulation. Cloud observations Cloud simulation error parameterisation improvements sounds easy? Cloud ParametrizationClouds 3

Cloud Validation: The problems How much of the ‘error’ derives from observations? Cloud observations error = e1 parameterisation improvements error Cloud simulation error = e2 Cloud ParametrizationClouds 3

Cloud Validation: The problems Which Physics is responsible for the error? Cloud observations parameterisation improvements error Cloud simulation radiation convection cloud physics dynamics turbulence Cloud ParametrizationClouds 3

The path to improved cloud parameterisation… Parameterisation improvement ? Composite studies NWP validation Case studies Cloud validation Climatological comparison Cloud ParametrizationClouds 3

ECMWF Model Configurations ECMWF Global Atmospheric Model (IFS) Many different configurations at different resolutions: TL159 (125 km) L62 Seasonal Forecast System (→6 months) TL255 (80 km) L62 Monthly Forecast System (→32 days) TL399 (50 km) L62 Ensemble Prediction System (EPS) (→15 days) TL799 (25 km) L91 Current Deterministic Global NWP (→10 days) TL1279 (16 km) L91 NWP from 2009 (→10 days) Future: 150 levels (Δ250m@z=5km)? 10 km horizontal resolution?.......... Need verification across a range of spatial scales and timescales → a model with a “climate” that is robust to resolution.

Model climate: Broadband radiative fluxes Can compare Top of Atmosphere (TOA) radiative fluxes with satellite observations: e.g. TOA Shortwave radiation (TSR) JJA 87 TSR Old Model Version (CY18R6) minus Observations (ERBE) Stratocumulus regions bad - also North Africa (old cycle!) Cloud ParametrizationClouds 3

Model climate: Cloud radiative “forcing” Problem: Can we associate these “errors” with clouds? Another approach is to examine “cloud radiative forcing” JJA 87 SWCRF Old Model Version (CY18R6) minus Observations (ERBE) Cloud Problems: strato-cu YES, North Africa NO! Note CRF sometimes defined as Fclr-F, also differences in model calculation Cloud ParametrizationClouds 3

Model climate “Cloud fraction” or “Total cloud cover” Can also compare other variables to derived products: CC JJA 87 TCC Old Model Version (CY18R6) minus Observations (ISCCP) References: ISCCP - Rossow and Schiffer, Bull Am Met Soc. 91, ERBE - Ramanathan et al. Science 89 Cloud ParametrizationClouds 3

Comparison with Satellite Data: A problem If more complicated cloud parameters are desired (e.g. vertical structure) then retrieval can be ambiguous Channel 1 Channel 2 ….. Another approach is to simulate irradiances using model fields Radiative transfer model Assumptions about vertical structures Liquid cloud 1 Liquid cloud 2 Liquid cloud 3 Ice cloud 1 Ice cloud 2 Ice cloud 3 Vapour 1 Vapour 2 Vapour 3 Height Cloud ParametrizationClouds 3

Comparison with Satellite Data: Simulating Satellite Radiances More certainty in the diagnosis of the existence of a problem. Doesn’t necessarily help identify the origin of the problem Examples: Morcrette MWR 1991, Chevallier et al, J Clim. 2001 Cloud ParametrizationClouds 3

late afternoon peak in convection Comparison with Satellite Data: A more complicated analysis is possible… DIURNAL CYCLE OVER TROPICAL LAND VARIABILITY Observations: late afternoon peak in convection Model: morning peak (Common problem) Cloud ParametrizationClouds 3

NWP Forecast Evaluation Differences in longer simulations may not be the direct result of the cloud scheme Interaction with radiation, dynamics etc. E.g: poor stratocumulus regions Using short-term NWP or analysis restricts this and allows one to concentrate on the cloud scheme Introduction of Tiedtke Scheme Cloud cover bias Time Cloud ParametrizationClouds 3

Example over Europe -8:-3 -3:-1 -1:1 1:3 3:8

NWP Forecast Evaluation Meteosat and simulated IR Daily Report 11th April 2005 “Going more into details of the cyclone, it can be seen that the model was able to reproduce the very peculiar spiral structure in the clouds bands. However large differences can be noticed further east, in the warm sector of the frontal system attached to the cyclone, where the model largely underpredicts the typical high-cloud shield. Look for example in the two maps above where a clear deficiency of cloud cover is evident in the model generated satellite images north of the Black Sea. In this case this was systematic over different forecasts.” – Quote from ECMWF daily report 11th April 2005 15

NWP Forecast Evaluation Meteosat and simulated w.v. channel Blue: moist Red: Dry 30 hr forecast too dry in front region Is not a FC-drift, does this mean the cloud scheme is at fault? 16

Case Studies Can concentrate on a particular location and/or time period in more detail, for which specific observational data is collected: CASE STUDY Examples: GATE, CEPEX, TOGA-COARE, ARM... Cloud ParametrizationClouds 3

Evaluation of vertical cloud structure Mace et al., 1998, GRL Examined the frequency of occurrence of ice cloud Reasonable match to data found ARM Site - America Great Plains Cloud ParametrizationClouds 3

Evaluation of vertical cloud structure Hogan et al., 2001, JAM Analysis using the radar and lidar at Chilbolton, UK. Reasonable Match Cloud ParametrizationClouds 3

Hogan et al. More details possible Cloud ParametrizationClouds 3

Hogan et al. (2001) Comparison improved when: (a) snow was included, (b) cloud below the sensitivity of the instruments was removed. Cloud ParametrizationClouds 3

Issues Raised: WHAT ARE WE COMPARING? HOW STRINGENT IS OUR TEST? Is the model statistic really equivalent to what the instrument measures? e.g: Radar sees snow, but the model may not include this in the definition of cloud fraction. Small ice amounts may be invisible to the instrument but included in the model statistic HOW STRINGENT IS OUR TEST? Perhaps the variable is easy to reproduce e.g: Mid-latitude frontal clouds are strongly dynamically forced, cloud fraction is often zero or one. Perhaps cloud fraction statistics are easy to reproduce in short term forecasts Cloud ParametrizationClouds 3

Can also use to validate “components” of cloud scheme EXAMPLE: Cloud Overlap Assumptions Hogan and Illingworth, 2000, QJRMS Issues Raised: HOW REPRESENTATIVE IS OUR CASE STUDY LOCATION? e.g: Wind shear and dynamics very different between Southern England and the tropics!!! Cloud ParametrizationClouds 3

Composites We want to look at a certain kind of model system: Stratocumulus regions Extra tropical cyclones An individual case may not be conclusive: Is it typical? On the other hand general statistics may swamp this kind of system Can use compositing technique Cloud ParametrizationClouds 3

Composites - a cloud survey From Satellite attempt to derive cloud top pressure and cloud optical thickness for each pixel - Data is then divided into regimes according to sea level pressure anomaly Use ISCCP simulator Data Model Modal-Data 1. 900 500 620 750 350 250 120 -ve SLP 900 500 620 750 350 250 120 Tselioudis et al., 2000, JCL +ve SLP Cloud top pressure High Clouds too thin Low clouds too thick 2. Optical depth Cloud ParametrizationClouds 3

Composites – Extra-tropical cyclones Overlay about 1000 cyclones, defined about a location of maximum optical thickness Plot predominant cloud types by looking at anomalies from 5-day average High Clouds too thin Low clouds too thick High tops=Red, Mid tops=Yellow, Low tops=Blue Klein and Jakob, 1999, MWR Cloud ParametrizationClouds 3

A strategy for cloud parametrization evaluation Jakob Cloud ParametrizationClouds 3

Recap: The problems All Observations Long term climatologies: Are we comparing ‘like with like’? What assumptions are contained in retrievals/variational approaches? Long term climatologies: Which physics is responsible for the errors? Dynamical regimes can diverge NWP, Reanalysis, Column Models Doesn’t allow the interaction between physics to be represented Case studies Are they representative? Do changes translate into global skill? Composites As case studies. And one more problem specific to NWP… Cloud ParametrizationClouds 3

NWP cloud scheme development Timescale of validation exercise Many of the above validation exercises are complex and involved Often the results are available O(years) after the project starts for a single version of the model ECMWF operational model is updated 2 to 4 times a year roughly, so often the validation results are no longer relevant, once they become available. Requirement: A quick and easy test bench

Example: LWP ERA-40 and recent cycles 26r1: April 2003 model 23r4: June 2001 SSMI Diff

Example: LWP ERA-40 and recent cycles model 23r4: June 2001 26r3: Oct 2003 SSMI Diff

Example: LWP ERA-40 and recent cycles model 23r4: June 2001 28r1: Mar 2004 SSMI Diff

Example: LWP ERA-40 and recent cycles model 23r4: June 2001 28r3: Sept 2004 SSMI Diff

Example: LWP ERA-40 and recent cycles model 23r4: June 2001 29r1: Apr 2005 SSMI Diff

Example: LWP ERA-40 and recent cycles model 23r4: June 2001 33r1: Nov 2008 SSMI Diff Do ERA-40 cloud studies still have relevance for the operational model?

So what do we use at ECMWF? T799-L91 Standard “Scores” (rms, anom corr of U, T, Z) “operational” validation of clouds against SYNOP observations Simulated radiances against Meteosat 7 T159-L91 – “climate” runs 4 ensemble members of 13 months Automatically produces comparisons to: ERBE, NOAA-x, CERES TOA fluxes Quikscat & SSM/I, 10m winds ISCCP & MODIS cloud cover SSM/I, TRMM liquid water path (soon MLS ice water content) GPCP, TRMM, SSM/I, Xie Arkin, Precip Dasilva climatology of surface fluxes ERA-40 analysis winds

Ease of use allows catalogue of climate to be built up Issues Obs errors Variance Resolution sensitivity

Top-of-atmos net LW radiation -150 Model T159 L91 -300 -150 Top-of-atmos net LW radiation CERES -300 too high Difference too low

Top-of-atmos net SW radiation 350 Model T159 L91 100 350 Top-of-atmos net SW radiation CERES 100 albedo high Difference albedo low

Total Cloud Cover (TCC) 80 Model T159 L91 10 80 Total Cloud Cover (TCC) ISCCP 10 TCC high Difference TCC low

Total Column Liquid Water (TCLW) 250 Model T159 L91 25 250 Total Column Liquid Water (TCLW) SSMI 25 high Difference low

Model validation Look at higher order statistics Example: PDF of cloud cover Highlights a problem with one particular model version!

Model validation Look at relationships between variables Example: Liquid water path versus probability of precipitation for different precipitation thresholds Can compare with satellite observations of LWP and precipitation rate → autoconversion parametrization 43

Model validation Making the most of instrument synergy Observational instruments measure one aspect of the atmosphere. Often, combining information from different instruments can provide complementary information (particularly for remote sensing) For example, radars at different wavelengths, lidar, radiometers. Radar, lidar and radiometer instruments at Chilbolton, UK (www.chilbolton.rl.ac.uk) 44

Long term ground-based observations ARM / CLOUDNET Network of stations processed for multi-year period using identical algorithms, first Europe, now also ARM sites Some European provide operational forecasts so that direct comparisons are made quasi-realtime Direct involvement of Met services to provide up-to-date information on model cycles

Long term ground-based observations ARM Observational Sites “Permanent” ARM sites and movable “ARM mobile facility” for observational campaigns. 46

Cloudnet Example In addition to standard quicklooks, longer-term statistics are available This example is for ECMWF cloud cover during June 2005 Includes preprocessing to account for radar attenuation and snow See www.cloud-net.org for more details and examples! (but not funded at the current time !)

Space-borne active remote sensing A-Train CloudSat and CALIPSO have active radar and lidar to provide information on the vertical profile of clouds and precipitation. (Launched 28th April 2006) Approaches to model validation: Model → Obs parameters Obs → Model parameters

Simulating Observations CFMIP COSP radar/lidar simulator CloudSat simulator (Haynes et al. 2007) Radar Reflectivity Model Data (T,p,q,iwc,lwc…) Physical Assumptions (PSDs, Mie tables...) Sub-grid Cloud/Precip Pre-processor Lidar Attenuated Backscatter CALIPSO simulator (Chiriaco et al. 2006)

Example cross-section through a front Model vs CloudSat radar reflectivity 50

Example CloudSat orbit “quicklook” http://www. cloudsat. cira

Example section of a CloudSat orbit 26th February 2006 15 UTC Mid-latitude cyclone High tropical cirrus Mid-latitude cyclone

Compare model with observed parameters: Radar reflectivity Tropics 82°N 82°S 0°C 26/02/2007 15Z Simulated radar reflectivity from the model. (ice only) Observed radar reflectivity from CloudSat (ice + rain)

Compare model parameters with equivalent derived from observations: Ice Amount 26/02/2007 15Z Model ice water content (excluding precipitating snow). Eq Eq Eq log10 kg m-3 Antarctica Greenland Ice water content derived from a 1DVAR retrieval of CloudSat/ CALIPSO/Aqua (Delanöe and Hogan (2007), Reading Univ., UK)

Summary Different approaches to verification (climate statistics, case studies, composites), different techniques (model-to-obs, obs-to-model) and a range of observations are required to validate and improve cloud parametrizations. Need to understand the limitations of observational data (e.g. what is beyond the senistivity limits of the radar, or what is the accuracy of derived liquid water path from satellite) The model developer needs to understand physical processes to improve the model. Usually this requires observations, theory and modelling, but observations can be used to test model’s physical relationships between variables.

The path to improved cloud parameterisation… Parameterisation improvement ? Composite studies NWP validation Case studies Cloud validation Climatological comparison Cloud ParametrizationClouds 3

The path to improved cloud parameterisation… Parameterisation improvement Many mountains to climb ! ? Composite studies NWP validation Case studies Cloud validation Climatological comparison Cloud ParametrizationClouds 3