Presentation is loading. Please wait.

Presentation is loading. Please wait.

Numerical Weather Prediction Parametrization of diabatic processes Clouds (4) Cloud Scheme Validation AN ECMWF LECTURER RICH, Richard Forbes and Adrian.

Similar presentations


Presentation on theme: "Numerical Weather Prediction Parametrization of diabatic processes Clouds (4) Cloud Scheme Validation AN ECMWF LECTURER RICH, Richard Forbes and Adrian."— Presentation transcript:

1 Numerical Weather Prediction Parametrization of diabatic processes Clouds (4) Cloud Scheme Validation AN ECMWF LECTURER RICH, Richard Forbes and Adrian Tompkins Cloud ParametrizationClouds 3

2 Cloud Validation: The issues
AIM: To perfectly simulate one aspect of nature: CLOUDS APPROACH: Validate the model generated clouds against observations, and use the information concerning apparent errors to improve the model physics, and subsequently the cloud simulation. Cloud observations Cloud simulation error parameterisation improvements sounds easy? Cloud ParametrizationClouds 3

3 Cloud Validation: The problems
How much of the ‘error’ derives from observations? Cloud observations error = e1 parameterisation improvements error Cloud simulation error = e2 Cloud ParametrizationClouds 3

4 Cloud Validation: The problems
Which Physics is responsible for the error? Cloud observations parameterisation improvements error Cloud simulation radiation convection cloud physics dynamics turbulence Cloud ParametrizationClouds 3

5 The path to improved cloud parameterisation…
Parameterisation improvement ? Composite studies NWP validation Case studies Cloud validation Climatological comparison Cloud ParametrizationClouds 3

6 ECMWF Model Configurations
ECMWF Global Atmospheric Model (IFS) Many different configurations at different resolutions: TL159 (125 km) L62 Seasonal Forecast System (→6 months) TL255 (80 km) L62 Monthly Forecast System (→32 days) TL399 (50 km) L62 Ensemble Prediction System (EPS) (→15 days) TL799 (25 km) L91 Current Deterministic Global NWP (→10 days) TL1279 (16 km) L91 NWP from 2009 (→10 days) Future: 150 levels 10 km horizontal resolution? Need verification across a range of spatial scales and timescales → a model with a “climate” that is robust to resolution.

7 Model climate: Broadband radiative fluxes
Can compare Top of Atmosphere (TOA) radiative fluxes with satellite observations: e.g. TOA Shortwave radiation (TSR) JJA 87 TSR Old Model Version (CY18R6) minus Observations (ERBE) Stratocumulus regions bad - also North Africa (old cycle!) Cloud ParametrizationClouds 3

8 Model climate: Cloud radiative “forcing”
Problem: Can we associate these “errors” with clouds? Another approach is to examine “cloud radiative forcing” JJA 87 SWCRF Old Model Version (CY18R6) minus Observations (ERBE) Cloud Problems: strato-cu YES, North Africa NO! Note CRF sometimes defined as Fclr-F, also differences in model calculation Cloud ParametrizationClouds 3

9 Model climate “Cloud fraction” or “Total cloud cover”
Can also compare other variables to derived products: CC JJA 87 TCC Old Model Version (CY18R6) minus Observations (ISCCP) References: ISCCP - Rossow and Schiffer, Bull Am Met Soc. 91, ERBE - Ramanathan et al. Science 89 Cloud ParametrizationClouds 3

10 Comparison with Satellite Data: A problem
If more complicated cloud parameters are desired (e.g. vertical structure) then retrieval can be ambiguous Channel 1 Channel 2 ….. Another approach is to simulate irradiances using model fields Radiative transfer model Assumptions about vertical structures Liquid cloud 1 Liquid cloud 2 Liquid cloud 3 Ice cloud 1 Ice cloud 2 Ice cloud 3 Vapour 1 Vapour 2 Vapour 3 Height Cloud ParametrizationClouds 3

11 Comparison with Satellite Data: Simulating Satellite Radiances
More certainty in the diagnosis of the existence of a problem. Doesn’t necessarily help identify the origin of the problem Examples: Morcrette MWR 1991, Chevallier et al, J Clim. 2001 Cloud ParametrizationClouds 3

12 late afternoon peak in convection
Comparison with Satellite Data: A more complicated analysis is possible… DIURNAL CYCLE OVER TROPICAL LAND VARIABILITY Observations: late afternoon peak in convection Model: morning peak (Common problem) Cloud ParametrizationClouds 3

13 NWP Forecast Evaluation
Differences in longer simulations may not be the direct result of the cloud scheme Interaction with radiation, dynamics etc. E.g: poor stratocumulus regions Using short-term NWP or analysis restricts this and allows one to concentrate on the cloud scheme Introduction of Tiedtke Scheme Cloud cover bias Time Cloud ParametrizationClouds 3

14 Example over Europe -8:-3 -3:-1 -1:1 1:3 3:8

15 NWP Forecast Evaluation Meteosat and simulated IR
Daily Report 11th April 2005 “Going more into details of the cyclone, it can be seen that the model was able to reproduce the very peculiar spiral structure in the clouds bands. However large differences can be noticed further east, in the warm sector of the frontal system attached to the cyclone, where the model largely underpredicts the typical high-cloud shield. Look for example in the two maps above where a clear deficiency of cloud cover is evident in the model generated satellite images north of the Black Sea. In this case this was systematic over different forecasts.” – Quote from ECMWF daily report 11th April 2005 15

16 NWP Forecast Evaluation Meteosat and simulated w.v. channel
Blue: moist Red: Dry 30 hr forecast too dry in front region Is not a FC-drift, does this mean the cloud scheme is at fault? 16

17 Case Studies Can concentrate on a particular location and/or time period in more detail, for which specific observational data is collected: CASE STUDY Examples: GATE, CEPEX, TOGA-COARE, ARM... Cloud ParametrizationClouds 3

18 Evaluation of vertical cloud structure
Mace et al., 1998, GRL Examined the frequency of occurrence of ice cloud Reasonable match to data found ARM Site - America Great Plains Cloud ParametrizationClouds 3

19 Evaluation of vertical cloud structure
Hogan et al., 2001, JAM Analysis using the radar and lidar at Chilbolton, UK. Reasonable Match Cloud ParametrizationClouds 3

20 Hogan et al. More details possible
Cloud ParametrizationClouds 3

21 Hogan et al. (2001) Comparison improved when: (a) snow was included,
(b) cloud below the sensitivity of the instruments was removed. Cloud ParametrizationClouds 3

22 Issues Raised: WHAT ARE WE COMPARING? HOW STRINGENT IS OUR TEST?
Is the model statistic really equivalent to what the instrument measures? e.g: Radar sees snow, but the model may not include this in the definition of cloud fraction. Small ice amounts may be invisible to the instrument but included in the model statistic HOW STRINGENT IS OUR TEST? Perhaps the variable is easy to reproduce e.g: Mid-latitude frontal clouds are strongly dynamically forced, cloud fraction is often zero or one. Perhaps cloud fraction statistics are easy to reproduce in short term forecasts Cloud ParametrizationClouds 3

23 Can also use to validate “components” of cloud scheme
EXAMPLE: Cloud Overlap Assumptions Hogan and Illingworth, 2000, QJRMS Issues Raised: HOW REPRESENTATIVE IS OUR CASE STUDY LOCATION? e.g: Wind shear and dynamics very different between Southern England and the tropics!!! Cloud ParametrizationClouds 3

24 Composites We want to look at a certain kind of model system:
Stratocumulus regions Extra tropical cyclones An individual case may not be conclusive: Is it typical? On the other hand general statistics may swamp this kind of system Can use compositing technique Cloud ParametrizationClouds 3

25 Composites - a cloud survey
From Satellite attempt to derive cloud top pressure and cloud optical thickness for each pixel - Data is then divided into regimes according to sea level pressure anomaly Use ISCCP simulator Data Model Modal-Data 1. 900 500 620 750 350 250 120 -ve SLP 900 500 620 750 350 250 120 Tselioudis et al., 2000, JCL +ve SLP Cloud top pressure High Clouds too thin Low clouds too thick 2. Optical depth Cloud ParametrizationClouds 3

26 Composites – Extra-tropical cyclones
Overlay about 1000 cyclones, defined about a location of maximum optical thickness Plot predominant cloud types by looking at anomalies from 5-day average High Clouds too thin Low clouds too thick High tops=Red, Mid tops=Yellow, Low tops=Blue Klein and Jakob, 1999, MWR Cloud ParametrizationClouds 3

27 A strategy for cloud parametrization evaluation
Jakob Cloud ParametrizationClouds 3

28 Recap: The problems All Observations Long term climatologies:
Are we comparing ‘like with like’? What assumptions are contained in retrievals/variational approaches? Long term climatologies: Which physics is responsible for the errors? Dynamical regimes can diverge NWP, Reanalysis, Column Models Doesn’t allow the interaction between physics to be represented Case studies Are they representative? Do changes translate into global skill? Composites As case studies. And one more problem specific to NWP… Cloud ParametrizationClouds 3

29 NWP cloud scheme development
Timescale of validation exercise Many of the above validation exercises are complex and involved Often the results are available O(years) after the project starts for a single version of the model ECMWF operational model is updated 2 to 4 times a year roughly, so often the validation results are no longer relevant, once they become available. Requirement: A quick and easy test bench

30 Example: LWP ERA-40 and recent cycles
26r1: April 2003 model 23r4: June 2001 SSMI Diff

31 Example: LWP ERA-40 and recent cycles
model 23r4: June 2001 26r3: Oct 2003 SSMI Diff

32 Example: LWP ERA-40 and recent cycles
model 23r4: June 2001 28r1: Mar 2004 SSMI Diff

33 Example: LWP ERA-40 and recent cycles
model 23r4: June 2001 28r3: Sept 2004 SSMI Diff

34 Example: LWP ERA-40 and recent cycles
model 23r4: June 2001 29r1: Apr 2005 SSMI Diff

35 Example: LWP ERA-40 and recent cycles
model 23r4: June 2001 33r1: Nov 2008 SSMI Diff Do ERA-40 cloud studies still have relevance for the operational model?

36 So what do we use at ECMWF?
T799-L91 Standard “Scores” (rms, anom corr of U, T, Z) “operational” validation of clouds against SYNOP observations Simulated radiances against Meteosat 7 T159-L91 – “climate” runs 4 ensemble members of 13 months Automatically produces comparisons to: ERBE, NOAA-x, CERES TOA fluxes Quikscat & SSM/I, 10m winds ISCCP & MODIS cloud cover SSM/I, TRMM liquid water path (soon MLS ice water content) GPCP, TRMM, SSM/I, Xie Arkin, Precip Dasilva climatology of surface fluxes ERA-40 analysis winds

37 Ease of use allows catalogue of climate to be built up
Issues Obs errors Variance Resolution sensitivity

38 Top-of-atmos net LW radiation
-150 Model T159 L91 -300 -150 Top-of-atmos net LW radiation CERES -300 too high Difference too low

39 Top-of-atmos net SW radiation
350 Model T159 L91 100 350 Top-of-atmos net SW radiation CERES 100 albedo high Difference albedo low

40 Total Cloud Cover (TCC)
80 Model T159 L91 10 80 Total Cloud Cover (TCC) ISCCP 10 TCC high Difference TCC low

41 Total Column Liquid Water (TCLW)
250 Model T159 L91 25 250 Total Column Liquid Water (TCLW) SSMI 25 high Difference low

42 Model validation Look at higher order statistics
Example: PDF of cloud cover Highlights a problem with one particular model version!

43 Model validation Look at relationships between variables
Example: Liquid water path versus probability of precipitation for different precipitation thresholds Can compare with satellite observations of LWP and precipitation rate → autoconversion parametrization 43

44 Model validation Making the most of instrument synergy
Observational instruments measure one aspect of the atmosphere. Often, combining information from different instruments can provide complementary information (particularly for remote sensing) For example, radars at different wavelengths, lidar, radiometers. Radar, lidar and radiometer instruments at Chilbolton, UK ( 44

45 Long term ground-based observations ARM / CLOUDNET
Network of stations processed for multi-year period using identical algorithms, first Europe, now also ARM sites Some European provide operational forecasts so that direct comparisons are made quasi-realtime Direct involvement of Met services to provide up-to-date information on model cycles

46 Long term ground-based observations ARM Observational Sites
“Permanent” ARM sites and movable “ARM mobile facility” for observational campaigns. 46

47 Cloudnet Example In addition to standard quicklooks, longer-term statistics are available This example is for ECMWF cloud cover during June 2005 Includes preprocessing to account for radar attenuation and snow See for more details and examples! (but not funded at the current time !)

48 Space-borne active remote sensing A-Train
CloudSat and CALIPSO have active radar and lidar to provide information on the vertical profile of clouds and precipitation. (Launched 28th April 2006) Approaches to model validation: Model → Obs parameters Obs → Model parameters

49 Simulating Observations CFMIP COSP radar/lidar simulator
CloudSat simulator (Haynes et al. 2007) Radar Reflectivity Model Data (T,p,q,iwc,lwc…) Physical Assumptions (PSDs, Mie tables...) Sub-grid Cloud/Precip Pre-processor Lidar Attenuated Backscatter CALIPSO simulator (Chiriaco et al. 2006)

50 Example cross-section through a front Model vs CloudSat radar reflectivity
50

51 Example CloudSat orbit “quicklook” http://www. cloudsat. cira

52 Example section of a CloudSat orbit 26th February 2006 15 UTC
Mid-latitude cyclone High tropical cirrus Mid-latitude cyclone

53 Compare model with observed parameters: Radar reflectivity
Tropics 82°N 82°S 0°C 26/02/ Z Simulated radar reflectivity from the model. (ice only) Observed radar reflectivity from CloudSat (ice + rain)

54 Compare model parameters with equivalent derived from observations: Ice Amount
26/02/ Z Model ice water content (excluding precipitating snow). Eq Eq Eq log10 kg m-3 Antarctica Greenland Ice water content derived from a 1DVAR retrieval of CloudSat/ CALIPSO/Aqua (Delanöe and Hogan (2007), Reading Univ., UK)

55 Summary Different approaches to verification (climate statistics, case studies, composites), different techniques (model-to-obs, obs-to-model) and a range of observations are required to validate and improve cloud parametrizations. Need to understand the limitations of observational data (e.g. what is beyond the senistivity limits of the radar, or what is the accuracy of derived liquid water path from satellite) The model developer needs to understand physical processes to improve the model. Usually this requires observations, theory and modelling, but observations can be used to test model’s physical relationships between variables.

56 The path to improved cloud parameterisation…
Parameterisation improvement ? Composite studies NWP validation Case studies Cloud validation Climatological comparison Cloud ParametrizationClouds 3

57 The path to improved cloud parameterisation…
Parameterisation improvement Many mountains to climb ! ? Composite studies NWP validation Case studies Cloud validation Climatological comparison Cloud ParametrizationClouds 3


Download ppt "Numerical Weather Prediction Parametrization of diabatic processes Clouds (4) Cloud Scheme Validation AN ECMWF LECTURER RICH, Richard Forbes and Adrian."

Similar presentations


Ads by Google