Presentation is loading. Please wait.

Presentation is loading. Please wait.

Numerical Weather Prediction Parametrization of diabatic processes Clouds (4) Cloud Scheme Validation Richard Forbes and Adrian Tompkins

Similar presentations


Presentation on theme: "Numerical Weather Prediction Parametrization of diabatic processes Clouds (4) Cloud Scheme Validation Richard Forbes and Adrian Tompkins"— Presentation transcript:

1 Numerical Weather Prediction Parametrization of diabatic processes Clouds (4) Cloud Scheme Validation Richard Forbes and Adrian Tompkins RICH, AN ECMWF LECTURER

2 2 Cloud Validation: The issues AIM: To perfectly simulate one aspect of nature: CLOUDS APPROACH: Validate the model generated clouds against observations, and use the information concerning apparent errors to improve the model physics, and subsequently the cloud simulation. Cloud observations Cloud simulation error parameterisation improvements sounds easy?

3 3 Cloud Validation: The problems How much of the error derives from observations? Cloud observations error = 1 Cloud simulation error = 2 error parameterisation improvements

4 4 Cloud Validation: The problems Which Physics is responsible for the error? Cloud observations Cloud simulation error parameterisation improvements radiation convection cloud physics dynamics turbulence

5 5 The path to improved cloud parameterisation… Cloud validation Parameterisation improvement Climatological comparison Case studies Composite studies NWP validation ?

6 ECMWF Model Configurations ECMWF Global Atmospheric Model (IFS) Many different configurations at different resolutions: T L 159 (125 km)L62 Seasonal Forecast System (6 months) T L 255 (80 km)L62 Monthly Forecast System (32 days) T L 399 (50 km)L62 Ensemble Prediction System (EPS) (15 days) T L 799 (25 km)L91Current Deterministic Global NWP (10 days) T L 1279(16 km)L91 NWP from 2009 (10 days) Future: 150 levels 10 km horizontal resolution? Need verification across a range of spatial scales and timescales a model with a climate that is robust to resolution.

7 7 Model climate: Broadband radiative fluxes JJA 87 TSR Stratocumulus regions bad - also North Africa (old cycle!) Can compare Top of Atmosphere (TOA) radiative fluxes with satellite observations: e.g. TOA Shortwave radiation (TSR) Old Model Version (CY18R6) minus Observations (ERBE)

8 8 Model climate: Cloud radiative forcing Problem: Can we associate these errors with clouds? Another approach is to examine cloud radiative forcing Note CRF sometimes defined as F clr -F, also differences in model calculation JJA 87 SWCRF Cloud Problems: strato-cu YES, North Africa NO! Old Model Version (CY18R6) minus Observations (ERBE)

9 9 Model climate Cloud fraction or Total cloud cover JJA 87 TCC References: ISCCP - Rossow and Schiffer, Bull Am Met Soc. 91, ERBE - Ramanathan et al. Science 89 Can also compare other variables to derived products: CC Old Model Version (CY18R6) minus Observations (ISCCP)

10 10 Comparison with Satellite Data: A problem If more complicated cloud parameters are desired (e.g. vertical structure) then retrieval can be ambiguous Channel 1 Channel 2 ….. Liquid cloud 1 Liquid cloud 2 Liquid cloud 3 Ice cloud 1 Ice cloud 2 Ice cloud 3 Vapour 1 Vapour 2 Vapour 3 Height Assumptions about vertical structures Another approach is to simulate irradiances using model fields Radiative transfer model

11 11 Comparison with Satellite Data: Simulating Satellite Radiances Examples: Morcrette MWR 1991, Chevallier et al, J Clim More certainty in the diagnosis of the existence of a problem. Doesnt necessarily help identify the origin of the problem

12 12 Comparison with Satellite Data: A more complicated analysis is possible… DIURNAL CYCLE OVER TROPICAL LAND VARIABILITY Observations: late afternoon peak in convection Model: morning peak (Common problem)

13 13 NWP Forecast Evaluation Differences in longer simulations may not be the direct result of the cloud scheme –Interaction with radiation, dynamics etc. –E.g: poor stratocumulus regions Using short-term NWP or analysis restricts this and allows one to concentrate on the cloud scheme Introduction of Tiedtke Scheme Time Cloud cover bias

14 14 Example over Europe -1:11:33:8-3:-1-8:-3

15 15 Daily Report 11 th April 2005 Going more into details of the cyclone, it can be seen that the model was able to reproduce the very peculiar spiral structure in the clouds bands. However large differences can be noticed further east, in the warm sector of the frontal system attached to the cyclone, where the model largely underpredicts the typical high-cloud shield. Look for example in the two maps above where a clear deficiency of cloud cover is evident in the model generated satellite images north of the Black Sea. In this case this was systematic over different forecasts. – Quote from ECMWF daily report 11 th April 2005 NWP Forecast Evaluation Meteosat and simulated IR

16 16 Blue: moist Red: Dry 30 hr forecast too dry in front region Is not a FC-drift, does this mean the cloud scheme is at fault? NWP Forecast Evaluation Meteosat and simulated w.v. channel

17 17 Case Studies Can concentrate on a particular location and/or time period in more detail, for which specific observational data is collected: CASE STUDY Examples: –GATE, CEPEX, TOGA-COARE, ARM...

18 18 Evaluation of vertical cloud structure Mace et al., 1998, GRL Examined the frequency of occurrence of ice cloud Reasonable match to data found ARM Site - America Great Plains

19 19 Evaluation of vertical cloud structure Hogan et al., 2001, JAM Analysis using the radar and lidar at Chilbolton, UK. Reasonable Match

20 20 Hogan et al. More details possible

21 21 Hogan et al. (2001) Comparison improved when: (a) snow was included, (b) cloud below the sensitivity of the instruments was removed.

22 22 Issues Raised: WHAT ARE WE COMPARING? –Is the model statistic really equivalent to what the instrument measures? –e.g: Radar sees snow, but the model may not include this in the definition of cloud fraction. Small ice amounts may be invisible to the instrument but included in the model statistic HOW STRINGENT IS OUR TEST? –Perhaps the variable is easy to reproduce –e.g: Mid-latitude frontal clouds are strongly dynamically forced, cloud fraction is often zero or one. Perhaps cloud fraction statistics are easy to reproduce in short term forecasts

23 23 Can also use to validate components of cloud scheme EXAMPLE: Cloud Overlap Assumptions Hogan and Illingworth, 2000, QJRMS Issues Raised: HOW REPRESENTATIVE IS OUR CASE STUDY LOCATION? e.g: Wind shear and dynamics very different between Southern England and the tropics!!!

24 24 Composites We want to look at a certain kind of model system: –Stratocumulus regions –Extra tropical cyclones An individual case may not be conclusive: Is it typical? On the other hand general statistics may swamp this kind of system Can use compositing technique

25 25 Composites - a cloud survey Tselioudis et al., 2000, JCL From Satellite attempt to derive cloud top pressure and cloud optical thickness for each pixel - Data is then divided into regimes according to sea level pressure anomaly Use ISCCP simulator 1.High Clouds too thin 2.Low clouds too thick Data Model Modal-Data Optical depth Cloud top pressure ve SLP ve SLP

26 26 Composites – Extra-tropical cyclones Overlay about 1000 cyclones, defined about a location of maximum optical thickness Plot predominant cloud types by looking at anomalies from 5-day average Klein and Jakob, 1999, MWR High tops=Red, Mid tops=Yellow, Low tops=Blue High Clouds too thin Low clouds too thick

27 27 A strategy for cloud parametrization evaluation Jakob

28 28 Recap: The problems All Observations –Are we comparing like with like? What assumptions are contained in retrievals/variational approaches? Long term climatologies: –Which physics is responsible for the errors? –Dynamical regimes can diverge NWP, Reanalysis, Column Models –Doesnt allow the interaction between physics to be represented Case studies –Are they representative? Do changes translate into global skill? Composites As case studies. And one more problem specific to NWP…

29 29 NWP cloud scheme development Timescale of validation exercise –Many of the above validation exercises are complex and involved –Often the results are available O(years) after the project starts for a single version of the model –ECMWF operational model is updated 2 to 4 times a year roughly, so often the validation results are no longer relevant, once they become available. Requirement Requirement: A quick and easy test bench

30 30 Example: LWP ERA-40 and recent cycles 23r4: June r1: April 2003 model SSMI Diff

31 31 Example: LWP ERA-40 and recent cycles 23r4: June r3: Oct 2003 model SSMI Diff

32 32 Example: LWP ERA-40 and recent cycles 23r4: June r1: Mar 2004 model SSMI Diff

33 33 Example: LWP ERA-40 and recent cycles 23r4: June r3: Sept 2004 model SSMI Diff

34 34 Example: LWP ERA-40 and recent cycles 23r4: June r1: Apr 2005 model SSMI Diff

35 35 Example: LWP ERA-40 and recent cycles 23r4: June r1: Nov 2008 Do ERA-40 cloud studies still have relevance for the operational model? model SSMI Diff

36 36 So what do we use at ECMWF? T799-L91 –Standard Scores (rms, anom corr of U, T, Z) –operational validation of clouds against SYNOP observations –Simulated radiances against Meteosat 7 T159-L91 – climate runs –4 ensemble members of 13 months –Automatically produces comparisons to: ERBE, NOAA-x, CERES TOA fluxes Quikscat & SSM/I, 10m winds ISCCP & MODIS cloud cover SSM/I, TRMM liquid water path (soon MLS ice water content) GPCP, TRMM, SSM/I, Xie Arkin, Precip Dasilva climatology of surface fluxes ERA-40 analysis winds

37 37 Ease of use allows catalogue of climate to be built up Issues 1.Obs errors 2.Variance 3.Resolution sensitivity

38 38 Top-of-atmos net LW radiation Model T159 L91 CERES Difference too high too low

39 39 Model T159 L91 CERES Difference albedo high albedo low Top-of-atmos net SW radiation

40 40 Total Cloud Cover (TCC) Model T159 L91 ISCCP Difference TCC high TCC low

41 41 Total Column Liquid Water (TCLW) Model T159 L91 SSMI Difference high low

42 42 Model validation Model validation Look at higher order statistics Example: PDF of cloud cover Highlights a problem with one particular model version!

43 43 Model validation Model validation Look at relationships between variables Example: Liquid water path versus probability of precipitation for different precipitation thresholds Can compare with satellite observations of LWP and precipitation rate autoconversion parametrization

44 44 Model validation Model validation Making the most of instrument synergy Observational instruments measure one aspect of the atmosphere. Often, combining information from different instruments can provide complementary information (particularly for remote sensing) For example, radars at different wavelengths, lidar, radiometers. Radar, lidar and radiometer instruments at Chilbolton, UK ( )

45 45 Long term ground-based observations ARM / CLOUDNET Network of stations processed for multi-year period using identical algorithms, first Europe, now also ARM sites Some European provide operational forecasts so that direct comparisons are made quasi-realtime Direct involvement of Met services to provide up-to-date information on model cycles

46 46 Long term ground-based observations Long term ground-based observations ARM Observational Sites Permanent ARM sites and movable ARM mobile facility for observational campaigns.

47 47 Cloudnet Example In addition to standard quicklooks, longer-term statistics are available This example is for ECMWF cloud cover during June 2005 Includes preprocessing to account for radar attenuation and snow See for more details and examples! (but not funded at the current time !)

48 48 Space-borne active remote sensing Space-borne active remote sensing A-Train CloudSat and CALIPSO have active radar and lidar to provide information on the vertical profile of clouds and precipitation. (Launched 28 th April 2006) Approaches to model validation: Model Obs parameters Obs Model parameters

49 Model Data (T,p,q,iwc,lwc…) Sub-grid Cloud/Precip Pre-processor CloudSat simulator (Haynes et al. 2007) CALIPSO simulator (Chiriaco et al. 2006) Lidar Attenuated Backscatter Radar Reflectivity Physical Assumptions (PSDs, Mie tables...) Simulating Observations CFMIP COSP radar/lidar simulator

50 50 Example cross-section through a front Model vs CloudSat radar reflectivity

51 51 Example CloudSat orbit quicklook

52 52 Example section of a CloudSat orbit 26 th February UTC Mid-latitude cyclone High tropical cirrus Mid-latitude cyclone

53 53 Compare model with observed parameters: Radar reflectivity Simulated radar reflectivity from the model. (ice only) Observed radar reflectivity from CloudSat (ice + rain) Tropics82°N82°S 0°C 26/02/ Z

54 54 Compare model parameters with equivalent derived from observations: Ice Amount Model ice water content (excluding precipitating snow). Ice water content derived from a 1DVAR retrieval of CloudSat/ CALIPSO/Aqua log 10 kg m -3 (Delanöe and Hogan (2007), Reading Univ., UK) Eq Greenland Antarctica 26/02/ Z

55 Summary Different approaches to verification (climate statistics, case studies, composites), different techniques (model-to- obs, obs-to-model) and a range of observations are required to validate and improve cloud parametrizations. Need to understand the limitations of observational data (e.g. what is beyond the senistivity limits of the radar, or what is the accuracy of derived liquid water path from satellite) The model developer needs to understand physical processes to improve the model. Usually this requires observations, theory and modelling, but observations can be used to test models physical relationships between variables.

56 56 The path to improved cloud parameterisation… Cloud validation Parameterisation improvement Case studies Composite studies NWP validation ? Climatological comparison

57 57 The path to improved cloud parameterisation… Cloud validation Parameterisation improvement Case studies Composite studies NWP validation ? Many mountains to climb ! Climatological comparison


Download ppt "Numerical Weather Prediction Parametrization of diabatic processes Clouds (4) Cloud Scheme Validation Richard Forbes and Adrian Tompkins"

Similar presentations


Ads by Google