15 October 2004 IPWG-2, Monterey Anke Thoss

Slides:



Advertisements
Similar presentations
Slide 1ECMWF forecast products users meeting – Reading, June 2005 Verification of weather parameters Anna Ghelli, ECMWF.
Advertisements

Precipitation Products PPS Anke Thoss, SMHI User Workshop, February 2015, Madrid.
Precipitation Products I MSG and PPS Anke Thoss, SMHI User Workshop, April 2010, Madrid.
The SMHI AVHRR & Cloud Type dataset for CNN-I/II & BBC Adam Dybbroe The AVHRR dataset Navigation, Coverage, Re-mapping, Resolution The Cloud Type product.
Statistical Postprocessing of LM Weather Parameters Ulrich Damrath Volker Renner Susanne Theis Andreas Hense.
Monitoring the Quality of Operational and Semi-Operational Satellite Precipitation Estimates – The IPWG Validation / Intercomparison Study Beth Ebert Bureau.
Presenting NWCSAF products, activities by EUMeTrain Mária Putsay and Andreas Wirth Hungarian Meteorological Service ZAMG NWC SAF 2015 Users’ Workshop
The use of the NWCSAF High Resolution Wind product in the mesoscale AROME model at the Hungarian Meteorological Service Máté Mile, Mária Putsay and Márta.
16/06/20151 Validating the AVHRR Cloud Top Temperature and Height product using weather radar data COST 722 Expert Meeting Sauli Joro.
Using the NWC SAF products over South Africa and southern Africa to enhance nowcasting capabilities in data sparse regions E de Coning, Morne Gijben, Louis.
COSMO General Meeting Zurich, 2005 Institute of Meteorology and Water Management Warsaw, Poland- 1 - Verification of the LM at IMGW Katarzyna Starosta,
How low can you go? Retrieval of light precipitation in mid-latitudes Chris Kidd School of Geography, Earth and Environmental Science The University of.
1 GOES-R AWG Hydrology Algorithm Team: Rainfall Probability June 14, 2011 Presented By: Bob Kuligowski NOAA/NESDIS/STAR.
1 GOES-R AWG Hydrology Algorithm Team: Rainfall Potential June 14, 2011 Presented By: Bob Kuligowski NOAA/NESDIS/STAR.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss High Resolution Snow Analysis for COSMO
LAPS __________________________________________ Analysis and nowcasting system for Finland/Scandinavia Finnish Meteorological Institute Erik Gregow.
Latest results in verification over Poland Katarzyna Starosta, Joanna Linkowska Institute of Meteorology and Water Management, Warsaw 9th COSMO General.
Slide: 1 Quality indicators in an operational precipitation product IPWG meeting 4 Beijing, October 2008 Presented by: Thomas Heinemann Meteorological.
Potential Benefits of Multiple-Doppler Radar Data to Quantitative Precipitation Forecasting: Assimilation of Simulated Data Using WRF-3DVAR System Soichiro.
RDT – Rapidly Developing Thunderstorms Satellite Application Facilities (SAFs) are dedicated centres for processing satellite data, achieved by utilizing.
The Swedish Weather Radar Production Chain
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Quantitative precipitation forecast in the Alps Verification.
Improved road weather forecasting by using high resolution satellite data Claus Petersen and Bent H. Sass Danish Meteorological Institute.
Cloud Mask: Results, Frequency, Bit Mapping, and Validation UW Cloud Mask Working Group.
Response of active and passive microwave sensors to precipitation at mid- and high altitudes Ralf Bennartz University of Wisconsin Atmospheric and Oceanic.
Use of a high-resolution cloud climate data set for validation of Rossby Centre climate simulations Presentation at the EUMETSAT Meteorological Satellite.
Satellite based instability indices for very short range forecasting of convection Estelle de Coning South African Weather Service Contributions from Marianne.
GII to RII to CII in South Africa Estelle de Coning South African Weather Service Senior Scientist.
1 Validation for CRR (PGE05) NWC SAF PAR Workshop October 2005 Madrid, Spain A. Rodríguez.
1 Summary of NWCSAF/PPS PAR User Survey Presented during the NWCSAF Product Assessment Review Workshop October 2005 Prepared by : Anke Thoss, Angela.
SAF - Nowcasting Product Assessment Review Worshop (Madrid 17 – 18 – 19 0ctober 2005 Yann Guillou Météo-France (DPR) Page 1/8 Long duration validation.
1 PGE04-MSG Precipitating Clouds Product Presented during the NWCSAF Product Assessment Review Workshop October 2005 Prepared by : Anke Thoss, Anna.
NWP models. Strengths and weaknesses. Morten Køltzow, met.no NOMEK
MSG cloud mask initialisation in hydrostatic and non-hydrostatic NWP models Sibbo van der Veen KNMI De Bilt, The Netherlands EMS conference, September.
Overview of WG5 activities and Conditional Verification Project Adriano Raspanti - WG5 Bucharest, September 2006.
VERIFICATION Highligths by WG5. 2 Outlook Some focus on Temperature with common plots and Conditional Verification Some Fuzzy verification Long trends.
Active and passive microwave remote sensing of precipitation at high latitudes R. Bennartz - M. Kulie - C. O’Dell (1) S. Pinori – A. Mugnai (2) (1) University.
Nowcasting Convection Fusing 0-6 hour observation- and model-based probability forecasts WWRP Symposium on Nowcasting and Very Short Range Forecasting.
OSEs with HIRLAM and HARMONIE for EUCOS Nils Gustafsson, SMHI Sigurdur Thorsteinsson, IMO John de Vries, KNMI Roger Randriamampianina, met.no.
Evaluation of Precipitation from Weather Prediction Models, Satellites and Radars Charles Lin Department of Atmospheric and Oceanic Sciences McGill University,
Validation of Satellite Rainfall Estimates over the Mid-latitudes Chris Kidd University of Birmingham, UK.
The Convective Rainfall Rate in the NWCSAF
Comparing a multi-channel geostationary satellite precipitation estimator with the single channel Hydroestimator over South Africa Estelle de Coning South.
Analysis and forecasting system for Finland/Scandinavia
I. Sanchez, M. Amodei and J. Stein Météo-France DPREVI/COMPAS
CMa & CT Cloud mask and type
Best practices for RGB compositing of multi-spectral imagery
Precipitation Classification and Analysis from AMSU
Dipartimento della Protezione Civile Italiana
Systematic timing errors in km-scale NWP precipitation forecasts
Verifying Precipitation Events Using Composite Statistics
Cold Air Outbreak: Constrain Case Study
SAFNWC/MSG Cloud type/height. Application for fog/low cloud situations
COSMO Priority Project ”Quantitative Precipitation Forecasts”
EUMETSAT fellow day, 17 March 2014, Darmstadt
AVHRR operational cloud masks intercomparison
Application of satellite-based rainfall and medium range meteorological forecast in real-time flood forecasting in the Upper Mahanadi River basin Trushnamayee.
PGE06 TPW Total Precipitable Water
EUMETSAT Precipitation Week
Sub-daily temporal reconstruction of historical extreme precipitation events using NWP model simulations Vojtěch Bližňák1 Miloslav.
Quantitative verification of cloud fraction forecasts
New Developments in Aviation Forecast Guidance from the RUC
The use of SAFNWC products at Instituto de Meteorologia (IM), Portugal
Igor Appel Alexander Kokhanovsky
Validation for TPW (PGE06)
6th IPWG Workshop October 2012, Sao Jose dos Campos, Brazil
INSTYTUT METEOROLOGII I GOSPODARKI WODNEJ
Discussion Questions to all Questions to SRNWP consortia
Short Range Ensemble Prediction System Verification over Greece
VERIFICATION OF THE LAMI AT CNMCA
Presentation transcript:

The SEVIRI Precipitating Clouds Product of the Nowcasting SAF: First results 15 October 2004 IPWG-2, Monterey Anke Thoss Swedish Meteorological and Hydrological Institute Ralf Bennartz University of Wisconsin

Contents Introduction Algorithm Examples Performance Plans

Problem overview: NWCSAF approach: Except for strong convection, VIS/IR features are not strongly correlated with precipitation.  likelihood estimates in intensity classes more appropriate than rain rate retrieval NWCSAF approach: 2 complementary products for Nowcasting purposes Precipitating Clouds (PC) product gives likelihood of precipitation in coarse intensity classes 2. Convective Rain Rate (CRR) product estimates rain rate for strongly convective situations

three classes of precipitation intensity from PC product: three classes of precipitation intensity from co-located radar data Rain rate Class 0: Precipitation-free 0.0 - 0.1 mm/h Class 1: Light/moderate precipitation 0.1 - 5.0 mm/h Class 2: Intensive precipitation 5.0 - ... mm/h

Data sets for algorithm development Colocated sets of: AVHRR NWP Tsurface (HIRLAM) radar reflectivities (dBZ), gauge adjusted, of the BALTRAD Radar Data Centre BRDC (Michelsson et.al. 2000) No quantitative tuning to MSG performed for version 1.0 which is presented here!

Input: NWCSAF Cloud type product NWP surface temperature (ECMWF) MSG channels : 0.6 m, 1.6 m, 3.9 m, 11 m and 12 m

Algorithm development: Based on Cloud type output Correlation of spectral features with precipitation investigated Special attention to cloud microphysics (day/night algorithms) Precipitation Index PI constructed as linear combination of spectral features Algorithms cloud type specific

Correlation of Spectral features with rain Correlation with class, all potentially raining cloudtypes T11 -0.24 Tsurf - T11 0.26 T11-T11 -0.16 R0.6 0.18 R3.7 -0.18 ln(R0.6/R3.7) 0.26 R0.6/R1.6 0.42 3.7m day algorithm, all 0.35 1.6m day algorithm, all 0.44 night algorithm, all 0.30

Probability distribution, all raining Cloudtypes Night algorithm 3.7 Day algorithm 1.6 Day algorithm

Precipitation Index Example AVHRR 3.7 day algorithm, all cloud types: PI=35+0.644(Tsurf-T11)+5.99(ln(R0.7/R3.7))-3.93(T11-T12) Example AVHRR 1.6 day algorithm, all cloud types: PI = 65 -15*abs(4.45-R0.6 /R1.6)+0.495*R0.6-0.915(T11-T12) +0*Tsurf+0*T11 MSG day algorithm: Blend of 3.7µm day algorithm (applied to 3.9 µm channel) and 1.6 µm algorithm with equal weight, some additional features introduced for later use in quantitative tuning (a8-a10): PI=a0 +a1*Tsurf +a2*T11+a3*ln(R0.6/R3.9)+a4*(T11-T12) +a5*abs(a6-R0.6/R1.6)+a7*R0.6 + a8*R1.6+a9*R3.9+a10*(R1.6/R3.9) MSG night algorithm still identical to PPS

Cloud type dependence Algarithm 0 All precipitating cloud types Reported 30min. rain frequency at Hungarian gauges March-June 2004 Algorithm1 Medium level clouds 14.9% 5027 colocations Algorithm2 High and very high opaque clouds 31.4% 4126 colocations Algorithm3 Medium to thick cirrus 5.3% 5999 colocations Thick cirrus most rain Algorithm4 Cirrus over lower cloud No Precipitation All cloudfree classes, low and very low clouds, thin cirrus, fractional cloud 0.1% for cloudfree (of 9255) 0.9% for nonprecipitating cloud types (of 11459)

Cloud type and total precipitation likelihood (day), March 2004, 12UTC 100% - 70% 60% 50% 40% 30% 20% 10% 0% Cloud type and total precipitation likelihood (day), March 2004, 12UTC

Night algorithm, courtesy of M Night algorithm, courtesy of M. Putsay, Hungarian Meteorological Service

Day algorithm, 20031014, 1045

05:30 06:30 07:30 Upper:PC1, lower:PC2, 20031014

30 min. sampling 10 min. sampling

high+ very high opaque medium level Cirrus moderate-thick Ci over lower cloud

Day 20% No Rain MSG Rain (30 min) 84.1% 15.9% 24.0% 76.0% Day 20% Hungary,gauges march-june 2004 No Rain MSG Rain (30 min) 84.1% 15.9% 24.0% 76.0% Day 20% Hungary,gauges march-june 2004 No Rain MSG Rain (10 min) 82.9% 17.1% 21.5% 78.5% 20% likelihood threshold N=36466 Rain: 7.1% (30min) 4.9% (10min) 20% POD= 0.76 FAR= 0.73 PODF= 0.16 HK= 0.60 BIAS= 2.85 ACC= 0.84 30% POD= 0.58 FAR= 0.65 PODF= 0.08 HK= 0.50 BIAS= 1.66 ACC= 0.89 POD= 0.78 FAR= 0.81 PODF= 0.17 HK= 0.61 BIAS= 4.13 ACC= 0.83 POD= 0.62 FAR= 0.74 PODF= 0.09 HK= 0.52 BIAS= 2.42

Day 20% No Rain MSG Rain (30 min) 78.2% 14.7% 1.7% 5.4% Day 20% Hungary,gauges march-june 2004 No Rain MSG Rain (30 min) 78.2% 14.7% 1.7% 5.4% Day 20% Hungary,gauges march-june 2004 No Rain MSG Rain (10 min) 78.9% 16.3% 1.0% 3.8% 20% likelihood threshold N=36466 Rain: 7.1% (30min) 4.9% (10min) Percent of total number  Percent of gauge class  Day 20% Hungary,gauges march-june 2004 No Rain MSG Rain (30 min) 84.1% 15.9% 24.0% 76.0% Day 20% Hungary,gauges march-june 2004 No Rain MSG Rain (10 min) 82.9% 17.1% 21.5% 78.5%

MSG PC Product validation with surface observations Dataset: 15 May – 18 June 2004 12:00 UT: MSG data and Collocated surface observations of present weather (only ww classes indicating clearly rain or no rain considered) PC product without use of cloud type (only a NN based cloud mask)

Validation of MSG PC product Day, 45 N – 55 N, Total data points : 12123 (4.6 % raining) Likelihood of precipitation agrees well with synop

Validation of MSG PC product Night, 45 N – 55 N, Total data points : 12123 (4.6 % raining) Likelihood of precipitation agrees well with synop

Validation of MSG PC product Day, 30 N – 45 N, Total data points : 7218 (2.5% raining) Likelihood of precipitation is over-estimated by the PC product

Validation of MSG PC product Night, 30 N – 45 N, Total data points : 7218 (2.5 % raining) Likelihood of rain is over-estimated by the PC product

Day 30N 45N No Rain MSG Rain Synop ww 84.2% 15.8% 9.8% 90.2% Night 81.3% 18.7% Rain Synop ww 11.4% 88.6% 20% likelihood threshold N=12123 4.6% raining 20% POD= 0.90 FAR= 0.78 PODF= 0.16 HK= 0.74 BIAS= 4.12 ACC= 0.84 POD= 0.88 HK= 0.72 BIAS= 4.17

Day 30N 45N No Rain MSG Rain Synop ww 87.4% 12.6% 7.8% 92.2% Night 85.5% 14.5% 9.8% 91.1% 20% likelihood threshold N=7218 2.5% raining 20% POD= 0.92 FAR= 0.84 PODF= 0.13 HK= 0.79 BIAS= 5.78 ACC= 0.86 POD= 0.91 FAR= 0.86 PODF= 0.14 HK= 0.77 BIAS= 6.51

Score summary for MSG hardclustering threshold 20% PC Product POD FAR HK BIAS Details AMSU LAND 0.89 0.83 0.47 Against BALTRAD radar AMSU skill to resolve intensity not considered here AMSU SEA 0.88 0.75 0.57 MSG day 45-55N 0.90 0.78 0.74 4.12 Alg.0 (no cloud type), May/June 45-55N against Synop WW MSG night 45-55 0.72 4.17 MSG day 30-45N 0.92 0.84 0.79 5.78 30-45N against Synop WW MSG night 30-45 0.91 0.86 0.77 6.51 MSG day 30min. 0.76 0.73 0.60 2.85 Cloud type dependant, March-June 2004 against Hungarian gauges MSG day 10min 0.81 0.61 4.13

Open questions Why does verification against SYNOP WW look better than for gauge comparison (POD)? (parallax adjustment, alg0 better than alg1-alg4, May/June easier, all difficult ww excluded …) Timescale / horizontal scale (real effect or convenient Bias correction?) How can false alarms be reduced further?

Algorithm Performance – Summary  Night algorithm seems OK for strong convection, but overestimates precipitation (extent and intensity) for frontal situations  Day algorithm better in general, but has no skill to class precipitation intensity  recommended to display total precipitation likelihood  Discontinueties between day and night algorithm  Precipitation likelihood fairly correct between 45-55N  South of 45N precipitation likelihood overestimated

What is next? Status: ongoing  tuning against European synop, covering a years cycle Status: ongoing while tuning, try to decrease discontinuaty between day and night algorithm, especially for PC2 need more gauge data for PC2 tuning  later: investigate usefulness of additional channels