Station lists and bias corrections Jemma Davie, Colin Parrett, Richard Renshaw, Peter Jermey © Crown Copyright 2012 Source: Met Office© Crown copyright.

Slides:



Advertisements
Similar presentations
© Crown copyright Met Office E-AMDAR evaluation. Mark Smees & Tim Oakley, Met Office, May 2008.
Advertisements

Chapter 13 – Weather Analysis and Forecasting
Gergely Bölöni, Roger Randriamampinanina, Regina Szoták, Gabriella Csima: Assimilation of ATOVS and AMDAR data in the ALADIN 3d-var system 1 _____________________________________________________________________________________.
World Meteorological OrganizationIntergovernmental Oceanographic Commission of UNESCO Ship Observations Team ~ integrating and coordinating international.
World Meteorological OrganizationIntergovernmental Oceanographic Commission of UNESCO Ship Observations Team ~ integrating and coordinating international.
© The Aerospace Corporation 2014 Observation Impact on WRF Model Forecast Accuracy over Southwest Asia Michael D. McAtee Environmental Satellite Systems.
© Crown copyright Met Office Impact experiments using the Met Office global and regional model Presented by Richard Dumelow to the WMO workshop, Geneva,
Part 5. Human Activities Chapter 13 Weather Forecasting and Analysis.
1 ATOVS and SSM/I assimilation at the Met Office Stephen English, Dave Jones, Andrew Smith, Fiona Hilton and Keith Whyte.
Brian Ancell, Cliff Mass, Gregory J. Hakim University of Washington
© Crown copyright Met Office Cost benefit studies for observing systems Stuart Goldstraw, Met Office, CBS-RA3-TECO-RECO, 13 th September 2014.
Chapter 13 – Weather Analysis and Forecasting. The National Weather Service The National Weather Service (NWS) is responsible for forecasts several times.
Climate quality data and datasets from VOS and VOSClim Elizabeth Kent and David Berry National Oceanography Centre, Southampton.
COSMO General Meeting Zurich, 2005 Institute of Meteorology and Water Management Warsaw, Poland- 1 - Verification of the LM at IMGW Katarzyna Starosta,
Observations Preprocessing Carla Cardinali
Dr Mark Cresswell Model Assimilation 69EG6517 – Impacts & Models of Climate Change.
Details for Today: DATE:18 th November 2004 BY:Mark Cresswell FOLLOWED BY:Literature exercise Model Assimilation 69EG3137 – Impacts & Models of Climate.
Advances in the use of observations in the ALADIN/HU 3D-Var system Roger RANDRIAMAMPIANINA, Regina SZOTÁK and Gabriella Csima Hungarian Meteorological.
1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009.
Eidgenössisches Departement des Innern EDI Bundesamt für Meteorologie und Klimatologie MeteoSchweiz Statistical Characteristics of High- Resolution COSMO.
© Crown copyright Met Office The EN QC system for temperature and salinity profiles Simon Good.
1 Mexico Regional AMDAR Workshop November 2011 Data Quality Monitoring and Control (QM / QC) Axel Hoff Convenor of WMO AMDAR Panel‘s Science and Technical.
AMB Verification and Quality Control monitoring Efforts involving RAOB, Profiler, Mesonets, Aircraft Bill Moninger, Xue Wei, Susan Sahm, Brian Jamison.
1 Discussion of Observational Biases of Some Aircraft Types at NCEP Dr. Bradley Ballish NCEP/NCO/PMB 7 September 2006 “Where America’s Climate and Weather.
Requirements from KENDA on the verification NetCDF feedback files: -produced by analysis system (LETKF) and ‘stat’ utility ((to.
Using satellite data to understand uncertainties in reanalyses: UERRA Richard Renshaw, Peter Jermey with thanks to Jörg Trentmann, Jennifer Lenhardt, Andrea.
1 Hyperspectral Infrared Water Vapor Radiance Assimilation James Jung Cooperative Institute for Meteorological Satellite Studies Lars Peter Riishojgaard.
Weather forecasting by computer Michael Revell NIWA
Verification Verification with SYNOP, TEMP, and GPS data P. Kaufmann, M. Arpagaus, MeteoSwiss P. Emiliani., E. Veccia., A. Galliani., UGM U. Pflüger, DWD.
GPS GPS derived integrated water vapor in aLMo: impact study with COST 716 near real time data Jean-Marie Bettems, MeteoSwiss Guergana Guerova, IAP, University.
© Crown copyright Met Office The EN4 dataset of quality controlled ocean temperature and salinity profiles and monthly objective analyses Simon Good.
Page 1© Crown copyright 2004 SRNWP Lead Centre Report on Data Assimilation 2005 for EWGLAM/SRNWP Annual Meeting October 2005, Ljubljana, Slovenia.
Introduction of temperature observation of radio-sonde in place of geopotential height to the global three dimensional variational data assimilation system.
Status of AMDAR Implementation in Japan, Forecast Department, Japan Meteorological Agency Prepared for APSDEU-6 Seoul, Korea, 1 June 2005.
Eidgenössisches Departement des Innern EDI Bundesamt für Meteorologie und Klimatologie MeteoSchweiz Statistics of COSMO Forecast Departures in View of.
The Impact of Data Assimilation on a Mesoscale Model of the New Zealand Region (NZLAM-VAR) P. Andrews, H. Oliver, M. Uddstrom, A. Korpela X. Zheng and.
WMO RTC-Turkey facilities, Alanya, Turkey Templates and Regulations
Page 1© Crown copyright 2005 Met Office Verification -status Clive Wilson, Presented by Mike Bush at EWGLAM Meeting October 8- 11, 2007.
NCAR April 1 st 2003 Mesoscale and Microscale Meteorology Data Assimilation in AMPS Dale Barker S. Rizvi, and M. Duda MMM Division, NCAR
Page 1 Developments in regional DA Oct 2007 © Crown copyright 2007 Mark Naylor, Bruce Macpherson, Richard Renshaw, Gareth Dow Data Assimilation and Ensembles,
Sean Healy Presented by Erik Andersson
MSG cloud mask initialisation in hydrostatic and non-hydrostatic NWP models Sibbo van der Veen KNMI De Bilt, The Netherlands EMS conference, September.
MODIS Winds Assimilation Impact Study with the CMC Operational Forecast System Réal Sarrazin Data Assimilation and Quality Control Canadian Meteorological.
Page 1 Validation by Model Assimilation and/or Satellite Intercomparison - ESRIN 9–13 December 2002 Evaluation of Envisat data using a NWP Assimilation.
1 3D-Var assimilation of CHAMP measurements at the Met Office Sean Healy, Adrian Jupp and Christian Marquardt.
Comparison of LM Verification against Multi Level Aircraft Measurements (MLAs) with LM Verification against Temps Ulrich Pflüger, Deutscher Wetterdienst.
Instruments. In Situ In situ instruments measure what is occurring in their immediate proximity. E.g., a thermometer or a wind vane. Remote sensing uses.
Deutscher Wetterdienst FE VERSUS 2 Priority Project Meeting Langen Use of Feedback Files for Verification at DWD Ulrich Pflüger Deutscher.
Global vs mesoscale ATOVS assimilation at the Met Office Global Large obs error (4 K) NESDIS 1B radiances NOAA-15 & 16 HIRS and AMSU thinned to 154 km.
ALADIN 3DVAR at the Hungarian Meteorological Service 1 _____________________________________________________________________________________ 27th EWGLAM.
OSEs with HIRLAM and HARMONIE for EUCOS Nils Gustafsson, SMHI Sigurdur Thorsteinsson, IMO John de Vries, KNMI Roger Randriamampianina, met.no.
Use of high resolution global SST data in operational analysis and assimilation systems at the UK Met Office. Matt Martin, John Stark,
© Crown copyright Met Office Assimilating infra-red sounder data over land John Eyre for Ed Pavelin Met Office, UK Acknowledgements: Brett Candy DAOS-WG,
Eidgenössisches Departement des Innern EDI Bundesamt für Meteorologie und Klimatologie MeteoSchweiz Statistics of COSMO Forecast Departures in View of.
Slide 1 Investigations on alternative interpretations of AMVs Kirsti Salonen and Niels Bormann 12 th International Winds Workshop, 19 th June 2014.
1 MODIS winds assimilation experiments and impact studies to date at the Met Office Howard Berger, Mary Forsythe, Met Office, Bracknell/Exeter, UK UW-CIMSS.
Quality Monitoring in JMA
Satellite data monitoring
Evaluation for China L band radiosonde
Data Assimilation Training
MO – Design & Plans UERRA GA 2016 Peter Jermey
Weak constraint 4D-Var at ECMWF
Steps towards evaluating the cost-benefit of observing systems
James Cotton, Mary Forsythe IWW14, Jeju City, South Korea.
Challenge: High resolution models need high resolution observations
Initialization of Numerical Forecast Models with Satellite data
Item Taking into account radiosonde position in verification
Linking operational activities and research
Comparison of different combinations of ensemble-based and variational data assimilation approaches for deterministic NWP Mark Buehner Data Assimilation.
The Met Office Ensemble of Regional Reanalyses
Presentation transcript:

Station lists and bias corrections Jemma Davie, Colin Parrett, Richard Renshaw, Peter Jermey © Crown Copyright 2012 Source: Met Office© Crown copyright Met Office

Outline Use of the model background Station lists −Surface −Aircraft −Upper air Background and buddy checks © Crown copyright Met Office

Use of the background Comparing observations with the background The background is a 6 hour forecast It has useful information because it contains observational information from previous analyses Left plot shows the rate at which 12- hour 500 hPa height forecast error degrades in ECMWF’s system on removal of satellite data. (Fisher 2004) It takes ~7 days before the influence of the satellite data is “forgotten”.

© Crown copyright Met Office Station Lists Contain information about which observations should or should not be rejected, bias corrections, probabilities of gross error, thinning and observation errors Static and updated sections Station lists for different observation types (surface, aircraft, sonde etc.) are updated in different ways Reject TEMP temperature reports from station ID between 700hPa and 400hPa Correct SYNOP pressure reports from station ID by 20hPa Reject marine reports over land Do not reject PILOT wind reports Reject aircraft reports above 90hPa

© Crown copyright Met Office Surface Report types: Land SYNOP, MOBSYN, METAR, ships and buoys Variables: Pmsl, Pstn, wind, 2m temperature, 2m relative humidity, visibility For each station: Calculate O-B statistics Make a decision about whether to reject reports from the station

© Crown copyright Met Office Surface – Non-pressure variables FEW OBSERVATIONS  Check previous month ENOUGH OBSERVATIONS  Check if statistics are good enough for accepting  Check if statistics are bad enough to reject station  If neither, do further checks involving whether the station was rejected for the previous month to make the decision

© Crown copyright Met Office Surface – Pressure Create a histogram of the O-B values Smooth the histogram Find peak and construct a 15 bin window centred at the peak Calculate mean and standard deviation of O-B in the window Compare to limits to make decision Negative of bias used as correction

© Crown copyright Met Office Aircraft Observation types: Airlines, AIREP/ADS, AMDAR Variables: temperature and wind Rejections can be made for individual aircraft or for an airline AMDAR levels – High ( hPa), Medium ( hPa) and Low (>700hPa) Corrections for AMDAR temperature

© Crown copyright Met Office Upper Air Observation types: TEMP, PILOT, drop sondes, wind profilers Variables: wind, temperature, relative humidity Can be rejected for different cycles The whole profile can be rejected, or just some layers For each station: Statistics Check wind direction bias Initial rejections and corrections Check whether to increase PGE Smoothing Final decision

© Crown copyright Met Office Upper Air – Initial rejections Compare statistics with limits Correct for radiation errors in temperature – Hawson correction

© Crown copyright Met Office Map model level decisions to decisions of pressure layers Upper Air – Initial rejections

© Crown copyright Met Office Take into account previous month’s rejections Combine for winds SuspectRejectPrevious Reject Updated Reject Number of Reports Updated Reject Upper Air – Updated rejections

© Crown copyright Met Office Upper Air – Smoothing Check stations for increasing PGE Layer rejections are smoothed Final rejection decisions made All layer rejection Low layer rejection Upper layer rejection Mid layer rejection

© Crown copyright Met Office Background and buddy checks Probability of gross error updated by consistency checks, the background check and the buddy check. Background check - compares with model background Probability density σ 2 = σ O 2 + σ B 2 Buddy check – checks against nearby observations If resulting PGE > 0.5, final flag is set and the observation is not assimilated.

© Crown copyright Met Office Summary Station lists − Static part: default settings for observation types and variables overall, as well as a few checks − Updated part: rejections and corrections for individual stations Quality control in Observation Processing System for individual reports − Consistency checks, background check and buddy check update PGE − If PGE exceeds 0.5, report not assimilated Observation feedbacks − Could be used to identify systematic errors and improve the observation record − Archiving ODBs for conventional observations

Thank you for listening. Any questions? © Crown Copyright 2012 Source: Met Office© Crown copyright Met Office