Presentation is loading. Please wait.

Presentation is loading. Please wait.

Томск Современные вычислительные технологии в задачах прогноза погоды

Similar presentations


Presentation on theme: "Томск Современные вычислительные технологии в задачах прогноза погоды"— Presentation transcript:

1 Г.С.Ривин Rivin@ict.nsc.ru www.ict.nsc.ru
Томск Современные вычислительные технологии в задачах прогноза погоды Г.С.Ривин

2 ATMOSPHERIC PROCESSES in SPACE-ATMOSPHERE-SEA/LAND system

3 Submodels

4 WMO WEATHER FORECASTING RANGES
Nowcasting A description of current weather parameters and 0 -2 hours description of forecasted weather parameters Very short-range Up to 12 hours description of weather parameters Short-range Beyond 12 hours and up to 72 hours description of weather parameters Medium-range Beyond 72 hours and up to 240 hours description of weather parameters Extended-range Beyond 10 days and up to 30 days description of weather parameters, usually averaged and expressed as a departure from climate values for that period. Long-range Monthly outlook Three month or 90 day outlook Seasonal outlook From 30 days up to two years Description of averaged weather parameters expressed as a departure (deviation, variation, anomaly) from climate values for that month (not necessarily the coming month). Description of averaged weather parameters expressed as a departure from climate values for that 90 day period (not necessarily the coming 90 day period). Description of averaged weather parameters expressed as a departure from climate values for that season. Climate forecasting Climate variability prediction Climate prediction Beyond two years Description of the expected climate parameters associated with the variation of inter-annual, decadal and multi-decadal climate anomalies. Description of expected future climate including the effects of both natural and human influences.

5 COMPONENTS of FORECAST SYSTEM
1. Observing system 2. Telecommunication system 3. Computer system 4. Data assimilation 5. Model 6. Postprocessing

6 Observing System SYNOP AIROCRAFS AIROLOGICAL DATA

7 Observing System RADARS, r = 100km Radar

8

9 The system of equations (conservation laws applied to individual parcels of air) (from E.Kalnay)
V. Bjerknes (1904) pointed out for the first time that there is a complete set of 7 equations with 7 unknowns that governs the evolution of the atmosphere: conservation of the 3-dimensional momentum (equations of motion), conservation of dry air mass (continuity equation), the equation of state for perfect gases, conservation of energy (first law of thermodynamics), equations for the conservation of moisture in all its phases. They include in their solution fast gravity and sound waves, and therefore in their space and time discretization they require the use of smaller time steps, or alternative techniques that slow them down. For models with a horizontal grid size larger than 10 km, it is customary to replace the vertical component of the equation of motion with its hydrostatic approximation, in which the vertical acceleration is neglected compared with gravitational acceleration (buoyancy). With this approximation, it is convenient to use atmospheric pressure, instead of height, as a vertical coordinate.

10

11 MODELS of ATMOSPHERE ECMWF: T511L60 – 40 km; EPS: T255L60 – 80 km;
2003, December MODELS of ATMOSPHERE ECMWF: T511L – 40 km; EPS: T255L60 – 80 km; DWD: GME (L41) – 40 km; LM (L3550) – (2.8)7 km; France: ARPEGE(L41) km; ALADIN (L41)– 9 km; HIRLAM: (L16-31) – km; UK: UM(L30) – 60 km; (L38) – 12 km; USA: AVP (T254L64) – 60 km; ETA (L60) – 12 km; Japan: GSM(L40) – 60 km; MSM(L40) – 10 km. RusFed.: T85L – 150 km; (L31) – 75 km. Moscow region (300kmx300km) km.

12 Coordinate systems: p, sigma, z, eta, hybrid
Models of atmosphere: Steps: global km, local 7-12 km; Methods: splitting, semi-Lagrangian scheme (23), ensembles, nonhydrostatic, grids Data assimilation: (4)D-Var, Kalman filter Reanalyses NCEP / NCAR USA years (1948-…; T62L28~210km) Reanalyses-2 (ETA RR 32 km, 45 layers) ECMWF ERA-15 (TL106L31~150km, ), ERA-40 (TL159L60~120km, 3D-Var, mid ) FEATURES OF INFORMATION AND COMPUTATIONAL TECHNOLOGIES IN ATMOSPHERIC SCIENCES

13 Modern and Possible further development computational technologies
ensemble simulation

14 ECMWF: FORECASTING SYSTEM - DECEMBER 2003
Model: Smallest half-wavelength resolved: 40 km (triangular spectral truncation 511) Vertical grid: hybrid levels (top pressure: 10 Pa) Time-step: minutes Numerical scheme: Semi-Lagrangian, semi- implicit time-stepping formulation. Number of grid points in model: 20,911,680 upper-air, 1,394,112 in land surface and sub- surface layers. The grid for computation of physical processes is a reduced, linear Gaussian grid, on which single- level parameters are available. The grid spacing is close to 40km. Variables at each grid point (recalculated at each time-step): Wind, temperature, humidity, cloud fraction and water/ ice content, ozone content (also pressure at surface grid-points) Physics: orography (terrain height and sub-grid-scale), drainage, precipitation, temperature, ground humidity, snow-fall, snow-cover & snow melt, radi­ation (incoming short-wave and out-going long-wave), friction (at surface and in free atmosphere), sub-grid-scale orographic drag - gravity waves and blocking effects, evaporation, sensible & latent heat flux, oceanic waves.

15 ECMWF: FORECASTING SYSTEM - DECEMBER 2003
Data Assimilation: Analysis: Mass & wind (four-dimensional variational multi- variate analysis on 60 model levels) Humidity (four-dimensional variational analysis on model levels up to 250 hPa) Surface parameters (sea surface temperature from NCEP Washington analysis, sea ice from SSM/I sat­ellite data), soil water content, snow depth, and screen level temperature and humidity Data used: Global satellite data (SATOB/AMV, (A)TOVS, Quikscat, SSM/I, SBUV, GOME, Meteosat7 WV radiance), Global free-atmosphere data (AIREP, AMDAR, TEMP, PILOT, TEMP/DROP, PROFILERS), Oceanic data (SYNOP/SHIP, PILOT/SHIP, TEMP/SHIP, DRIBU), Land data (SYNOP). Data checking and validation is applied to each parameter used. Thinning procedures are applied when observations are redundant at the model scale.

16 Nonhydrostatic models
the Penn State/NCAR Mesoscale Model (e.g., Dudhia, 1993), the CAPS Advanced Regional prediction System (Xue et al, 1995), NCEP's Regional Spectral Model (Juang et al, 1997), the Mesoscale Compressible Community (MCC) model (Laprise et al, 1997), the CSU RAMS Tripoli and Cotton (1980), the US Navy COAMPS (Hodur, 1997).

17

18 WRF Development Teams Courtesy NCAR WG1 WG3 WG6 WG12 WG5 WG4 WG7 WG9

19 From Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)
Model Physics in High Resolution NWP Physics “No Man’s Land” km Cumulus Parameterization Resolved Convection LES PBL Parameterization Two Stream Radiation 3-D Radiation From Joe Klemp, NCAR (Bad Orb, )

20 Weather Research and Forecasting Model
Goals: Develop an advanced mesoscale forecast and assimilation system, and accelerate research advances into operations 36h WRF Precip Forecast Analyzed Precip 27 Sept. 2002 Collaborative partnership, principally among NCAR, NOAA, DoD, OU/CAPS, FAA, and university community Multi-agency WRF governance; development conducted by 15 WRF Working Groups Software framework provides portable, scalable code with plug-compatible modules Ongoing active testing and rapidly growing community use Over 1,400 registered community users, annual workshops and tutorials for research community Daily experimental real-time forecasting at NCAR , NCEP, NSSL, FSL, AFWA, U. of Illinois Operational implementation at NCEP and AFWA in FY04 From Joe Klemp, NCAR (Bad Orb, )

21 From Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)
Hurricane Isabel NOAA –17 AVHRR 13 Sep 03 14:48 GMT From Joe Klemp, NCAR (Bad Orb, )

22 From Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)
Hurricane Isabel Track 18/1700Z 4 km WRF Initialized 17/0000Z 10 km WRF Initialized 15/1200Z From Joe Klemp, NCAR (Bad Orb, )

23 From Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)
Hurricane Isabel 3 h Precip Forecast WRF Model 10 km grid 5 day forecast Initialized: 12 UTC 15 Sep 03 From Joe Klemp, NCAR (Bad Orb, )

24 From Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)
48 h Hurricane Isabel Reflectivity Forecast Initialized 00 UTC 17 Sep 03 Radar Composite 4 km WRF forecast From Joe Klemp, NCAR (Bad Orb, )

25 From Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)
Hurricane Isabel Reflectivity at Landfall 18 Sep Z Radar Composite 41 h forecast from 4 km WRF From Joe Klemp, NCAR (Bad Orb, )

26 From Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)
Hurricane Isabel Surface-Wind Forecast WRF Model 4 km grid 2 day forecast Initialized: 00 UTC 17 Sep 03 From Joe Klemp, NCAR (Bad Orb, )

27 From Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)
WRF Mass Coordinate Core Terrain-following hydrostatic pressure vertical coordinate Arakawa C-grid, two-way interacting nested grids (soon) 3rd order Runge-Kutta split-explicit time differencing Conserves mass, momentum, dry entropy, and scalars using flux form prognostic equations 5th order upwind or 6th order centered differencing for advection Physics for CR applications: Lin microphysics, YSU PBL, OSU/MM5 LSM, Dudhia shortwave/RRTM longwave radiation, no cumulus parameterization From Joe Klemp, NCAR (Bad Orb, )

28 From Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)
Model Configuration for 4 km Grid Domain 2000 x 2000 km, 501 x 501 grid 50 mb top, 35 levels 24 s time step Initialization Interpolated from gridded analyses BAMEX: 40 km Eta CONUS analysis Isabel: 1o GFS global analysis (~110 km) Computing requirements 128 Processors on IBM SP Power 4 Regatta Run time: 106 min/24h of forecast From Joe Klemp, NCAR (Bad Orb, )

29 North American Early Guidance System
5/31/2009 6 km aerosols in radiative transfer & reflectivity 6 km WRF aerosols 5/31/2008 7 km absorption scattering in radiative transfer 7 km WRF improved physics 5/31/2005 9 km AIRS, GOES imagery & move top to 2mb 9 km NMM 2mb hourly output 5/31/2006 8 km WRF 4DDA 8 km WRF 5/31/2010 5 km NPP, advanced 4DDA, NPOESS, IASI & air quality 5 km WRF L100 2/28/2004 10 km hourly update & improved background error cov. 10 km Meso Eta improved physics 9/30/2002 12 km 3DVAR radial velocity 12 km Meso Eta Date Data Assimilation Prediction Model

30 Global Forecast System (GFS)
5/31/2009 NPP, integrated SST analysis 40 km / L80 5/31/2008 Aerosols in radiative transfer, GIFTS 5/31/2005 3-D Background error covariance, cloud analysis, minimization 45 km / L64 5/31/2006 Absorption / scattering in radiative transfer 45 km / L64 + improved microphysics 5/31/2010 Advanced 4DDA, NPOESS, IASI + air quality 35 km / L100 2/28/2004 Grid point version, AIRS, GOES imagery T-254 / L64 add 2 passive tracers 9/30/2002 3D-VAR, AMSU-B, Quikscat T-254 / L64 Date Data Assimilation Prediction Model

31 Timeline for WRF at NCEP
North American WRF: Operational in FY05 Hurricane WRF: Operational in FY06 Rapid Refresh (RUC) WRF (hourly): Operational in FY07 WRF SREF : Operational in FY07 Others? (Fire Wx, Homeland Security, etc.) using best WRF deterministic model

32 http://www. metoffice. com/research/nwp/numerical/unified_model/index
The Unified Model The Unified Model is the name given to the suite of atmospheric and oceanic numerical modelling software developed and used at the Met Office. The formulation of the model supports global and regional domains and is applicable to a wide range of temporal and spatial scales that allow it to be used for both numerical weather prediction and climate modelling as well as a variety of related research activities. The Unified Model was introduced into operational service in 1992. Since then, both its formulation and capabilities have been substantially enhanced. New Dynamics A major upgrade to the Met Office Global Numerical Weather Prediction model was implemented on 7th August 2002. Submodels The Unified Model is made up of a number of numerical submodels representing different aspects of the earth's environment that influence the weather and climate. Like all coupled models the Unified Model can be split up in a number of different ways, with various submodel components switched on or off for a specific modelling application. The Portable Unified Model (PUM) A portable version of the Unified Model has also been developed suitable for running on workstations and other computer systems.

33 The Met Office Global Numerical Weather Prediction model
The Met Office Global Numerical Weather Prediction model was implemented on 7th August 2002. The package of changes was under trial for over a year and is known as "New Dynamics". This document details some of the key changes that are part of the New Dynamics package. Non-hydrostatic model with height as the vertical co-ordinate. Charney-Philips grid-staggering in the vertical, Arakawa C-grid staggering in the horizontal, Two time-level, semi-Lagrangian advection and semi-implicit time stepping. Edwards-Slingo radiation scheme with non-spherical ice spectral files Large-scale precipitation includes prognostic ice microphysics. Vertical gradient area large-scale cloud scheme. Convection with convective available potential energy (CAPE) closure, momentum transports and convective anvils. A boundary-layer scheme which is non-local in unstable regimes. Gravity-wave drag scheme which includes flow blocking. GLOBE orography dataset. The MOSES (Met Office Surface Exchange Scheme) surface hydrology and soil model scheme. Predictor-corrector technique with no extraction of basic state profile. Three-dimensional Helmholtz-type equation solved using GCR technique.

34 Météo-France 2003 status The operational forecast system at Météo-France is based on two different numerical applications of the same code ARPEGE-IFS, 2. additional code to build the limited area model ALADIN. The ARPEGE-IFS has been developed jointly by Météo-France and ECMWF (ARPEGE is the usual name in Toulouse and IFS - in Reading): ECMWF model for medium range forecasts (4-7 days) a Toulouse variable mesh version in for short range predictions (1-4 days) The ALADIN library has been developed jointly by Météo-France and the national meteorological or 14 hydrometeorological services: Austria, Belgium, Bulgaria, Croatia, Czech Republic, Hungary, Moldova, Morocco, Poland, Portugal, Romania, Slovakia, Slovenia, Tunisia.

35 40(35) 325 35 325 DWD FORECAST SYSTEM Local model (LM)
Global model (GME) DWD FORECAST SYSTEM

36 DWD FORECAST SYSTEM Global model (GME)

37 Local model (LM) DWD FORECAST SYSTEM

38 DWD trajectory models

39 DWD trajectory models

40

41 Further Development of the Local Systems LME and LMK 2003 to 2006
LME: Local model LM for whole of Europe; mesh size 7 km and 40 layers; h forecasts from 00, 12 and 18 UTC. LMK: LM-”Kürzestfrist”; mesh size < 3 km and 50 layers; 18-h forecasts from 00, 03, 06, 09, 12, 15, 18, 21 UTC for Germany with explicit prediction of deep convection. 1. Data assimilation 2 Q 2005 Use satellite (GPS) and radar data (reflectivity, VAD winds) 1 Q 2006 Use European wind profiler and satellite data

42 Further Development of the Local Systems LME and LMK 2003 to 2006
2. Local modelling 2 Q 2004 Increase model domain (7 km mesh) from 325x325 up to 753x641 gridpoints (covering whole of Europe), 40 layers 3 Q 2005 New convection scheme (Kain-Fritsch ?)

43 Europa

44 LMK: LM-Kürzestfrist Model-based system for nowcasting and very short range forecasting Goals: Prediction of severe weather on the mesoscale. Explicit simulation of deep convection. Method: 18-h predictions of LM initialised every three hours, mesh size < 3 km Usage of new observations: SYNOP: Every 60 min, METAR: Every 30 min, GPS: Every 30 min, VAD winds: Every 15 min, Reflectivity: Every 15 min, Wind profiler: Every 10 min, Aircraft data.

45 LMK: A new 18-h forecast every three hours
00 03 06 09 12 15 18 21 00 (UTC) LMK: A new 18-h forecast every three hours

46 High-resolution Regional Model HRM
Operational NWP Model at 13 services worldwide Hydrostatic, (rotated) latitude/longitude grid Operators of second order accuracy 7 to 28 km mesh size, various domain sizes 20 to 35 layers (hybrid, sigma/pressure) Prognostic variables: ps, u, v, T, qv, qc, qi Same physics package as GME Programming: Fortran90, OpenMP/MPI for parallelization From 00 and 12 UTC: Forecasts up to 78 hours Lat. bound. cond. from GME at 3-hourly intervals

47 General structure of a regional NWP system
Topographical data Graphics Visualization MOS Kalman Regional NWP Model Initial data (analysis) Direct model output (DMO) Applications Wave model, Trajectories Lateral boundary data Verification Diagnostics

48 Short Description of the High-Resolution Regional Model (HRM) Hydrostatic limited-area meso- and meso- scale numerical weather prediction model Prognostic variables Surface pressure ps Temperature T Water vapour qv Cloud water qc Cloud ice qi Horizontal wind u, v Several surface/soil parameters Diagnostic variables Vertical velocity  Geopotential  Cloud cover clc Diffusion coefficients tkvm/h

49 Current operational users of the HRM
Kenya, National Meteorological Service Oman, National Meteoro-logical Service (DGCAM) Romania, National Meteoro-logical & Hydrological Service Spain, National Met. Institute United Arab Emirates, National Met. Institute Vietnam, National Meteoro-logical & Hydrological Service; Hanoi University Brazil, Directorate of Hydrography & Navigation Brazil, Instituto Nacional de Meteorologia Bulgaria, National Meteoro-logical & Hydrological Service China, Guangzhou Regional Meteorological Centre India, Space Physics Lab. Israel, Israel Meteorological Service Italy, Italian Meteorological Service

50 Numerics of the HRM Regular or rotated latitude/longitude grid
Mesh sizes between 0.25° and 0.05° (~ 28 to 6 km) Arakawa C-grid, second order centered differencing Hybrid vertical coordinate, 20 to 35 layers Split semi-implicit time stepping; t = 150s at  = 0.25° Lateral boundary formulation due to Davies Radiative upper boundary condition as an option Fourth-order horizontal diffusion, slope correction Adiabatic implicit nonlinear normal mode initialization

51 Physical parameterizations of the HRM
-two stream radiation scheme (Ritter and Geleyn, 1992) including long- and shortwave fluxes in the atmosphere and at the surface; full cloud - radiation feedback; diagnostic derivation of partial cloud cover (rel. hum. and convection) Grid-scale precipitation scheme including parameterized cloud microphysics (Doms and Schättler, 1997) Mass flux convection scheme (Tiedtke, 1989) differentiating between deep, shallow and mid-level convection Level-2 scheme of vertical diffusion in the atmosphere, similarity theory (Louis, 1979) at the surface Two-layer soil model including snow and interception storage; three-layer version for soil moisture as an option

52 Computational aspects of the HRM
Fortran 90 and C (only for Input/Output: GRIB code) Multi-tasking for shared memory computers based on standard Open-MP Efficient dynamic memory allocation NAMELIST variables for control of model Computational cost: ~ 3100 Flop per grid point, layer and time step Interface to data of the global model GME available providing initial and/or lateral boundary data Build-in diagnostics of physical processes Detailed print-out of meteographs

53

54

55 Further Development of the HRM 2003 to 2006
An MPI version of HRM for Linux PC Clusters, developed by Vietnam, is available to all HRM users since July 2003. A 3D-Var data assimilation scheme developed by Italy will be available to experienced HRM users early 2004. The physics packages in GME and HRM will remain exactly the same. The interaction between the different HRM groups should be intensified. A first HRM User’s Meeting will take place in Rio de Janeiro (Brazil) in October 2004.

56 West-Syberian region

57 West-Syberian region 20 km 101:101 10 km

58 EFFS, Partners SMHI DWD DMI WL|Delft, RIZA GRDC Univ Lancaster
Univ. Bristol ECMWF JRC Ispra Univ. Bologna

59 TASKS for the DWD in the EFFS
1) Run the complete assimilation-forecast system for GME and LM for the three historical flood events for a period of roughly 2 weeks for each flood event. 2) Perform for the three flood events high resolution analyses of 24h precipitation heights on the basis of surface observations. 3) Develop a prototype-scheme for near real-time 24h precipitation analysis on the basis of Radar-data and synoptic precipitation observations. In addition to these tasks the operational model results according to task 1) for the period of the Central European flood were retrieved from the archives and supplied to the project ftp-server.

60

61 Maps of the constant fields for GME and LM.

62

63 Rain gauge (high resolution) stations (climate),
EFFS DWD DATA RESULTS Rain gauge (high resolution) stations (climate), July 1997 Austria Czech Republic Germany Poland Switzerland Alltogether

64 NWP Systems (now and plans) (Computers, Göbal and Local Models)
2002 2003 2004 2005 2006 ECMWF 0.96 Tf TL511 (40km) L60 10 Tf 20 Tf TL511(40km) L60 TL799(25km) L91 DWD 1.92 Tf 60km L31 7 km L35 2.88 Tf 40km L40 7 / 2 km L35 18-28 Tf 30km L45 NCEP 7.3 Tf T170(80km) L42 12km L60 T254(50km) L64 15.6 Tf TL611(40km) L42 8 km 28 Tf 2007: G 30km L km JMA Japan 0.768 Tf T106(120km) L40 20 / 10 km L40 TL319(60km) L42 6 Tf 5 km L50 20 Tf 2007: TL959(20km) L60 CMA China 0.384 Tf T213(60km) L31 25 km L20 3.84 Tf ? 15 km 2008: 5 km HMC Russia 35 Gf T85(150km) L31 75 km L30 T Tf ? T169(80km) L31

65 ECMWF: EQUIPMENT IN USE (end of 2003)
Computer equipment being readied for operational use

66 Central Computer System (CCS)
2500 TB 84 TB 2752 MB GHz Phase II 6/2004 1250 TB 42 TB 1408 MB GHz Phase I 9/2002 200 TB 30 TB 1216 MB MHz Current 2001 Tape Storage Disk Space Memory Processors Clock Speed Phase / Date But what are we going to do if we have not CCS?

67 Result of V.Galabov (Bulgaria) experiments
with different PC LINUX (Red Hat 7.3) PGI Workstation 4.0 (Portland Group Fortran and C++) HRM DWD (hydrostatic High Resolution Model) 93 x 73, 31 Layers, grid spacing (14 km), forecast for 48 hours AMD Duron 1300MHz Mb PC 133 SDRAM min AMD Athlon XP MHz Mb DDR266 RAM min Pentium GHz Mb DDR333 SDRAM min Intel Xeon Workstation 1 processor GHz Mb RDRAM PC min 2 processors 2.4 GHz Mb RDRAM PC min

68 program TestOMP end program TestOMP
integer k, n, tid, nthreads, max_threads, procs logical dynamic, dynamic double precision d (5000) ===== call gettim (hrs1,mins1,secs1,hsecs1) call getdat (year,month,day) max_threads = OMP_GET_MAX_THREADS() procs = OMP_GET_NUM_PROCS() dynamic = OMP_GET_DYNAMIC() nested = OMP_GET_NESTED() !$OMP PARALLEL PRIVATE (NTHREADS, tid, n, k) tid = OMP_GET_THREAD_NUM() nthreads = OMP_GET_NUM_THREADS() !$OMP DO SCHEDULE (STATIC, 5000) do n = 1 , 10000 do k = 1, 5000 d(k) = sin (dble(k+n))**2 + cos (dble(k+n))**2 end do !OMP END DO !$OMP END PARALLEL ===== call gettim (hrs2,mins2,secs2,hsecs2) call getdat (year,month,day) end program TestOMP

69 Pentium 4 3.06 GHz; 2 Gb DDR DIMM PC3200; 120 Gb Seagate
OS BIOS Compiler OpenMP Time Windows XP Threads DISABLE Visual Fortan 6.5 - 3.59 s Hyper Threadings 3.63 s Linux (Mandrake9.2) Intel Fortran 8.0 + & - + 2.38 s

70 The future (from E.Kalnay)
An amazing improvement in the quality of the forecasts based on NWP guidance. From the active research currently taking place, one can envision that the next decade will continue to bring improvements, especially in the following areas: Detailed short-range forecasts, using storm-scale models able to provide skillful predictions of severe weather. More sophisticated methods of data assimilation able to extract the maximum possible information from observing systems, especially remote sensors such as satellites and radars. Development of adaptive observing systems, where additional observations are placed where ensembles indicate that there is rapid error growth (low predictability). Improvement in the usefulness of medium-range forecasts, especially through the use of ensemble forecasting. Fully coupled atmospheric-hydrological systems, where the atmospheric model precipitation is appropriately downscaled and used to extend the length of river flow prediction. More use of detailed atmosphere-ocean-land coupled models, where the effect of long lasting coupled anomalies such as SST and soil moisture anomalies leads to more skillful predictions of anomalies in weather patterns beyond the limit of weather predictability (about two weeks). More guidance to government and the public on areas such as air pollution, UV radiation and transport of contaminants, which affect health. An explosive growth of systems with emphasis on commercial applications of NWP, from guidance on the state of highways to air pollution, flood prediction, guidance to agriculture, construction, etc.

71 Г.С.Ривин Rivin@ict.nsc.ru www.ict.nsc.ru
Томск Современные вычислительные технологии в задачах прогноза погоды Г.С.Ривин

72 COMPONENTS of FORECAST SYSTEM
Observing system Telecommunication system Computer system Data assimilation Model Postprocessing


Download ppt "Томск Современные вычислительные технологии в задачах прогноза погоды"

Similar presentations


Ads by Google