Presentation is loading. Please wait.

Presentation is loading. Please wait.

Implementation and Plans for TIGGE at NCAR and ECMWF

Similar presentations


Presentation on theme: "Implementation and Plans for TIGGE at NCAR and ECMWF"— Presentation transcript:

1 Implementation and Plans for TIGGE at NCAR and ECMWF
Douglas Schuster, Steven Worley, Nathan Wilhelmi NCAR Baudouin Raoult, Manuel Fuentes, Jorg Urban ECMWF 4/27/2019

2 Archive Center Data Collection
Outline Archive Center Data Collection Data Discovery, Access, And Distribution Analysis Tools Future User Services Summary 4/27/2019

3 Archive Center Data Collection
Participating Archive Centers National Center for Atmospheric Research (NCAR) European Center for Medium Range Weather Forecasts (ECMWF) China Meteorological Administration (CMA) Data Collection Mechanism Unidata’s Internet Data Distribution/Local Data Manager system (IDD/LDM) History of providing similar functionality in delivering National Center for Environmental Prediction (NCEP) model data to the university community. 4/27/2019

4 Archive Center Data Collection
Current Data Providers Future Data Providers Australia China Canada Brazil Korea France Center Startdate ECMWF 10/1/2006 United Kingdom (UKMO) Japan (JMA) NCEP 11/1/2006 4/27/2019

5 Archive Center Data Collection
TIGGE Parameters Available on 4 level types Single Level (includes surface) Pressure Level (1000, 925, 850, 700, 500, 300, 250, 200, 50 hPa) 50 hPa level only includes geopotential height Isentropic Level (320K) Potential Vorticity Level (2 PVU) 4/27/2019

6 Archive Center Data Collection
TIGGE Archive at NCAR Model output stored as forecast files for each data provider Each file contains all parameters and ensemble members for a level type and forecast time Complete archive maintained on the Mass Store System as part of the CISL Research Data Archive (RDA). TIGGE Archive at ECMWF Data archived through ECMWF’s MARS system. Stores individual GRIB messages. 4/27/2019

7 Archive Center Data Collection
Summary of Data Providers 4/27/2019

8 Archive Center Data Collection
Pressure Level Parameters 4/27/2019

9 Archive Center Data Collection
Isentropic Level Parameters Potential Vorticity Level Parameters 4/27/2019

10 Archive Center Data Collection
Single Level Parameters 4/27/2019

11 Archive Center Data Collection
Single Level Parameters cont. 4/27/2019

12 Data Discovery, Access, and Distribution
NCAR’s TIGGE Web Portal ( Users can search, discover, and download forecast files Select data by initialization date/time, data provider, parameter level, and forecast time Each file contains all parameters and ensemble members for a given level type (sl, pl, pv, pt) Contains most recent 2-3 weeks of model output 48 hour data access delay from model init time 4/27/2019

13 Data Discovery Access and Distribution
4/27/2019

14 Data Discovery Access and Distribution
4/27/2019

15 Data Discovery Access and Distribution
NCAR Research Data Archive The RDA dataset enables: Access to the complete TIGGE archive. Easy use of the TIGGE archive on NCAR CISL computers – data coming directly from the Mass Store System (MSS) NCAR computing accounts available upon request Support staff to handle requests for offline data Forecast file structure identical to online files 4/27/2019

16 Data Discovery Access and Distribution
4/27/2019

17 Data Discovery Access and Distribution
4/27/2019

18 4/27/2019

19 Analysis Tools User Analysis and Basic Data Manipulation Tools
WMO GRIB2 is relatively new, so tools are immature Forecasts with ensemble members add another dimension Improvements underway in: NCAR Command Language (NCL)/Python GEMPACK, GRIB-Java (Unidata) NOAA tools (wgrib2, GRIB-2 software libraries, etc) ECMWF GRIB API 4/27/2019

20 Future User Services TIGGE Portal 2007 Improvements
Develop streaming download for multiple files Upgrade to handle subset data requests through a simple interface (e.g. parameter selection) Provide user selected grid interpolation across multiple models Add spatial sub-setting functionality Include subscription services for recurring requests Provide web services for automated requests Common interface for NCAR, ECMWF, and CMA 4/27/2019

21 Summary ECMWF and NCAR now archiving data from 4 providers (ECMWF, NCEP, UKMO, JMA) 6 more data providers plan to come online over 2007 (Australia, Brazil, Canada, China, Korea, France) The TIGGE archive and access system designed to accommodate irregularity between providers. Desired 2007 Improvements include: Streaming download for multiple forecast files. User specified sub-set/grid interpolation requests across multiple models. A common web service interface for CMA, ECMWF, and NCAR for automated requests. 4/27/2019

22 TIGGE Portal / Web Services Architecture
Java 5 based implementation. Developed with the Spring application framework. Web Portal Interface Point and click interface allows the selection and downloading of forecast dataset. Web Services REST based Web Services to allow clients to discover and download data. Provides an interface for automated clients to download data in large volumes and at regular intervals. 4/27/2019

23 REST = Representational State Transfer.
REST Web Services REST = Representational State Transfer. REST is not a standard for web services, it is a style of web service. REST is built on existing standards (HTTP/URL/XML). Requests are sent as valid HTTP requests using the appropriate verbs (GET, POST, PUT, DELETE). Responses are returned service specific XML documents. Since REST is not a standard it avoids some of the pitfalls and problems associated with SOAP based web service interoperability. In this particular application it allows for returning complex data types that are not inherently bound to implementation toolkit/language. 4/27/2019

24 HTTP Request/Response HTTP/XML Rest Request/Response
TIGGE Data Selection/Download Architecture Tomcat Engine MySql TIGGE Forecast Metadata Database Browser Client HTTP Request/Response TIGGE Web Portal Interface Forecast Metadate Database Access Forecast Catalog TIGGE Service Web Service Client HTTP/XML Rest Request/Response TIGGE Web Service Interface GRIB2 Forecast Data Files 4/27/2019

25 HTTP Request/Response HTTP/XML Rest Request/Response
TIGGE Data Subsetting Architecture Tomcat Engine MySql TIGGE Subset Request Database Browser Client HTTP Request/Response TIGGE Web Portal Interface Forecast Metadate Database Access Forecast Catalog TIGGE Service Web Service Client HTTP/XML Rest Request/Response TIGGE Web Service Interface Perl Server Embedded HTTP Client Data Extraction Engine 4/27/2019

26 Archive Center Data Collection
Current TIGGE Data Ingest Status Data is coming from three separate IDD/LDM systems ECMWF (Data Provider = ECMWF, UKMO, JMA) Overwhelmingly the largest CONDUIT (NCEP), maintained by Unidata CPTEC (Brazil) ECMWF (UKMO) JMA CPTEC NCEP Unidata Server NCAR 4/27/2019

27 Archive Center Data Collection
Hourly Volume Receipt from IDD EXP Stream 4/27/2019

28 Archive Center Data Collection
Hourly Volume Receipt from IDD CONDUIT Stream 4/27/2019

29 Archive Center Data Collection
Hourly Cumulative Volume Data Summary 4/27/2019

30 Archive Center Data Collection
Highlights Succeeded to reach target data rate (10 GB/hr) – Oct. 2005 Build up supporting systems and software around IDD/LDM Develop archiving procedures and implement file management for the TIGGE portal. Begin realistic daily test data flow – April 2006 Initiate operational data collection October 1, 2006 4/27/2019

31 Sub-setting and Grid Interpolation
Future User Services Sub-setting and Grid Interpolation Users will need more than file downloads at model center native resolution Specify parameter, temporal, and spatial sub-setting across all models Requires verified software from each data provider Must build and run effectively on local hardware 4/27/2019

32 TIGGE - Who, What, Why? WMO THORPEX Interactive Grand Global Ensemble (TIGGE) Foster multi-model ensemble studies. Improve accuracy of 1 to 14 day high-impact weather forecasts. Up to 10 operational centers contributing ensemble data. Two Phases 1: Three central Archive Centers China Meteorological Administration (CMA) European Center for Medium-Range Weather Forecasts (ECMWF) National Center for Atmospheric Research (NCAR) 2: Widely Distributed Access (not discussed today) 4/27/2019

33 Archive Center Data Collection
Cooperative Support Team for Data Ingest and Highlights ECMWF – Raoult, Fuentes Optimally tune ECWMF and NCAR systems to work together Develop protocols to ensure complete transfer (e.g. grid manifests) Unidata – Yoksas Expert advise on IDD/LDM VETS – Brown Installed, configured/re-configured, monitored IDD/LDM DSG – Arnold System administration NETS – Mitchell TCP packet analysis and advise 4/27/2019

34 Current and Future Challenges
Transition from Dataportal to Ultra-zone was not seamless Machine differences perturbed a number of adjustments Clock synchronization with ECMWF TCP Cache, and File IO settings LDM configuration Things are running well now. 4/27/2019

35 Data transport by Unidata’s IDD/LDM application
TIGGE NCAR Data Received Data transport by Unidata’s IDD/LDM application Tuned to a maximum of about 10 GB/Hour (optimum rate between ECMWF and NCAR) Transfer packets are individual GRIB2 messages (single 2D fields) Current receipt metrics, 172 GB/Day, 809K fields Data organization and storage Fields are combined into forecast files organized by: Initialization time, data provider, level type, and forecast time step All ensemble members are included in each forecast file Most current three-week period kept online (4-6TB) Long-term times series archived on the NCAR Mass Storage System This file based approach contrasts with the ECMWF MARS approach Optimum rate => no data loss and no need, in general, for resend requests between ECMWF and NCAR Level type = single level, pressure level, potential vorticity level, and potential temperature level 4/27/2019

36 Current participation
TIGGE NCAR Data Providers Current participation * N200, Reduced Gaussian There are 71 ‘standard’ grids, varying levels of compliance now Grids = 26 surface or single level + 45 pressure level Future contributors Australia, Brazil, Canada, China, Korea, and France Archive System is designed to handle varying forecasts per day, output resolution, and numbers of ensemble members Data Provider Compliant Grids Forecasts/Day (Resolution) Ensemble Members ECMWF 70 2 (N200*) 51 UK Met Office 62 2 (1.25x2/3˚) 24 JMA 48 1 (1.25x1.25˚) NCEP 41 4 (1.0x1.0˚) 15 Compliant Grids = The Archive Centers are prepare for non-full compliant starts and growing compliance with time. There are few supplementary grids, e.g. U, V, T on PV surface and PV on 320K isentropic surface 4/27/2019

37 Design and implement user registration
TIGGE NCAR Next Steps Design and implement user registration Simple online form, agree to research and education work only Default, 48 hour delay (from forecast init. time) before access Real-time access granted by International Program Office E.g. Special field project support Open user access Initially, only file download through a portal interface (to be activated early November) User Software, a challenge because GRIB2 is new Put examples and resource pointers into portal (end November) Improvements will be posted as they become available User defined subsets and regridding across multiple models Development depends on securing additional funding Bring new Data Providers online as they become ready 4/27/2019

38 Data Discovery Access and Distribution
4/27/2019

39 Archive Center Data Collection
Summary of Data Provider Forecasts Center- Model pf Cf fc Len. (h) pl inc pt inc pv inc sl inc Daily Init Times ecmf-glob 50 1 240 6 00,12 246 – 360 egrr-glob 23 rjtd-glob 216 12 6* kwbc-glob 14 384 00,06,12, 18 4/27/2019


Download ppt "Implementation and Plans for TIGGE at NCAR and ECMWF"

Similar presentations


Ads by Google