Ensemble Handling in GrADS

Slides:



Advertisements
Similar presentations
Jennifer M. Adams and Brian Doty IGES/COLA
Advertisements

ECMWF June 2006Slide 1 Access to ECMWF data for Research Manuel Fuentes Data and Services Section, ECMWF ECMWF Forecast Products User Meeting.
New Resources in the Research Data Archive Doug Schuster.
Climate Analytics on Global Data Archives Aparna Radhakrishnan 1, Venkatramani Balaji 2 1 DRC/NOAA-GFDL, 2 Princeton University/NOAA-GFDL 2. Use-case 3.
A Unified Data Model and Programming Interface for Working with Scientific Data Doug Lindholm Laboratory for Atmospheric and Space Physics University of.
TIGGE, GRIB to NetCDF converter Doug Schuster (NCAR/ECMWF)
Brian Doty and Jennifer Adams
Preparing CMOR for CMIP6 and other WCRP Projects
NOS GIS Team Enabling the Transition of CPC Products to GIS Format: GrADS/GIS Enabling the Transition of CPC Products to GIS Format: GrADS/GIS Viviane.
ICOADS Archive Practices at NCAR JCOMM ETMC-III 9-12 February 2010 Steven Worley.
HDF5 OPeNDAP Project Update and Demo MuQun Yang and Hyo-Kyung Lee (The HDF Group) James Gallagher (OPeNDAP, Inc.)
GrADS 1.9 and the GrADS-DODS Server Jennifer Adams, Brian Doty, Joe Wielgosz Center for Ocean-Land-Atmosphere Studies (COLA) AMS/IIPS 13 January 2004.
An Update on GrADS and the GDS and their Application to a Searchable Metadata Catalog Jennifer Miletta Adams IGES/COLA.
CS April 2007 Models in Potato Pest Management (LateBlight) Client: Professor William Fry Sharmin Azam Christopher Brickley Nathan Cormier Ledet.
The International Surface Pressure Databank (ISPD) and Twentieth Century Reanalysis at NCAR Thomas Cram - NCAR, Boulder, CO Gilbert Compo & Chesley McColl.
Активное распределенное хранилище для многомерных массивов Дмитрий Медведев ИКИ РАН.
Desktop Weather-Forecast System Jim Kinter, Mike Fennessy, Brian Doty and J. Shukla Center for Ocean-Land-Atmosphere Studies.
Introduction Downloading and sifting through large volumes of data stored in differing formats can be a time-consuming and sometimes frustrating process.
EGU 2011 TIGGE, TIGGE LAM and the GIFS T. Paccagnella (1), D. Richardson (2), D. Schuster(3), R. Swinbank (4), Z. Toth (3), S.
Quick Unidata Overview NetCDF Workshop 25 October 2012 Russ Rew.
Thiago Quirino and S.G. Gopalakrishnan, AOML/NOAA Hurricane Research Division, Miami, FL Thiago Quirino and S.G. Gopalakrishnan, AOML/NOAA.
Presented by The Earth System Grid: Turning Climate Datasets into Community Resources David E. Bernholdt, ORNL on behalf of the Earth System Grid team.
GrADS: Essential Component of COLA’s Cyberinfrastructure Brian Doty Jennifer Adams.
Introduction to Hands On Training in CORDEX South Asia Data Analysis
THREDDS Data Server Ethan Davis GEOSS Climate Workshop 23 September 2011.
The IRI Climate Data Library: translating between data cultures Benno Blumenthal International Research Institute for Climate Prediction Columbia University.
Slide 1 TIGGE phase1: Experience with exchanging large amount of NWP data in near real-time Baudouin Raoult Data and Services Section ECMWF.
Accomplishments and Remaining Challenges: THREDDS Data Server and Common Data Model Ethan Davis Unidata Policy Committee Meeting May 2011.
HDF5 OPeNDAP Project Update and Demo MuQun Yang and Hyo-Kyung Lee (The HDF Group) James Gallagher (OPeNDAP, Inc.) 1HDF and HDF-EOS Workshop XII10/17/2008.
DAP4 James Gallagher & Ethan Davis OPeNDAP and Unidata.
Geneva 2-3 December 2011 Proposal for a sub-seasonal research data set.
The New Zealand Institute for Plant & Food Research Limited Use of Cloud computing in impact assessment of climate change Kwang Soo Kim and Doug MacKenzie.
Supporting HDF5 in GrADS Jennifer M. Adams and Brian E. Doty IGES/COLA.
MedCOF-1, Belgrade Serbia, November 2013 The CNR-ISAC monthly forecasting system D. Mastrangelo, P. Malguzzi, A. Buzzi Bologna, Italy Institute of.
GDS – The GrADS/DODS Server Jim Kinter Center for Ocean-Land-Atmosphere Studies (COLA) NVODS Workshop 10 September 2003.
CKD Workshop  30 March 2011  Jim Kinter  Data Lessons from Project Athena GrADS Station Data Model Used for in situ observational data Individual reports.
APEC Climate Center Data Service System Chi-Yung Francis Tam APCC.
THREDDS Catalogs Ethan Davis UCAR/Unidata NASA ESDSWG Standards Process Group meeting, 17 July 2007.
Metadata Standards for Gridded Climate Data in the Earth System Grid Robert Drach LLNL/PCMDI UCRL-PRES
NQuery: A Network-enabled Data-based Query Tool for Multi-disciplinary Earth-science Datasets John R. Osborne.
The Unified Access Framework for Gridded Data … the 1 st year focus of NOAA’s Global Earth Observation Integrated Data Environment (GEO-IDE) Steve Hankin,
An Update on COLA’s Software Development Jennifer M. Adams and Brian Doty.
Slide 1 GO-ESSP Paris. June 2007 Slide 1 (TIGGE and) the EU Funded BRIDGE project Baudouin Raoult Head of Data and Services Section ECMWF.
HDF5 OPeNDAP Project Update and Demo MuQun Yang and Hyo-Kyung Lee (The HDF Group) James Gallagher (OPeNDAP, Inc.) 1HDF and HDF-EOS Workshop XII, Aurora,
The HDF Group Data Interoperability The HDF Group Staff Sep , 2010HDF/HDF-EOS Workshop XIV1.
Information Technology: GrADS INTEGRATED USER INTERFACE Maps, Charts, Animations Expressions, Functions of Original Variables General slices of { 4D Grids.
Using Model Climatology to Develop a Confidence Metric Taylor Mandelbaum, School of Marine and Atmospheric Sciences, Stony Brook, NY Brian Colle, School.
Predictability of High Impact Weather during the Cool Season: CSTAR Update and the Development of a New Ensemble Sensitivity Tool for the Forecaster Brian.
GrADS-DODS Server An open-source tool for distributed data access and analysis Joe Wielgosz, Brian Doty, Jennifer Adams COLA/IGES - Calverton, MD
TIGGE Archive Status at NCAR THORPEX Workshop and 6th GIFS-TIGGE Working Group Meetings WMO Headquarters Geneva September 2008 Steven Worley Doug.
Weathertop Consulting, LLC Server-side OPeNDAP Analysis – Concrete steps toward a generalized framework via a reference implementation using F-TDS Roland.
11/8/2007HDF and HDF-EOS Workshop XI, Landover, MD1 Software to access HDF5 Datasets via OPeNDAP MuQun Yang, Hyo-Kyung Lee The HDF Group.
LAS and THREDDS: Partners for Education Roland Schweitzer Steve Hankin Jonathan Callahan Joe Mclean Kevin O’Brien Ansley Manke Yonghua Wei.
TIGGE Archive Access at NCAR Steven Worley Doug Schuster Dave Stepaniak Hannah Wilcox.
Enabling the Transition of CPC Products to GIS Format Brian Doty Jennifer Adams Michael Halpert Viviane Silva.
1 2.5 DISTRIBUTED DATA INTEGRATION WTF-CEOP (WGISS Test Facility for CEOP) May 2007 Yonsook Enloe (NASA/SGT) Chris Lynnes (NASA)
C. Bruce Entwistle Science and Operations Officer Aviation Weather Center Kansas City, MO C. Bruce Entwistle Science and Operations.
NcBrowse: A Graphical netCDF File Browser Donald Denbo NOAA-PMEL/UW-JISAO
Unidata Infrastructure for Data Services Russ Rew GO-ESSP Workshop, LLNL
The TIGGE Model Validation Portal: An Improvement in Data Interoperability 1 Thomas Cram Doug Schuster Hannah Wilcox Michael Burek Eric Nienhouse Steven.
HDF5 OPeNDAP Project Update and Demo MuQun Yang and Hyo-Kyung Lee (The HDF Group) James Gallagher (OPeNDAP, Inc.) 1HDF and HDF-EOS Workshop XII, Aurora,
IRI/LDEO Climate Data Library M.Benno Blumenthal, Michael Bell, and John del Corral International Research Institute for Climate and Society Columbia University.
Spatial Data Activities at the Reading e-Science Centre
“It Slices, It Dices, and Area Subsets” SUMMARY AND FUTURE WORK
Jennifer Adams, Joe Wielgosz, Brian Doty, and Jim Kinter
TIGGE Archives and Access
TIGGE Data Archive and Access System at NCAR
(WCRP Seasonal Prediction Workshop) Applied Meteorology Group
ECMWF usage, governance and perspectives
Presentation transcript:

Ensemble Handling in GrADS Jennifer M. Adams Brian Doty IGES/COLA

What is GrADS? GrADS is an interactive tool that integrates data access, analysis, and visualization Handles many data formats: binary, NetCDF, HDF, GRIB1&2, BUFR Two data models for gridded and in situ data Expression handling is flexible, compact, recursive Programmable interface for scripting Written in C; code is open source (GPL)

A GrADS Graphics Example A nice graphic we trot out often for show and tell. Illustrates several of the graphics output types: contours, colorized vectors, wind barbs, colorized station marks Data are from a variety of sources: gridded satellite obs in HDF, gridded model output in GRIB, station obs in BUFR

What is the GrADS Data Server? GDS is a stable, secure, OPeNDAP data server that provides subsetting and server-side analysis services over the internet GDS can serve any GrADS-readable dataset, and unifies all data formats into a NetCDF framework Open a data set with http://servername/filename instead of /disk/filename GrADS and the GDS are a coupled software system. Where GrADS goes, GDS must follow.

News from GrADS/GDS Team GrADS has a 5th grid dimension for ensembles GrADS has a GRIB2 interface GDS can serve any GrADS data set GrADS is a client for all OPeNDAP data sets GrADS will support GIS-compatible outputs GrADS 2.0 and GDS 2.0 were both released in 2008.

The New Ensemble Dimension in GrADS A 5th grid dimension for ensemble members ‘set X, Y, Z, T, or E’ or ‘set lon, lat, lev, time, or ens’ A virtual dimension for forecast time offset ‘display temp(ft=2)’ ‘display temp(ftime=24hr)’

GrADS Metadata Requirements for Ensemble Members Unique name / number Initial time Length If GRIB2, some additional octet values One time axis spans all members All members must have common X, Y, Z axes GrADS does not constrain ensemble members to have the same initial time -- or even the same length. The hard requirements are that one time axis must span all members and the lat/lon/lev axes must be the same for all members.

GrADS GRIB2 Descriptor File Wesley’s g2ctl works very well, but doesn’t handle EDEF (yet). DSET /gens/prod/gefs.%iy4%im2%id2/%ih2/pgrb2a/ge%e.t%ih2z.pgrb2af%f2 TDEF 17 linear 00z09oct2008 6hr EDEF 23 avg 17 00z09oct2008 0 spr 17 00z09oct2008 2 c00 17 00z09oct2008 1,0 p01 17 00z09oct2008 3,1 p02 17 00z09oct2008 3,2 p03 17 00z09oct2008 3,3 p04 17 00z09oct2008 3,4 . . . p19 17 00z09oct2008 3,19 p20 17 00z09oct2008 3,20 ENDEDEF @ ens String avg Unweighted mean of all members @ ens String spr Standard deviation with respect to ensemble mean @ ens String c00 Control forecast @ ens String p01 Positively perturbed forecast The GRIB2 codes are octets 35 and 36 from Section 4 (PDT # 1, 2, 11, and 12) The output from grib2scan and gribmap provides some guidance for creating the EDEF statement.

Examples of Ensemble Data Sets NCEP GFS Ensembles (GENS) NCEP Climate Forecast System (CFS) NCEP Short Range Ensemble Forecasts (SREF) ESRL MRF Reforecasting Experiment WCRP CMIP3 Multi-Model Data (IPCC AR4) TIGGE Now that the ensemble handing is working for all data types including GRIB2, GrADS is ready to handle a lot of important 5D data sets.

Ensemble Data Sets Behind GDS Data become more usable and accessible Subsets over all dimensions Server-side analysis File aggregation Format translation Ensemble metadata standards: Putting 5-D data sets behind GDS makes the data even more usable and accessible for a variety of reasons (listed above). Developing standards for ensemble metadata has opened a can of worms. Our approach is to present metadata in such a way so that GrADS will understand it, and so that it doesn’t interfere with other clients.

Ensemble Forecast Time Series (Longitude, Latitude, and Level are fixed) Consider a forecast time series from 21 ensemble members, each drawn with a different color. The members all agree for the first 2-3 days, then begin to diverge in the 3-7 day period, and after that there is little coherence to the forecast. Forecast Time --->

Ensemble Forecast Grid (Longitude, Latitude, and Level are fixed) Ensemble Member Here is the same data drawn as a 2-D Time v. Ensemble plot Each row in the grid represents one of the colored lines drawn in the previous plot. Pixels are colored according to the data values. I can draw it this way because the ensembles are handled as the 5th dimension in my gridded forecast data set. The well-defined purple stripe is the event all members agree on at the beginning of the forecast, and the image gets noisier as the forecast evolves. Forecast Time --->

Ten Ensemble Forecasts (Longitude, Latitude, and Level are fixed) Ensemble Member Now we expand our ensemble set to include the same data from ten previous forecasts initialized at 12 hour intervals. Note the time axis has expanded to accommodate the shift in the temporal coverage of additional ensemble forecasts. You can see how the purple stripe feature has gradually coalesced over the five-day period from something incoherent and noisy into a well-defined event. Forecast Time --->

CFS Daily Hindcast (Longitude, Latitude, and Level are fixed) Ensemble Member This is the first two months of a daily CFS hindcast. Note the staggered start times and different lengths for ensemble members. Time Axis ---->

Ensemble Forecast Time Series (Longitude, Latitude, and Level are fixed) Back to a plot I showed earlier. If the data were behind GDS, it could take a long time to draw this because you would need to download 21 time series. This spaghetti-style drawing can be improved by exploiting GrADS analysis capabilities in the E dimension … Forecast Time --->

Ensemble Mean = tloop(ave(Z,e=2,e=23)) Ensemble Min/Max = tloop(min(Z,ens=c00,ens=p20)) +/- StdDev of Ensemble Mean = tloop(sqrt(ave(pow(Z-Zave,2),e=1,e=21))) You can get an even better display by doing 4 relatively simple calculations: the ensemble mean, the standard deviation of the ensemble mean, and the min/max of all the members. For GDS data sets, these derived quantities are calculated on the server side. 17 Mb operated on, 1 Kb downloaded: data requirements reduced by 5 orders of magnitude. Forecast Time --->

TIGGE Data Behind GDS at NCAR Perfect testbed for ensemble handling and GRIB2 interface Boost to usage of TIGGE data Forecasts sorted by date and by provider Time series of analyses Nearly unbearable load on old hardware 48-hour data embargo Int’l agreement requires password protection At dataportal.ucar.edu: ~250 Gb/day 5 Tb online 2-3 week window

TIGGE Multi-Member Multi-Model Ensemble 500mb Geopotential Height valid August 30, 2008 7-day Lead 5-day Lead 3-day Lead 1-day Lead Here’s an example of what can be done with TIGGE data behind GDS -- Each color is an ensemble mean forecast of 500mb Height from 8 different TIGGE providers. The white contours show the multi-model ensemble average (198 members). The four panels show the forecasts with the same valid time but with lead times of 1, 3, 5, and 7 days. In this example, we operated on 6.0 Gb of grib2 data, downloaded 3.3 Mb.

TIGGE MME Forecast Error and Ensemble Spread 500mb Geopotential Height valid August 30, 2008 7-day Lead 5-day Lead The gray-shaded background field is the standard deviation of the multi-model ensemble mean. The white blobs are areas where the models disagree, I.e., where the colored contours in the previous panel diverge. Note how these “clouds of variance” dissipate as lead time shrinks. The colored contours show the absolute value of the forecast error, which is the multi-model ensemble mean minus the analysis (the mean of the 00hr forecasts from the control run of each model). 3-day Lead 1-day Lead

TIGGE Forecasts of Hurricane Ike valid: 12z 9 Sep - 00z 13 Sep init: 00z 8 Sep init: 12z 8 Sep init: 00z 9 Sep These are predicted tracks for hurricane Ike from four different TIGGE models (one color per model). The tracks are created by connecting the dots that mark the location of the sea level pressure minimum at each time step within the valid date range (a 3.5-day period). Multiple tracks in each panel are from individual ensemble members. Each row represents a different initialization date, with later dates towards the bottom. Models: RED = China CMA ORANGE = Canada CMC GREEN = ECMWF BLUE = NCEP init: 12z 9 Sep