S ITE R EPORT : L AWRENCE B ERKELEY N ATIONAL L ABORATORY J OERG M EYER

Slides:



Advertisements
Similar presentations
Designing Services for Grid-based Knowledge Discovery A. Congiusta, A. Pugliese, Domenico Talia, P. Trunfio DEIS University of Calabria ITALY
Advertisements

Kensington Oracle Edition: Open Discovery Workflow Meets Oracle 10g Professor Yike Guo.
NG-CHC Northern Gulf Coastal Hazards Collaboratory Simulation Experiment Integration Sandra Harper 1, Manil Maskey 1, Sara Graves 1, Sabin Basyal 1, Jian.
DOE Global Modeling Strategic Goals Anjuli Bamzai Program Manager Climate Change Prediction Program DOE/OBER/Climate Change Res Div
What do we currently mean by Computational Science? Traditionally focuses on the “hard sciences” and engineering –Physics, Chemistry, Mechanics, Aerospace,
Sandy Landsberg DOE Office of Science Advanced Scientific Computing Research September 20, 2012 DOE / ASCR Interests & Complex Engineered Networks Applied.
Earth System Curator Spanning the Gap Between Models and Datasets.
University of Chicago Department of Energy The Parallel and Grid I/O Perspective MPI, MPI-IO, NetCDF, and HDF5 are in common use Multi TB datasets also.
Hank Childs Lawrence Berkeley National Laboratory /
1 Slides presented by Hank Childs at the VACET/SDM workshop at the SDM Center All-Hands Meeting. November 26, 2007 Snoqualmie, Wa Work performed under.
ASCR Data Science Centers Infrastructure Demonstration S. Canon, N. Desai, M. Ernst, K. Kleese-Van Dam, G. Shipman, B. Tierney.
U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
BER Long Term Measures As discussed at the last BERAC meeting with Joel Parriott (OMB) and Bill Valdez (DOE/SC) BERAC is on the hook for evaluating BER’s.
Aug 9-10, 2011 Nuclear Energy University Programs Materials: NEAMS Perspective James Peltz, Program Manager, NEAMS Crosscutting Methods and Tools.
Priority Research Direction (I/O Models, Abstractions and Software) Key challenges What will you do to address the challenges? – Develop newer I/O models.
Scientific Grand Challenges Workshop Series: Challenges in Climate Change Science and the Role of Computing at the Extreme Scale Warren M. Washington National.
O AK R IDGE N ATIONAL L ABORATORY U.S. D EPARTMENT OF E NERGY Global Climate Modeling Research John Drake Computational Climate Dynamics Group Computer.
Deploying a Petascale-Capable Visualization and Analysis Tool April 15, 2010.
1 NERSC User Group Business Meeting, June 3, 2002 High Performance Computing Research Juan Meza Department Head High Performance Computing Research.
Panel Summary Andrew Hanushevsky Stanford Linear Accelerator Center Stanford University XLDB 23-October-07.
WORKFLOWS IN CLOUD COMPUTING. CLOUD COMPUTING  Delivering applications or services in on-demand environment  Hundreds of thousands of users / applications.
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
Welcome to HTCondor Week #14 (year #29 for our project)
Role of Deputy Director for Code Architecture and Strategy for Integration of Advanced Computing R&D Andrew Siegel FSP Deputy Director for Code Architecture.
Slide 1 Auburn University Computer Science and Software Engineering Scientific Computing in Computer Science and Software Engineering Kai H. Chang Professor.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, ADVANCED SCIENTIFIC COMPUTING RESEARCH An.
, Increasing Discoverability and Accessibility of NASA Atmospheric Science Data Center (ASDC) Data Products with GIS Technology ASDC Introduction The Atmospheric.
, Implementing GIS for Expanded Data Accessibility and Discoverability ASDC Introduction The Atmospheric Science Data Center (ASDC) at NASA Langley Research.
Visualization Group March 8 th, Visualization Group Permanent staff: –Wes Bethel (group leader) –John Shalf, Cristina Siegerist, Raquel Romano Collaborations:
Extreme scale parallel and distributed systems – High performance computing systems Current No. 1 supercomputer Tianhe-2 at petaflops Pushing toward.
The Eyeblaster ACM Advertising Campaign Management.
VACET: Deploying Technology for Visualizing and Analyzing Astrophysics Simulations Author May 19, 2009.
IPlant Collaborative Tools and Services Workshop iPlant Collaborative Tools and Services Workshop Collaborating with iPlant.
Efficient Visualization and Analysis of Very Large Climate Data Hank Childs, Lawrence Berkeley National Laboratory December 8, 2011 Lawrence Livermore.
The Future of the iPlant Cyberinfrastructure: Coming Attractions.
N ATIONAL E NERGY R ESEARCH S CIENTIFIC C OMPUTING C ENTER 1 NERSC Visualization Greenbook Workshop Report June 2002 Wes Bethel LBNL.
IPlant Collaborative Hands-on Cyberinfrastructure Workshop – Part 2 R. Walls University of Arizona Biodiversity Information Standards (TDWG) Sep. 29, 2015,
ASCAC-BERAC Joint Panel on Accelerating Progress Toward GTL Goals Some concerns that were expressed by ASCAC members.
Issues Autonomic operation (fault tolerance) Minimize interference to applications Hardware support for new operating systems Resource management (global.
NIH Resource for Biomolecular Modeling and Bioinformatics Beckman Institute, UIUC NAMD Development Goals L.V. (Sanjay) Kale Professor.
Presented by Scientific Data Management Center Nagiza F. Samatova Network and Cluster Computing Computer Sciences and Mathematics Division.
Land Ice Verification and Validation (LIVV) Kit Weak scaling behavior for a large dome- shaped test case. It shows that the scaling behavior of a new run.
CCGrid, 2012 Supporting User Defined Subsetting and Aggregation over Parallel NetCDF Datasets Yu Su and Gagan Agrawal Department of Computer Science and.
VAPoR: A Discovery Environment for Terascale Scientific Data Sets Alan Norton & John Clyne National Center for Atmospheric Research Scientific Computing.
1 DOE Office of Science October 2003 SciDAC Scientific Discovery through Advanced Computing Alan J. Laub.
Computational Science & Engineering meeting national needs Steven F. Ashby SIAG-CSE Chair March 24, 2003.
Large Scale Time-Varying Data Visualization Han-Wei Shen Department of Computer and Information Science The Ohio State University.
1 OFFICE OF ADVANCED SCIENTIFIC COMPUTING RESEARCH The NERSC Center --From A DOE Program Manager’s Perspective-- A Presentation to the NERSC Users Group.
August 2003 At A Glance The IRC is a platform independent, extensible, and adaptive framework that provides robust, interactive, and distributed control.
1 Accomplishments. 2 Overview of Accomplishments  Sustaining the Production Earth System Grid Serving the current needs of the climate modeling community.
Presented by Adaptive Hybrid Mesh Refinement for Multiphysics Applications Ahmed Khamayseh and Valmor de Almeida Computer Science and Mathematics Division.
Supercomputing 2006 Scientific Data Management Center Lead Institution: LBNL; PI: Arie Shoshani Laboratories: ANL, ORNL, LBNL, LLNL, PNNL Universities:
The Performance Evaluation Research Center (PERC) Participating Institutions: Argonne Natl. Lab.Univ. of California, San Diego Lawrence Berkeley Natl.
NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions and services that benefit.
A WEB-ENABLED APPROACH FOR GENERATING DATA PROCESSORS University of Nevada Reno Department of Computer Science & Engineering Jigar Patel Sergiu M. Dascalu.
Axis AI Solves Challenges of Complex Data Extraction and Document Classification through Advanced Natural Language Processing and Machine Learning MICROSOFT.
Figure 3. Overview of system architecture for RCMES. A Regional Climate Model Evaluation System based on Satellite and other Observations Peter Lean 1.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Toward High Breakthrough Collaboration (HBC) Susan Turnbull Program Manager Advanced Scientific Computing Research March 4, 2009.
FACULTY EXTERNSHIP OPPORTUNITIES IN DATA SCIENCE AND DATA ANALYTICS Facilitated by: FilAm Software Technology, Clark Freeport Zone Ecuiti, San Francisco,
One Team: Relevant... Ready... Responsive... Reliable Basic Research Program Particle-Scale Distribution of Soil Moisture in Porous Media 24 January 2007.
EGI-InSPIRE RI EGI Compute and Data Services for Open Access in H2020 Tiziana Ferrari Technical Director, EGI.eu
Leverage Big Data With Hadoop Analytics Presentation by Ravi Namboori Visit
VisIt Project Overview
Electron Ion Collider New aspects of EIC experiment instrumentation and computing, as well as their possible impact on and context in society (B) COMPUTING.
DOE Facilities - Drivers for Science: Experimental and Simulation Data
A Web-enabled Approach for generating data processors
Oscar AP by Massive Analytic: A Precognitive Analytics Platform for Effortless Data-Driven Decisions. Now Available in Azure Marketplace MICROSOFT AZURE.
Improve Patient Experience with Saama and Microsoft Azure
Presentation transcript:

S ITE R EPORT : L AWRENCE B ERKELEY N ATIONAL L ABORATORY J OERG M EYER

LBNL Visualization Group Introduction Wes Bethel (Vis Group Lead) Recent hires: Hari Krishnan Burlen Loring Joerg Meyer Oliver Rübel (Computer Systems Engineers)

LBNL Visualization Base Program Problem(s): scientific discovery hindered by sheer size and complexity of data. Approach: R&D in areas of parallelism in vis/analysis, feature- based filtering and analysis, query- driven visualization and analysis. Impact: new understanding of technologies/architectures for running vis and analysis algorithms at highest- ever levels of concurrency on advanced platforms, many specific impacts on science through ongoing collaborations. Images: (top) results of 216K-way parallel run of new hybrid-parallel technology on JaguarPF on a 3TB dataset. (Bottom) globe display in LBNL booth at SC conference.

SciDAC Visualization and Analytics Center for Enabling Technology Problem. Need for production-quality, petascale-capable visualization s/w infrastructure to enable scientific knowledge discovery. Approach. Blend of R&D, software engineering, and direct application interactions to produce high quality visualization software and tools. Impact. Delivering high-quality, petascale-capable vis s/w, numerous direct impacts on science applications (knowledge discovery, cost savings, etc.) Images. (Top) VisIt generated image of 2T zone data on 32K cores of Franklin. (Bottom) VisIt image of type II SN simulation.

Visualization and Analysis of Ultra-scale Climate Data Problem. Traditional climate vis and analysis tools won’t work on increasing large and complex climate simulation data. Approach. Combination of leverage/extend scalable vis s/w, and develop new analysis tools to study specific classes of climate science problems. Impact. New s/w that directly meets needs of climate science community, esp. wrt upcoming IPCC assessment. Images. (Top) Our team has developed s/w for detecting and analyzing atmospheric rivers using supervised machine learning. (Bottom) Our team applies statistical models to predict seasonal, regional precipitation extremes.

DOE/BER: Climate Modeling Visualization and Analysis support of Global Cloud Resolving Models Data Models for geodesic unstructured grid Parallel plugin development in VisIt —Profiling and benchmarking of I/O VisIt Climate skin development for ease of use —Custom plots and operators

SciDAC-e: Math and Geometric Tools for Energy-Related Gas Separations Problem. A limiting factor in characterizing porous media for use in carbon capture is the lack of tools for structural analysis. Approach. Apply Fast Marching methods to find voids and isolated regions, which can then be quantitatively analyzed. Develop highly scalable codes to facilitate real-time analysis. Impact. Help advance materials understanding to achieve better carbon capture, sequestration. Images. These three show the propagation of a front through 3D, revealing channels (green) and inaccessible voids, which are then eliminated from use in future simulations evaluating different carbon capture/sequestration strategies.

SciDAC-e: Vis and Analysis for Nanoscale Control of CO2 Problem. CO2 sequestration strategy evaluation needs better tools for analyzing experimental data. Approach. Develop new 3D image analysis and surface reconstruction software tools. Impact. Reconstruction and analysis from experimental data provide data critical for accurate CO2 sequestation modeling. Images. (Top) 2D slice of micromodel (observed). (Middle) 3D segmentation/surface reconstruction of porous media. (Bottom) CO2 flow rates computed by simulation.

NEAMS Purpose: provide visualization support to reactor campaign of NEAMS (Nuclear Engineering Advanced Modeling and Simulation), specially Nek5000 code, which does thermal hydraulics simulations. Approach: deploying and customizing VisIt for Nek5000 community Key aspects: flow analysis, large unstructured meshes, in situ processing

DOE/ASCR Exascale SDM Project ExaHDF5: An I/O Platform for Exascale data models, analysis and performance Taking HDF5 to the Exascale —Removing collective operations, Fault tolerance, Auto-tuning, Asynchronous I/O —Benefits all HDF5 and NetCDF-4 users Taking FastBit to the Exascale —Index/Query on distributed multi-core platforms —Operates on generic NetCDF and HDF5 files Designing easy to use data models for climate, groundwater and accelerator modeling —Hide complexity of underlying file system —H5hut, GCRM

NSF/RDAV Recent accomplishments Vis software deployment on Nautilus + Free, interactive parallel visualization and graphical analysis tool for viewing scientific data Parallel, open-source, multi-platform data analysis and visualization application Ver now available! Ver now available!

Joerg Meyer Questions? Additional information: LBNL Vis Group: