Current Progress on the CCA Groundwater Modeling Framework Bruce Palmer, Yilin Fang, Vidhya Gurumoorthi, Computational Sciences and Mathematics Division.

Slides:



Advertisements
Similar presentations
SDMX in the Vietnam Ministry of Planning and Investment - A Data Model to Manage Metadata and Data ETV2 Component 5 – Facilitating better decision-making.
Advertisements

Pore-scale modeling of reactive and non-reactive transport: Upscaling and multiscale hybrid modeling Timothy Scheibe Pacific Northwest National Laboratory.
1 Coven a Framework for High Performance Problem Solving Environments Nathan A. DeBardeleben Walter B. Ligon III Sourabh Pandit Dan C. Stanzione Jr. Parallel.
Using Kepler to Perform Parameter Studies in Subsurface Sciences Jared Chase Scientific Data Management CET All Hands Meeting 11/28/2007
Coupling Continuum Model and Smoothed Particle Hydrodynamics Methods for Reactive Transport Yilin Fang, Timothy D Scheibe and Alexandre M Tartakovsky Pacific.
Presented by Scalable Systems Software Project Al Geist Computer Science Research Group Computer Science and Mathematics Division Research supported by.
Office of Science U.S. Department of Energy Grids and Portals at NERSC Presented by Steve Chan.
September 19, 2011 Simulation of High Power Mercury Jet Targets using Smoothed Particle Hydrodynamics Roman Samulyak, Tongfei Guo AMS Department, Stony.
SIMULATION. Simulation Definition of Simulation Simulation Methodology Proposing a New Experiment Considerations When Using Computer Models Types of Simulations.
Chapter 10 Application Development. Chapter Goals Describe the application development process and the role of methodologies, models and tools Compare.
TPAC Digital Library Talk Overview Presenter:Glenn Hyland Tasmanian Partnership for Advanced Computing & Australian Antarctic Division Outline: TPAC Overview.
ADLB Update Recent and Current Adventures with the Asynchronous Dynamic Load Balancing Library Rusty Lusk Mathematics and Computer Science Division Argonne.
Loads Balanced with CQoS Nicole Lemaster, Damian Rouson, Jaideep Ray Sandia National Laboratories Sponsor: DOE CCA Meeting – January 22, 2009.
An Extensible Python User Environment Jeff Daily Karen Schuchardt, PI Todd Elsethagen Jared Chase H41G-0956 Website Acknowledgements.
Role of Deputy Director for Code Architecture and Strategy for Integration of Advanced Computing R&D Andrew Siegel FSP Deputy Director for Code Architecture.
1 CCOS Seasonal Modeling: The Computing Environment S.Tonse, N.J.Brown & R. Harley Lawrence Berkeley National Laboratory University Of California at Berkeley.
U.S. Department of Energy Office of Science Office of Biological & Environmental Research Biological and Environmental Research Advisory Committee Roland.
Virtual Geophysics Laboratory Exploiting the Cloud and Empowering Geophysicists Ryan Fraser, Terry Rankine, Lesley Wyborn, Joshua Vote, Ben Evans. Presented.
SAN DIEGO SUPERCOMPUTER CENTER HDF5/SRB Integration August 28, 2006 Mike Wan SRB, SDSC Peter Cao
Jonathan Carroll-Nellenback University of Rochester.
Hybrid and Multiscale Modeling of Subsurface Flow and Transport Processes Mesa C Organizers: Timothy Scheibe (PNNL) Daniel Tartakovsky (UCSD) 1.
POOMA 2.4 Progress and Plans Scott Haney, Mark Mitchell, James Crotinger, Jeffrey Oldham, and Stephen Smith October 22, 2001 Los Alamos National Laboratory.
Coupling Parallel Programs via MetaChaos Alan Sussman Computer Science Dept. University of Maryland With thanks to Mike Wiltberger (Dartmouth/NCAR)
Results Based Management: Logical Framework Matrix (LFM) December 30 th, 2009 Abeer Shakweer, Ph.D., Planning and Monitoring Manager Science and Technology.
SciDAC Projects: Groundwater Tim Scheibe PNNL-SA
Mathematics and Computer Science & Environmental Research Divisions ARGONNE NATIONAL LABORATORY Regional Climate Simulation Analysis & Vizualization John.
CCGrid 2014 Improving I/O Throughput of Scientific Applications using Transparent Parallel Compression Tekin Bicer, Jian Yin and Gagan Agrawal Ohio State.
1 Modeling Needs and Considerations for Energy Efficiency Ex Ante and Ex Post Savings Estimates Workshop: Energy Modeling Tools and their Applications.
High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative.
Parallel I/O Performance: From Events to Ensembles Andrew Uselton National Energy Research Scientific Computing Center Lawrence Berkeley National Laboratory.
Component-Based Implementation of STOMP Yilin Fang Bruce Palmer Pacific Northwest National Laboratory Silver Spring, July 2007.
Center for Component Technology for Terascale Simulation Software CCA is about: Enhancing Programmer Productivity without sacrificing performance. Supporting.
Presented by An Overview of the Common Component Architecture (CCA) The CCA Forum and the Center for Technology for Advanced Scientific Component Software.
McGraw-Hill/Irwin © 2006 The McGraw-Hill Companies, Inc., All Rights Reserved. 1.
Update on the CCA Groundwater Simulation Framework: the BOCCA Experience Bruce Palmer, Yilin Fang, Vidhya Gurumoorthi, James Fort, Tim Scheibe Computational.
Interactive Computational Sciences Laboratory Clarence O. E. Burg Assistant Professor of Mathematics University of Central Arkansas Science Museum of Minnesota.
F. Douglas Swesty, DOE Office of Science Data Management Workshop, SLAC March Data Management Needs for Nuclear-Astrophysical Simulation at the Ultrascale.
Multilevel Parallelism using Processor Groups Bruce Palmer Jarek Nieplocha, Manoj Kumar Krishnan, Vinod Tipparaju Pacific Northwest National Laboratory.
Scalable Systems Software for Terascale Computer Centers Coordinator: Al Geist Participating Organizations ORNL ANL LBNL.
MESQUITE: Mesh Optimization Toolkit Brian Miller, LLNL
CCA Common Component Architecture CCA Forum Tutorial Working Group CCA Status and Plans.
Scientific Workflow systems: Summary and Opportunities for SEEK and e-Science.
Parallel I/O in CMAQ David Wong, C. E. Yang*, J. S. Fu*, K. Wong*, and Y. Gao** *University of Tennessee, Knoxville, TN, USA **now at: Pacific Northwest.
Connections to Other Packages The Cactus Team Albert Einstein Institute
Progress on Component-Based Subsurface Simulation I: Smooth Particle Hydrodynamics Bruce Palmer Pacific Northwest National Laboratory Richland, WA.
Current Progress on Developing a Subsurface Simulation Framework Using CCA Bruce Palmer, Yilin Fang, Glenn Hammond, Vidhya Gurumoorthi, Jim Fort Pacific.
Xolotl: A New Plasma Facing Component Simulator Scott Forest Hull II Jr. Software Developer Oak Ridge National Laboratory
Super Computing 2000 DOE SCIENCE ON THE GRID Storage Resource Management For the Earth Science Grid Scientific Data Management Research Group NERSC, LBNL.
SDM Center High-Performance Parallel I/O Libraries (PI) Alok Choudhary, (Co-I) Wei-Keng Liao Northwestern University In Collaboration with the SEA Group.
1 Data Structures for Scientific Computing Orion Sky Lawlor /04/14.
SDM Center Parallel I/O Storage Efficient Access Team.
1 1  Capabilities: PCU: Communication, threading, and File IO built on MPI APF: Abstract definition of meshes, fields, and their algorithms GMI: Interface.
1 Rocket Science using Charm++ at CSAR Orion Sky Lawlor 2003/10/21.
A. Sim, CRD, L B N L 1 SRM Collaboration Meeting, Sep , 2005 SRM v3.0 LBNL Implementation Status Report Scientific Data Management Research Group.
Center for Component Technology for Terascale Simulation Software (CCTTSS) 110 April 2002CCA Forum, Townsend, TN CCA Status, Code Walkthroughs, and Demonstrations.
ECG Simulation NCRR Overview Technology for the ECG Simulation project CardioWave BioPSE project background Tools developed to date Tools for the next.
Center for Component Technology for Terascale Simulation Software (CCTTSS) 110 April 2002CCA Forum, Townsend, TN This work has been sponsored by the Mathematics,
Large-scale accelerator simulations: Synergia on the Grid turn 1 turn 27 turn 19 turn 16 C++ Synergia Field solver (FFT, multigrid) Field solver (FFT,
1 A Case Study: Percolation Percolation. Pour liquid on top of some porous material. Will liquid reach the bottom? Applications. [ chemistry, materials.
2.4 A Case Study: Percolation Introduction to Programming in Java: An Interdisciplinary Approach · Robert Sedgewick and Kevin Wayne · Copyright © 2008.
“Port Monitor”: progress & open questions Torsten Wilde and James Kohl Oak Ridge National Laboratory CCA Forum Quarterly Meeting Santa Fe, NM ~ October.
Simbios Simbios ™ The National Center for Physics-Based Simulation of Biological Structures at Stanford SimTK Framework CCA for Physics Based Simulation.
VisIt Project Overview
MASS Java Documentation, Verification, and Testing
Parallel Objects: Virtualization & In-Process Components
HPC Modeling of the Power Grid
DOE 2000 PI Retreat Breakout C-1
GeoFEST tutorial What is GeoFEST?
Cindy Powell Director, Energy Processes & Materials Division
Gaurab KCa,b, Zachary Mitchella,c and Sarat Sreepathia
Presentation transcript:

Current Progress on the CCA Groundwater Modeling Framework Bruce Palmer, Yilin Fang, Vidhya Gurumoorthi, Computational Sciences and Mathematics Division Pacific Northwest National Laboratory Richland, WA 99352

Progress on Smoothed Particle Hydrodynamics (SPH) Framework Added functionality to model intragranular diffusion Created 2D modeling capability Added multiphase modeling capability Successfully incorporated SPH framework into current contractor build (cca-tools-contractor-acts09). Work is underway to incorporate a uranium transport model into the framework

3 SPH Framework

4

5 2D SPH Simulation

Progress on Subsurface Transport Over Multiple Phases (STOMP) Framework Continued refining grid component Increased level of encapsulation between grid component and main physics component Developed a separate component that exports grid fields using an unstructured grid syntax Currently working to separate import functionality into a separate component

STOMP Framework STOMP componentGrid component Input Grid Port Physics Solvers Time Integrator Output GA Data Mapping GA MPI Chemistry Grid PortMesh Port GA Mesh IO Component Mesh Port

8 STOMP/CCA Simulations Simulation of transport of contaminant through an array of monitoring wells at the Hanford IFRC site, using the component version of STOMP

9 Mesh IO Component

Current Status A complete framework for performing Smoothed Particle Hydrodynamics simulations exists New functionality for simulating intragranular diffusion and multiphase flow has been added to the code 2D simulation capability has been added to most of the code Model for simulating uranium transport has been developed and is scheduled for implementation Initial efforts to incorporate H5Part parallel IO libraries were only partially successful (H5Part is built on top of HDF5). Problems are associated with shared library versions of H5Part Framework has been imported using the cca-tools-contractor- acts09 tarball

Current Status STOMP framework is evolving into multiple components Grid component has been further encapsulated to eliminate dependencies between grid and physics components An output component that supports that unstructured grid syntax used in the grid component has been created Work is currently underway to separate out input from the grid component The CCA version of the code is being used as the basis for an exascale version of STOMP being developed under another project, although the CCA tools are not being used

Issues There are still some bugs in the contractor build associated with the header file block in C++ components The only large DOE platform we can run on at the moment is Chinook at PNNL. We cannot run on the Franklin or Hopper machines at NERSC. We haven’t tried a BlueGene machine. Hopper is suppose to support shared libraries, so presumably it might be able to support CCA as is Shared files are proving to be problematic in a number of ways. 1) Not all (probably most) large computers do not support them 2) Reliable builds of shared libraries are a problem (GA, HDF5) What is the status of static builds generated from CCA?

Issues (cont.) What are the current development plans for CCA going forward? Will there be sufficient resources available for maintaining, debugging, and porting existing functionality of the CCA framework?

Acknowledgements Funding for this project was provided by DOE’s Office of Advanced Scientific Computing Research under the Scientic Discovery through Advanced Computing program Computer time on Chinook supplied by the Environmental and Molecular Sciences Laboratory at Pacific Northwest National Laboratory through its Science Theme program Computer time on Franklin supplied by the National Energy Research Scientific Computing Center