1 Mathematical, Information and Computational Sciences Computer Science PI Meeting June 26-27, 2002 Fred Johnson.

Slides:



Advertisements
Similar presentations
Strategic Simulation Plan Thomas Zacharia Director Computer Science and Mathematics Division Oak Ridge National Laboratory presented at the Workshop on.
Advertisements

What do we currently mean by Computational Science? Traditionally focuses on the “hard sciences” and engineering –Physics, Chemistry, Mechanics, Aerospace,
Earth System Curator Spanning the Gap Between Models and Datasets.
1 Ideas About the Future of HPC in Europe “The views expressed in this presentation are those of the author and do not necessarily reflect the views of.
U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
Presentation at WebEx Meeting June 15,  Context  Challenge  Anticipated Outcomes  Framework  Timeline & Guidance  Comment and Questions.
High-Performance Computing
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update June 12,
Presented by Suzy Tichenor Director, Industrial Partnerships Program Computing and Computational Sciences Directorate Oak Ridge National Laboratory DOE.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
The DOE Science Grid Computing and Data Infrastructure for Large-Scale Science William Johnston, Lawrence Berkeley National Lab Ray Bair, Pacific Northwest.
Update on the DOE SciDAC Program Vicky White, DOE/HENP Lattice QCD Collaboration Meeting Jefferson Lab, Feb
1 NERSC User Group Business Meeting, June 3, 2002 High Performance Computing Research Juan Meza Department Head High Performance Computing Research.
1 Mathematical, Information and Computational Sciences Mathematical, Information and Computational Sciences Program - An Introduction – 3 rd Doe/NSF Meeting.
U.S. Department of Energy’s Office of Science Dr. Raymond Orbach February 25, 2003 Briefing for the Basic Energy Sciences Advisory Committee FY04 Budget.
1 Dr. Frederica Darema Senior Science and Technology Advisor NSF Future Parallel Computing Systems – what to remember from the past RAMP Workshop FCRC.
1 Mathematical, Information and Computational Sciences FAST-OS Workshop July 9-10, 2002 Fred Johnson.
Office of Science U.S. Department of Energy U.S. Department of Energy’s Office of Science Dr. Raymond L. Orbach Under Secretary for Science U.S. Department.
1 Ideas About the Future of HPC in Europe “The views expressed in this presentation are those of the author and do not necessarily reflect the views of.
1 OASCR Genomes to Life OBER Genomes to Life a partnership between Biology and Computing Gary Johnson John Houghton Office.
NGNS Program Managers Richard Carlson Thomas Ndousse ASCAC meeting 11/21/2014 Next Generation Networking for Science Program Update.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
Center for Component Technology for Terascale Simulation Software (aka Common Component Architecture) (aka CCA) Rob Armstrong & the CCA Working Group Sandia.
Opportunities for Discovery: Theory and Computation in Basic Energy Sciences Chemical Sciences, Geosciences and Biosciences Response to BESAC Subcommittee.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, ADVANCED SCIENTIFIC COMPUTING RESEARCH An.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update September.
November 13, 2006 Performance Engineering Research Institute 1 Scientific Discovery through Advanced Computation Performance Engineering.
U.S. Department of Energy’s Office of Science High Performance Computing Challenges and Opportunities Dr. Daniel Hitchcock
CCA Common Component Architecture CCA Forum Tutorial Working Group Welcome to the Common.
Programming Models & Runtime Systems Breakout Report MICS PI Meeting, June 27, 2002.
CCA Common Component Architecture CCA Forum Tutorial Working Group Welcome to the Common.
Crystal Ball Panel ORNL Heterogeneous Distributed Computing Research Al Geist ORNL March 6, 2003 SOS 7.
DOE 2000, March 8, 1999 The IT 2 Initiative and NSF Stephen Elbert program director NSF/CISE/ACIR/PACI.
High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative.
BESAC Dec Outline of the Report I. A Confluence of Scientific Opportunities: Why Invest Now in Theory and Computation in the Basic Energy Sciences?
Pascucci-1 Valerio Pascucci Director, CEDMAV Professor, SCI Institute & School of Computing Laboratory Fellow, PNNL Massive Data Management, Analysis,
Presented by An Overview of the Common Component Architecture (CCA) The CCA Forum and the Center for Technology for Advanced Scientific Component Software.
National Collaboratories Program Overview Mary Anne ScottFebruary 7, rd DOE/NSF Meeting on LHC and Global Computing “Infostructure”
Commodity Grid Kits Gregor von Laszewski (ANL), Keith Jackson (LBL) Many state-of-the-art scientific applications, such as climate modeling, astrophysics,
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
Brent Gorda LBNL – SOS7 3/5/03 1 Planned Machines: BluePlanet SOS7 March 5, 2003 Brent Gorda Future Technologies Group Lawrence Berkeley.
Scalable Systems Software for Terascale Computer Centers Coordinator: Al Geist Participating Organizations ORNL ANL LBNL.
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
ESFRI & e-Infrastructure Collaborations, EGEE’09 Krzysztof Wrona September 21 st, 2009 European XFEL.
May 6, 2002Earth System Grid - Williams The Earth System Grid Presented by Dean N. Williams PI’s: Ian Foster (ANL); Don Middleton (NCAR); and Dean Williams.
1 DOE Office of Science October 2003 SciDAC Scientific Discovery through Advanced Computing Alan J. Laub.
Computational Science & Engineering meeting national needs Steven F. Ashby SIAG-CSE Chair March 24, 2003.
1 1 Office of Science Jean-Luc Vay Accelerator Technology & Applied Physics Division Lawrence Berkeley National Laboratory HEP Software Foundation Workshop,
1 OFFICE OF ADVANCED SCIENTIFIC COMPUTING RESEARCH The NERSC Center --From A DOE Program Manager’s Perspective-- A Presentation to the NERSC Users Group.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
Comprehensive Scientific Support Of Large Scale Parallel Computation David Skinner, NERSC.
HEP and NP SciDAC projects: Key ideas presented in the SciDAC II white papers Robert D. Ryne.
ComPASS Summary, Budgets & Discussion Panagiotis Spentzouris, Fermilab ComPASS PI.
Supercomputing 2006 Scientific Data Management Center Lead Institution: LBNL; PI: Arie Shoshani Laboratories: ANL, ORNL, LBNL, LLNL, PNNL Universities:
PLASMA SCIENCE ADVANCED COMPUTING INTITUTE PROGRAM ADVISORY COMMITTEE MEETING W. M. TANG and V. S. CHAN 3 June 2004.
High Risk 1. Ensure productive use of GRID computing through participation of biologists to shape the development of the GRID. 2. Develop user-friendly.
Center for Component Technology for Terascale Simulation Software (CCTTSS) 110 April 2002CCA Forum, Townsend, TN This work has been sponsored by the Mathematics,
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
All Hands Meeting 2005 BIRN-CC: Building, Maintaining and Maturing a National Information Infrastructure to Enable and Advance Biomedical Research.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
Office of Science U.S. Department of Energy High-Performance Network Research Program at DOE/Office of Science 2005 DOE Annual PI Meeting Brookhaven National.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
Toward High Breakthrough Collaboration (HBC) Susan Turnbull Program Manager Advanced Scientific Computing Research March 4, 2009.
Steven L. Lee DOE Program Manager, Office of Science Advanced Scientific Computing Research Scientific Discovery through Advanced Computing Program: SciDAC-3.
VisIt Project Overview
A Brief Introduction to NERSC Resources and Allocations
Collaborations and Interactions with other Projects
Defining the Grid Fabrizio Gagliardi EMEA Director Technical Computing
Presentation transcript:

1 Mathematical, Information and Computational Sciences Computer Science PI Meeting June 26-27, 2002 Fred Johnson

2 Mathematical, Information and Computational Sciences WELCOME!! To The First (annual?) MICS Computer Science PI Meeting –Foster a sense of community in CS focused on high end problems –Get to know each other and what’s going on –Future directions roadmap Program Managers Folks who made it happen! –Cheryl, Bonnie, Jim, Rusty

3 Mathematical, Information and Computational Sciences operate supercomputers, a high performance network, and related facilities. Mission Discover, develop, and deploy the computational and networking tools that enable researchers in the scientific disciplines to analyze, model, simulate, and predict complex physical, chemical, and biological phenomena important to the Department of Energy (DOE). foster and support fundamental research in advanced scientific computing – applied mathematics, computer science, and networking

4 Mathematical, Information and Computational Sciences FY 2003 President’s Request Advanced Scientific Computing Research Program FY2001FY2002FY2003 Mathematical, Information, and Computational Sciences $151,647$154,400$166,625 Laboratory Technology Research $ 9,649$ 3,000 TOTAL ASCR$161,296$157,400$169,625 Budget Authority ($ in thousands) NOTE- FY2001 excludes SBIR/STTR set-asides

5 Mathematical, Information and Computational Sciences Program Strategy Fundamental Research R&D for Applications Energy Sciences Network (ESnet) Advanced Computing Research Facilities National Energy Research Scientific Computing Center (NERSC) Testbeds Applications Materials Chemical Combustion Accelerator HEP Nuclear Fusion Global Climate Astrophysics Facility Access Teams of Researchers Applied Mathematics Computer Science Advanced Computing Software Tools Scientific Application Pilots Networking Collaboratory Tools Collaboratory Pilots BES, BER, FES, HEP, NP Integrated Software Infrastructure Centers Teams- mathematicians, computer scientists, application scientists, and software engineers …simulation of complex systems …distributed teams and remote access to facilities High Performance Computing and Network Facilities for Science Research to enable…

6 Mathematical, Information and Computational Sciences Budget Request FY2003- $166,625,000 Base Research Comp. Bio. SciDAC Facilities SBIR/STTR Enhancements over FY2002 Computational Biology +$5.6M SciDAC +$5.3M Facilities +$1.3M

7 Mathematical, Information and Computational Sciences FY2001 MICS Research Budget by Institution $ in thousands (# of projects) AMSCSNC-ACSTNRSAPPSciDACComp. Bio. Univ. (& Others)8,236 (42) 9,336 (24) 5,597 (20) 2,583 (12) 1,105 (10) 17,548 (56) 1,703 (8) Laboratories15,496 (31) 11,605 (44) 12,984 (74) 2,673 (16) 960 (7) 19,895 (65) 1120 (3) Totals23,78220,94118,5815,2562,06537,4432,823 Legend AMS- Applied Mathematical Sciences CS- Computer Sciences NC-ACST- National Collaboratories- Advanced Computing Software Tools NR- Networking Research SAPP- Scientific Application Pilot Projects SciDAC- Scientific Discovery through Advanced Computing Comp. Bio.- Computational Biology Base Research

8 Mathematical, Information and Computational Sciences FY 2001 –Initiated software infrastructure component of SciDAC –Initiated research efforts in computational biology –Upgraded NERSC to 5 teraflops –Acquired IBM Power 4 Hardware for evaluation/scaling (limited SciDAC support) FY 2002 Activities –Ensure success of SciDAC –Strengthen base research effort (Early Career PI) –Initiate NERSC-4 acquisition process FY 2003 Plans –Launch computational component of Genomes to Life, in partnership with BER –Initiate computational nanoscience partnership with BES as part of SciDAC –Provide topical high performance computing resources to support SciDAC research Program Evolution

9 Mathematical, Information and Computational Sciences Computer Science Research Challenge – HPC for Science is (still after fifteen years!) –Hard to use –Inefficient –Fragile –An unimportant vendor market Vision –A comprehensive, integrated software environment that enables the effective application of high performance systems to critical DOE problems Goal– Radical Improvement in –Application Performance –Ease of Use –Time to Solution Node and System Hardware Arch User Space Runtime Support OS Kernel OS Bypass Scientific Applications System Admin Software Development Chkpt/RstrtMath LibsDebuggers Viz/DataScheduler PSEsRes. MgtFramewrks Compilers Perf ToolsFile Sys Runtme Tls HPC System Elements

10 Mathematical, Information and Computational Sciences Program Components Base Program –Evolutionary and revolutionary software methodologies for future generations of HPC architectures SciDAC Integrated Software Infrastructure Centers –Enable effective application of current terascale architectures to SciDAC applications through focused research and partnerships

11 Mathematical, Information and Computational Sciences Computer Science Technical Elements 25% 19% 18% 15% 23%

12 Mathematical, Information and Computational Sciences Opportunities for Program Growth Dynamic OS/Runtime environments Operating systems for petascale systems Application specific problem solving environments Intelligent program development environments Accelerate HW/SW for effective petascale computation Life Science (such as those described in the Genome-to-Life initiative) Nanoscience (such as those proposed in the NSF and DOE Nanoscience initiatives) Computational Cosmology and Astrophysics Earth Science and Environmental modeling Computational Physics Computational Chemistry Fusion modeling and simulation Multidisciplinary design problems

13 Mathematical, Information and Computational Sciences Scientific Discovery through Advanced Computing (SciDAC) An integrated program to: (1) Create a new generation of Scientific Simulation Codes that take full advantage of the extraordinary computing capabilities of terascale computers. (2) Create the Mathematical and Computing Systems Software to enable the Scientific Simulation Codes to effectively and efficiently use terascale computers. (3) Create a Collaboratory Software Environment to enable geographically separated scientists to effectively work together as a team and to facilitate remote access to both facilities and data.

14 Mathematical, Information and Computational Sciences SciDAC Program Elements Scientific Simulation Codes –Funding in ASCR, BES, BER, FES, and HENP to develop scientific codes that take full advantage of terascale computers Mathematical Methods and Algorithms –Funding in ASCR to develop mathematical methods and algorithms that perform well on cache-based microprocessors and scale to thousands and, eventually, tens of thousands of processors Computing Systems Software –Funding in ASCR to develop software to facilitate the development and use of scientific codes for terascale computers

15 Mathematical, Information and Computational Sciences Program Elements (cont’d) –Funding in ASCR to develop software needed to manage and analyze massive data sets produced by simulations on terascale computers Collaboratories and Data Grids –Funding in ASCR to develop collaborative software to enable geographically-dispersed researchers to work as a team –Funding in ASCR to develop computational and data grids to facilitate access to computers, facilities, and data

16 Mathematical, Information and Computational Sciences Scientific Simulation Methods and Codes for Terascale Computers BES ($1,931) –Understand and predict the energetics and dynamics of chemical reactions and the interaction between chemistry and fluid dynamics – electronic structure and reacting flows BER ($8,000) –Understand and predict the earth’s climate at both regional and global scales for decades to centuries, including levels of certainty and uncertainty FES ($3,000) –Understand and predict microscopic turbulence and macroscopic stability in magnetically confined plasmas, including their effect on core and edge confinement

17 Mathematical, Information and Computational Sciences Scientific Simulation Methods and Codes for Terascale Computers (cont’d) –Understand and predict the electromagnetic fields, beam dynamics, and other physical processes in heavy-ion accelerators for inertial fusion –Understand basic plasma science processes, such as electromagnetic wave-particle interactions and magnetic reconnection HENP ($7,000) –Understand and predict electromagnetic field and beam dynamics in particle accelerators –Understand and predict the physical phenomena encompassed in the Standard Model of Particle Physics –Understand mechanisms of core collapse supernovae

18 Mathematical, Information and Computational Sciences SciDAC Prototype: NWChem Global Arrays ParIO Malloc RT DB Integrals API Geometry Object Basis Set Object Linear Algebra Diagonalization Molecular Energy Molecular Structure Molecular Vibrations... Molecular Modeling Toolkit Molecular Calculation Modules Computational Science Development Toolkit NWChem a major new modeling capability for molecular science Molecular Electronic Structure Molecular Dynamics 600,000 lines of code and growing Runs on... Cray T3D/E, IBM SP2, SGI SMP, NOWs, Sun and other workstations, X86 PCs (Linux) Scales to... 2,000+ processors Developers... Core group (15) plus larger group (20) of world-wide collaborators 100 person-years at PNNL alone...

19 Mathematical, Information and Computational Sciences Emphasis on TeamBuilding and Complete Software Life-Cycle Computing Infrastructure Applied Mathematicians Applied Mathematicians Computer Scientists Computer Scientists Theoretical & Computational Scientists Theoretical & Computational Scientists Bring together teams of theoretical and computational scientists, computer scientists, applied mathematicians, with the computing infrastructure, and establish close working relationships. Support and integrate research through development to deployment to ensure that the scientific community receives innovative, yet usable software capabilities in a timely fashion; seek industrial support wherever possible. ResearchDeployment New Model Traditional Model Software Life-cycle 1 2 DevelopmentMaint.

20 Mathematical, Information and Computational Sciences SciDAC Program Awards -- Integrated Software Infrastructure Centers (ISIC) ISIC Vision:  Provide a comprehensive, portable, and fully integrated suite of systems software, libraries and tools for the effective management and utilization of terascale computers by SciDAC applications  Provide maximum performance, robustness, portability and ease of use to application developers, end users and system administrators Award Summary:  3 Centers -- Mathematical algorithms/libraries, $8.6M/year  4 Centers – Computer science issues, $10.7M/year

21 Mathematical, Information and Computational Sciences Computer Science Integrated Software Infrastructure Centers Scalable Tools for Large Clusters; Resource Management; System Interfaces; System Management Tool Framework ANL LBNL Ames PNNL SNL LANL National Center for Supercomputing App (Al Giest, ORNL) $2.2M/Yr High-End Computer Sysetms Performance: Science & Engineering ANL LLNL ORNL U. of Illinois UCSD U. of Tennessee U. of Maryland (David Bailey, LBNL) $2.4M High Performance, Low Latency Parallel Software Component Architecture ANL LANL LLNL ORNL PNNL U. of Utah Indiana U. (Rob Armstrong, SNL) $3.1M Scientific Data Management Enabling Technology ANL LLNL ORNL Georgia Tech. UCSD Northwestern U. North Carolina State (Ari Shoshani, LBNL) $3M Four activities focused on a comprehensive, portable, and fully integrated suite of systems software and tools for effective utilization of terascale computers - $10.7M

22 Mathematical, Information and Computational Sciences Open Source/IP Strong emphasis on Open Source One license doesn’t fit all Code from many sources needed Independent IP holding organization –CCA subset – SNL, ANL, Indiana –Goal: Code development without FEAR

23 Mathematical, Information and Computational Sciences The Future of Computational Science at DOE From Secretary Abraham’s speech at BNL on Friday, June 17, 2002 The Department is also one of this nation's major sponsors of advanced computers for science. We did this in the first instance for obvious national security reasons. And we have gone on to establish the country's first supercomputer center for science. Now more than ever, however, virtually all science depends on teraflops. The computer is no longer simply a tool for science. Computation is science itself, and enables scientists to understand complex systems that would otherwise remain beyond our grasp. It's an indispensable contributor to our national security work, to nanotechnology, as well as to every other venture we undertake in science. And I intend that this Department maintains America's lead in this critical field.

24 Mathematical, Information and Computational Sciences Today’s Agenda Morning –Bob Lucas – Lessons Learned the Hard Way –Tony Mezzacappa – Software Perspectives of a SciDAC Application Scientist –Burton Smith – New Ideas, New Architectures Afternoon –ISIC Posters –Base Program Posters

25 Mathematical, Information and Computational Sciences Thursday’s Agenda Morning Breakout Sessions –Programming Models and Runtime –System Software Environment –Interoperability and Portability Tools –Visualization and Data Understanding –If you don’t like these … Afternoon –Feedback by session chairs –General discussions, meeting feedback, …