Ensuring Our Nation’s Energy Security NCSX Computational Challenges and Directions in the Office of Science Science for DOE and the Nation www.science.doe.gov.

Slides:



Advertisements
Similar presentations
Office of Science U.S. Department of Energy FASTOS – June FASTOS – June 2005 WELCOME!!
Advertisements

Strategic Simulation Plan Thomas Zacharia Director Computer Science and Mathematics Division Oak Ridge National Laboratory presented at the Workshop on.
Sandy Landsberg DOE Office of Science Advanced Scientific Computing Research September 20, 2012 DOE / ASCR Interests & Complex Engineered Networks Applied.
O AK R IDGE N ATIONAL L ABORATORY U. S. D EPARTMENT OF E NERGY Center for Computational Sciences Cray X1 and Black Widow at ORNL Center for Computational.
1 Ideas About the Future of HPC in Europe “The views expressed in this presentation are those of the author and do not necessarily reflect the views of.
U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
Advancing Alternative Energy Technologies Glenn MacDonell Director, Energy Industry Canada Workshop on Alternatives to Conventional Generation Technologies.
High-Performance Computing
Advances in Modeling and Simulation for US Nuclear Stockpile Stewardship February 2, 2009 James S. Peery Director Computers, Computation, Informatics and.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update June 12,
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
U.S. Science Policy Cheryl L. Eavey, Program Director
Introduction CS 524 – High-Performance Computing.
1 WRF Development Test Center A NOAA Perspective WRF ExOB Meeting U.S. Naval Observatory, Washington, D.C. 28 April 2006 Fred Toepfer NOAA Environmental.
RBG 6/9/ Reactive Processing Jon Hiller Science and Technology Associates, Inc. For Dr. William Harrod DARPA IPTO 24 May 2005.
1 Mathematical, Information and Computational Sciences Mathematical, Information and Computational Sciences Program - An Introduction – 3 rd Doe/NSF Meeting.
U.S. Department of Energy’s Office of Science Dr. Raymond Orbach February 25, 2003 Briefing for the Basic Energy Sciences Advisory Committee FY04 Budget.
Building a Cluster Support Service Implementation of the SCS Program UC Computing Services Conference Gary Jung SCS Project Manager
Office of Science U.S. Department of Energy U.S. Department of Energy’s Office of Science Dr. Raymond L. Orbach Under Secretary for Science U.S. Department.
Introduction to Systems Analysis and Design
Diversity and Inclusion at NASA: A Strategic Integrated Approach
The NIH Roadmap for Medical Research
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
High Productivity Computing Systems Robert Graybill DARPA/IPTO March 2003.
1 Robert S. Webb and Roger S. Pulwarty NOAA Climate Service.
The Climate Prediction Project Global Climate Information for Regional Adaptation and Decision-Making in the 21 st Century.
Designing the Microbial Research Commons: An International Symposium Overview National Academy of Sciences Washington, DC October 8-9, 2009 Cathy H. Wu.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, ADVANCED SCIENTIFIC COMPUTING RESEARCH An.
The Materials Genome Initiative and Materials Innovation Infrastructure Meredith Drosback White House Office of Science and Technology Policy September.
Fred H. Cate Vice President for Research September 18, 2015 Grand Challenges.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Parallel Programming in C with MPI and OpenMP Michael J. Quinn.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update September.
ESIP Federation Air Quality Cluster Partner Agencies.
Center for Computational Sciences O AK R IDGE N ATIONAL L ABORATORY U. S. D EPARTMENT OF E NERGY Vision for OSC Computing and Computational Sciences
Revitalizing High-End Computing – Progress Report July 14, 2004 Dave Nelson (NCO) with thanks to John Grosh (DoD)
Cray Innovation Barry Bolding, Ph.D. Director of Product Marketing, Cray September 2008.
BESAC Dec Outline of the Report I. A Confluence of Scientific Opportunities: Why Invest Now in Theory and Computation in the Basic Energy Sciences?
ASCAC-BERAC Joint Panel on Accelerating Progress Toward GTL Goals Some concerns that were expressed by ASCAC members.
Pascucci-1 Valerio Pascucci Director, CEDMAV Professor, SCI Institute & School of Computing Laboratory Fellow, PNNL Massive Data Management, Analysis,
Slide-1 LACSI Extreme Computing MITRE ISIMIT Lincoln Laboratory This work is sponsored by the Department of Defense under Army Contract W15P7T-05-C-D001.
Office of Science U.S. Department of Energy Raymond L. Orbach Director Office of Science U.S. Department of Energy Presentation to BESAC December 6, 2004.
HPCMP Benchmarking Update Cray Henry April 2008 Department of Defense High Performance Computing Modernization Program.
U.S. Department of Energy’s Office of Science 20 th Meeting of the IEA Large Tokamak ExCo, May th Meeting of the IEA Poloidal Divertor ExCo, May.
Mcs/ HPC challenges in Switzerland Marie-Christine Sawley General Manager CSCS SOS8, Charleston April,
Ensuring Our Nation’s Energy Security NCSX News from the Office of Science Presentation to the Basic Energy Sciences Advisory Committee July 22, 2002 Dr.
ESFRI & e-Infrastructure Collaborations, EGEE’09 Krzysztof Wrona September 21 st, 2009 European XFEL.
Cray Environmental Industry Solutions Per Nyberg Earth Sciences Business Manager Annecy CAS2K3 Sept 2003.
1 DOE Office of Science October 2003 SciDAC Scientific Discovery through Advanced Computing Alan J. Laub.
BESAC Workshop on Opportunities for Catalysis/Nanoscience May 14-16, 2002 William S. Millman Basic Energy Sciences May 14, 2002 Catalysis and Nanoscience.
Marv Adams Chief Information Officer November 29, 2001.
Computational Science & Engineering meeting national needs Steven F. Ashby SIAG-CSE Chair March 24, 2003.
National Strategic Computing Initiative
Barriers to Industry HPC Use or “Blue Collar” HPC as a Solution Presented by Stan Ahalt OSC Executive Director Presented to HPC Users Conference July 13,
1 OFFICE OF ADVANCED SCIENTIFIC COMPUTING RESEARCH The NERSC Center --From A DOE Program Manager’s Perspective-- A Presentation to the NERSC Users Group.
Applied Sciences Perspective Lawrence Friedl, Program Director NASA Earth Science Applied Sciences Program LANCE User Working Group Meeting  September.
Comprehensive Scientific Support Of Large Scale Parallel Computation David Skinner, NERSC.
HEP and NP SciDAC projects: Key ideas presented in the SciDAC II white papers Robert D. Ryne.
Government and Industry IT: one vision, one community Vice Chairs April Meeting Agenda Welcome and Introductions GAPs welcome meeting with ACT Board (John.
The Performance Evaluation Research Center (PERC) Participating Institutions: Argonne Natl. Lab.Univ. of California, San Diego Lawrence Berkeley Natl.
CERN VISIONS LEP  web LHC  grid-cloud HL-LHC/FCC  ?? Proposal: von-Neumann  NON-Neumann Table 1: Nick Tredennick’s Paradigm Classification Scheme Early.
HPC University Requirements Analysis Team Training Analysis Summary Meeting at PSC September Mary Ann Leung, Ph.D.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
Toward High Breakthrough Collaboration (HBC) Susan Turnbull Program Manager Advanced Scientific Computing Research March 4, 2009.
Fermi National Accelerator Laboratory & Thomas Jefferson National Accelerator Facility SciDAC LQCD Software The Department of Energy (DOE) Office of Science.
U.S. Department of Energy’s Office of Science Presentation to the Basic Energy Sciences Advisory Committee (BESAC) Dr. Raymond L. Orbach, Director November.
National Vision for High Performance Computing
Unidata Policy Committee Meeting
Presentation transcript:

Ensuring Our Nation’s Energy Security NCSX Computational Challenges and Directions in the Office of Science Science for DOE and the Nation Fred Johnson Advanced Scientific Computing Research SOS7, March 2003

2 Outline Background – Computational science in the Office of Science SciDAC Ultrascale Scientific Computing Capability FY04: Next Generation Computer Architecture FY05: The future revisited

3 Supports basic research that underpins DOE missions. Constructs and operates large scientific facilities for the U.S. scientific community. – Accelerators, synchrotron light sources, neutron sources, etc. Five Offices – Basic Energy Sciences – Biological and Environmental Research – Fusion Energy Sciences – High Energy and Nuclear Physics – Advanced Scientific Computing Research The Office of Science

Computational Science is Critical to the Office of Science Mission Scientific problems of strategic importance typically: –Involve physical scales that range over 5-50 orders of magnitude; –Couple scientific disciplines, e.g., chemistry and fluid dynamics to understand combustion; –Must be addressed by teams of mathematicians, computer scientists, and application scientists; and –Utilize facilities that generate millions of gigabytes of data shared among scientists throughout the world. Two layers of Fe-Mn-Co containing 2,176 atoms corresponds to a wafer with dimensions approximately fifty nanometers (50x m) on a side and five nanometers (5 x m) thick. A simulation of the properties of this configuration was performed on the IBM SP at NERSC. The simulation lasted for 100 hrs. at a calculation rate of 2.46 Teraflops (one trillion floating point operations per second). To explore material imperfections, the simulation would need to be at least 10 times more compute intensive. The Scale of the Problem

5 Outline Background – Computational science in the Office of Science SciDAC Ultrascale Scientific Computing Capability FY04: Next Generation Computer Architecture FY05: The future revisited

6 Scientific Discovery Through Advanced Computation (SciDAC) SciDAC brings the power of terascale computing and information technologies to several scientific areas -- breakthroughs through simulation. SciDAC is building community simulation models through collaborations among application scientists, mathematicians and computer scientists -- research tools for plasma physics, climate prediction, combustion, etc. State-of-the-art electronic collaboration tools facilitate the access to these tools by the broader scientific community to bring simulation to a level of parity with theory & observation in the scientific enterprise.

7 Introduction SciDAC is a pilot program for a “new way of doing science” spans the entire Office of Science (ASCR, BES, BER, FES, HENP) $37M 2M+ 8M+ 3M 7M involves all DOE labs and many universities builds on 50 years of DOE leadership in computation and mathematical software (EISPACK, LINPACK, LAPACK, BLAS, etc.)

8 Addressing the Performance Gap through Software , Teraflops 1996 Peak performance is skyrocketing l In 1990s, peak performance increased 100x; in 2000s, it will increase 1000x But... l Efficiency for many science applications declined from 40-50% on the vector supercomputers of 1990s to as little as 5- 10% on parallel supercomputers of today Need research on... l Mathematical methods and algorithms that achieve high performance on a single processor and scale to thousands of processors l More efficient programming models for massively parallel supercomputers Performance Gap Peak Performance Real Performance

9 It’s Not Only Hardware! Updated version of chart appearing in “Grand Challenges: High performance computing and communications”, OSTP committee on physical, mathematical and Engineering Sciences, 1992.

10 SciDAC Goals an INTEGRATED program to: – (1) create a new generation of scientific simulation codes that take full advantage of the extraordinary capabilities of terascale computers – (2) create the mathematical and computing systems software to enable scientific simulation codes to effectively and efficiently use terascale computers – (3) create a collaboratory software environment to enable geographically distributed scientists to work effectively together as a TEAM and to facilitate remote access, through appropriate hardware and middleware infrastructure, to both facilities and data with the ultimate goal of advancing fundamental research in science central to the DOE mission

11 CSE is Team-Oriented successful CSE usually requires teams with members and/or expertise from at least mathematics, computer science, and (several) application areas language and culture differences usual reward structures focus on the individual incompatible with traditional academia SciDAC will help break down barriers and lead by example; DOE labs are a critical asset

12 The Computer Scientist’s View Must have Fortran! Must study climate! Must have cycles! Must move data!

13 Applications Scientist View Applications Scientist Computer Scientist

14 Future SciDAC Issues additional computing and network resources – initial SciDAC focus is on software, but new hardware will be needed within the next two years – both capability and capacity computing needs are evolving rapidly limited architectural options available in the U.S. today – topical computing may be a cost-effective way of providing extra computing resources – math and CS research will play a key role expansion of SciDAC program – many important SC research areas (e.g., materials/nanoscience, functional genomics/proteomics) are not yet included in SciDAC; NSRCs, GTL

15 Outline Background – Computational science in the Office of Science SciDAC Ultrascale Scientific Computing Capability FY04: Next Generation Computer Architecture FY05: The future revisited

16 Motivation UltraScale Simulation Computing Capability Mission need: Energy production, novel materials, climate science, biological systems – Systems too complex for direct calculation; descriptive laws absent. – Involve physical scales up to 50 orders of magnitude; – Several scientific disciplines, e.g., combustion; materials science – Experimental data may be costly to develop, insufficient, inadequate or unavailable; and – Large data files (millions of gigabytes) shared among scientists throughout the world. History of Accomplishments – MPI, Math libraries, first dedicated high-performance computing center, SciDAC

17 ASCAC Statement Without robust response to Earth Simulator, U.S. is open to losing its leadership in defining and advancing frontiers of computational science as new approach to science. This area is critical to both our national security and economic vitality. (Advanced Scientific Computing Advisory Committee – May 21, 2002).

18 Simulation Capability Needs FY Timeframe Application Simulation Need Sustained Computational Capability Needed (Tflops) Significance Climate Science Calculate chemical balances in atmosphere, including clouds, rivers, and vegetation. > 50 Provides U.S. policymakers with leadership data to support policy decisions. Properly represent and predict extreme weather conditions in changing climate. Magnetic Fusion Energy Optimize balance between self-heating of plasma and heat leakage caused by electromagnetic turbulence. > 50Underpins U.S. decisions about future international fusion collaborations. Integrated simulations of burning plasma crucial for quantifying prospects for commercial fusion. Combustion Science Understand interactions between combustion and turbulent fluctuations in burning fluid. > 50Understand detonation dynamics (e.g. engine knock) in combustion systems. Solve the “soot “ problem in diesel engines. Environmental Molecular Science Reliably predict chemical and physical properties of radioactive substances. > 100Develop innovative technologies to remediate contaminated soils and groundwater. AstrophysicsRealistically simulate the explosion of a supernova for first time. >> 100Measure size and age of Universe and rate of expansion of Universe. Gain insight into inertial fusion processes.

19 Key Ideas Deliver a full-suite of leadership class computers for science with broad applicability. Establish a model for computational sciences (SciDAC and base programs) that couples applications scientists, mathematicians, and computational and computer scientists with computer designers, engineers, and semiconductor researchers. Develop partnerships with domestic computer vendors to ensure that leadership class computers are designed, developed, and produced with science needs as an explicit design criterion. Partner with other agencies. Partner with industry on applications.

20 FY 2004 Request to OMB USSCC UltraScale Scientific Computing Capability Supporting R&D – 30% – Research with Domestic Vendors – Develop ultrascale hardware and software capabilities for advancing science, focusing on faster interconnects and switches. o Continue 2 partnerships begun in FY2003 o Initiate additional partnerships ( up to 3) in FY2004, based on competitive review – Operating Systems, Software Environments, and Tools o Address issues to ensure scalability of operating systems to meet science needs o Develop enhanced numerical libraries for scientific simulations o Develop tools to analyze application performance on ultrascale computer systems – University-based Computer Architecture Research – Explore future generations of computer architectures for ultrascale science simulation.

21 FY 2004 Request to OMB USSCC UltraScale Scientific Computing Capability Computing and Network Facilities- 70% – Computer architecture evaluation partnerships- Evaluate computer architectures at levels to ensure that computer hardware and systems software balanced for science and likely to successfully scale o Continue partnership established in FY2002 between ORNL and Cray, Inc. o Initiate one new partnership, comprised of scientists and engineers from a domestic computer vendor, with computer scientists, and applications scientists supported by the Office of Science. o Award partnership from a competition among invited vendors – Begin installation of first ultrascale computing system for science

22 Outline Background – Computational science in the Office of Science SciDAC Ultrascale Scientific Computing Capability FY04: Next Generation Computer Architecture FY05: The future revisited

23 Next Generation Computer Architecture Goal:Identify and address major hardware and software architectural bottlenecks to the performance of existing and planned DOE science application Main Activities – Architecture impacts on application performance – OS/runtime research – Evaluation testbeds

24 Outline Background – Computational science in the Office of Science SciDAC Ultrascale Scientific Computing Capability FY04: Next Generation Computer Architecture FY05: The future revisited

25 How full is the glass? Support and enthusiasm within the Office of Science –Office of Science Strategic Plan Interagency cooperation/coordination –NSA SV2 –DOD IHEC –DARPA HPCS –NNSA: program reviews, open source, NAS study, Red Storm,… –DARPA/DOD/SC USSCC meeting –OSTP/NNSA/DOD/SC NGCA meeting OSTP support International coordination –Hawaii meeting –ES benchmarking

26 Agency Coordination Overview Matrix Research CoordinationDevelopment Coordination Strategy Coordination NNSA$17M research funded at NNSA laboratories Red Storm developmentFormal coordination documents DOD – DUSD Science and Technology IHEC study DARPAHPCS review team HPCS evaluation system plan NSA UPCCray SV2/X1 development All AgenciesHECCWG

27 NNSA Details Research CoordinationDevelopment Coordination Strategy Coordination NNSAX – $17M research funded at NNSA laboratories, Light weight kernel, common component architecture, performance engineering, … X – Red Storm development quarterly review meetings, ASCI Q review, ASCI PSE review, SciDAC reviews, … X – Formal coordination documents, joint funded NAS study, open source software thrust, platform evaluation

28 NSA Details Research CoordinationDevelopment Coordination Strategy Coordination NSAX – UPC (Lauren Smith), Programming Models (Bill Carlson), Benchmarking (Candy Culhane) X – Cray SV2/X1 development, Cray Black Widow development (quarterly review meetings)

29 DOD and DARPA Details Research CoordinationDevelopment Coordination Strategy Coordination DOD – DUSD Science and Technology X – IHEC study, agreement on SC role in IHEC DARPAX – HPCS review team – Phase I, Phase II and Phase II; Review Cray, IBM, HP, SUN and SGI projects X – HPCS evaluation system plan, agreement on SC role as HPCS early evaluator at scale

30 DARPA High Productivity Computing Systems Program (HPCS) Goal:  Provide a new generation of economically viable high productivity computing systems for the national security and industrial user community (2007 – 2010) HPCS Program Focus Areas Impact: Performance (efficiency): critical national security applications by a factor of 10X to 40X Productivity (time-to-solution) Portability (transparency): insulate research and operational application software from system Robustness (reliability): apply all known techniques to protect against outside attacks, hardware faults, & programming errors Fill the Critical Technology and Capability Gap Today (late 80’s HPC technology)…..to…..Future (Quantum/Bio Computing) Fill the Critical Technology and Capability Gap Today (late 80’s HPC technology)…..to…..Future (Quantum/Bio Computing) Applications: Intelligence/surveillance, reconnaissance, cryptanalysis, weapons analysis, airborne contaminant modeling and biotechnology

31 Early Computing Metrics Clock frequency Raw performance (flops) GHz Race Current Computing Metrics Clock frequency Point performance Acquisition Price Tera-flop Race (Top Ten HPC Centers) HPCS “Value” Based Metrics System performance relative-to- application diversity Scalability (flops-to-petaflops) Idea-to-solution Time-to-solution Mean time-to-recovery Robustness (includes security) Evolvability Application life cycle costs Acquisition (facilities and equipment) costs Ownership (facilities, support staff, training) costs Computing Metric Evolution

32 STREAMS ADD: Computes A + B for long vectors A and B (historical data available) New microprocessor generations “reset” performance to at most 6% of peak Performance degrades to 1% - 3% of peak as clock speed increases within a generation Goal: benchmarks that relate application performance to memory reuse and other factors Memory System Performance Limitations Why applications with limited memory reuse perform inefficiently today

33 Phase I HPCS Industry Teams Cray, Incorporated International Business Machines Corporation(IBM) Silicon Graphics, Inc. (SGI ) Sun Microsystems, Inc. Hewlett-Packard Company

34 The Future of Supercomputing National Academy CSTB study Co-funded by ASCR and NNSA 18 month duration Co-chairs: Susan Graham, Marc Snir Kick-off meeting 3/6/03 The committee will assess the status of supercomputing in the United States, including the characteristics of relevant systems and architecture research in government, industry, and academia and the characteristics of the relevant market. …

35 High End Computing Revitalization Task Force OSTP interagency thrust HEC an administration priority for FY05 Task Force to address: – HEC core technology R&D – Federal HEC capability, capacity and accessibility – Issues related to Federal procurement of HEC systems “It is expected that the Task Force recommendations will be considered in preparing the President’s budget for FY2005 and beyond.” Kick-off meeting March 10, 2003 Co-chairs: John Grosh, DOD and Alan Laub, DOE

36 Links SciDAC Genomes to Life Nanoscale Science, Engineering, and Technology Research UltraScale Simulation Planning