Presentation is loading. Please wait.

Presentation is loading. Please wait.

Ensuring Our Nation’s Energy Security NCSX Computational Challenges and Directions in the Office of Science Science for DOE and the Nation www.science.doe.gov.

Similar presentations


Presentation on theme: "Ensuring Our Nation’s Energy Security NCSX Computational Challenges and Directions in the Office of Science Science for DOE and the Nation www.science.doe.gov."— Presentation transcript:

1 Ensuring Our Nation’s Energy Security NCSX Computational Challenges and Directions in the Office of Science Science for DOE and the Nation www.science.doe.gov Fred Johnson Advanced Scientific Computing Research SOS7, March 2003

2 2 Outline Background – Computational science in the Office of Science SciDAC Ultrascale Scientific Computing Capability FY04: Next Generation Computer Architecture FY05: The future revisited

3 3 Supports basic research that underpins DOE missions. Constructs and operates large scientific facilities for the U.S. scientific community. – Accelerators, synchrotron light sources, neutron sources, etc. Five Offices – Basic Energy Sciences – Biological and Environmental Research – Fusion Energy Sciences – High Energy and Nuclear Physics – Advanced Scientific Computing Research The Office of Science

4 Computational Science is Critical to the Office of Science Mission Scientific problems of strategic importance typically: –Involve physical scales that range over 5-50 orders of magnitude; –Couple scientific disciplines, e.g., chemistry and fluid dynamics to understand combustion; –Must be addressed by teams of mathematicians, computer scientists, and application scientists; and –Utilize facilities that generate millions of gigabytes of data shared among scientists throughout the world. Two layers of Fe-Mn-Co containing 2,176 atoms corresponds to a wafer with dimensions approximately fifty nanometers (50x 10 -9 m) on a side and five nanometers (5 x 10 -9 m) thick. A simulation of the properties of this configuration was performed on the IBM SP at NERSC. The simulation lasted for 100 hrs. at a calculation rate of 2.46 Teraflops (one trillion floating point operations per second). To explore material imperfections, the simulation would need to be at least 10 times more compute intensive. The Scale of the Problem

5 5 Outline Background – Computational science in the Office of Science SciDAC Ultrascale Scientific Computing Capability FY04: Next Generation Computer Architecture FY05: The future revisited

6 6 Scientific Discovery Through Advanced Computation (SciDAC) SciDAC brings the power of terascale computing and information technologies to several scientific areas -- breakthroughs through simulation. SciDAC is building community simulation models through collaborations among application scientists, mathematicians and computer scientists -- research tools for plasma physics, climate prediction, combustion, etc. State-of-the-art electronic collaboration tools facilitate the access to these tools by the broader scientific community to bring simulation to a level of parity with theory & observation in the scientific enterprise.

7 7 Introduction SciDAC is a pilot program for a “new way of doing science” spans the entire Office of Science (ASCR, BES, BER, FES, HENP) $37M 2M+ 8M+ 3M 7M involves all DOE labs and many universities builds on 50 years of DOE leadership in computation and mathematical software (EISPACK, LINPACK, LAPACK, BLAS, etc.)

8 8 Addressing the Performance Gap through Software 0.1 1 10 100 1,000 20002004 Teraflops 1996 Peak performance is skyrocketing l In 1990s, peak performance increased 100x; in 2000s, it will increase 1000x But... l Efficiency for many science applications declined from 40-50% on the vector supercomputers of 1990s to as little as 5- 10% on parallel supercomputers of today Need research on... l Mathematical methods and algorithms that achieve high performance on a single processor and scale to thousands of processors l More efficient programming models for massively parallel supercomputers Performance Gap Peak Performance Real Performance

9 9 It’s Not Only Hardware! Updated version of chart appearing in “Grand Challenges: High performance computing and communications”, OSTP committee on physical, mathematical and Engineering Sciences, 1992.

10 10 SciDAC Goals an INTEGRATED program to: – (1) create a new generation of scientific simulation codes that take full advantage of the extraordinary capabilities of terascale computers – (2) create the mathematical and computing systems software to enable scientific simulation codes to effectively and efficiently use terascale computers – (3) create a collaboratory software environment to enable geographically distributed scientists to work effectively together as a TEAM and to facilitate remote access, through appropriate hardware and middleware infrastructure, to both facilities and data with the ultimate goal of advancing fundamental research in science central to the DOE mission

11 11 CSE is Team-Oriented successful CSE usually requires teams with members and/or expertise from at least mathematics, computer science, and (several) application areas language and culture differences usual reward structures focus on the individual incompatible with traditional academia SciDAC will help break down barriers and lead by example; DOE labs are a critical asset

12 12 The Computer Scientist’s View Must have Fortran! Must study climate! Must have cycles! Must move data!

13 13 Applications Scientist View Applications Scientist Computer Scientist

14 14 Future SciDAC Issues additional computing and network resources – initial SciDAC focus is on software, but new hardware will be needed within the next two years – both capability and capacity computing needs are evolving rapidly limited architectural options available in the U.S. today – topical computing may be a cost-effective way of providing extra computing resources – math and CS research will play a key role expansion of SciDAC program – many important SC research areas (e.g., materials/nanoscience, functional genomics/proteomics) are not yet included in SciDAC; NSRCs, GTL

15 15 Outline Background – Computational science in the Office of Science SciDAC Ultrascale Scientific Computing Capability FY04: Next Generation Computer Architecture FY05: The future revisited

16 16 Motivation UltraScale Simulation Computing Capability Mission need: Energy production, novel materials, climate science, biological systems – Systems too complex for direct calculation; descriptive laws absent. – Involve physical scales up to 50 orders of magnitude; – Several scientific disciplines, e.g., combustion; materials science – Experimental data may be costly to develop, insufficient, inadequate or unavailable; and – Large data files (millions of gigabytes) shared among scientists throughout the world. History of Accomplishments – MPI, Math libraries, first dedicated high-performance computing center, SciDAC

17 17 ASCAC Statement Without robust response to Earth Simulator, U.S. is open to losing its leadership in defining and advancing frontiers of computational science as new approach to science. This area is critical to both our national security and economic vitality. (Advanced Scientific Computing Advisory Committee – May 21, 2002).

18 18 Simulation Capability Needs FY2004-05 Timeframe Application Simulation Need Sustained Computational Capability Needed (Tflops) Significance Climate Science Calculate chemical balances in atmosphere, including clouds, rivers, and vegetation. > 50 Provides U.S. policymakers with leadership data to support policy decisions. Properly represent and predict extreme weather conditions in changing climate. Magnetic Fusion Energy Optimize balance between self-heating of plasma and heat leakage caused by electromagnetic turbulence. > 50Underpins U.S. decisions about future international fusion collaborations. Integrated simulations of burning plasma crucial for quantifying prospects for commercial fusion. Combustion Science Understand interactions between combustion and turbulent fluctuations in burning fluid. > 50Understand detonation dynamics (e.g. engine knock) in combustion systems. Solve the “soot “ problem in diesel engines. Environmental Molecular Science Reliably predict chemical and physical properties of radioactive substances. > 100Develop innovative technologies to remediate contaminated soils and groundwater. AstrophysicsRealistically simulate the explosion of a supernova for first time. >> 100Measure size and age of Universe and rate of expansion of Universe. Gain insight into inertial fusion processes.

19 19 Key Ideas Deliver a full-suite of leadership class computers for science with broad applicability. Establish a model for computational sciences (SciDAC and base programs) that couples applications scientists, mathematicians, and computational and computer scientists with computer designers, engineers, and semiconductor researchers. Develop partnerships with domestic computer vendors to ensure that leadership class computers are designed, developed, and produced with science needs as an explicit design criterion. Partner with other agencies. Partner with industry on applications.

20 20 FY 2004 Request to OMB USSCC UltraScale Scientific Computing Capability Supporting R&D – 30% – Research with Domestic Vendors – Develop ultrascale hardware and software capabilities for advancing science, focusing on faster interconnects and switches. o Continue 2 partnerships begun in FY2003 o Initiate additional partnerships ( up to 3) in FY2004, based on competitive review – Operating Systems, Software Environments, and Tools o Address issues to ensure scalability of operating systems to meet science needs o Develop enhanced numerical libraries for scientific simulations o Develop tools to analyze application performance on ultrascale computer systems – University-based Computer Architecture Research – Explore future generations of computer architectures for ultrascale science simulation.

21 21 FY 2004 Request to OMB USSCC UltraScale Scientific Computing Capability Computing and Network Facilities- 70% – Computer architecture evaluation partnerships- Evaluate computer architectures at levels to ensure that computer hardware and systems software balanced for science and likely to successfully scale o Continue partnership established in FY2002 between ORNL and Cray, Inc. o Initiate one new partnership, comprised of scientists and engineers from a domestic computer vendor, with computer scientists, and applications scientists supported by the Office of Science. o Award partnership from a competition among invited vendors – Begin installation of first ultrascale computing system for science

22 22 Outline Background – Computational science in the Office of Science SciDAC Ultrascale Scientific Computing Capability FY04: Next Generation Computer Architecture FY05: The future revisited

23 23 Next Generation Computer Architecture Goal:Identify and address major hardware and software architectural bottlenecks to the performance of existing and planned DOE science application Main Activities – Architecture impacts on application performance – OS/runtime research – Evaluation testbeds

24 24 Outline Background – Computational science in the Office of Science SciDAC Ultrascale Scientific Computing Capability FY04: Next Generation Computer Architecture FY05: The future revisited

25 25 How full is the glass? Support and enthusiasm within the Office of Science –Office of Science Strategic Plan Interagency cooperation/coordination –NSA SV2 –DOD IHEC –DARPA HPCS –NNSA: program reviews, open source, NAS study, Red Storm,… –DARPA/DOD/SC USSCC meeting –OSTP/NNSA/DOD/SC NGCA meeting OSTP support International coordination –Hawaii meeting –ES benchmarking

26 26 Agency Coordination Overview Matrix Research CoordinationDevelopment Coordination Strategy Coordination NNSA$17M research funded at NNSA laboratories Red Storm developmentFormal coordination documents DOD – DUSD Science and Technology IHEC study DARPAHPCS review team HPCS evaluation system plan NSA UPCCray SV2/X1 development All AgenciesHECCWG

27 27 NNSA Details Research CoordinationDevelopment Coordination Strategy Coordination NNSAX – $17M research funded at NNSA laboratories, Light weight kernel, common component architecture, performance engineering, … X – Red Storm development quarterly review meetings, ASCI Q review, ASCI PSE review, SciDAC reviews, … X – Formal coordination documents, joint funded NAS study, open source software thrust, platform evaluation

28 28 NSA Details Research CoordinationDevelopment Coordination Strategy Coordination NSAX – UPC (Lauren Smith), Programming Models (Bill Carlson), Benchmarking (Candy Culhane) X – Cray SV2/X1 development, Cray Black Widow development (quarterly review meetings)

29 29 DOD and DARPA Details Research CoordinationDevelopment Coordination Strategy Coordination DOD – DUSD Science and Technology X – IHEC study, agreement on SC role in IHEC DARPAX – HPCS review team – Phase I, Phase II and Phase II; Review Cray, IBM, HP, SUN and SGI projects X – HPCS evaluation system plan, agreement on SC role as HPCS early evaluator at scale

30 30 DARPA High Productivity Computing Systems Program (HPCS) Goal:  Provide a new generation of economically viable high productivity computing systems for the national security and industrial user community (2007 – 2010) HPCS Program Focus Areas Impact: Performance (efficiency): critical national security applications by a factor of 10X to 40X Productivity (time-to-solution) Portability (transparency): insulate research and operational application software from system Robustness (reliability): apply all known techniques to protect against outside attacks, hardware faults, & programming errors Fill the Critical Technology and Capability Gap Today (late 80’s HPC technology)…..to…..Future (Quantum/Bio Computing) Fill the Critical Technology and Capability Gap Today (late 80’s HPC technology)…..to…..Future (Quantum/Bio Computing) Applications: Intelligence/surveillance, reconnaissance, cryptanalysis, weapons analysis, airborne contaminant modeling and biotechnology

31 31 Early Computing Metrics Clock frequency Raw performance (flops) GHz Race Current Computing Metrics Clock frequency Point performance Acquisition Price Tera-flop Race (Top Ten HPC Centers) HPCS “Value” Based Metrics System performance relative-to- application diversity Scalability (flops-to-petaflops) Idea-to-solution Time-to-solution Mean time-to-recovery Robustness (includes security) Evolvability Application life cycle costs Acquisition (facilities and equipment) costs Ownership (facilities, support staff, training) costs Computing Metric Evolution

32 32 STREAMS ADD: Computes A + B for long vectors A and B (historical data available) New microprocessor generations “reset” performance to at most 6% of peak Performance degrades to 1% - 3% of peak as clock speed increases within a generation Goal: benchmarks that relate application performance to memory reuse and other factors Memory System Performance Limitations Why applications with limited memory reuse perform inefficiently today

33 33 Phase I HPCS Industry Teams Cray, Incorporated International Business Machines Corporation(IBM) Silicon Graphics, Inc. (SGI ) Sun Microsystems, Inc. Hewlett-Packard Company

34 34 The Future of Supercomputing National Academy CSTB study Co-funded by ASCR and NNSA 18 month duration Co-chairs: Susan Graham, Marc Snir Kick-off meeting 3/6/03 The committee will assess the status of supercomputing in the United States, including the characteristics of relevant systems and architecture research in government, industry, and academia and the characteristics of the relevant market. … http://www.cstb.org/project_supercomputing.html

35 35 High End Computing Revitalization Task Force OSTP interagency thrust HEC an administration priority for FY05 Task Force to address: – HEC core technology R&D – Federal HEC capability, capacity and accessibility – Issues related to Federal procurement of HEC systems “It is expected that the Task Force recommendations will be considered in preparing the President’s budget for FY2005 and beyond.” Kick-off meeting March 10, 2003 Co-chairs: John Grosh, DOD and Alan Laub, DOE

36 36 Links SciDAC http://www.osti.doe.gov/scidac Genomes to Life http://www.doegenomestolife.org/ Nanoscale Science, Engineering, and Technology Research http://www.sc.doe.gov/production/bes/NNI.htm http://www.science.doe.gov/bes/Theory_and_Modeling_in_Nanoscience.pdf UltraScale Simulation Planning http://www.ultrasim.info/


Download ppt "Ensuring Our Nation’s Energy Security NCSX Computational Challenges and Directions in the Office of Science Science for DOE and the Nation www.science.doe.gov."

Similar presentations


Ads by Google