Presentation is loading. Please wait.

Presentation is loading. Please wait.

HPC in 2029: Will The March to ZettaFLOPS Succeed? William Gropp www.cs.uiuc.edu/~wgropp.

Similar presentations


Presentation on theme: "HPC in 2029: Will The March to ZettaFLOPS Succeed? William Gropp www.cs.uiuc.edu/~wgropp."— Presentation transcript:

1 HPC in 2029: Will The March to ZettaFLOPS Succeed? William Gropp www.cs.uiuc.edu/~wgropp

2 Extrapolation is Risky 1989 – T – 20 years  Intel introduces 486DX  Eugene Brooks writes “Attack of the Killer Micros”  4 years before TOP500  Top systems at about 2 GF Peak 1999 – T – 10 years  NVIDIA introduces the GPU (GeForce 256) Programming GPUs still a challenge  Top system – ASCI Red, 9632 cores, 3.2 TF Peak  MPI is 7 years old

3 HPC Today High(est)-End systems  1 PF (10 15 Ops/s) achieved on a few “peak friendly” applications  Much worry about scalability, how we’re going to get to an ExaFLOPS  Systems are all oversubscribed DOE INCITE awarded almost 900M processor hours in 2009, many turned away NSF PRAC awards for Blue Waters similarly competitive Widespread use of clusters, many with accelerators; cloud computing services Laptops (far) more powerful than the supercomputers I used as a graduate student

4 NSF’s Strategy for High-end Computing FY’07 FY’11 FY’10FY’09FY’08 Science and Engineering Capability ( logarithmic scale ) Track 1 System Track 2 Systems UIUC/NCSA (~1 PF sustained) TACC (500+TF peak) UT/ORNL (~1PF peak) Track 2d PSC (?) Leading University HPC Centers (10-100 TF) Track 3 Systems

5 HPC in 2011 Sustained PF systems  NSF Track 1 “Blue Waters” at Illinois  “Sequoia” Blue Gene/Q at LLNL  Undoubtedly others Still programmed with MPI and MPI+other (e.g., MPI+OpenMP)  But in many cases using toolkits, libraries, and other approaches And not so bad – applications will be able to run when the system is turned on  Replacing MPI will require some compromise – e.g., domain specific (higher-level but less general) Still can’t compile single-threaded code to reliably get good performance – see the work in autotuners. Lesson – there’s a limit to what can be automated. Pretending that there’s an automatic solution will stand in the way of a real solution

6 HPC in 2019 Exascale (10 18 ) systems arrive  Issues include power, concurrency, fault resilience, memory capacity Likely features  Memory per core (or functional unit) smaller than today’s systems  10 8 -10 9 threads  Heterogeneous processing elements Software will be different  You can use MPI, but constraints will get in your way  Likely a combination of tools, with domain-specific solutions and some automated code generation Algorithms need to change/evolve  Extreme scalability, reduced memory  Managed locality  Participate in fault tolerance

7 HPC in 2029 Will we even have Zettaflops (10 21 Ops/s)?  Unlikely (but not impossible) in a single (even highly parallel) system Power (again) – you need an extra 1000-fold improvement in results/Joule Concurrency 10 11 -10 12 threads (!) See the Zettaflops workshops – www.zettaflops.org www.zettaflops.org  Will require new device technology Will the high-end have reached a limit after Exascale systems?

8 The HPC Pyramid in 1993 High Performance Workstations Mid-Range Parallel Processors and Networked Workstations Center Supercomputer s Tera Flop Class

9 The HPC Pyramid in 2029 (?) Laptops, phones, wristwatches, eye glasses… Single Cabinet Petascale Systems (or attack of the killer GPU successors) Center Exascale Supercomputer s

10 Blue Waters Project Petascale Allocation Awards 1.Computational Chemistry at the Petascale  Monica Lamm, Mark Gordon, Theresa Windus, Masha Sosonkina, Brett Bode, Iowa State University 2.Testing Hypotheses about Climate Prediction at Unprecedented Resolutions on the Blue Waters System  David Randall, Ross Heikes, Colorado State University; William Large, Richard Loft, John Dennis, Mariana Vertenstein, National Center for Atmospheric Research; Cristiana Stan, James Kinter, Institute for Global Environment and Society; Benjamin Kirtman, University of Miami 3.Petascale Research in Earthquake System Science on Blue Waters  Thomas Jordan, Jacobo Bielak, University of Southern California 4.Breakthrough Petascale Quantum Monte Carlo Calculations  Shiwei Zhang, College of William and Mary 5.Electronic Properties of Strongly Correlated Systems Using Petascale Computing  Sergey Savrasov, University of California, Davis; Kristjan Haule, Gabriel Kotliar, Rutgers University

11 Blue Waters Project Petascale Allocation Awards 6.Understanding Tornados and Their Parent Supercells Through Ultra- High Resolution Simulation/Analysis  Robert Wilhelmson, Brian Jewett, Matthew Gilmore, University of Illinois at Urbana-Champaign 7.Petascale Simulation of Turbulent Stellar Hydrodynamics  Paul Woodward, Pen-Chung Yew, University of Minnesota, Twin Cities 8.Petascale Simulations of Complex Biological Behavior in Fluctuating Environments  Ilias Tagkopoulos, University of California, Davis 9.Computational Relativity and Gravitation at Petascale: Simulating and Visualizing Astrophysically Realistic Compact Binaries  Manuela Campanelli, Carlos Lousto, Hans-Peter Bischof, Joshua Faber, Yosef Ziochower, Rochester Institute of Technology 10.Enabling Science at the Petascale: From Binary Systems and Stellar Core Collapse to Gamma-Ray Bursts  Eric Schnetter, Gabrielle Allen, Mayank Tyagi, Peter Diener, Christian Ott, Louisiana State University

12 Blue Waters Project Petascale Allocation Awards 11.Petascale Computations for Complex Turbulent Flows  Pui-Kuen Yeung, James Riley, Robert Moser, Amitava Majumdar, Georgia Institute of Technology 12.Computational Microscope  Klaus Schulten, Laxmikant Kale, University of Illinois at Urbana-Champaign 13.Simulation of Contagion on Very Large Social Networks with Blue Waters  Keith Bisset, Xizhou Feng, Virginia Polytechnic Institute and State University 14.Formation of the First Galaxies: Predictions for the Next Generation of Observatories  Brian O’Shea, Michigan State University; Michael Norman, University of California at San Diego 15.Super Instruction Architecture for Petascale Computing  Rodney Bartlett, Erik Duemens, Beverly Sanders, University of Florida; Ponnuswamy Sadayappan, Ohio State University 16.Peta-Cosmology: Galaxy Formation and Virtual Astronomy  Kentaro Nagamine, University of Nevada at Las Vegas; Jeremiah Ostriker, Princeton University; Renyue Cen, Greg Bryan


Download ppt "HPC in 2029: Will The March to ZettaFLOPS Succeed? William Gropp www.cs.uiuc.edu/~wgropp."

Similar presentations


Ads by Google