Presentation is loading. Please wait.

Presentation is loading. Please wait.

National Center for Supercomputing Applications University of Illinois at Urbana-Champaign Blue Waters An Extraordinary Computing Resource for Advancing.

Similar presentations


Presentation on theme: "National Center for Supercomputing Applications University of Illinois at Urbana-Champaign Blue Waters An Extraordinary Computing Resource for Advancing."— Presentation transcript:

1 National Center for Supercomputing Applications University of Illinois at Urbana-Champaign Blue Waters An Extraordinary Computing Resource for Advancing Science and Engineering Thom Dunning, Rob Pennington,* Bill Kramer, Marc Snir, Bill Gropp, Wen-mei Hwu and Ed Seidel* * Currently at NSF

2 Background March to Petascale Computing CI Days 22 February 2010 University of Kentucky Top 500: #1 1 GF: late 1980s 1 TF: PF: 2008 petaflops teraflops

3 Background NSF’s Strategy for High-end Computing Three Resource Levels Track 3: University owned and operated Track 2: Several NSF-funded supercomputer & specialized computing centers (TeraGrid) Track 1: NSF-funded leading- edge computer center Computing Resources Track 3: 10s–100s TF Track 2: 500–1,000 TF, ~100+ TB of memory Track 1: see following slides CI Days 22 February 2010 University of Kentucky

4 Blue Waters Project Fielding a Sustained Petascale Computing System

5 Blue Waters Project Technology Selection: Who We Consulted CI Days 22 February 2010 University of Kentucky D. Baker, University of Washington Protein structure refinement and determination M. Campanelli, RIT Computational relativity and gravitation D. Ceperley, UIUC Quantum Monte Carlo molecular dynamics J. P. Draayer, LSU Ab initio nuclear structure calculations P. Fussell, Boeing Aircraft design optimization C. C. Goodrich Space weather modeling M. Gordon, T. Windus, Iowa State University Electronic structure of molecules S. Gottlieb, Indiana University Lattice quantum chromodynamics V. Govindaraju Image processing and feature extraction M. L. Klein, University of Pennsylvania Biophysical and materials simulations J. B. Klemp et al., NCAR Weather forecasting/hurricane modeling R. Luettich, University of North Carolina Coastal circulation and storm surge modeling W. K. Liu, Northwestern University Multiscale materials simulations M. Maxey, Brown University Multiphase turbulent flow in channels S. McKee, University of Michigan Analysis of ATLAS data M. L. Norman, UCSD Simulations in astrophysics and cosmology J. P. Ostriker, Princeton University Virtual universe J. P. Schaefer, LSST Corporation Analysis of LSST datasets P. Spentzouris, Fermilab Design of new accelerators W. M. Tang, Princeton University Simulation of fine-scale plasma turbulence A. W. Thomas, D. Richards, Jefferson Lab Lattice QCD for hadronic and nuclear physics J. Tromp, Caltech/Princeton Global and regional seismic wave propagation P. R. Woodward, University of Minnesota Astrophysical fluid dynamics

6 Blue Waters Project Attributes of Sustained Petascale System Maximum Core Performance …to minimize number of cores needed for a given performance level, lessen impact of sections of code with limited scalability Low Latency, High Bandwidth Interconnect …to enable science and engineering applications to scale to tens to hundreds of thousands of cores Large, Fast Memories …to solve the most memory-intensive problems Large, Fast I/O System and Data Archive …to solve the most data-intensive problems Reliable Operation …to enable the solution of Grand Challenge problems CI Days 22 February 2010 University of Kentucky

7 Blue Waters Project Track 1 System: Blue Waters TACCNCSA System AttributeTrack 2Track 1 VendorSunIBM ProcessorAMD BarcelonaIBM Power7 Peak Performance (PF)0.579 Sustained Performance (PF)~0.05~1 Number of Cores/Chip48 Number of Processor Cores62,976>300,000 Amount of Memory (TB)123>1,000 Amount of Disk Storage (PB)1.73 (s)>10 Amount of Archival Storage (PB)2.5 (20)>500 External Bandwidth (Gbps) >20 2 >3 >6 >5 >200 >10 CI Days 22 February 2010 University of Kentucky

8 Blue Waters Project Building Blue Waters Multi-chip Module 4 Power7 chips 128 GB memory 512 GB/s memory bandwidth 1 TF (peak) Router 1,128 GB/s bandwidth IH Server Node 8 MCM’s (256 cores) 1 TB memory 8 TF (peak) Fully water cooled Blue Waters Building Block 32 IH server nodes 32 TB memory 256 TF (peak) 4 Storage systems 10 Tape drive connections Blue Waters ~1 PF sustained >300,000 cores >1 PB of memory >10 PB of disk storage ~500 PB of archival storage >100 Gbps connectivity Blue Waters is built from components that can also be used to build systems with a wide range of capabilities—from deskside to beyond Blue Waters. Blue Waters will be the most powerful computer in the world for scientific research when it comes on line in Summer of CI Days 22 February 2010 University of Kentucky Power7 Chip 8 cores, 32 threads L1, L2, L3 cache (32 MB) Up to 256 GF (peak) 45 nm technology

9 Blue Waters Project Selected Unique Features of Blue Waters Shared/Distributed Memory Computing System Powerful Shared Memory Multichip Module (QCM) New high performance fabric interconnects all nodes Hardware support for global shared memory I/O and Data archive Systems High performance I/O subsystems On-line disks fully integrated with archival storage system Natural Growth Path Full range of systems from servers to supercomputers Facilitates software development Address science and engineering problems at all levels CI Days 22 February 2010 University of Kentucky

10 Blue Waters Project National Petascale Computing Facility Modern Data Center 90,000+ ft 2 total 30,000 ft 2 raised floor 20,000 ft 2 machine room Energy Efficiency LEED certified Gold (goal: Platinum) PUE < 1.2 (< 1.1 for much of year) Partners EYP MCF/ Gensler IBM Yahoo! CI Days 22 February 2010 University of Kentucky

11 Blue Waters Project Great Lakes Consortium for Petascale Computation The Ohio State University* Shiloh Community Unit School District #1 Shodor Education Foundation, Inc. SURA – 60 plus universities University of Chicago* University of Illinois at Chicago* University of Illinois at Urbana-Champaign* University of Iowa* University of Michigan* University of Minnesota* University of North Carolina–Chapel Hill University of Wisconsin–Madison* Wayne City High School * CIC universities* Argonne National Laboratory Fermi National Accelerator Laboratory Illinois Math and Science Academy Illinois Wesleyan University Indiana University* Iowa State University Illinois Mathematics and Science Academy Krell Institute, Inc. Louisiana State University Michigan State University* Northwestern University* Parkland Community College Pennsylvania State University* Purdue University* Goal: Facilitate the widespread and effective use of petascale computing to address frontier research questions in science, technology and engineering at research, educational and industrial organizations across the region and nation. Charter Members CI Days 22 February 2010 University of Kentucky

12 Science & Engineering Research on Blue Waters

13 Blue Waters Project Computational Science and Engineering Petascale computing will enable advances in a broad range of science and engineering disciplines: CI Days 22 February 2010 University of Kentucky Molecular ScienceWeather & Climate Forecasting Earth ScienceAstronomy Health

14 Petascale Computing Resource Allocations Solicitation: NSF (PRAC) Selection Criteria Compelling science or engineering research question Question that can only be answered using a system of the scale of Blue Waters (cycles, memory, I/O bandwidth, etc.) Evidence, or a convincing argument, that the application code can make effective use of Blue Waters Source (or sources) of funds to support the research work and any needed code development effort Funding Allocation or provisional allocation of time on Blue Waters Travel funds to enable teams to work closely with Blue Waters Project team Next Due Date March 17, 2010 (annually thereafter) CI Days 22 February 2010 University of Kentucky

15 Blue Waters Engagement with Research Teams Provide Details on Blue Waters System Provide Assistance with Blue Waters Software Numerical libraries MPI, OpenMP, ARMCI/Global Arrays, LAPI, OpenSHMEM, etc. Compilers (Fortran, C, UPC, Co-array Fortran) Provide Assistance with Blue Waters Hardware Chip and network simulators Staged access to Power7 hardware Provide Training On-line documentation Webinars/tutorials/on-line courses (~8 per year) Workshops (~2 per year) CI Days 22 February 2010 University of Kentucky

16 For More Information: Blue Waters Website IBM Power Systems Website PRAC Solicitation CI Days 22 February 2010 University of Kentucky

17 Questions?


Download ppt "National Center for Supercomputing Applications University of Illinois at Urbana-Champaign Blue Waters An Extraordinary Computing Resource for Advancing."

Similar presentations


Ads by Google