U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, 2007 1 ADVANCED SCIENTIFIC COMPUTING RESEARCH An.

Slides:



Advertisements
Similar presentations
1 US activities and strategy :NSF Ron Perrott. 2 TeraGrid An instrument that delivers high-end IT resources/services –a computational facility – over.
Advertisements

What do we currently mean by Computational Science? Traditionally focuses on the “hard sciences” and engineering –Physics, Chemistry, Mechanics, Aerospace,
Sandy Landsberg DOE Office of Science Advanced Scientific Computing Research September 20, 2012 DOE / ASCR Interests & Complex Engineered Networks Applied.
O AK R IDGE N ATIONAL L ABORATORY U. S. D EPARTMENT OF E NERGY Center for Computational Sciences Cray X1 and Black Widow at ORNL Center for Computational.
Norman D. Peterson Director, Government Relations September 9, 2013
U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
Presentation at WebEx Meeting June 15,  Context  Challenge  Anticipated Outcomes  Framework  Timeline & Guidance  Comment and Questions.
High-Performance Computing
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update June 12,
Presented by Suzy Tichenor Director, Industrial Partnerships Program Computing and Computational Sciences Directorate Oak Ridge National Laboratory DOE.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
The DOE Science Grid Computing and Data Infrastructure for Large-Scale Science William Johnston, Lawrence Berkeley National Lab Ray Bair, Pacific Northwest.
Presented by Scalable Systems Software Project Al Geist Computer Science Research Group Computer Science and Mathematics Division Research supported by.
1 Strategic Planning: An Update March 13, Outline What we have done so far? Where do we stand now? Next steps?
1 Ideas About the Future of HPC in Europe “The views expressed in this presentation are those of the author and do not necessarily reflect the views of.
Office of Science Office of Biological and Environmental Research J Michael Kuperberg, Ph.D. Dan Stover, Ph.D. Terrestrial Ecosystem Science AmeriFlux.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
The NIH Roadmap for Medical Research
Research Groups Committed to Understanding Energy One of the world’s premier centers for R&D on energy production, distribution,
Lawrence Berkeley National Laboratory Kathy Yelick Associate Laboratory Director for Computing Sciences.
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
Welcome to HTCondor Week #14 (year #29 for our project)
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program Walter M. Polansky Office of Advanced.
Opportunities for Discovery: Theory and Computation in Basic Energy Sciences Chemical Sciences, Geosciences and Biosciences Response to BESAC Subcommittee.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
Computer Science in UNEDF George Fann, Oak Ridge National Laboratory Rusty Lusk, Argonne National Laboratory Jorge Moré, Argonne National Laboratory Esmond.
The Climate Prediction Project Global Climate Information for Regional Adaptation and Decision-Making in the 21 st Century.
Designing the Microbial Research Commons: An International Symposium Overview National Academy of Sciences Washington, DC October 8-9, 2009 Cathy H. Wu.
Partnerships and Broadening Participation Dr. Nathaniel G. Pitts Director, Office of Integrative Activities May 18, 2004 Center.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update September.
November 13, 2006 Performance Engineering Research Institute 1 Scientific Discovery through Advanced Computation Performance Engineering.
Physics Steven Gottlieb, NCSA/Indiana University Lattice QCD: focus on one area I understand well. A central aim of calculations using lattice QCD is to.
Results of the HPC in Europe Taskforce (HET) e-IRG Workshop Kimmo Koski CSC – The Finnish IT Center for Science April 19 th, 2007.
Systems Studies Program Peer Review Meeting Albert L. Opdenaker III DOE Program Manager Holiday Inn Express Germantown, Maryland August 29, 2013.
DOE 2000, March 8, 1999 The IT 2 Initiative and NSF Stephen Elbert program director NSF/CISE/ACIR/PACI.
Center for Computational Sciences O AK R IDGE N ATIONAL L ABORATORY U. S. D EPARTMENT OF E NERGY Vision for OSC Computing and Computational Sciences
Presented by Leadership Computing Facility (LCF) Roadmap Buddy Bland Center for Computational Sciences Leadership Computing Facility Project.
Cray Innovation Barry Bolding, Ph.D. Director of Product Marketing, Cray September 2008.
BESAC Dec Outline of the Report I. A Confluence of Scientific Opportunities: Why Invest Now in Theory and Computation in the Basic Energy Sciences?
David Mogk Dept. of Earth Sciences Montana State University April 8, 2015 Webinar SAGE/GAGE FACILITIES SUPPORTING BROADER EDUCATIONAL IMPACTS: SOME CONTEXTS.
Pascucci-1 Valerio Pascucci Director, CEDMAV Professor, SCI Institute & School of Computing Laboratory Fellow, PNNL Massive Data Management, Analysis,
Commodity Grid Kits Gregor von Laszewski (ANL), Keith Jackson (LBL) Many state-of-the-art scientific applications, such as climate modeling, astrophysics,
Experts in numerical algorithms and High Performance Computing services Challenges of the exponential increase in data Andrew Jones March 2010 SOS14.
Ted Fox Interim Associate Laboratory Director Energy and Engineering Sciences Oak Ridge, Tennessee March 21, 2006 Oak Ridge National Laboratory.
Mcs/ HPC challenges in Switzerland Marie-Christine Sawley General Manager CSCS SOS8, Charleston April,
Argonne Leadership Computing Facility ALCF at Argonne  Opened in 2006  Operated by the Department of Energy’s Office of Science  Located at Argonne.
Brent Gorda LBNL – SOS7 3/5/03 1 Planned Machines: BluePlanet SOS7 March 5, 2003 Brent Gorda Future Technologies Group Lawrence Berkeley.
A look at computing performance and usage.  3.6GHz Pentium 4: 1 GFLOPS  1.8GHz Opteron: 3 GFLOPS (2003)  3.2GHz Xeon X5460, quad-core: 82 GFLOPS.
Materials Innovation Platforms (MIP): A New NSF Mid-scale Instrumentation and User Program to Accelerate The Discovery of New Materials MRSEC Director’s.
1 DOE Office of Science October 2003 SciDAC Scientific Discovery through Advanced Computing Alan J. Laub.
GEOSCIENCE NEEDS & CHALLENGES Dogan Seber San Diego Supercomputer Center University of California, San Diego, USA.
Computational Science & Engineering meeting national needs Steven F. Ashby SIAG-CSE Chair March 24, 2003.
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
1 OFFICE OF ADVANCED SCIENTIFIC COMPUTING RESEARCH The NERSC Center --From A DOE Program Manager’s Perspective-- A Presentation to the NERSC Users Group.
Internet2 Applications Group: Renater Group Presentation T. Charles Yun Internet2 Program Manager, Applications Group 30 October 2001.
Comprehensive Scientific Support Of Large Scale Parallel Computation David Skinner, NERSC.
NICS Update Bruce Loftis 16 December National Institute for Computational Sciences University of Tennessee and ORNL partnership  NICS is the 2.
Presented by NCCS Hardware Jim Rogers Director of Operations National Center for Computational Sciences.
Professor Arthur Trew Director, EPCC EPCC: KT in novel computing.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI strategy and Grand Vision Ludek Matyska EGI Council Chair EGI InSPIRE.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
Toward High Breakthrough Collaboration (HBC) Susan Turnbull Program Manager Advanced Scientific Computing Research March 4, 2009.
Fermi National Accelerator Laboratory & Thomas Jefferson National Accelerator Facility SciDAC LQCD Software The Department of Energy (DOE) Office of Science.
Presented by SciDAC-2 Petascale Data Storage Institute Philip C. Roth Computer Science and Mathematics Future Technologies Group.
A Brief Introduction to NERSC Resources and Allocations
Electron Ion Collider New aspects of EIC experiment instrumentation and computing, as well as their possible impact on and context in society (B) COMPUTING.
The Shifting Landscape of CI Funding
Presentation transcript:

U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, ADVANCED SCIENTIFIC COMPUTING RESEARCH An Overview Michael Strayer Associate Director, Office of Science

U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, Department of Energy Organizational Structure

U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, Advanced Scientific Computing Research Advanced Scientific Computing Research Computational Science Research and Partnerships Facilities Small Business Research NERSC Oak Ridge LCF Argonne LCF ESnet HPC & Network Facilities & Testbeds ASCR Mission: Steward of DOE’s Computational Science, Applied Mathematics, Computer Science, High-Performance Computing and Networking Research for open science. Deploy and operate high performance computing user facilities at LBNL, ANL, and ORNL ASCR Vision: Best in class advancing science and technological innovation through modeling and simulation

U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, ASCR High Performance Computing Resources High Performance Production Computing Facility (NERSC) –Delivers high-end capacity computing to entire DOE SC research community –Large number of projects (200 – 300) –Medium- to very-large-scale projects that occasionally need a very high capability –Annual allocations Leadership Computing Facilities –Delivers highest computational capability to national and international researchers through peer-reviewed Innovative and Novel Computational Impact on Theory and Computation program –Small number of projects (10 – 20) –Multiple year allocations

U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, Current Facilities NERSC –10 Teraflop IBM SP 375 RS/6000 (Seaborg) with 6080 processors, 7.2 terabytes aggregate memory –6.7 Teraflop IBM Power 5 (Bassi) with 888 processors, 3.5 terabytes aggregate memory –3.1 Teraflop LinuxNetworx Opteron cluster (Jacquard) with 712 processors, 2.1 terabytes aggregate memory LCF at Oak Ridge –119 teraflop Cray XT3/XT4 (Jaguar) with 11,708 dual core AMD Opteron processor nodes, 46 terabytes aggregate memory –18.5 Teraflop Cray X1E (Phoenix) with 1,024 multi-streaming vector processors, Argonne LCF –5.7 Teraflop IBM Blue Gene/L (BGL) with 2,048 PPC processors

U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, Future Facility Upgrades ALCF –100 teraflop IBM Blue Gene/P delivered by end of 2007 – teraflop upgrade to IBM Blue Gene/P in late 2008 LCF – Oak Ridge –Cray XT4 upgraded to 250 TF by end of 2007 –1 Petaflop Cray Baker system to be delivered by end of 2008 NERSC –100+ teraflop Cray XT4 in operation by October 2007

U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, Ensuring Hardware Productivity Scientific Discovery through Advanced Computing (SciDAC) Innovative and Novel Computational Impact on Theory and Experiment (INCITE)

U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, Create comprehensive, scientific computing software infrastructure to enable scientific discovery in the physical, biological, and environmental sciences at the petascale Develop new generation of data management and knowledge discovery tools for large data sets (obtained from scientific user and simulations) Scientific Discovery through Advanced Computing (SciDAC)

U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, SciDAC Accomplishments SciDAC teams –created first laboratory-scale flame simulation in three dimensions to better understand combustion which provides 80% of the energy used in the U.S. –simulated techniques for re-fueling fusion reactors –developed new methods for simulating improvements in future particle accelerators SciDAC partnerships improved effectiveness of scientific applications codes between 275% to over 10,000% –Example: decreased time to solution for Agile- Boltzmann baseline run from 4 weeks to 4 days (8 angles, 12 energy groups with a spatial resolution of 100) The SciDAC data mining tool, Sapphire, awarded a prestigious 2006 R&D100 award Hydroxyl radical in a turbulent jet flame SciDAC Review and Scientific Discovery document numerous SciDAC accomplishments

U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, SciDAC-2 Computational Collaborations to Drive Scientific Discovery Total- $61.5 M Statistics 31- SciDAC projects 9- Centers 4- Institutes 18- Efforts in 11 application areas Astrophysics, Climate, Biology, Fusion, Petabyte data, Materials & Chemistry, Nuclear physics, High Energy physics, QCD, Turbulence, Groundwater New performers ? About 60% of the funds !

U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, Institutes and Centers ― Attributes ― Institutes- University-led centers of excellence –Focus on major software issues –Employ range of collaborative research interactions. –Reach out to engage a broader community of scientists in scientific discovery through advanced computation and collaboration. –Conduct training/outreach in high performance computing topics. Centers for Enabling Technology- work directly with applications: –Develop to enable scientific simulation codes to take full advantage of tera- to peta- scale. –Ensure critical computer science and applied mathematics issues are addressed in a timely and comprehensive fashion. –Address issues associated with research software lifecycle.

U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, SciDAC Outreach Center “ Build Collaborations to Drive Scientific Discovery” Innovative web and software services –Tools which make SciDAC researchers more effective at delivering their technologies (web hosting and authenticated wiki-like portals) –Services to promote an easy interface between SciDAC and ‘the outside computational world’ (web, , and phone central point of contact for SciDAC inquiries) Workshops, training sessions –Getting the right people together to forge collaborations Outreach

U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, Conference

U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, Innovative and Novel Computational Impact on Theory and Experiment- INCITE Initiated in 2004 Provides Office of Science computing resources to a small number of computationally intensive research projects of large scale, that can make high-impact scientific advances through the use of a large allocation of computer time and data storage Open to national and international researchers, including industry No requirement of DOE Office of Science funding Peer-reviewed 2004 Awards: 4.9 Million processor hours at NERSC awarded to three projects 2005 Awards: 6.5 Million processor hours at NERSC awarded to three projects

U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, INCITE Allocations by Disciplines 95 Million processor hours allocate to 45 projects

U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, INCITE New 2008 Call for Proposals for over 0.25 Billion processor hours of INCITE allocations should be announced in mid-May at

U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, Road to Exascale Simulation and Modeling at the Exascale for Energy Ecological Sustainability and Global Security (E 3 SGS) Initiative The planned petascale computer systems and the progress toward exascale systems provide an unprecedented opportunity for science; one that will make it possible to use computation not only as an critical tool along with theory and experiment in understanding the behavior of the fundamental components of nature but also for fundamental discovery and exploration of the behavior of complex systems with billions of components including those involving humans.

U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, Critical Challenges Energy- Ensuring global sustainability requires reliable and affordable pathways to low-carbon energy production, e.g. bio-fuels, fusion and fission, and distribution on a massive scale. Ecological Sustainability- The effort toward sustainability involves characterizing the conditions for balance in the climate system. The ability to fit energy production and industrial emissions within balanced global climate and chemical cycles is the major scientific and technical challenge for this century. Security- The internet, as well as the instrumentation and control systems for the energy infrastructure, are central to the well-being of our society.

U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, Programmatic Themes Engage top scientists and engineers, computer scientists and applied mathematicians in the country to develop the science of complexity as well as new science driven computer architectures and algorithms tied to the needs of scientific computing at all scales. Correspondingly, recruit and develop the next generation of computational and mathematical scientists. Invest in pioneering large-scale science, modeling and simulation that contribute to advancing energy, ecology and global security. Develop scalable analysis algorithms, data systems and storage architectures needed to accelerate discovery from large-scale experiments and enable verification and validation of the results of the pioneering applications. Additionally, develop visualization and data management systems to manage the output of large-scale computational science runs and in new ways to integrate data analysis with modeling and simulation. Accelerate the build-out and future development of the DOE open computing facilities to realize the large-scale systems-level science required to advance the energy, ecology and global security program.

U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, E 3 SGS Town Hall Meetings Three “town hall meetings” on the proposed E 3 SGS initiative: Lawrence Berkeley National Laboratory hosted the first meeting on April 17-18, Oak Ridge National Laboratory – May 17-18, and Argonne National Laboratory – May 31-June 1.