Presentation is loading. Please wait.

Presentation is loading. Please wait.

National Strategic Computing Initiative

Similar presentations


Presentation on theme: "National Strategic Computing Initiative"— Presentation transcript:

1 National Strategic Computing Initiative
Irene Qualters Division Director, ACI Computer & Information Science & Engineering October 16, 2015

2 National Strategic Computing Initiative Executive Order Signed July 29, 2015
“Whole of government” approach Public/private partnership with industry and academia Strategic Leverage beyond individual programs Long time horizon (decade or more) Computing HPC as advanced, capable computing technology Multiple styles of computing and all necessary infrastructure Scope includes everything necessary for a fully integrated capability Initiative Above baseline effort Link and lift efforts Enhance U.S. strategic advantage in HPC for economic competitiveness and scientific discovery

3 NSCI Executive Order calls on NSF to play a leadership role
Scientific discovery advances The broader HPC ecosystem for scientific discovery Workforce development NSF is in a leadership position to advance the science and engineering foundations for HPC over a two-decade time horizon, and to reap the benefits of applying HPC advances to all science and engineering domains.   The NSCI calls on NSF to play a central role in… We are co-leading NSCI with DOD and DOE Co-lead with DOD and DOE

4 The Government’s Role in NSCI
DOD + DOE Capable exascale program Analytic computing to support missions: science and national security NSF Scientific discovery Broader HPC ecosystem Workforce Development IARPA + NIST Future computing technologies NASA, FBI, NIH, DHS, NOAA Deployment within their mission contexts The NSCI is formulated as a “whole of government” activity

5 NSCI Objectives Accelerate delivery of a capable exascale computing system (hardware, software) to deliver approximately 100X the performance of current 10PF systems across a range of applications reflecting government needs Increase coherence between technology base used for modeling and simulation and that used for data analytic computing. Establish, over the next 15 years, a viable path forward for future HPC systems in the post Moore’s Law … Increase the capacity and capability of an enduring national HPC ecosystem, employing a holistic approach … networking, workflow, downward scaling, foundational algorithms and software, and workforce development. Develop an enduring public-private partnership to assure that the benefits .. are transferred to the U.S. commercial, government, and academic sectors

6 Next Steps for NSCI Executive Council established Implementation Plan
Co-chaired by the Directors of OSTP and of OMB Membership representing participating agencies as designated by the Director of OSTP Implementation Plan To be established within 90 days (of July 29, 2015) And annually thereafter for 5 years NIH/DOE/NSF Request For Information (RFI) “Science Drivers Requiring Capable Exascale High Performance Computing” response deadline has been extended to November 13 White House National Strategic Computing Initiative Workshop (Oct.20-21)

7 3. Establish, over the next 15 years, a viable path forward for future HPC systems in the post Moore’s Law era Happening now Multi-core and many-core processors Domain-specific integrated circuits Energy-aware computing Hierarchical memories High-speed Interconnects Longer term New materials (e.g., carbon nano-tubes, graphene-based devices) Usable parallelism, concurrency, and scalability Non-charge transfer devices (e.g., electron spin) Resiliency at scale Decreased power consumption Bio, nano, and quantum devices Architectures that reduce data movement NSF Role: Support foundational research (leadership by ENG, CISE, MPS, and BIO)

8 2. Increase coherence between technology base used for modeling and simulation and that used for data analytic computing Modeling and Simulation Multi-scale Multi-physics Multi-resolution Multidisciplinary Coupled models Data Science Data Assimilation Visualization Image Analysis Data Compression Data Analytics Machine Learning Time NSF Role: Support foundational research and research infrastructure within and across all disciplines (across all NSF directorates)

9 Data Science : Emerging …..Inherently multidisciplinary
Engineering Practice Theoretical Foundation Technical Skills 2010 Drew Conway’s Venn Diagram Mathematical and CS Theory Software Engineering Domain Knowledge For better or worse, data is a commodity traded electronically; therefore, in order to be in this market you need to speak hacker. This, however, does not require a background in computer science—in fact—many of the most impressive hackers I have met never took a single CS course. Being able to manipulate text files at the command-line, understanding vectorized operations, thinking algorithmically; these are the hacking skills that make for a successful data hacker. Once you have acquired and cleaned the data, the next step is to actually extract insight from it. In order to do this, you need to apply appropriate math and statistics methods, which requires at least a baseline familiarity with these tools. This is not to say that a PhD in statistics in required to be a competent data scientist, but it does require knowing what an ordinary least squares regression is and how to interpret it. In the third critical piece—substance—is where my thoughts on data science diverge from most of what has already been written on the topic. To me, data plus math and statistics only gets you machine learning, which is great if that is what you are interested in, but not if you are doing data science. Science is about discovery and building knowledge, which requires some motivating questions about the world and hypotheses that can be brought to data and tested with statistical methods. On the flip-side, substantive expertise plus math and statistics knowledge is where most traditional researcher falls. Doctoral level researchers spend most of their time acquiring expertise in these areas, but very little time learning about technology. Part of this is the culture of academia, which does not reward researchers for understanding technology. That said, I have met many young academics and graduate students that are eager to bucking that tradition. Finally, a word on the hacking skills plus substantive expertise danger zone. This is where I place people who, "know enough to be dangerous," and is the most problematic area of the diagram. In this area people who are perfectly capable of extracting and structuring data, likely related to a field they know quite a bit about, and probably even know enough R to run a linear regression and report the coefficients; but they lack any understanding of what those coefficients mean. It is from this part of the diagram that the phrase "lies, damned lies, and statistics" emanates, because either through ignorance or malice this overlap of skills gives people the ability to create what appears to be a legitimate analysis without any understanding of how they got there or what they have created. Fortunately, it requires near willful ignorance to acquire hacking skills and substantive expertise without also learning some math and statistics along the way. As such, the danger zone is sparsely populated, however, it does not take many to produce a lot of damage. Domain Knowledge

10 Aspirations for Convergence
State of the Art “Big Data” Data Analytics Data Intensity Large Scale Data Driven Modeling And Simulation Goal of Initiative State of the Art High-Performance Modeling And Simulation Computational Intensity

11 2. Increase synergy between technology base used for modeling and simulation and that used for data analytic computing 4. Increase the capacity and capability of an enduring national HPC ecosystem, employing a holistic approach … networking, workflow, downward scaling, foundational algorithms and software, and workforce development. NSF Role: Research Priorities: Computationally and Data Intensive Science and Engineering Frontiers Discovery-motivated computational investments: Cyberinfrastructure: people, software, technology Collaborations likely: interagency, public sector, industry, international Re-use, agility, sustainability Directorates: All

12 The conduct of science is changing
Revolution in the scientific workflow: many interfaces to shared CI services Large Facilities Researcher Software Data Shared Data/Software Gateway Resources Collaboration Networks Psdgraphics.com National Computing Resources Cloud Services

13 Scientific Discovery & Innovation Cyberinfrastructure Ecosystem
NSF embraces an expansive view of cyberinfrastructure motivated by scientific priorities, the changing conduct of science and informed by technology advances OBSERVE HYPOTHESIZE EXPERIMENT ANALYZE THEORIZE Scientific Discovery & Innovation Data Software People, organizations, & communities Scientific Instruments Cyberinfrastructure Ecosystem Computational Resources Networking & Cybersecurity

14 NSF role in NSCI: Enduring Computational Ecosystem for Advancing Science and Engineering
Infrastructure Platform Pilots, Development and Deployment Computational and Data Literacy across all STEM disciplines Computational and Data Enabled Science and Engineering Discovery Fundamental Research in HPC Platform Technologies, Architectures, and Approaches

15 NSF Implementation Community Input NSF-wide activity
NSF Advisory Committee(s) engagement NSF-Wide Cyberinfrastructure Advisory Committee (ACCI) Directorate ACs: MPS, CISE, … National Academies Study completion NSF-wide activity CI Council ACI, Heads of Directorates Cross-Directorate Working Group MPS & CISE/ACI Co-chair/leads

16 Discussion, Questions?


Download ppt "National Strategic Computing Initiative"

Similar presentations


Ads by Google