NSF’s CyberInfrastructure Vision for 21st Century Discovery, Innovation, and Learning GridChem Workshop March 9, 2006 Austin, TX Miriam Heller, Ph.D.

Slides:



Advertisements
Similar presentations
CCIRN Meeting Douglas Gatchell US NSF/CISE/SCI July 3, 2004.
Advertisements

21 st Century Science and Education for Global Economic Competition William Y.B. Chang Director, NSF Beijing Office NATIONAL SCIENCE FOUNDATION.
Xsede eXtreme Science and Engineering Discovery Environment Ron Perrott University of Oxford 1.
1 US activities and strategy :NSF Ron Perrott. 2 TeraGrid An instrument that delivers high-end IT resources/services –a computational facility – over.
Joint CASC/CCI Workshop Report Strategic and Tactical Recommendations EDUCAUSE Campus Cyberinfrastructure Working Group Coalition for Academic Scientific.
Cyberinfrastructure Planning at NSF Steve Meacham National Science Foundation February 8, 2006.
U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
Presentation at WebEx Meeting June 15,  Context  Challenge  Anticipated Outcomes  Framework  Timeline & Guidance  Comment and Questions.
SACNAS, Sept 29-Oct 1, 2005, Denver, CO What is Cyberinfrastructure? The Computer Science Perspective Dr. Chaitan Baru Project Director, The Geosciences.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CIF21) NSF-wide Cyberinfrastructure Vision People, Sustainability, Innovation,
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
SDSC Computing the 21st Century Talk Given to the NSF Sugar Panel May 27, 1998.
The "Earth Cube” Towards a National Data Infrastructure for Earth System Science Presentation at WebEx Meeting July 11, 2011.
NSF’s Evolving Cyberinfrastructure Program Guy Almes Office of Cyberinfrastructure Cyberinfrastructure2005 Lincoln 16 August 2005.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
CyberInfrastructure and Office of CyberInfrastructure (OCI) Kevin Thompson Program Director Office of Cyberinfrastructure.
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
CyberInfrastructure and Office of CyberInfrastructure (OCI) to: SURA Information Technology Committee Meeting José Muñoz, Ph.D. Office of CyberInfrastructure.
Advances in Cyberinfrastructure with a Focus on Data: a U.S. National Science Foundation Overview Alliance for Permanent Access to Records of Science in.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
Unidata Policy Committee Meeting Bernard M. Grant, Assistant Program Coordinator for the Atmospheric and Geospace Sciences Division May 2012 NSF.
CI Days: Planning Your Campus Cyberinfrastructure Strategy Russ Hobby, Internet2 Internet2 Member Meeting 9 October 2007.
Broadening Participation Activities in Chemistry Celeste M. Rohlfing National Science Foundation Chemistry Division September 20, 2007.
Partnerships and Broadening Participation Dr. Nathaniel G. Pitts Director, Office of Integrative Activities May 18, 2004 Center.
Students Becoming Scientists in the World: Integrating Research and Education for Sustainable Development Dr. James P. Collins Directorate for the Biological.
The FY 2009 Budget Thomas N. Cooley, NSF Council of Colleges of Arts and Sciences March 13, 2008.
August 2007 Advancing Scientific Discovery through TeraGrid Adapted from S. Lathrop’s talk in SC’07
Biomedical Science and Engineering Funding Opportunities at NSF Semahat Demir Program Director Biomedical Engineering Program National Science Foundation.
National Science Foundation 1 Evaluating the EHR Portfolio Judith A. Ramaley Assistant Director Education and Human Resources.
O C I October 31, 2006Office of CyberInfrastructure1 Software Development for Cyberinfrastructure (SDCI) and Cyberinfrastructure for Environmental Observatories:
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
O C I October 31, 2006Office of CyberInfrastructure1 Cyberinfrastructure Training, Education, Advancement, & Mentoring (CI-TEAM) Miriam Heller National.
Cyberinfrastructure Planning at NSF Deborah L. Crawford Acting Director, Office of Cyberinfrastructure HPC Acquisition Models September 9, 2005.
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
National Science Foundation Experimental Program to Stimulate Competitive Research (NSF EPSCoR) May 24, 2012 National Academies 1.
1 Investing in America’s Future The National Science Foundation Strategic Plan for FY Advisory Committee for Cyberinfrastructure 10/31/06 Craig.
Promoting Diversity at the Graduate Level in Mathematics: A National Forum MSRI October 16, 2008 Deborah Lockhart Executive Officer, Division of Mathematical.
Cyberinfrastructure A Status Report Deborah Crawford, Ph.D. Interim Director, Office of Cyberinfrastructure National Science Foundation.
DOE 2000, March 8, 1999 The IT 2 Initiative and NSF Stephen Elbert program director NSF/CISE/ACIR/PACI.
GriPhyN EAC Meeting (Jan. 7, 2002)Carl Kesselman1 University of Southern California GriPhyN External Advisory Committee Meeting Gainesville,
Perspectives on Cyberinfrastructure Daniel E. Atkins Professor, University of Michigan School of Information & Dept. of EECS October 2002.
Interoperability Grids, Clouds and Collaboratories Ruth Pordes Executive Director Open Science Grid, Fermilab.
TCUP Leadership Forum January 3, 2014 Sylvia M. James, Ed.D. Division Director, Human Resource Development (HRD)
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
The Minority-Serving Institutions (MSI) Cyberinfrastructure (CI) Institute [MSI C(I) 2 ] Providing a scalable mechanism for developing a CI-enabled science.
National Strategic Computing Initiative
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
December 10, 2003Slide 1 International Networking and Cyberinfrastructure Douglas Gatchell Program Director International Networking National Science Foundation,
Education, Outreach and Training (EOT) and External Relations (ER) Scott Lathrop Area Director for EOT Extension Year Plans.
2005 GRIDS Community Workshop1 Learning From Cyberinfrastructure Initiatives Grid Research Integration Development & Support
1 Cyber-Enabled Discovery and Innovation Michael Foster May 11, 2007.
Digital Data Collections ARL, CNI, CLIR, and DLF Forum October 28, 2005 Washington DC Chris Greer Program Director National Science Foundation.
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions and services that benefit.
Status Report: Cyberinfrastructure Strategic Planning Activities BIO Advisory Committee November 17, 2005 Manfred Zorn, Chris Greer, Liz Blood, Sally O’Connor.
National Science Foundation Blue Ribbon Panel on Cyberinfrastructure Summary for the OSCER Symposium 13 September 2002.
APAN Meeting Douglas Gatchell US NSF/CISE/SCI July 4, 2004.
Data Infrastructure Building Blocks (DIBBS) NSF Solicitation Webinar -- March 3, 2016 Amy Walton, Program Director Advanced Cyberinfrastructure.
All Hands Meeting 2005 BIRN-CC: Building, Maintaining and Maturing a National Information Infrastructure to Enable and Advance Biomedical Research.
NSF Draft Strategic Plan for Data, Data Analysis, and Visualization Chris Greer Program Director National Science Foundation.
TG ’08, June 9-13, State of TeraGrid John Towns Co-Chair, TeraGrid Forum Director, Persistent Infrastructure National Center for Supercomputing.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
TeraGrid’s Process for Meeting User Needs. Jay Boisseau, Texas Advanced Computing Center Dennis Gannon, Indiana University Ralph Roskies, University of.
Miriam Heller National Science Foundation ACCI October 31, 2006
Presentation transcript:

NSF’s CyberInfrastructure Vision for 21st Century Discovery, Innovation, and Learning GridChem Workshop March 9, 2006 Austin, TX Miriam Heller, Ph.D. Office of Cyberinfrastructure Program Director

2 NSF: Heller – Outline CyberInfrastructure (CI) at NSF : Then and Now Strategic Planning – Setting Directions OCI Investments : Now and Later Concluding Remarks

3 NSF: Heller – ‘97 Partnerships for Advanced Computational Infrastructure Alliance (NCSA-led) NPACI (SDSC-led) ‘93 Hayes Report Branscomb Report ‘95 ‘99 PITAC Report Terascale Computing Systems ‘00 ITR Projects ETF Management & Operations ’03-SCI Atkins Report ’05- OCI ‘08 Core Support NCSA SDSC Historical NSF Contributions ‘85 Supercomputer Centers PSC NCSA SDSC JvNC CTC NSFNet Discipline- specific CI Projects ANIR NMI ‘01

4 NSF: Heller – Cyberinfrastructure Vision “Atkins report” - Blue-ribbon panel, chaired by Daniel E. Atkins Called for a national-level, integrated system of hardware, software, & data resources and services New infrastructure to enable new paradigms of science & engineering research and education with increased efficiency www. nsf.gov/od/oci/reports/toc.jsp

5 NSF: Heller –

6 Fall 2003 July January 2006

7 NSF: Heller – Guy Almes Program Director Office of CyberInfrastructure Debra Crawford Office Director (Acting) José Muñoz Dep. Office Dir. Fillia Makedon Program Director Doug Gatchell Program Director Kevin Thompson Program Director Miriam Heller Program Director (Vacancy) Program Director (Software) Judy Hayden Priscilla Bezdek Mary Daley Irene Lombardo Allison Smith Steve Meacham Program Director Frank Scioli Program Director ANL RP IU RP PU RP ORNL RP TACC RP MRI REU Sites STI NMI Dev. CyberSecurity CI-TEAM EPSCOR GriPhyN Disun CCG NMI SDSC Core SDSC RP HPC Acq. NCSA Core NCSA RP PSC RP ETF GIG EIN IRNC Condor NMI Integ. Optiputer SBE CyberTools SBE POC Vittal Rao Program Director Dan Atkins Office Director (June) José Muñoz Dep. Office Dir.

8 NSF: Heller – CyberInfrastrcture (CI) Governance CyberInfrastrcture (CI) Governance CyberInfrastructure Council (CIC) CyberInfrastructure Council (CIC)  NSF ADs and ODs, chaired by Dr. Bement (NSF Director)  CIC responsible for shared stewardship and ownership of NSF’s CyberInfrastructure Portfolio SCI  OCI Realignment SCI  OCI Realignment  SCI / CISE  Office of the Director / Office of CyberInfrastructure (OCI)  Budget transferred  Ongoing projects and personnel transferred OCI focuses on provisioning “production-quality” CI to enable 21 st century research and education breakthroughs OCI focuses on provisioning “production-quality” CI to enable 21 st century research and education breakthroughs  CISE remains focused on basic CS research and education mission Advisory Committee for NSF’s CI activities and portfolioAdvisory Committee for NSF’s CI activities and portfolio Cyberinfrastructure User Advisor Committee (CUAC)Cyberinfrastructure User Advisor Committee (CUAC)

Burgeoning Number of CI Systems LHC

10 NSF: Heller – CyberInfrastructure Budgets HPC hardware acquisitions, O&M, and user support as a fraction of NSF’s overall CI budget NSF 2006 CI Budget 75% 25% Research directorates OCI OCI Budget: $127M (FY06) ETF + CORE 56% FY07: $ (Request)

11 NSF: Heller – NSF’s Cyberinfrastructure Vision (FY 2006 – 2010) Completed in Summer 2006 Ch. 1 : Call to ActionCh. 1 : Call to Action Visions for: Ch. 2 : High Performance ComputingCh. 2 : High Performance Computing Ch. 3 : Data, Data Analysis & VisualizatonCh. 3 : Data, Data Analysis & Visualizaton Ch. 4 : Collaboratories, Observatories and Virtual OrganizationsCh. 4 : Collaboratories, Observatories and Virtual Organizations Ch. 5 : Learning and Workforce DevelopmentCh. 5 : Learning and Workforce Development

12 NSF: Heller – NSF states intent to “play a leadership role” “NSF will play a leadership role in the development and support of a comprehensive cyberinfrastructure essential to 21st century advances in science and engineering research and education. NSF is the only agency within the U.S. government that funds research and education across all disciplines of science and engineering.... Thus, it is strategically placed to leverage, coordinate and transition cyberinfrastructure advances in one field to all fields of research.” From NSF Cyberinfrastructure Vision for the 21st Century Discovery

13 NSF: Heller – Learning & WorkforceDevelopment CI Vision : 4 Interrelated PerspectivesCollaboratories, Observatories & VirtualOrganizations Data, Data Analysis & Visualization High Performance Computing

14 NSF: Heller – Enabling and Motivating Trends digital convergence structured processable Push Pull Atkins- Symposium on KES: Past, Present and Future

15 NSF: Heller – Some Computation: TeraGrid Provides: 1.Unified user environment to support high-capability, production-quality cyberinfrastructure services for science & engineering research. 2.New S&E opportunities using new ways to distribute resources and services. Integrate grid services, incl.  HPC  Data collections  Visualization servers  Portals Distributed, open architecture GIG responsible for :  SW integration (incl. CTSS)  Base infrastructure (security, networking, and operations)  User support  Community engagement (e.g. Science Gateways) 8 RP’s  PSC, TACC, NCSA, SDSC, ORNL, Indiana, Purdue, Chicago/ANL  Other institutions participate as sub-awardees of the GIG

16 NSF: Heller – Content Digital everything; exponential growth; conversion and born-digital. S&E literature is digital. Microfilm-> digital for preservation. Digital libraries are real and getting better. Distributed (global scale), multi-media, multi-disciplinary observational. Huge volume. Need for large-scale, enduring, professionally managed/curated data repositories. Increasing demand for easier finding, reuse: data mining, interdisciplinary data federation. New modes of scholarly communication: what’s publishing? what’s a publication? IP, openness, ownership, privacy, security issues Atkins- Symposium on KES: Past, Present and Future

17 NSF: Heller – Interactivity Networking - machine to machine  IRNC program  Internet2 Interfaces - human to machine Smart sensors, instruments, arrays - machine to physical world  CEO:P program Organizational - Interactive distributed systems systems; knowledge (work) environments; virtual communities  NSF Workshop on Cyberinfrastructure for the Social Sciences, 2005  Next Generation CyberTools Atkins- Symposium on KES: Past, Present and Future

18 NSF: Heller – Comprehensive & Synergistic View of IT & the Future of the Research University Atkins- Symposium on KES: Past, Present and Future

19 NSF: Heller – “Borromean Ring*” teams needed for Cyberinfrastructure Success *Three symmetric, interlocking rings, no two of which are interlinked. Removing one destroys the synergy. Disciplinary, multi- disciplinary research communities People & Society Social & Behavioral Sciences Computer & Information, Science& Engineering Iterative, participatory design; collateral learning. Atkins- Symposium on KES: Past, Present and Future

20 NSF: Heller – OCI INVESTMENT HIGHLIGHTS Midrange HPC Acquisition ($30) Leadership Class High-Performance Computing System Acquisition ($50M) Data- and Collaboration-Intensive Software Services ($25.7M)  Conduct applied research and development  Perform scalability/reliability tests to explore tool viability  Develop, harden and maintain software tools and services  Provide software interoperability CI Training, Education, Advancement and Training ($10M)

21 NSF: Heller – Acquisition Strategy FY06 FY10 FY09FY08FY07 Science and engineering capability (logrithmic scale) Typical university HPC systems Track 1 system(s) Track 2 systems

22 NSF: Heller – HPC Acquisition Activities HPC acquisition will be driven by the needs of the S&E community RFI held for interested Resource Providers and HPC vendors on 9 Sep 2005 First in a series of HPC S&E requirements workshops held Sep 2005  Generated Application Benchmark Questionnaire  Attended by 77 scientists and engineers

23 NSF: Heller – Science Driven Cyberinfrastructure Trade-off l Interconnect fabric l Processing power l Memory l I/O P M P M P M Interconnect

24 NSF: Heller – Computing: One Size Doesn’t Fit All Data capability (Increasing I/O and storage) Compute capability (increasing FLOPS) SDSC Data Science Env Campus, Departmental and Desktop Computing Traditional HEC Env QCD Protein Folding CPMD NVO EOL CIPRes SCEC Visualization Data Storage/PreservationExtreme I/O 1.3D + time simulation 2.Out-of-Core ENZO Visualization CFD Climate SCEC Simulation ENZO simulation Can’t be done on Grid (I/O exceeds WAN) Distributed I/O Capable courtesy SDSC

25 NSF: Heller – Benchmarking Broad inter-agency interest Use of benchmarking for performance prediction  valuable when target systems are not readily available either because –Inaccessible (e.g. secure) –Does not exist at sufficient scale –In various stages of design l Useful for “what-if” analysis  Suppose I double the memory on my Redstorm? l Nirvana (e.g., Snavely/SDSC) :  Abstract away the application: application signatures –Platform independent  Abstract away the hardware: platform signature  Convolve the signatures to provide an assessment

26 NSF: Heller – HPC Benchmarking l HPC Challenge Benchmarks ( 1.HPL - the Linpack TPP benchmark which measures the floating point rate of execution for solving a linear system of equations.HPL 2.DGEMM - measures the floating point rate of execution of double precision real matrix-matrix multiplication. 3.STREAM - a simple synthetic benchmark program that measures sustainable memory bandwidth (in GB/s) and the corresponding computation rate for simple vector kernel.STREAM 4.PTRANS (parallel matrix transpose) - exercises the communications where pairs of processors communicate with each other simultaneously. It is a useful test of the total communications capacity of the network.PTRANS 5.RandomAccess - measures the rate of integer random updates of memory (GUPS).RandomAccess 6.FFTE - measures the floating point rate of execution of double precision complex one-dimensional Discrete Fourier Transform (DFT).FFTE 7.Communication bandwidth and latency - a set of tests to measure latency and bandwidth of a number of simultaneous communication patterns; based on b_eff (effective bandwidth benchmark). b_eff

27 NSF: Heller – HPC Acquisition - Track 1 l Increased funding will support first phase of a petascale system acquisition l Over four years NSF anticipates investing $200M l Acquisition is critical to NSF’s multi-year plan to deploy and support world-class HPC environment l Collaborating with sister agencies with a stake in HPC  DARPA, HPCMOD, DOE/OS, DOE/NNSA, NIH

28 NSF: Heller – NSF Middleware Initiative Program to design, develop, test, deploy, and sustain a set of reusable and expandable middleware functions that benefit many science and engineering applications in a networked environment. Define open-source, open-architecture standards for on- line (international) collaboration resource sharing that is sustainable, scalable, and securable Examples include:  Community-wide access to experimental data on the Grid  Authorized resource access across multiple campuses using common tools  Web-based portals that provide a common interface to wide- ranging Grid-enabled computation resources  Grid access to instrumentation such as accelerators, telescopes

29 NSF: Heller – NMI-funded Activities in S&E Research From funded > 40 development awards + integration awards Integration award highlights include NMI Grids Center (e.g. Build and Test), Campus Middleware Services (e.g. Shibolleth) and Nanohub Condor – Mature distributed computing system installed on 1000’s of CPU “pools” and 10’s of 1000’s of CPUs. GridChem –Open source Java application launches/monitors computational chemistry calculations (Gaussian03, GAMESS, NWChem+Molpro, Qchem, Aces) on CCG supercomputers remotely. NanoHub – Extends NSF Network for Computational Nanotechnology applications, e.g., NEMO3D, nanoMOS, to distributed environment over Teragrid, U Wisconsin, other grid assets using InVIGO, Condor-G, etc.

30 NSF: Heller – Other Middleware Funding OCI made a major award in middleware in November 2005 to Foster/Kesselman:  "Community Driven Improvement of Globus Software", $13.3M award over 5 years Ongoing funding to Virtual Data Toolkit (VDT) middleware via OCI and MPS OSG activities, including:  DiSUN is a 5 year $12 M award for computational, storage, middleware resources at four Tier-2 site  GriPhyN and iVDGL target VDT, VDS but ending soon Ongoing funding to VDT middleware via TeraGrid as part of the CTSS

31 NSF: Heller – Input: 70 projects / 101 proposals / 17 (24%) collaborative projects Input: 70 projects / 101 proposals / 17 (24%) collaborative projects Outcomes: Outcomes:  Invested $2.67 M in awards for projects up to $250K total over 1-2 years  15.7% success rate: in 11 Demonstration Projects (14 proposals) across BIO, CISE, EHR, ENG, GEO, MPS disciplines Broadening Participation for CI Workforce Development Broadening Participation for CI Workforce Development  Alvarez (FIU) – CyberBridges  Crasta (VaTech) – Project-Centric Bioinformatics  Fortson (Adler) – CI-Enabled 21 st c. Astronomy Training for HS Science Teachers  Fox (IU) - Bringing Minority Serving Institution Faculty into CI & e-Science Communities  Gordon (OSU) – Leveraging CI to Scale-Up a Computational Science U/G Curriculum  Panoff (Shodor) – Pathways to CyberInfrastructure : CI through Computational Science  Takai (SUNY Stonybrook) – High School Distributed Search for Cosmic Rays (MARIACHI) Developing and Implementing CI Resources for CI Workforce Development Developing and Implementing CI Resources for CI Workforce Development  DiGiano (SRI) – Cybercollaboration Between Scientists and Software Developers  Figueiredo (UFl) – In-VIGO/Condor-G MW for Coastal & Estuarine Science CI Training  Regli (Drexel) – CI for Creation and Use of Multi-Disciplinary Engineering Models  Simpson (PSU) – CI-Based Engineering Repositories for Undergraduates (CIBER-U) Learning and Our 21 st Century CI Workforce CI-TEAM: Demonstration Projects

32 NSF: Heller – Cyberinfrastructure Training, Education, Advancement, and Mentoring for Our 21 st Century Workforce (CI-TEAM) Aims to prepare science and engineering workforce with knowledge and skills needed to create, advance and use cyberinfrastructure for discovery, learning and innovation across and within all areas of science and engineering. Exploits the power of Cyberinfrastructure to cross digital, disciplinary, institutional, and geographic divides and fosters inclusion of diverse groups of people and organizations, with particular emphasis on traditionally underrepresented groups. Focus on workforce development activities; <50% tool development. FY06 program funds ~ $10 M for two types of awards:  Demonstration Projects (~ FY05 projects, exploratory in nature, may be limited in scope and scale, have potential to expand into future scale implementation activities; ≤ $250,000)  Implementation Projects (larger in scope or scale, draw on prior experience with proposed activities or teams, expected to deliver sustainable learning and workforce development activities that complement ongoing NSF investment in cyberinfrastructure; ≤ $1,000,000). New CI-TEAM Solicitation Due June 5, 2006

33 NSF: Heller – Concluding Thoughts NSF has taken a leadership role in CI and working to define the vision and future directions Successful past investments position CI for the Revolution Achieving the goal of provisioning CI for 21 st breakthrough science and engineering research and education depends on the successful investment in the development and deployment of useful, appropriate, usable, used, sustainable CI resources, tools, and services complemented by investment in a cyber-savvy workforce to design, deploy, use and support Need PIs to  Advise NSF on CI needs  Track growing CI use  Demonstrate breakthrough research and education

34 NSF: Heller – Thank You! Miriam Heller, Ph.D. Program Director Office of Cyberinfrastructure National Science Foundation Tel:

35 NSF: Heller – IRNC Awards l Awards FTransPAC2 (U.S. – Japan and beyond) FGLORIAD (U.S. – China – Russia – Korea) FTranslight/PacificWave (U.S. – Australia) FTransLight/StarLight (U.S. – Europe) FWHREN (U.S. – Latin America) International Research Network Connections