Presentation is loading. Please wait.

Presentation is loading. Please wait.

CyberInfrastructure and Office of CyberInfrastructure (OCI) to: SURA Information Technology Committee Meeting José Muñoz, Ph.D. Office of CyberInfrastructure.

Similar presentations


Presentation on theme: "CyberInfrastructure and Office of CyberInfrastructure (OCI) to: SURA Information Technology Committee Meeting José Muñoz, Ph.D. Office of CyberInfrastructure."— Presentation transcript:

1

2 CyberInfrastructure and Office of CyberInfrastructure (OCI) to: SURA Information Technology Committee Meeting José Muñoz, Ph.D. Office of CyberInfrastructure Deputy Office Director Senior Science Advisor

3 2 OCI Muñoz Outline l NSF organizational changes l CyberInfrastructure (CI) at NSF l Strategic Planning l New HPC Acquisition l Related OCI activities l Summary

4 3 OCI Muñoz CyberInfrastructure (CI) Governance CyberInfrastructure (CI) Governance Dr. Dan Atkins (University of Michigan) selected as new OCI Director Dr. Dan Atkins (University of Michigan) selected as new OCI Director served as Chair of NSF's Blue-Ribbon Advisory Panel on Cyberinfrastructure: Revolutionizing Science and Engineering Through Cyberinfrastructure (2003) served as Chair of NSF's Blue-Ribbon Advisory Panel on Cyberinfrastructure: Revolutionizing Science and Engineering Through Cyberinfrastructure (2003) tenure begins June 2006 tenure begins June 2006 CyberInfrastructure Council (CIC) created CyberInfrastructure Council (CIC) created NSF ADs and ODs, chaired by Dr. Bement (NSF Dir.) NSF ADs and ODs, chaired by Dr. Bement (NSF Dir.) CIC responsible for shared stewardship and ownership of NSF’s cyberinfrastructure portfolio CIC responsible for shared stewardship and ownership of NSF’s cyberinfrastructure portfolio Advisory Committee for NSF’s CI activities and portfolio created Advisory Committee for NSF’s CI activities and portfolio created candidate members have been identified. candidate members have been identified. first meeting June 2006? first meeting June 2006?

5 4 OCI Muñoz CI Governance NSF High-End Computing Coordinating Group NSF High-End Computing Coordinating Group representatives from each NSF Directorate representatives from each NSF Directorate Chaired by OCI Chaired by OCI Meets every other week Meets every other week SCI to OCI Realignment SCI to OCI Realignment SCI was moved from CISE to Office of the Director and became Office of CyberInfrastructure (OCI) SCI was moved from CISE to Office of the Director and became Office of CyberInfrastructure (OCI) Budget transferred Budget transferred Ongoing projects and personnel transferred Ongoing projects and personnel transferred OCI is focused on provision of “production-quality” CI for research and education OCI is focused on provision of “production-quality” CI for research and education CISE remains focused on basic CS research and education mission CISE remains focused on basic CS research and education mission CI software and hardware CI software and hardware Collaborate with other NSF Directorates Collaborate with other NSF Directorates CI Strategic Planning Process Underway CI Strategic Planning Process Underway NSF CyberInfrastructure Vision document NSF CyberInfrastructure Vision document

6 5 OCI Muñoz Guy Almes Program Director Office of CyberInfrastructure Debra Crawford Office Director (Interim) José Muñoz Dep. Office Dir. Fillia Makedon Program Director Doug Gatchell Program Director Kevin Thompson Program Director Miriam Heller Program Director (Vacancy) Program Director (Software) Judy Hayden Priscilla Bezdek Mary Daley Irene Lombardo Allison Smith Steve Meacham Program Director Frank Scioli Program Director ANL RP IU RP PU RP ORNL RP TACC RP MRI REU Sites STI NMI Dev. CyberSecurity CI-TEAM EPSCOR GriPhyN Disun CCG NMI SDSC Core SDSC RP HPC Acq. NCSA Core NCSA RP PSC RP ETF GIG EIN IRNC Condor NMI Integ. Optiputer SBE CyberTools SBE POC Vittal Rao Program Director

7 6 OCI Muñoz CI Vision Document 4 Interrelated Plans Learning & Workforce Development Data, Data Analysis & Visualization High Performance Computing Collaboratories, Observatories & Virtual Organizations

8 7 OCI Muñoz Strategic Plan (FY 2006 – 2010) Ch. 1: Call to Action Ch. 2: High Performance Computing Ch. 3: Data, Data Analysis & Visualization Ch. 5: Learning and Workforce Development Ch. 4: Colaboratories, Observatories and Virtual Organizations Strategic Plans for: Completed in Summer 2006

9 8 OCI Muñoz CyberInfrastructure Budgets HPC hardware acquisitions, O&M, and user support as a fraction of NSF’s overall CI budget NSF 2006 CI Budget 75% 25% Research directorates OCI OCI Budget: $127M (FY06) FY07: $182.42 (Request) HPC ACQ. CORE ETF CI-TEAM TESTBEDS NMI IRNC

10 9 OCI Muñoz Recent CI Activities l HPC Solicitation Released September 27, 2005 FPerformance Benchmarks Identified November 9 FProposals were due 10Feb06 l Continuing Work on CI Vision Document FFour, NSF-wide SWOT Teams developing Strategic and Implementation Plans covering various aspects of CI Fhttp://www.nsf.gov/dir/index.jsp?org=OCI l Ongoing Interagency Discussions FCommittee on Science FOffice of Science and Technology Policy FAgency-to-agency (DARPA, HPCMOD, DOE/OS, NNSA, NIH)

11 10 OCI Muñoz Acquisition Strategy FY06 FY10 FY09FY08FY07 Science and engineering capability (logrithmic scale) Typical university HPC systems Track 1 system(s) Track 2 systems

12 11 OCI Muñoz HPC Acquisition Activities l HPC acquisition will be driven by the needs of the S&E community l RFI held for interested Resource Providers and HPC vendors on 9 Sep 2005 l First in a series of HPC S&E requirements workshops held 20-21 Sep 2005 FGenerated Application Benchmark Questionnaire F Attended by 77 scientists and engineers

13 12 OCI Muñoz TeraGrid: What is It? TeraGrid: (1)Provides a unified user environment to support high-capability, production-quality cyberinfrastructure services for science and engineering research. (2)Provides new S&E opportunities – by making new ways of using distributed resources and services Examples of services include: HPC Data collections Visualization servers Portals Integration of services provided by grid technologies Distributed, open architecture. GIG responsible for integration: Software integration (including the common software stack, CTSS) Base infrastructure (security, networking, and operations) User support Community engagement (including the Science Gateways activities) 8 Resource Providers (with separate awards): PSC, TACC, NCSA, SDSC, ORNL, Indiana, Purdue, Chicago/ANL Several other institutions participate in TeraGrid as a sub- awardees of the GIG New sites may join as Resource Partners

14 13 OCI Muñoz NSF Middleware Initiative (NMI) l Program Solicitations between 2001-2004 funded over 40 development awards and a series of integration awards FIntegration award highlights include NMI Grids Center (e.g. Build and Test), Campus Middleware Services (e.g. Shibolleth) and Nanohub l OCI made a major award in middleware in November 2005 to Foster/Kesselman: F"Community Driven Improvement of Globus Software", $13.3M award over 5 years Develop, deploy and sustain a set of reusable and expandable middleware functions that benefit many science and engineering applications in a networked environment. o "open standards“ o international collaboration o sustainable o scalable and securable

15 14 OCI Muñoz 2005 IRNC Awards l Awards FTransPAC2 (U.S. – Japan and beyond) FGLORIAD (U.S. – China – Russia – Korea) FTranslight/PacificWave (U.S. – Australia) FTransLight/StarLight (U.S. – Europe) FWHREN (U.S. – Latin America) International Research Network Connections

16 15 OCI Muñoz l Input: 70 projects / 101 proposals / 17 (24%) projects were collaborative l Outcomes:  15.7% success rate: 11 projects (14 proposals) awarded up to $250,000 over 1-2 years related to BIO, CISE, EHR, ENG, GEO, MPS l Broadening Participation for CI Workforce Development  Alvarez (FIU) – CyberBridges  Crasta (VaTech) – Project-Centric Bioinformatics  Fortson (Adler) – CI-Enabled 21 st c. Astronomy Training for HS Science Teachers  Fox (IU) - Bringing Minority Serving Institution Faculty into CI & e-Science Communities  Gordon (OSU) – Leveraging CI to Scale-Up a Computational Science U/G Curriculum  Panoff (Shodor) – Pathways to CyberInfrastructure : CI through Computational Science  Takai (SUNY Stonybrook) – High School Distributed Search for Cosmic Rays (MARIACHI) l Developing and Implementing CI Resources for CI Workforce Development  DiGiano (SRI) – Cybercollaboration Between Scientists and Software Developers  Figueiredo (UFl) – In-VIGO/Condor-G MW for Coastal & Estuarine Science CI Training  Regli (Drexel) – CI for Creation and Use of Multi-Disciplinary Engineering Models  Simpson (PSU) – CI-Based Engineering Repositories for Undergraduates (CIBER-U) Learning and Our 21 st Century CI Workforce CI-TEAM: Demonstration Projects New CI-TEAM SOLICITATION COMING MARCH 2006

17 16 OCI Muñoz How it all fits together… HPC LWD COVO ETF CORE CI-TEAM ETF IRNC ETF CORE NMI DATA

18 17 OCI Muñoz NSF HECURA 2004 l FY 2004 NSF/DARPA/DOE activity focused on research in FLanguages FCompilers  Libraries l 100 proposals submitted in July 2005 F82 projects submitted by 57 US academic institutions and non- profit organizations Includes no-cost national lab and industrial lab collaborators l Nine projects were awarded FTools and libraries for high-end computing FResource management FReliability of high-end systems

19 18 OCI Muñoz NSF HECURA – 2005/2006 FOCUS l I/O, file and storage systems design for efficient, high throughput data storage, retrieval and management in the HEC environment. l hardware and software tools for design, simulation, benchmarking, performance measurement and tuning of file and storage systems.

20 19 OCI Muñoz HECURA – 2005/2006 SCOPE (CISE) l File Systems Research l Future File Systems related protocols l I/O middleware l Quality of Service l Security l Management, reliability, and availability at scale l Archives/Backups as extensions to file systems l Novel storage devices for the IO stack l I/O Architectures l Hardware and software tools for design, simulation of I/O, file and storage systems. l Efficient benchmarking, tracing, performance measurement and tuning tools of I/O, file and storage systems

21 20 OCI Muñoz Benchmarking l Broad inter-agency interest l Use of benchmarking for performance prediction F valuable when target systems are not readily available either because Inaccessible (e.g. secure) Does not exist at sufficient scale In various stages of design l Useful for “what-if” analysis FSuppose I double the memory on my Redstorm? l Nirvana (e.g. Snavely/SDSC): FAbstract away the application: application signatures Platform independent FAbstract away the hardware: platform signature FConvolve the signatures to provide an assessment

22 21 OCI Muñoz HPC Benchmarking l HPC Challenge Benchmarks (http://icl.cs.utk.edu/hpcc/) 1.HPL - the Linpack TPP benchmark which measures the floating point rate of execution for solving a linear system of equations.HPL 2.DGEMM - measures the floating point rate of execution of double precision real matrix-matrix multiplication. 3.STREAM - a simple synthetic benchmark program that measures sustainable memory bandwidth (in GB/s) and the corresponding computation rate for simple vector kernel.STREAM 4.PTRANS (parallel matrix transpose) - exercises the communications where pairs of processors communicate with each other simultaneously. It is a useful test of the total communications capacity of the network.PTRANS 5.RandomAccess - measures the rate of integer random updates of memory (GUPS).RandomAccess 6.FFTE - measures the floating point rate of execution of double precision complex one-dimensional Discrete Fourier Transform (DFT).FFTE 7.Communication bandwidth and latency - a set of tests to measure latency and bandwidth of a number of simultaneous communication patterns; based on b_eff (effective bandwidth benchmark).b_eff

23 22 OCI Muñoz DARPA HPCS l Partitioned Global Address Space (PGAS) programming paradigm FIntended to support scaling to 1000s of processors FCo-Array Fortran FUnified Parallel C FCray’s Chapel FIBM’s X10 FSun’s Fortress l DARPA HPCS productivity activities F HPC specific programming environments? ?

24 23 OCI Muñoz OCI INVESTMENT HIGHLIGHTS l Leadership Class High-Performance Computing System Acquistion ($50M) l Data- and Collaboration-Intensive Software Services ($25.7M) FConduct applied research and development FPerform scalability/reliability tests to explore tool viability FDevelop, harden and maintain software tools and services FProvide software interoperability

25 24 OCI Muñoz HPC Acquisition - Track 1 l Increased funding will support first phase of a petascale system acquisition l Over four years NSF anticipates investing $200M l Acquisition is crititical to NSF’s multi- year plan to deploy and support world- class HPC environment l Collaborating with sister agencies with a stake in HPC FDARPA, HPCMOD, DOE/OS, DOE/NNSA, NIH

26 25 OCI Muñoz CI – Summary “The Tide that Raises All Boats” CI impacts and enables the broad spectrum of science and engineering activities l NSF CI deployment/acquisition activities must be driven by the needs of the science, engineering and education communities F CI is more than just “big iron” l Many opportunities to work with other federal agencies that develop, acquire and/or use various CI resources l Work required in all aspects of CI software F application and middleware for petascale systems F systems software l CI has been, and will continue to be, an effective mechanism for broadening participation

27 26 OCI Muñoz In the end… It’s all about the SCIENCE

28 27 OCI Muñoz


Download ppt "CyberInfrastructure and Office of CyberInfrastructure (OCI) to: SURA Information Technology Committee Meeting José Muñoz, Ph.D. Office of CyberInfrastructure."

Similar presentations


Ads by Google