U.S. Department of Energy’s Office of Science Mary Anne Scott Program Manager Washington Update July 21, 2004 ESCC Meeting.

Slides:



Advertisements
Similar presentations
U.S. Department of Energys Office of Science Distributed Science at Department of Energy Dan Hitchcock
Advertisements

Sandy Landsberg DOE Office of Science Advanced Scientific Computing Research September 20, 2012 DOE / ASCR Interests & Complex Engineered Networks Applied.
ASCR Data Science Centers Infrastructure Demonstration S. Canon, N. Desai, M. Ernst, K. Kleese-Van Dam, G. Shipman, B. Tierney.
U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update June 12,
GENI: Global Environment for Networking Innovations Larry Landweber Senior Advisor NSF:CISE Joint Techs Madison, WI July 17, 2006.
EInfrastructures (Internet and Grids) US Resource Centers Perspective: implementation and execution challenges Alan Blatecky Executive Director SDSC.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
SDSC Computing the 21st Century Talk Given to the NSF Sugar Panel May 27, 1998.
U.S. Science Policy Cheryl L. Eavey, Program Director
Federal Aviation Administration Next Generation Air Transportation System (NextGen) Briefing to COMSTAC Kelvin Coleman May 16, 2008.
Where do we go from here? “Knowledge Environments to Support Distributed Science and Engineering” Symposium on Knowledge Environments for Science and Engineering.
U.S. Department of Energy’s Office of Science Dr. Raymond Orbach February 25, 2003 Briefing for the Basic Energy Sciences Advisory Committee FY04 Budget.
Office of Science U.S. Department of Energy U.S. Department of Energy’s Office of Science Dr. Raymond L. Orbach Under Secretary for Science U.S. Department.
High Confidence Medical Device Software and Systems Workshop Planning Meeting Government Introduction November 16, 2004 Sally E. Howe, Ph.D. Associate.
Office of Science Office of Biological and Environmental Research J Michael Kuperberg, Ph.D. Dan Stover, Ph.D. Terrestrial Ecosystem Science AmeriFlux.
NGNS Program Managers Richard Carlson Thomas Ndousse ASCAC meeting 11/21/2014 Next Generation Networking for Science Program Update.
Cyberinfrastructure: Initiatives at the US National Science Foundation Stephen Nash Program Director, Operations Research U.S. National Science Foundation.
Welcome to HTCondor Week #14 (year #29 for our project)
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
Renaissance Computing Institute PITAC Update: Computational Science Dan Reed Chancellor’s Eminent Professor Director, RENCI University of North Carolina.
US NITRD LSN-MAGIC Coordinating Team – Organization and Goals Richard Carlson NGNS Program Manager, Research Division, Office of Advanced Scientific Computing.
Supercomputing Center Jysoo Lee KISTI Supercomputing Center National e-Science Project.
Interagency Large Scale Networking Program David B. Nelson, Ph.D. Director National Coordination Office for Information Technology Research and Development.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
Page 1 Wei Zhao 3/25/2005 Information Technologies: Challenges and Opportunities Wei Zhao Whatever I said here does not represent anybody, including myself!
The Materials Genome Initiative and Materials Innovation Infrastructure Meredith Drosback White House Office of Science and Technology Policy September.
Funding Opportunities for GI Science at National Science Foundation Nina Lam 02/04/00.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update September.
Federal Networking and Information Technology R&D Program Big Data Senior Steering Group Wendy Wigen, Technical Coordinator April 13, 2012.
Campus Cyberinfrastructure – Network Infrastructure and Engineering (CC-NIE) Kevin Thompson NSF Office of CyberInfrastructure April 25, 2012.
Cyberinfrastructure Planning at NSF Deborah L. Crawford Acting Director, Office of Cyberinfrastructure HPC Acquisition Models September 9, 2005.
DOE 2000, March 8, 1999 The IT 2 Initiative and NSF Stephen Elbert program director NSF/CISE/ACIR/PACI.
ESIP Federation Air Quality Cluster Partner Agencies.
Revitalizing High-End Computing – Progress Report July 14, 2004 Dave Nelson (NCO) with thanks to John Grosh (DoD)
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
Logistical Networking Micah Beck, Research Assoc. Professor Director, Logistical Computing & Internetworking (LoCI) Lab Computer.
Department of Energy Office of Science ESCC & Internet2 Joint Techs Workshop Madison, Wisconsin.July 16-20, 2006 Network Virtualization & Hybridization.
Commodity Grid Kits Gregor von Laszewski (ANL), Keith Jackson (LBL) Many state-of-the-art scientific applications, such as climate modeling, astrophysics,
Office of Science U.S. Department of Energy ESCC Meeting July 21-23, 2004 Network Research Program Update Thomas D. Ndousse Program Manager Mathematical,
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program Perspective from Washington HBCU Update George Seweryniak DOE.
U.S. Department of Energy Office of Science DOE HBCU Program George Seweryniak DOE/SC-31 HBCU Program Manager Dec
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
1 DOE Office of Science October 2003 SciDAC Scientific Discovery through Advanced Computing Alan J. Laub.
Group Science J. Marc Overhage MD, PhD Regenstrief Institute Indiana University School of Medicine.
Computational Science & Engineering meeting national needs Steven F. Ashby SIAG-CSE Chair March 24, 2003.
1 OFFICE OF ADVANCED SCIENTIFIC COMPUTING RESEARCH The NERSC Center --From A DOE Program Manager’s Perspective-- A Presentation to the NERSC Users Group.
NASA ARAC Meeting Update on Next Generation Air Transportation System May 3, 2005 Robert Pearce Deputy Director, Joint Planning & Development Office.
6 February 2004 Internet2 Priorities 2004 Internet2 Industry Strategy Council Douglas Van Houweling.
What’s Happening at Internet2 Renee Woodten Frost Associate Director Middleware and Security 8 March 2005.
1 Cyber-Enabled Discovery and Innovation Michael Foster May 11, 2007.
Information and Communications Technology R&D in the U.S Federal Government Suzi Iacono, Ph.D. National Science Foundation Co-Chair Social, Economic, and.
Digital Data Collections ARL, CNI, CLIR, and DLF Forum October 28, 2005 Washington DC Chris Greer Program Director National Science Foundation.
NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions and services that benefit.
Expedition Workshop Towards Scalable Data Management June 10, 2008 Chris Greer Director, NCO.
NITRD Networking and Information Technology Research and Development Program 19 March 2009.
Advanced research and education networking in the United States: the Internet2 experience Heather Boyles Director, Member and Partner Relations Internet2.
Internet2 Strategic Directions October Fundamental Questions  What does higher education (and the rest of the world) require from the Internet.
Expedition Workshop Strategic Leadership For Networking and Information Technology Education September 16, 2008 Chris Greer Director, NCO.
NITRD Networking and ITRD IT R&D CIC computing, info and comm HPCC and communication HPC high-performance computing George O. Strawn NITRD co-chair and.
Office of Science U.S. Department of Energy High-Performance Network Research Program at DOE/Office of Science 2005 DOE Annual PI Meeting Brookhaven National.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
Toward High Breakthrough Collaboration (HBC) Susan Turnbull Program Manager Advanced Scientific Computing Research March 4, 2009.
DOE Facilities - Drivers for Science: Experimental and Simulation Data
Collaborations and Interactions with other Projects
The Shifting Landscape of CI Funding
David M. Kennedy, Senior Advisor for the Arctic Region, NOAA
Presentation transcript:

U.S. Department of Energy’s Office of Science Mary Anne Scott Program Manager Washington Update July 21, 2004 ESCC Meeting

Office of Science U.S. Department of Energy “The times, they are a-changin” Bob Dylan, 1963

Office of Science U.S. Department of Energy ASCR and MICS Staff  ASCR  Ed Oliver, Associate Director for Advanced Scientific Computing Research  Dan Hitchcock, Senior Technical Advisor  Linda Twenty, Program Analyst  MICS  Michael Strayer, Acting Director MICS  David Goodwin, NERSC  Fred Johnson, Computer Science, CS ISICs  Gary Johnson, ACRT, SAPP, Applied Math (acting)  Thomas Ndousse-Fetter, Network Research  Mary Anne Scott, Collaboratories, ESnet (acting)  George Seweryniak, HBCU  John van Rosendale, Visualization and Data Management, Math ISICs  Jane Hiegel- Secretary  Beverly Foltz - Secretary (temp) Phone Fax

Office of Science U.S. Department of Energy ASCR/MICS Mission Discover, develop, and deploy the computational and networking tools that enable researchers in the scientific disciplines to analyze, model, simulate, and predict complex physical, chemical, and biological phenomena important to the Department of Energy (DOE). Research: Foster and support fundamental research in advanced scientific computing – applied mathematics, computer science, and networking Facilities: Operate supercomputers, a high performance network, and related facilities

Office of Science U.S. Department of Energy ASCR in Relationship to Office of Science Shows OneSC, Phase 1

Office of Science U.S. Department of Energy ASCR in Relationship to Federal IT Research President’s Information Technology Advisory Committee (PITAC) U.S. Congress HECC High End Computing and Communication Coordinating Group SEW Social, Economic & Workforce Implications of IT and IT Workforce Development Coordinating Group LSN Large Scale Networking Coordinating Group HCI&IM Human Computer Interface & Information Management Coordinating Group HCSS High Confidence Systems & Software Coordinating Group SDP Software Design & Productivity Coordinating Group White House OSTP/OMB National Science & Technology Council (NSTC) Senior Principal’s Group for IT National Coordination Office for Computing, Information and Communications NCO/CIC Interagency Working Group on IT R&D IWG/ITR&D Participating Agencies AHRQ, DARPA, DOE, EPA, NASA, NIST, NOAA, NSA, NSF, OSD/URI AHRQ, DARPA, DOE, EPA, NASA, NIST, NOAA, NSA, NSF, OSD/URI Joint Engineering Team (JET) Network Research Team (NRT) == DOE Direct Involvement Middleware and Grid Infrastructure Coordination (MaGIC)

Office of Science U.S. Department of Energy Planning Workshops  High Performance Network Planning Workshop, August  DOE Workshop on Ultra High-Speed Transport Protocols and Network Provisioning for Large-Scale Science Applications, April  Science Case for Large Scale Simulation, June  DOE Science Networking Roadmap Meeting, June  Workshop on the Road Map for the Revitalization of High End Computing, June  ASCR Strategic Planning Workshop, July

Office of Science U.S. Department of Energy Roadmap – Requirements/Business Case  Over 40% of federal support to the physical sciences is in the Office of Science  Supports over 15,000 PhDs, PostDocs and graduate students.  A similar number of PhDs, PostDocs and graduate students funded by other federal, state and private agencies, and international institutions are users/collaborators of DOE facilities.  Most of these users/collaborators and many of the DOE funded users are at universities; many of them are at international locations.  Effective end-to-end (E2E) networking and middleware that interfaces to university researchers and international collaborators is critical for the success of the DOE Science Mission.

Office of Science U.S. Department of Energy Roadmap – Requirements/Business Case  Achieving the DOE science mission for the next five years requires continuing advancements in networking and middleware.  THE #1 DRIVER – Petabyte scale experimental and simulation data systems will be increasing to exabyte scale data systems.  Examples: Bioinformatics, Climate, LHC, etc.  Computational systems that process or produce the data continue to advance with Moore’s Law thereby driving network requirements.  The combination of increases in data and computational power are projected to at least continue the historical trend of doubling network requirements every year.

Office of Science U.S. Department of Energy Roadmap – Requirements/Business Case  The sources of the data, the computational resources and the scientists consumers of the data are often not collocated. This is due to two main reasons:  The experimental facilities, data facilities and computational facilities are extremely expensive and consequently not replicated. They are also distributed. Sharing them is often the only cost effective solution.  The scientists are highly distributed, many of them being located at universities.  Due to the distribution of experimental facilities, data facilities, computational facilities and scientists, networking and middleware are essential to achieve the science.

Office of Science U.S. Department of Energy Network Issues – Technology  Single wavelengths in the optical fiber transport can currently only carry 10 Gbps and this limit is not anticipated to change within the next five years. At least 40 Gbps will be required by  The current transmission protocol, TCP, will not at present efficiently support speeds above a few Gbps per data stream.  The technologies to concurrently control multiple multi Gbps data streams do not at present exist.  The technologies to perform effective cybersecurity above several Gbps do not currently exist.

Office of Science U.S. Department of Energy …what now??? network environment VISION - A scalable, secure, integrated network environment for ultra-scale distributed science is being developed to make it possible to combine resources and expertise to address complex questions that no single institution could manage alone. It is creating the means for research teams to integrate unique and expensive DOE research facilities and resources for remote collaboration, experimentation, simulation and analysis.  Network Strategy Production network  Base TCP/IP services; +99.9% reliable High-impact network  Increments of 10 Gbps; switched lambdas (other solutions); 99% reliable Research network  Interfaces with production, high-impact and other research networks; start electronic and advance towards optical switching; very flexible  Revisit governance model  SC-wide coordination  Advisory Committee involvement

Office of Science U.S. Department of Energy What is ESCC today?  Standing committee  Members appointed by ESnet site organizations  Advisory to ESSC, providing a forum for the consideration of a broad range of technical issues  Forum for information interchange  ESnet-wide activities and plans  Site-specific requirements and plans  Forum for interactions with the ESnet manager and staff  Forum for interactions with SC programs that use or would like to use ESnet facilities Where to next?

Office of Science U.S. Department of Energy Leadership-Class Computing  FY2004 $30M appropriation  for “the Department [of Energy] to acquire additional advanced computing capability to support existing users in the near term and to initiate longer-term research and development on next generation computer architectures.”  May 12, 2004 announcement  ORNL partnered with Cray Inc, IBM Corp, and Silicon Graphics Inc  $25M to begin to build a 50 teraflop science research supercomputer  End station concept proposed  FY2005 President’s budget requests additional $25M to continue

Office of Science U.S. Department of Energy Other Recent HQ activities  Committee of Visitors  Validate the effectiveness of the way Scientific Research in managed in DOE Office of Science  March ’04 considered CS, Math, NC  March ’05 will consider facilities, Net Research  Advisory Committee  New members coming  JET Roadmapping Workshop  April 13-15, JLAB (Newport News, VA)