Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.

Slides:



Advertisements
Similar presentations
CCIRN Meeting Douglas Gatchell US NSF/CISE/SCI July 3, 2004.
Advertisements

21 st Century Science and Education for Global Economic Competition William Y.B. Chang Director, NSF Beijing Office NATIONAL SCIENCE FOUNDATION.
Supporting Research on Campus - Using Cyberinfrastructure (CI) Public research use of ICT has rapidly increased in the past decade, requiring high performance.
Joint CASC/CCI Workshop Report Strategic and Tactical Recommendations EDUCAUSE Campus Cyberinfrastructure Working Group Coalition for Academic Scientific.
Presentation at WebEx Meeting June 15,  Context  Challenge  Anticipated Outcomes  Framework  Timeline & Guidance  Comment and Questions.
SACNAS, Sept 29-Oct 1, 2005, Denver, CO What is Cyberinfrastructure? The Computer Science Perspective Dr. Chaitan Baru Project Director, The Geosciences.
High Performance Computing Course Notes Grid Computing.
GENI: Global Environment for Networking Innovations Larry Landweber Senior Advisor NSF:CISE Joint Techs Madison, WI July 17, 2006.
EInfrastructures (Internet and Grids) US Resource Centers Perspective: implementation and execution challenges Alan Blatecky Executive Director SDSC.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CIF21) NSF-wide Cyberinfrastructure Vision People, Sustainability, Innovation,
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
NSF and Environmental Cyberinfrastructure Margaret Leinen Environmental Cyberinfrastructure Workshop, NCAR 2002.
Where do we go from here? “Knowledge Environments to Support Distributed Science and Engineering” Symposium on Knowledge Environments for Science and Engineering.
Western Regional Biomedical Collaboratory Creating a culture for collaboration.
Is 'Designing' Cyberinfrastructure - or, Even, Defining It - Possible? Peter A. Freeman National Science Foundation January 29, 2007 The views expressed.
Milos Kobliha Alejandro Cimadevilla Luis de Alba Parallel Computing Seminar GROUP 12.
14 July 2000TWIST George Brett NLANR Distributed Applications Support Team (NCSA/UIUC)
1 Ideas About the Future of HPC in Europe “The views expressed in this presentation are those of the author and do not necessarily reflect the views of.
EInfrastructures (Internet and Grids) - 15 April 2004 Sharing ICT Resources – “Think Globally, Act Locally” A point-of-view from the United States Mary.
EInfrastructures (Internet and Grids) - 15 April 2004 Sharing ICT Resources – Discussion of Best Practices in the U.S. Mary E. Spada Program Manager, Strategic.
Knowledge Environments for Science: Representative Projects Ian Foster Argonne National Laboratory University of Chicago
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
Data Infrastructures Opportunities for the European Scientific Information Space Carlos Morais Pires European Commission Paris, 5 March 2012 "The views.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
Spring 2003 Internet2 Meeting Cyberinfrastructure - Implications for the Future of Research Alan Blatecky ANIR National Science Foundation.
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
CyberInfrastructure workshop CSG May Ann Arbor, Michigan.
Perspectives on Cyberinfrastructure Daniel E. Atkins Professor, University of Michigan School of Information & Dept. of EECS October 2002.
NanoHUB.org and HUBzero™ Platform for Reproducible Computational Experiments Michael McLennan Director and Chief Architect, Hub Technology Group and George.
© Internet 2012 Internet2 and Global Collaboration APAN 33 Chiang Mai 14 February 2012 Stephen Wolff Internet2.
Geosciences - Observations (Bob Wilhelmson) The geosciences in NSF’s world consists of atmospheric science, ocean science, and earth science Many of the.
National Center for Supercomputing Applications Barbara S. Minsker, Ph.D. Associate Professor National Center for Supercomputing Applications and Department.
Breakout #2 Generic Classes of Issues Hardware –big iron (capability, not just capacity) Network –last-mile problem –computational grid Software/frameworks.
The Swiss Grid Initiative Context and Initiation Work by CSCS Peter Kunszt, CSCS.
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Breakout # 1 – Data Collecting and Making It Available Data definition “ Any information that [environmental] researchers need to accomplish their tasks”
GEOSCIENCE NEEDS & CHALLENGES Dogan Seber San Diego Supercomputer Center University of California, San Diego, USA.
1 Cyberinfrastructure: The Future and Its Challenges Oklahoma Supercomputing Symposium 2003 September 25, 2003 Peter A. Freeman Assistant Director of NSF.
Three Critical Matters in Big Data Projects for e- Science Kerk F. Kee, Ph.D. Assistant Professor, Chapman University Orange, California
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
Funding: Staffing for Research Computing What staffing models does your institution use for research computing? How does your institution pay for the staffing.
26/05/2005 Research Infrastructures - 'eInfrastructure: Grid initiatives‘ FP INFRASTRUCTURES-71 DIMMI Project a DI gital M ulti M edia I nfrastructure.
ARL Workshop on New Collaborative Relationships: The Role of Academic Libraries in the Digital Data Universe September 26-27, 2006 ARL Prue.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
December 10, 2003Slide 1 International Networking and Cyberinfrastructure Douglas Gatchell Program Director International Networking National Science Foundation,
Internet2 Applications Group: Renater Group Presentation T. Charles Yun Internet2 Program Manager, Applications Group 30 October 2001.
What’s Happening at Internet2 Renee Woodten Frost Associate Director Middleware and Security 8 March 2005.
GRID ANATOMY Advanced Computing Concepts – Dr. Emmanuel Pilli.
Digital Data Collections ARL, CNI, CLIR, and DLF Forum October 28, 2005 Washington DC Chris Greer Program Director National Science Foundation.
NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions and services that benefit.
Preliminary Findings Baseline Assessment of Scientists’ Data Sharing Practices Carol Tenopir, University of Tennessee
NORDUnet NORDUnet e-Infrastrucure: Grids and Hybrid Networks Lars Fischer CTO, NORDUnet Fall 2006 Internet2 Member Meeting, Chicago.
EScience for All: Not If, But When Jeannette M. Wing Assistant Director, NSF CISE President’s Professor of Computer Science, CMU.
High Risk 1. Ensure productive use of GRID computing through participation of biologists to shape the development of the GRID. 2. Develop user-friendly.
1 Kostas Glinos European Commission - DG INFSO Head of Unit, Géant and e-Infrastructures "The views expressed in this presentation are those of the author.
APAN Meeting Douglas Gatchell US NSF/CISE/SCI July 4, 2004.
Cyberinfrastructure Overview of Demos Townsville, AU 28 – 31 March 2006 CREON/GLEON.
All Hands Meeting 2005 BIRN-CC: Building, Maintaining and Maturing a National Information Infrastructure to Enable and Advance Biomedical Research.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Internet2 Applications & Engineering Ted Hanss Director, Applications Development.
Clouds , Grids and Clusters
Brian Matthews STFC EOSCpilot Brian Matthews STFC
Bird of Feather Session
NSF Middleware Initiative
Presentation transcript:

Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center

Computing in Atmospheric Sciences Workshop: 2003 Driving Factor 1: Technology Pull Continuing exponential advances in sensor, computer, storage and network capabilities will occur Sensor networks are creating new experimental facilities PetaByte databases are in-place; ExaByte databases will become feasible Increase in numerical and computer modeling capabilities and capacities will continue to broaden the base of science disciplines Increase in network speeds makes it feasible to connect distributed resources as never before

Computing in Atmospheric Sciences Workshop: 2003 Technology Curves Optical Fiber (bits per second) (Doubling time 9 Months) Silicon Computer Chips (Number of Transistors) (Doubling time 18 Months) Data Storage (bits per square inch) (Doubling time 12 Months)

Computing in Atmospheric Sciences Workshop: 2003 Driving Factor 2: Science Problem Push New classes of scientific problems are now being pursued More difficult research, more complex requirements Coupling of expertise, collaboration, and disciplines encourage the development of new science and research –High energy physicists are planning to harness tens of thousands of CPUs in a worldwide data grid –On-line digital sky survey requires mechanisms for data federation and effective navigation –Geoscientists are planning to interlink and share multi-disciplinary data sets to understand the complex dynamics of Earth Systems

Computing in Atmospheric Sciences Workshop: 2003 Initial Observations Emerging technologies are enabling new types of collaboration and possibilities in Science Recent changes in the nature of science are creating an urgent requirement for a new class of distributed science infrastructure E-science has some unique requirements that will not be addressed by industry Current Grid/Middleware technology and software development is being done via a patchwork of diverse, short-term projects and programs around the globe Infrastructure now extends far beyond just technologies and capabilities

Computing in Atmospheric Sciences Workshop: 2003 Hardware NSF Model: Integrated CI System meeting the needs of a community of communities Grid Services & Middleware Development Tools & Libraries Applications Environmental Science High Energy Physics Proteomics/Genomics … Domain- specific Cybertools (software) Shared Cybertools (software) Distributed Resources (computation, communication storage, etc.) Education and Training Discovery & Innovation

Computing in Atmospheric Sciences Workshop: 2003 Data & Storage Networks & Middleware Instruments & Labs Domain Expertise & Disciplines Institutions & Communities Computers & Grids E-Science Collaboration Knowledge Applications Cyberinfrastructure

Computing in Atmospheric Sciences Workshop: 2003 Cultural Challenges I Nature of science and engineering enabled by cyberinfrastructure is fundamentally different from traditional approaches Sharing becomes a fundamental tenet for science –New traditions of competing and cooperation need to be developed –Remote participation and management of distributed control Who “owns” research that is shared and collaborative? –What about derivative rights, credit in terms of publication and exploration?

Computing in Atmospheric Sciences Workshop: 2003 Cultural Challenges II Conceptual models and frameworks to understand cyberinfrastructure are limited –NSF is in process of defining –What are Virtual Organizations? Social conventions, folkways and mores for science & research need to change What are the incentives? What does key contributor mean in truly distributed and open collaborations?

Computing in Atmospheric Sciences Workshop: 2003 Technical Challenges Standard setting is wishful thinking due to swiftness of technology change More players, more heterogeneity, design must scale System must be production-ready – SW must be bullet-proof, useful, usable Most interesting programming models (real-time, on-demand, adaptive, etc.) still require considerable research Useful cyber tools, useful domain tools, required

Computing in Atmospheric Sciences Workshop: 2003 Logistical, Legal Challenges Who maintains the Software? Who fixes the bugs? Who documents the code? Who answers the phone? Who wears the pager? How do we do accounting over multiple administrative and discipline domains? How do we allocate resources over multiple sites? How do we deal with varying organizational IP policies, open source policies, licensing policies, etc. What authority and responsibility will or should Virtual Organizations have?

Computing in Atmospheric Sciences Workshop: 2003 Ideological Challenges What organizational framework promotes development of stable, persistent infrastructure? How do we integrate different institutional approaches and cultures for administration of resources, operations, software development and deployment, etc. How do we develop metrics and incentives for meaningful cooperation, coordination, community collections, etc What is shared/private, free/charged for, centralized/distributed, etc.

Computing in Atmospheric Sciences Workshop: 2003 International Challenges How do you share resources across national boundaries? How do we do global allocations? Who make decisions? Who enforces decisions? What mechanisms should be used to select/support applications? How do we ensure stability and interoperability? What about national security?

Computing in Atmospheric Sciences Workshop: 2003 Some Support Issues Inadequate funding of Grid and middleware in general A recognition that long-term, sustained, and persistent efforts to develop and support grid and middleware must be established as soon as possible A recognition that international coordination, cooperation and collaboration has to be enabled Inadequate funding of pipelines to produce expertise for future

Computing in Atmospheric Sciences Workshop: 2003 End