U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,

Slides:



Advertisements
Similar presentations
Conference xxx - August 2003 Fabrizio Gagliardi EDG Project Leader and EGEE designated Project Director Position paper Delivery of industrial-strength.
Advertisements

SWITCH Visit to NeSC Malcolm Atkinson Director 5 th October 2004.
Joint CASC/CCI Workshop Report Strategic and Tactical Recommendations EDUCAUSE Campus Cyberinfrastructure Working Group Coalition for Academic Scientific.
Towards a Virtual European Supercomputing Infrastructure Vision & issues Sanzio Bassini
EInfrastructures (Internet and Grids) US Resource Centers Perspective: implementation and execution challenges Alan Blatecky Executive Director SDSC.
FP6−2004−Infrastructures−6-SSA [ Empowering e Science across the Mediterranean ] Grids and their role towards development F. Ruggieri – INFN (EUMEDGRID.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Open Science Grid June 28, 2006 Bill Kramer Chair of the Open Science Grid Council NERSC Center General Manager, LBNL.
NSF and Environmental Cyberinfrastructure Margaret Leinen Environmental Cyberinfrastructure Workshop, NCAR 2002.
EInfrastructures (Internet and Grids) - 15 April 2004 Sharing ICT Resources – “Think Globally, Act Locally” A point-of-view from the United States Mary.
EInfrastructures (Internet and Grids) - 15 April 2004 Sharing ICT Resources – Discussion of Best Practices in the U.S. Mary E. Spada Program Manager, Strategic.
Knowledge Environments for Science: Representative Projects Ian Foster Argonne National Laboratory University of Chicago
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
Welcome to HTCondor Week #14 (year #29 for our project)
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
Critical Role of ICT in Parliament Fulfill legislative, oversight, and representative responsibilities Achieve the goals of transparency, openness, accessibility,
Commodity Grid (CoG) Kits Keith Jackson, Lawrence Berkeley National Laboratory Gregor von Laszewski, Argonne National Laboratory.
The OMII Perspective on Grid and Web Services At the University of Southampton.
Experiment Requirements for Global Infostructure Irwin Gaines FNAL/DOE.
The Climate Prediction Project Global Climate Information for Regional Adaptation and Decision-Making in the 21 st Century.
INFSO-RI Enabling Grids for E-sciencE The US Federation Miron Livny Computer Sciences Department University of Wisconsin – Madison.
LCG and HEPiX Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002.
WG Goals and Workplan We have a charter, we have a group of interested people…what are our plans? goalsOur goals should reflect what we have listed in.
Miron Livny Computer Sciences Department University of Wisconsin-Madison Welcome and Condor Project Overview.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Virtual Data Grid Architecture Ewa Deelman, Ian Foster, Carl Kesselman, Miron Livny.
GriPhyN EAC Meeting (Jan. 7, 2002)Carl Kesselman1 University of Southern California GriPhyN External Advisory Committee Meeting Gainesville,
Perspectives on Grid Technology Ian Foster Argonne National Laboratory The University of Chicago.
The Internet2 HENP Working Group Internet2 Spring Meeting April 9, 2003.
SEEK Welcome Malcolm Atkinson Director 12 th May 2004.
1 ARGONNE  CHICAGO Grid Introduction and Overview Ian Foster Argonne National Lab University of Chicago Globus Project
Authors: Ronnie Julio Cole David
JRA Execution Plan 13 January JRA1 Execution Plan Frédéric Hemmer EGEE Middleware Manager EGEE is proposed as a project funded by the European.
Mcs/ HPC challenges in Switzerland Marie-Christine Sawley General Manager CSCS SOS8, Charleston April,
GRIDS Center Middleware Overview Sandra Redman Information Technology and Systems Center and Information Technology Research Center National Space Science.
Facilitator: Dr Alex Ryan Associate, Higher Education Academy Interdisciplinary Sustainability Education: Insights, Momentum and Futures 14 th December.
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
SEAL Core Libraries and Services CLHEP Workshop 28 January 2003 P. Mato / CERN Shared Environment for Applications at LHC.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
OSG Consortium Meeting (January 23, 2006)Paul Avery1 University of Florida Open Science Grid Progress Linking Universities and Laboratories.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
The Minority-Serving Institutions (MSI) Cyberinfrastructure (CI) Institute [MSI C(I) 2 ] Providing a scalable mechanism for developing a CI-enabled science.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
Internet2 Applications Group: Renater Group Presentation T. Charles Yun Internet2 Program Manager, Applications Group 30 October 2001.
National e-Science Institute and National e-Science Centre The Way Ahead Prof. Malcolm Atkinson Director 30 th September 2003.
6 February 2004 Internet2 Priorities 2004 Internet2 Industry Strategy Council Douglas Van Houweling.
2005 GRIDS Community Workshop1 Learning From Cyberinfrastructure Initiatives Grid Research Integration Development & Support
GriPhyN Project Paul Avery, University of Florida, Ian Foster, University of Chicago NSF Grant ITR Research Objectives Significant Results Approach.
Thoughts on International e-Science Infrastructure Kevin Thompson U.S. National Science Foundation Office of Cyberinfrastructure iGrid2005 9/27/2005.
Digital Data Collections ARL, CNI, CLIR, and DLF Forum October 28, 2005 Washington DC Chris Greer Program Director National Science Foundation.
NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions and services that benefit.
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
EGEE Project Review Fabrizio Gagliardi EDG-7 30 September 2003 EGEE is proposed as a project funded by the European Union under contract IST
Internet2 Spring Meeting NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions.
Realizing the Promise of Grid Computing Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer Science.
1 Kostas Glinos European Commission - DG INFSO Head of Unit, Géant and e-Infrastructures "The views expressed in this presentation are those of the author.
All Hands Meeting 2005 BIRN-CC: Building, Maintaining and Maturing a National Information Infrastructure to Enable and Advance Biomedical Research.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
EMI is partially funded by the European Commission under Grant Agreement RI EMI Sustainability Alberto Di Meglio, CERN DCI Projects Meeting Amsterdam,
10-Feb-00 CERN HepCCC Grid Initiative ATLAS meeting – 16 February 2000 Les Robertson CERN/IT.
Hall D Computing Facilities Ian Bird 16 March 2001.
Bob Jones EGEE Technical Director
Ian Bird GDB Meeting CERN 9 September 2003
ESMF Governance Cecelia DeLuca NOAA CIRES / NESII April 7, 2017
The Grid Observatory SSC Towards a Computer Science SSC
Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002
Status of Grids for HEP and HENP
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting, CERN, February 22, 2003

2 ARGONNE  CHICAGO Overview l U.S. Grid projects –Overview of current state in infrastructure, applications, and middleware l Next steps –NSF “Cyberinfrastructure” report –U.S. MAGIC Committee –Planned “LHC” ITR proposal & GRIDS-2 –Building a Grid middleware community l U.S. involvement in EGEE –Integration of infrastructure –Collaboration on middleware

3 ARGONNE  CHICAGO Current State of U.S. Grid Projects (1): Infrastructure + Applications l Infrastructure deployment & operation –NSF TeraGrid, iVDGL, DOE Science Grid, NASA IPG, BIRN, various regional Grids –Good progress, but still far from critical mass as a national “cyberinfrastructure” l Applications R&D and deployment –(HEP) GriPhyN, iVDGL, PPDG [next slide] –(other) Earth System Grid, NEESgrid, NEON, GEON, etc. –Substantial engagement of large application communities; much more needed

4 ARGONNE  CHICAGO “Trillium”: US Physics Grid Projects l Particle Physics Data Grid –Data Grid tools for HENP expts –DOE funded, $9.5M l GriPhyN –Data Grid research –NSF funded, $11.9M l iVDGL –Development of global Grid lab –NSF funded, $14.1M  Data intensive expts.  Collaborations of physicists & computer scientists  Infrastructure development & deployment  Globus + VDT based =

5 ARGONNE  CHICAGO Current State of U.S. Grid Projects (2): Middleware l Diverse mix of projects & funding sources –No-one wants to fund middleware! l Unified by two long-lived projects: Globus (since 1995), Condor (since 1987) –Persistence, strategic direction, skilled staff –Foundation for essentially all Grid projects –Strong & close coordination between the two l Much recent progress towards creation of professional, distributed support structures –With support from NSF Middleware Initiative/GRIDS Center & GriPhyN (VDT)

6 ARGONNE  CHICAGO Impact of NSF NMI/GRIDS Center: Evolution of GT Processes l Before 2000 – -based problem tracking, aka “req” l 2000 –Detailed documentation, release notes (Q1) –Legal framework for external contributions (Q1) l 2001 –Packaging; module & binary releases (Q4) –Substantial regression tests (Q4) l 2002 –Bugzilla problem reporting & tracking (Q2) –Processes for external contrib (Q2) –Distributed testing infrastructure (Q3) l 2003 (in progress) –Distributed support infrastructure: GT “support centers” –Standardized Grid testing framework(s) –GT “contrib” components –Grid Technology Repository

7 ARGONNE  CHICAGO Overview l U.S. Grid projects –Overview of current state in infrastructure, applications, and middleware l Next steps –NSF “Cyberinfrastructure” report –U.S. MAGIC Committee –Planned “LHC” ITR proposal & GRIDS-2 –Building a Grid middleware community l U.S. involvement in EGEE –Integration of infrastructure –Collaboration on middleware

8 ARGONNE  CHICAGO Report of the NSF Blue Ribbon Panel on Cyberinfrastructure ( ) l “A new age has dawned in scientific and engineering research, pushed by continuing progress in computing, information, and communication technology, and pulled by the expanding complexity, scope, and scale of today’s challenges. The capacity of this technology has crossed thresholds that now make possible a comprehensive cyberinfrastructure on which to build new types of scientific and engineering knowledge environments and organizations and to pursue research in new ways and with increased efficacy.” l Recommends $1B/yr new funding for

9 ARGONNE  CHICAGO Report of the NSF Blue Ribbon Panel on Cyberinfrastructure ( “The National Science Foundation should establish and lead a large-scale, interagency, and internationally coordinated Advanced Cyberinfrastructure Program (ACP) to create, deploy, and apply cyberinfrastructure in ways that radically empower all scientific and engineering research and allied education. We estimate that sustained new NSF funding of $1 billion per year is needed.”

10 ARGONNE  CHICAGO NSF ITR: Global Analysis Communities

11 ARGONNE  CHICAGO NSF ITR: Global Analysis Communities l Global knowledge & resource management + collab. tools l Infrastructure to support “Community Grids” l Infrastructure to manage dynamic workspace capabilities l Decentralized multi-tiered schema evolution and synchronization

12 ARGONNE  CHICAGO GRIDS Center 2 (Proposals to NSF Due March 7 th ) l Transition to OGSA standards l Expand range of functionality supported l Put in place a distributed integration, testing, and support structure l Facilitate exporting the NMI toolset to other middleware activities l Expand the set of communities supported l Establish international collaborations

13 ARGONNE  CHICAGO GRIDS Center 2: An Open Grid Technology Community l Success of Grid concept demands effective community mechanisms for coordinated –R&D for core technologies –Testing, packaging, documentation –Support and training of user communities l All three must become collaborative activities –Based on open standards (GGF) –Centered on a common code base –Supported by appropriate tools –United by appropriate processes & governance

14 ARGONNE  CHICAGO Overview l U.S. Grid projects –Overview of current state in infrastructure, applications, and middleware l Next steps –NSF “Cyberinfrastructure” report –U.S. MAGIC Committee –Planned “LHC” ITR proposal & GRIDS-2 –Building a Grid middleware community l U.S. involvement in EGEE –Integration of infrastructure –Collaboration on middleware

15 ARGONNE  CHICAGO U.S. Involvement in EGEE (1): Integration of Infrastructure l EGEE (& U.S. equivalents) to serve science communities with international scope –Physics, astronomy, environment, bio, …, … l We must design for international collaboration & coordination from the start –Need to learn more about application needs & implications on Grid design and operation –But some things clear, e.g., standards; open and productive coordination; applications l Strong application community interest in establishing these connections

16 ARGONNE  CHICAGO U.S. Involvement in EGEE (2): Collaboration on Middleware l With OGSA, EGEE, evolving distributed support structures, etc., stars are aligned for true international cooperation –Software: transform GT & Condor into international collaboration (think Linux) on a common base for Grid/distributed computing –Testing & support: link EGEE staff & systems into international testing framework: procedures, tools, infrastructure, bi-literal agreements l Not easy: many opportunities for not-invented- here, start-from-scratch perspectives, and small local changes that lead to diversion!

17 ARGONNE  CHICAGO U.S. Involvement in EGEE (3): Specific Activities l Explicit collaborative efforts aimed at –Integrating U.S. and EGEE resources in support of international science –Creation & operation of coordinated international testing and support structure l Direct engagement of U.S. groups in EGEE work packages –Joint development in some cases, e.g., monitoring/operations –Establishment and operation of the structures above, to ensure common evaluation, testing, and support