Www.grids-center.org The GRIDS Center, part of the NSF Middleware Initiative The GRIDS Center: Defining and Deploying Grid Middleware presented by Tom.

Slides:



Advertisements
Similar presentations
Plateforme de Calcul pour les Sciences du Vivant SRB & gLite V. Breton.
Advertisements

Towards a Virtual European Supercomputing Infrastructure Vision & issues Sanzio Bassini
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Application of GRID technologies for satellite data analysis Stepan G. Antushev, Andrey V. Golik and Vitaly K. Fischenko 2007.
Building on the BIRN Workshop BIRN Systems Architecture Overview Philip Papadopoulos – BIRN CC, Systems Architect.
The DOE Science Grid Computing and Data Infrastructure for Large-Scale Science William Johnston, Lawrence Berkeley National Lab Ray Bair, Pacific Northwest.
Milos Kobliha Alejandro Cimadevilla Luis de Alba Parallel Computing Seminar GROUP 12.
EInfrastructures (Internet and Grids) - 15 April 2004 Sharing ICT Resources – “Think Globally, Act Locally” A point-of-view from the United States Mary.
EInfrastructures (Internet and Grids) - 15 April 2004 Sharing ICT Resources – Discussion of Best Practices in the U.S. Mary E. Spada Program Manager, Strategic.
Knowledge Environments for Science: Representative Projects Ian Foster Argonne National Laboratory University of Chicago
Simo Niskala Teemu Pasanen
GRIDS Center G rid R esearch I ntegration D evelopment & S upport Chicago - NCSA – SDSC - USC/ISI - Wisconsin.
30 August 2015 NSF Middleware Initiative Building for the Future Kevin Thompson, NSF Ken Klingenstein, Internet2 Tom Garritano, Grids Center Mary Fran.
Peer to Peer & Grid Computing Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer Science The University.
HPC club presentation A proposal for a campus-wide research grid Barry Wilkinson Department of Computer Science UNC-Charlotte Dec. 2, 2005.
National Center for Supercomputing Applications The Computational Chemistry Grid: Production Cyberinfrastructure for Computational Chemistry PI: John Connolly.
Long Term Ecological Research Network Information System LTER Grid Pilot Study LTER Information Manager’s Meeting Montreal, Canada 4-7 August 2005 Mark.
Slide 1 Experiences with NMI R2 Grids Software at Michigan Shawn McKee April 8, 2003 Internet2 Spring Meeting.
ARGONNE  CHICAGO Ian Foster Discussion Points l Maintaining the right balance between research and development l Maintaining focus vs. accepting broader.
Ian Foster Argonne National Lab University of Chicago Globus Project The Grid and Meteorology Meteorology and HPN Workshop, APAN.
SAN DIEGO SUPERCOMPUTER CENTER NUCRI Advisory Board Meeting November 9, 2006 Science Gateways on the TeraGrid Nancy Wilkins-Diehr TeraGrid Area Director.
GT Components. Globus Toolkit A “toolkit” of services and packages for creating the basic grid computing infrastructure Higher level tools added to this.
INFSO-RI Enabling Grids for E-sciencE The US Federation Miron Livny Computer Sciences Department University of Wisconsin – Madison.
Grid Security Issues Shelestov Andrii Space Research Institute NASU-NSAU, Ukraine.
What is Internet2? Ted Hanss, Internet2 5 March
NSF Middleware Initiative Renee Woodten Frost Assistant Director, Middleware Initiatives Internet2 NSF Middleware Initiative.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
GEM Portal and SERVOGrid for Earthquake Science PTLIU Laboratory for Community Grids Geoffrey Fox, Marlon Pierce Computer Science, Informatics, Physics.
Virtual Data Grid Architecture Ewa Deelman, Ian Foster, Carl Kesselman, Miron Livny.
Perspectives on Grid Technology Ian Foster Argonne National Laboratory The University of Chicago.
GCRC Meeting 2004 Introduction to the Grid and Security Philip Papadopoulos.
Introduction to Grid Computing Ed Seidel Max Planck Institute for Gravitational Physics
Grid Middleware Tutorial / Grid Technologies IntroSlide 1 /14 Grid Technologies Intro Ivan Degtyarenko ivan.degtyarenko dog csc dot fi CSC – The Finnish.
Ames Research CenterDivision 1 Information Power Grid (IPG) Overview Anthony Lisotta Computer Sciences Corporation NASA Ames May 2,
3 Nov 2003 A. Vandenberg © Second NMI Integration Testbed Workshop on Experiences in Middleware Deployment, Anaheim, CA 1 Grids at Georgia State – Starting.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
The Grid the united computing power Jian He Amit Karnik.
1 ARGONNE  CHICAGO Grid Introduction and Overview Ian Foster Argonne National Lab University of Chicago Globus Project
Authors: Ronnie Julio Cole David
1October 9, 2001 Sun in Scientific & Engineering Computing Grid Computing with Sun Wolfgang Gentzsch Director Grid Computing Cracow Grid Workshop, November.
Grid Security: Authentication Most Grids rely on a Public Key Infrastructure system for issuing credentials. Users are issued long term public and private.
GRIDS Center Middleware Overview Sandra Redman Information Technology and Systems Center and Information Technology Research Center National Space Science.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
ISERVOGrid Architecture Working Group Brisbane Australia June Geoffrey Fox Community Grids Lab Indiana University
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
CEOS Working Group on Information Systems and Services - 1 Data Services Task Team Discussions on GRID and GRIDftp Stuart Doescher, USGS WGISS-15 May 2003.
Alain Roy Computer Sciences Department University of Wisconsin-Madison Condor & Middleware: NMI & VDT.
GridShib Grid-Shibboleth Integration An Overview Von Welch
7. Grid Computing Systems and Resource Management
National Computational Science National Center for Supercomputing Applications National Computational Science Integration of the MyProxy Online Credential.
Globus and PlanetLab Resource Management Solutions Compared M. Ripeanu, M. Bowman, J. Chase, I. Foster, M. Milenkovic Presented by Dionysis Logothetis.
2005 GRIDS Community Workshop1 Learning From Cyberinfrastructure Initiatives Grid Research Integration Development & Support
Feb 2-4, 2004LNCC Workshop on Computational Grids & Apps Middleware for Production Grids Jim Basney Senior Research Scientist Grid and Security Technologies.
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions and services that benefit.
NSF Middleware Initiative and Enterprise Middleware: What Can It Do for My Campus? Renee Woodten Frost Internet2/University of Michigan.
Internet2 Spring Meeting NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions.
The GRIDS Center, part of the NSF Middleware Initiative Grid Security Overview presented by Von Welch National Center for Supercomputing.
The Globus Toolkit The Globus project was started by Ian Foster and Carl Kesselman from Argonne National Labs and USC respectively. The Globus toolkit.
GRIDS Center John McGee, USC/ISI April 10, 2003 Internet2 – Spring Member Meeting Arlington, VA NSF Middleware Initiative.
PARALLEL AND DISTRIBUTED PROGRAMMING MODELS U. Jhashuva 1 Asst. Prof Dept. of CSE om.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI UMD Roadmap Steven Newhouse 14/09/2010.
All Hands Meeting 2005 BIRN-CC: Building, Maintaining and Maturing a National Information Infrastructure to Enable and Advance Biomedical Research.
NSF Middleware Initiative and Enterprise Middleware: What Can It Do for My Campus? Mark Luker, EDUCAUSE Copyright Mark Luker, This work is the intellectual.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
The Minority-Serving Institutions (MSI) Cyberinfrastructure (CI) Institute [MSI C(I) 2 ] Providing a scalable mechanism for developing a CI-enabled science.
Access Grid and USAID November 14, 2007
Status of Grids for HEP and HENP
Presentation transcript:

The GRIDS Center, part of the NSF Middleware Initiative The GRIDS Center: Defining and Deploying Grid Middleware presented by Tom Garritano University of Chicago and Argonne National Laboratory

The GRIDS Center, part of the NSF Middleware Initiative 2 NSF Middleware Initiative (NMI) GRIDS is one of two original teams, the other being EDIT New NMI teams just announced (Grid portals and instrumentation) GRIDS releases well-tested, deployed and supported middleware based on common architectures that can be extended to Internet users around the world NSF support of GRIDS leverages investment by DOE, NASA, DARPA, UK e-Science Program, and private industry

The GRIDS Center, part of the NSF Middleware Initiative 3 GRIDS Center GRIDS = Grid Research Integration Development & Support Partnership of leading teams in Grid computing –University of Chicago and Argonne National Lab –Information Sciences Institute at USC –NCSA at the University of Illinois at Urbana-Champaign –SDSC at the University of California at San Diego –University of Wisconsin at Madison –Plus other software contributors (to date: UC Santa Barbara, U. of Michigan) GRIDS develops, tests, deploys and supports standard tools for: –Authentication, authorization, policy –Resource discovery and directory services –Remote access to computers, data, instruments

The GRIDS Center, part of the NSF Middleware Initiative 4 The Grid: What is it? “Resource-sharing technology with software and services that let people access computing power, databases, and other tools securely online across corporate, institutional, and geographic boundaries without sacrificing local autonomy.” Three key Grid criteria: –coordinates distributed resources –using standard, open, general-purpose protocols and interfaces –to deliver qualities of service not possible with pre-Grid technologies

The GRIDS Center, part of the NSF Middleware Initiative 5 R R R R R R R R R R R R Virtual Organizations Distributed resources and people

The GRIDS Center, part of the NSF Middleware Initiative 6 R R R R R R R R R R R R Virtual Organizations Distributed resources and people Linked by networks, crossing administrative domains

The GRIDS Center, part of the NSF Middleware Initiative 7 R R R R R R R R R R Virtual Organizations R R Distributed resources and people Linked by networks, crossing administrative domains Sharing resources, common goals VO-B VO-A

The GRIDS Center, part of the NSF Middleware Initiative 8 R R R R R R R R R R Virtual Organizations Distributed resources and people Linked by networks, crossing administrative domains Sharing resources, common goals Dynamic VO-B VO-A R R

The GRIDS Center, part of the NSF Middleware Initiative 9 R R R R R R R R R R R R VO-A VO-B Virtual Organizations Distributed resources and people Linked by networks, crossing administrative domains Sharing resources, common goals Dynamic Fault tolerant

The GRIDS Center, part of the NSF Middleware Initiative 10 GRIDS Center Software Suite Globus Toolkit ®. The de facto standard for Grid computing, an open-source "bag of technologies" to simplify collaboration across organizations. Includes tools for authentication, scheduling, file transfer and resource description. Condor-G. Enhanced version of the core Condor software optimized to work with GT for managing Grid jobs. Network Weather Service (NWS). Periodically monitors and dynamically forecasts performance of network and computational resources. Grid Packaging Tools (GPT). XML-based packaging data format defines complex dependencies between components.

The GRIDS Center, part of the NSF Middleware Initiative 11 GRIDS Center Software Suite (cont.) GSI-OpenSSH. Modified version adds support for Grid Security Infrastructure (GSI) authentication and single sign-on capability. MyProxy. Repository lets users retrieve a proxy credential on demand, without managing private key and certificate files across sites and applications. MPICH-G2. Grid-enabled implementation of the Message Passing Index (MPI) standard, based on the popular MPICH library. GridConfig. Manages the configuration of GRIDS components, letting users regenerate configuration files in native formats and ensure consistency. KX.509 and KCA. A tool from EDIT that bridges Kerberos and PKI infrastructure.

The GRIDS Center, part of the NSF Middleware Initiative 12 E-Science Benefits Substantially from GRIDS Components Large-scale IT deployment projects rely on GRIDS components and architecture for core services –BIRN, the Bioinformatics Research Network –GEON, the Geoscience Network –GriPhyN, Particle Physics Data Grid, International Virtual Data Grid Laboratory –NEESgrid, part of the Network for Earthquake Engineering Simulation –International projects such as the UK e-Science Program and EU DataGrid GRIDS standard tools let projects avoid building their own infrastructure –Increases interoperability, efficiency –Prevents “balkanization” of applications BIRN MRI Data for Brain Imaging

The GRIDS Center, part of the NSF Middleware Initiative 13 Industrial and International Leaders Move to Grid Services GRIDS leaders engage a worldwide community in defining specifications for Grid services –Very active working through Global Grid Forum –Over a dozen leading companies (IBM, HP, Platform) have committed to Globus-based Grid services for their products NMI-R4 in December will include Globus Toolkit 3.0 –GT3 is the first full-scale deployment of new Open Grid Services Infrastructure (OGSI) spec –Significant contributions from new international partners (University of Edinburgh and Swedish Royal Institute of Technology) for database access and security –UK Council for the Central Laboratory of the Research Councils (CCLRC) users rank deployment of GT3 as their #1 priority

The GRIDS Center, part of the NSF Middleware Initiative 14 Acclaim for GRIDS Components On July 15, the New York Times noted the “far-sighted simplicity” of the Grid services architecture The Globus Toolkit has earned: –R&D 100 Award –Federal Laboratory Consortium Award for Excellence in Technology Transfer MIT Technology Review named Grid one of “Ten Technologies That Will Change the World” InfoWorld list of 2003’s top 10 innovators includes two GRIDS PIs GRIDS co-PI Ian Foster named “Innovator of the Year” for 2003 by R&D Magazine

The GRIDS Center, part of the NSF Middleware Initiative 15 Future GRIDS Plans GRIDS is completing its second year in October –Original three-year award, through Fall 2004 –Very successful in establishing processes, meeting twice/year release schedule, defining broadly accepted Grid middleware standards, and increasing public awareness of Grid computing GRIDS Center 2 plans –Further develop and refine core NMI releases and processes –Deploy tools based on Open Grid Services Architecture –Expand testing capability –Create a federated bug-tracking facility –Public databases: Grid Projects and Deployments System and Grid Technology Repository –Increase outreach to communities at all levels: Existing major Grid projects (e.g., TeraGrid, NEESgrid) Major projects that should use Grid more (e.g., SEEK, NEON) New communities not yet using Grid (e.g., Computer-Aided Diagnosis)

The GRIDS Center, part of the NSF Middleware Initiative 16 Upcoming Tutorials GRIDS is extremely well-represented at SC03, the supercomputing conference –Tutorials, technical papers, BoFs, demonstrations –Phoenix, AZ, November – GlobusWORLD 2004 conference –Co-sponsored by GRIDS –San Francisco, CA, January –Academia and Industry both well-represented –

The GRIDS Center, part of the NSF Middleware Initiative 17 For more information The GRIDS Center NSF Middleware Initiative The Globus Alliance