Presentation is loading. Please wait.

Presentation is loading. Please wait.

Copyright James Kent Blackburn 2007. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial,

Similar presentations


Presentation on theme: "Copyright James Kent Blackburn 2007. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial,"— Presentation transcript:

1 Copyright James Kent Blackburn 2007. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial, educational purposes, provided that this copyright statement appears on the reproduced materials and notice is given that the copying is by permission of the author. To disseminate otherwise or to republish requires written permission from the author. Copyright Notice

2 Open Science Grid James “Kent” Blackburn OSG Resources Manager Senior Scientist LIGO Laboratory California Institute of Technology

3 Instrumentation Security Control Data Generation Computation Analysis Simulation Program Security Management Security and Access Authentication Access Control Authorization Researcher Control Program Viewing Security 3D Imaging Display and Visualization. Display Tools Security Data Input Collab Tools Publishing Human Support Help Desk Policy and Funding Resource Providers Funding Agencies Campuses Search Data Sets Storage Security RetrievalInput Schema Metadata Data Directories Ontologies Archive Education And Outreach Network Training OSG’s Coverage of the CI “Bubble” Diagram OSG OSG Consortium

4 The Open Science Grid The Open Science Grid’s mission is to help satisfy the ever-growing computing and data management requirements of researchers by enabling them to share a greater percentage of available computer cycles and software with less effort. The OSG is a distributed, common cyberinfrastructure spanning campus, regional, national and international boundaries. At over 50 provider sites, independently-owned and managed resources make up the distributed facility;  agreements between members provided the glue;  their requirements drive the evolution;  their effort helps make it happen. The facility is dedicated to high throughput computing and is open to researchers from all domains. OSG is a Cyberinfrastructure for Research The OSG is a framework for large scale distributed resource sharing, addressing the technology, policy and social requirements of sharing

5 OSG Consortium Partners Academia Sinica Argonne National Laboratory (ANL) Boston University Brookhaven National Laboratory (BNL) California Institute of Technology Center for Advanced Computing Research Center for Computation & Technology at Louisiana State University Center for Computational Research, The State University of New York at Buffalo Center for High Performance Computing at the University of New Mexico Columbia University Computation Institute at the University of Chicago Cornell University DZero Collaboration Dartmouth College Fermi National Accelerator Laboratory (FNAL) Florida International University Georgetown University Hampton University Indiana University Indiana University-Purdue University, Indianapolis International Virtual Data Grid Laboratory (iVDGL) Thomas Jefferson National Accelerator Facility University of Arkansas Universidade de São Paulo Universideade do Estado do Rio de Janerio University of Birmingham University of California, San Diego University of Chicago University of Florida University of Illinois at Chicago University of Iowa University of Michigan University of Nebraska - Lincoln University of New Mexico University of North Carolina/Renaissance Computing Institute University of Northern Iowa University of Oklahoma University of South Florida University of Texas at Arlington University of Virginia University of Wisconsin-Madison University of Wisconsin-Milwaukee Center for Gravitation and Cosmology Vanderbilt University Wayne State University Kyungpook National University Laser Interferometer Gravitational Wave Observatory (LIGO) Lawrence Berkeley National Laboratory (LBL) Lehigh University Massachusetts Institute of Technology National Energy Research Scientific Computing Center (NERSC) National Taiwan University New York University Northwest Indiana Computational Grid Notre Dame University Pennsylvania State University Purdue University Rice University Rochester Institute of Technology Sloan Digital Sky Survey (SDSS) Southern Methodist University Stanford Linear Accelerator Center (SLAC) State University of New York at Albany State University of New York at Binghamton State University of New York at Buffalo Syracuse University T2 HEPGrid Brazil Texas Advanced Computing Center Texas Tech University

6 What The OSG Offers Low-threshold access to many distributed computing and storage resources A combination of dedicated, scheduled, and opportunistic computing The Virtual Data Toolkit software packaging and distributions Grid Operations, including facility-wide monitoring, validation, information services and system integration testing Operational security Troubleshooting of end-to-end problems Education and Training

7 The OSG as a Community Alliance The OSG is a grass-roots endeavor bringing together research institutions throughout the U.S. and the World.  The OSG Consortium brings together the stakeholders.  The OSG Facility brings together resources and users. The OSG’s growing alliance of universities, national laboratories, scientific collaborations and software developers,  contribute to the OSG,  share ideas and technologies  reap the benefits of the integrated resources through both agreements with fellow members and opportunistic use. An active engagement effort adds new domains and resource providers to the OSG Consortium. Training is offered at semi-annual OSG Consortium meetings and through educational activities organized in collaboration with TeraGrid.  One to three day hands-on training sessions are offered around the U.S and abroad for users, administrators and developers.

8 OSG Community Structure Virtual Organizations (VOs) The OSG community shares/trades in groups (VOs) not individuals VO management services allow registration, administration and control of members within VOs Facilities trust and authorize VOs Compute and storage services prioritize according to VO group membership Set of Available Resources VO Management Service OSG and WAN Campus Grid Experimental Project Grid Image courtesy: UNM VO Management & Applications VO Management & Applications

9 Campus Grids They are a fundamental building block of the OSG  The multi-institutional, multi-disciplinary nature of the OSG is a macrocosm of many campus IT cyberinfrastructure coordination issues. Currently OSG has three operational campus grids on board:  Fermilab, Purdue, Wisconsin  Working to add Clemson, Harvard, Lehigh Elevation of jobs from Campus CI to OSG is transparent Campus scale brings value through  Richness of common software stack with common interfaces  Higher common denominator makes sharing easier  Greater collective buying power with venders  Synergy through common goals and achievements

10 Current OSG Resources OSG has more than 50 participating institutions, including self-operated research VOs, campus grids, regional grids and OSG-operated VOs Provides about 10,000 CPU-days per day in processing Provides 10 Terabytes per day in data transport CPU usage averages about 75% OSG is starting to offer support for MPI

11 Weekly OSG Process Hours

12 Facts and Figures from First Year of Operations OSG contributed an average of over one thousand CPU-days per day for two months to the D0 physics experiment OSG provided the LHC collaboration more than 30% of their processing cycles worldwide, in which up to 100 Terabytes per day were transferred across more than 7 storage sites LIGO has been running workflows of more than 10,000 jobs across more than 20 different OSG sites. A climate modeling application has accumulated more than 10,000 CPU days of processing on the OSG. The Kuhlman Lab completed structure predictions for ten proteins, consuming more than 10,000 CPU-days on the OSG.

13 Facing the CI Challenge Together OSG is looking for a few partners to help deploy campus wide grid infrastructure that integrates with local enterprise infrastructure and the national CI OSG’s Engagement Team is available to help scientists get their applications running on OSG  Low impact starting point  Help your researchers gain significant compute cycles while exploring OSG as a framework for your own campus CI osg@renci.org osg@renci.orgSend your inquires to osg@renci.orgosg@renci.org Learn more about the OSG at http:www.opensciencegrid.orgLearn more about the OSG at http:www.opensciencegrid.orghttp:www.opensciencegrid.org


Download ppt "Copyright James Kent Blackburn 2007. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial,"

Similar presentations


Ads by Google