Presentation is loading. Please wait.

Presentation is loading. Please wait.

Open Science Grid By Zoran Obradovic CSE 510 November 1, 2007.

Similar presentations


Presentation on theme: "Open Science Grid By Zoran Obradovic CSE 510 November 1, 2007."— Presentation transcript:

1 Open Science Grid By Zoran Obradovic CSE 510 November 1, 2007

2 The OSG is a continuation of Grid3, a community grid built in 2003 through a joint project of the U.S. LHC software and computing programs, the National Science Foundations’ GriPhyN and iVDGL projects, and the Department of Energy’s PPDG project

3 Goal of Open Science Grid (OSG) is to facilitate the need for expanding computing and data management that are desired by scientific researchers, especially collaborative science requiring high throughput computing. It is an association of service and resource providers as well as researchers including universities, national laboratories and computing centers across the U.S.

4 This association, also known as a Consortium, includes members from particle and nuclear physics, astrophysics, bioinformatics, gravitational-wave science and computer science collaborations

5 Who are the Consortium Members?

6

7 And many more… DZero Collaboration Dartmouth College Deutsches Elektronen-Synchrotron (DESY) Fermi National Accelerator Laboratory (FNAL) Florida International University Georgetown University The Globus Alliance Grid Physics Network (GriPhyN) Grid Resources for Advanced Science and Engineering (GRASE) Hampton University Harvard University Indiana University Indiana University-Purdue University, Indianapolis International Virtual Data Grid Laboratory (iVDGL) Kyungpook National University Laser Interferometer Gravitational Wave Observatory (LIGO) Lawrence Berkeley National Laboratory (LBL) Lehigh University Massachusetts Institute of Technology National Energy Research Scientific Computing Center (NERSC) New York University Northwest Indiana Computational Grid Notre Dame University Oak Ridge National Laboratory OSG Grid Operations Center (GOC) Particle Physics Data Grid (PPDG) and PPDG Common Project Pennsylvania State University Purdue University Renaissance Computing Institute Rice University Rochester Institute of Technology São Paulo Regional Analysis Center (SPRACE) Sloan Digital Sky Survey (SDSS)

8 Consortium Members 2005

9 Consortium Members 2007

10 Partners: grid and network organizations as well as international, national, regional and campus grids Some of the partners: APAC National Grid Data Intensive Science University Network (DISUN) Enabling Grids for E-SciencE (EGEE) Grid Laboratory of Wisconsin (GLOW) Grid Operations Center at Indiana University Grid Research and Education Group at Iowa (GROW) Nordic Data Grid Facility (NorduGrid) Northwest Indiana Computational Grid (NWICG) Oxford e-Research Centre (OxGrid)

11 Who Manages OSG?

12

13 Several sub-groups within the Consortium manage, advise, oversee and govern the OSG These groups include the Executive Board, the Executive Team, the OSG Council, the Users Group, the Scientific Oversight Group and the Finance Board

14 OSG is governed by the Council, which includes a representative from each Consortium member. The Users Group for example supplies a venue for OSG user representatives to share requirements from and experiences of developing and running applications on the OSG. They ensure that all parts of the scientific mission and all applications in use on the OSG are represented

15 Brookhaven National Laboratory - Howard Gordon Collider Detector at Fermilab (CDF) - TBD Condor Project - Miron Livny DZero Collaboration - Brad Abbott DOSAR - Dick Greenwood Fermi National Accelerator Laboratory - Vicky White Globus Alliance - Ian Foster Some of the council members:

16 The Scientific Oversight Group represents a scientific community and directs the Council and Executive Board The Finance Board manages all matters related to OSG costs and resources

17 The administration of the OSG is led by the Executive Director and Executive Board Executive Director Ruth Pordes Fermi National Accelerator Laboratory Council ChairBill KramerLawrence Berkeley National Laboratory Facility Coordinator Miron Livny Deputy: Todd Tannenbaum (interim) University of Wisconsin, Madison They direct the OSG program of work, write policy and represent the OSG Consortium in relations with other organizations and committees.

18 Who are the Virtual Organizations?

19 Virtual Organization (VO) is a collection of people (VO members) and it encompasses the group's computing/storage resources and services They are responsible for corresponding individually with each other for guaranteed access to resources In order to receive the approval at another VO's site, a user's grid job must be able to present an verification token along with a token indicating the desired computing privileges

20 -(NYSGRID) -(CIGI) -Collider Detector at Fermilab (CDF) -Compact Muon Solenoid (CMS) -CompBioGrid (CompBioGrid) -D0 Experiment at Fermilab (DZero) -Dark Energy Survey (DES) -Distributed Organization for Scientific and Academic Research (DOSAR) -Engagement (Engage) -Fermi National Accelerator Center (Fermilab) -Functional Magnetic Resonance Imaging (fMRI) -Geant4 Software Toolkit (geant4) -Genome Analysis and Database Update (GADU) Some of the Virtual Organizations at OSG:

21 How to Form a VO at OSG?

22 -A Charter statement describing the purpose of the VO Organization needs… -A VO Membership Service which meets the requirements of an OSG Release. This is done by deploying the VOMS package (a system that manages real-time user authorization information for a VO) -A support organization (called a Support Center in OSG parlance) that will support the VO in OSG Operations -Completion of the registration form

23 After submitting the registration form, the OSG Operations Activity will review the information. If there are no issues preventing acceptance, they will send a welcome email message, add support for your VO to the OSG software infrastructure

24 Making resources available on OSG You don’t have to be a member of the VO to share resources On OSG, even though it is recommended (in order to test A resource on OSG you must be a member) Resources are presented to OSG typically in one of two modes: - Resources controlled by a single VO and made available as part of the VO's commitment to OSG -Resources provided by a Facility (a collection of resources or sites under the same administrative domain, not necessarily affiliated with a VO) or provided by a group of VOs.

25 Software Stacks

26 The VDT provides the underlying software stack for the OSG, but also provides software to other grids. It is separate into two caches. The goal of the VDT software cache is to be grid-agnostic. The OSG software stack is a thin layer on top of the VDT that does two things: it selects the subset of the VDT that OSG uses, and it provides OSG-specific configuration Pacman installs and configures it all

27 What is Pacman? Packaging system that installs virtual data toolkit (VDT) It is installed via a downloaded tarball Upward compatiblity with all existing caches Flexible command line based cache and package browsing Snapshots, cache hierarchies installation caches available Globus and/or SSH access as well as http access to caches and downloads Updating, verify and repair of installations Multi-version installations

28 Debian 3.1 (Sarge) Fedora Core 3 Fedora Core 4 Fedora Core 4 (x86-64) Fedora Core 4 (x86 on x86-64) RedHat Enterprise Linux 3 AS RedHat Enterprise Linux 3 AS (x86-64) RedHat Enterprise Linux 3 AS (x86 on x86-64) RedHat Enterprise Linux 3 AS (IA-64) RedHat Enterprise Linux 4 AS RedHat Enterprise Linux 4 AS (x86-64) RedHat Enterprise Linux 4 AS (x86 on x86-64) ROCKS Linux 3.3 Scientific Linux Fermi 3 Scientific Linux Fermi 4 Scientific Linux Fermi 4 (x86-64) Scientific Linux Fermi 4 (x86 on x86-64) Scientific Linux 4 (IA-64) SUSE Linux 9 (IA-64) VDT will run on the following systems:

29 The OSG is set up to enable a smooth transition from developing new services to providing them in a production environment. Each set of services and functionality is then used as a design basis for OSG applications is then turned into an OSG “Release.”

30

31

32 VDT contains three kinds of middleware: Basic Grid Services: Condor-G and Globus Virtual Data Systems Utilities: MonAlisa, VOMS, What is supported in different platforms? http://vdt.cs.wisc.edu/releases/1.8.1/contents.html

33 VDT is funded by National Science Foundation The National Science Foundation funds research and education in most fields of science and engineering. It does this through grants, and cooperative agreements to more than 2,000 colleges, universities, K-12 school systems, businesses, informal science organizations and other research organizations throughout the country. The Foundation accounts for about one-fourth of federal support to academic institutions for basic research.

34 And… the Department of Energy The Department of Energy's overarching mission is to advance the national, economic, and energy security of the United States; to promote scientific and technological innovation in support of that mission; and to ensure the environmental cleanup of the national nuclear weapons complex. The Department's strategic goals to achieve the mission are designed to deliver results along five strategic themes: Energy Security Nuclear Security Scientific Discovery and Innovation Environmental Responsibility Management Excellence:

35 The OSG Production software cache is at: http://software.grid.iu.edu/pacman/ The OSG ITB software cache is at: http://software.grid.iu.edu/itb// The OSG VTB software cache is at: http://osg-vtb.uchicago.edu/vtb/ The VDT software cache for VDT 1.8.1 (used in this OSG software release) is at: http://vdt.cs.wisc.edu/vdt_181_cache Contents of the VDT software cache: http://vdt.cs.wisc.edu/releases/1.8.1/contents.html

36 When are meetings held at OSG?

37 Meetings held and scheduled for OSG http://indico.fnal.gov/categoryDisplay.py?categId=86

38 Here you can find international news regarding the Grid http://www.isgtw.org/ Here you can find general news about the Grid http://www.opensciencegrid.org/News_and_Events/News_Archive

39 Resources: http://www.opensciencegrid.org OSG Facility PPT by Miron Livny https://twiki.grid.iu.edu/twiki/bin/view/Documentation


Download ppt "Open Science Grid By Zoran Obradovic CSE 510 November 1, 2007."

Similar presentations


Ads by Google