CHEP 2006 Mumbai INDIA February 15 2005 Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The.

Slides:



Advertisements
Similar presentations
Open Science Grid Living on the Edge: OSG Edge Services Framework Kate Keahey Abhishek Rana.
Advertisements

A Scalable Approach to Deploying and Managing Appliances Kate Keahey Rick Bradshaw, Narayan Desai, Tim Freeman Argonne National Lab, University of Chicago.
Virtual Appliances for Scientific Applications Kate Keahey Argonne National Laboratory University of Chicago.
Workspaces for CE Management Kate Keahey Argonne National Laboratory.
Wei Lu 1, Kate Keahey 2, Tim Freeman 2, Frank Siebenlist 2 1 Indiana University, 2 Argonne National Lab
Division of Labor: Tools for Growing and Scaling Grids Tim Freeman, Kate Keahey, Ian Foster, Abhishek Rana, Frank Wuerthwein, Borja Sotomayor.
From Sandbox to Playground: Dynamic Virtual Environments in the Grid Kate Keahey Argonne National Laboratory Karl Doering University.
Virtual Workspaces in the Grid Kate Keahey Argonne National Laboratory Ian Foster, Tim Freeman, Xuehai Zhang, Daniel Galron.
Nimbus or an Open Source Cloud Platform or the Best Open Source EC2 No Money Can Buy ;-) Kate Keahey Tim Freeman University of Chicago.
Global Grid Forum GridWorld GGF15 Boston USA October Abhishek Singh Rana and Frank Wuerthwein UC San Diegowww.opensciencegrid.org The Open Science.
Data Management Expert Panel - WP2. WP2 Overview.
Implementing Finer Grained Authorization in the Open Science Grid Gabriele Carcassi, Ian Fisk, Gabriele, Garzoglio, Markus Lorch, Timur Perelmutov, Abhishek.
Open Science Grid Project DASH: Securing Direct MySQL Database Access for the Grid D. Malon, E. May, D. Ratnikov, A. Vaniachine Argonne National Laboratory.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Services Abderrahman El Kharrim
DataGrid is a project funded by the European Union 22 September 2003 – n° 1 EDG WP4 Fabric Management: Fabric Monitoring and Fault Tolerance
INFSO-RI An On-Demand Dynamic Virtualization Manager Øyvind Valen-Sendstad CERN – IT/GD, ETICS Virtual Node bootstrapper.
Open Science Grid Frank Würthwein UCSD. 2/13/2006 GGF 2 “Airplane view” of the OSG  High Throughput Computing — Opportunistic scavenging on cheap hardware.
+ Virtualization in Clusters and Grids Dr. Lizhe Wang.
Kate Keahey Argonne National Laboratory University of Chicago Globus Toolkit® 4: from common Grid protocols to virtualization.
Virtual Infrastructure in the Grid Kate Keahey Argonne National Laboratory.
Nimbus & OpenNebula Young Suk Moon. Nimbus - Intro Open source toolkit Provides virtual workspace service (Infrastructure as a Service) A client uses.
OSG Public Storage and iRODS
1 Dynamic Application Installation (Case of CMS on OSG) Introduction CMS Software Installation Overview Software Installation Issues Validation Considerations.
OSG Services at Tier2 Centers Rob Gardner University of Chicago WLCG Tier2 Workshop CERN June 12-14, 2006.
OSG Site Provide one or more of the following capabilities: – access to local computational resources using a batch queue – interactive access to local.
การติดตั้งและทดสอบการทำคลัสเต อร์เสมือนบน Xen, ROCKS, และไท ยกริด Roll Implementation of Virtualization Clusters based on Xen, ROCKS, and ThaiGrid Roll.
OSG Middleware Roadmap Rob Gardner University of Chicago OSG / EGEE Operations Workshop CERN June 19-20, 2006.
1 Evolution of OSG to support virtualization and multi-core applications (Perspective of a Condor Guy) Dan Bradley University of Wisconsin Workshop on.
INFSO-RI Enabling Grids for E-sciencE SA1: Cookbook (DSA1.7) Ian Bird CERN 18 January 2006.
Large Scale Sky Computing Applications with Nimbus Pierre Riteau Université de Rennes 1, IRISA INRIA Rennes – Bretagne Atlantique Rennes, France
G RID M IDDLEWARE AND S ECURITY Suchandra Thapa Computation Institute University of Chicago.
Globus Virtual Workspaces OOI Cyberinfrastructure Design Meeting, San Diego, October Kate Keahey University of Chicago Argonne National Laboratory.
Grid User Management System Gabriele Carcassi HEPIX October 2004.
Global Grid Forum GridWorld GGF15 Boston USA October Abhishek Singh Rana and Frank Wuerthwein UC San Diegowww.opensciencegrid.org The Open Science.
Copyright © cs-tutorial.com. Overview Introduction Architecture Implementation Evaluation.
Virtual Workspaces Kate Keahey Argonne National Laboratory.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
OSG Tier 3 support Marco Mambelli - OSG Tier 3 Dan Fraser - OSG Tier 3 liaison Tanya Levshina - OSG.
INFSO-RI Enabling Grids for E-sciencE OSG-LCG Interoperability Activity Author: Laurence Field (CERN)
EVGM081 Multi-Site Virtual Cluster: A User-Oriented, Distributed Deployment and Management Mechanism for Grid Computing Environments Takahiro Hirofuchi,
Open Science Grid Open Science Grid: Beyond the Honeymoon Dane Skow Fermilab September 1, 2005.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
Conference name Company name INFSOM-RI Speaker name The ETICS Job management architecture EGEE ‘08 Istanbul, September 25 th 2008 Valerio Venturi.
Ian Gable University of Victoria 1 Deploying HEP Applications Using Xen and Globus Virtual Workspaces A. Agarwal, A. Charbonneau, R. Desmarais, R. Enge,
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
VO Privilege Activity. The VO Privilege Project develops and implements fine-grained authorization to grid- enabled resources and services Started Spring.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE Site Architecture Resource Center Deployment Considerations MIMOS EGEE Tutorial.
OSG Abhishek Rana Frank Würthwein UCSD.
Virtual Organization Membership Service eXtension (VOX) Ian Fisk On behalf of the VOX Project Fermilab.
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
OSG Deployment Preparations Status Dane Skow OSG Council Meeting May 3, 2005 Madison, WI.
Parag Mhashilkar Computing Division, Fermi National Accelerator Laboratory.
VOX Project Status T. Levshina. 5/7/2003LCG SEC meetings2 Goals, team and collaborators Purpose: To facilitate the remote participation of US based physicists.
INRNE's participation in LCG Elena Puncheva Preslav Konstantinov IT Department.
II EGEE conference Den Haag November, ROC-CIC status in Italy
OSG Status and Rob Gardner University of Chicago US ATLAS Tier2 Meeting Harvard University, August 17-18, 2006.
Claudio Grandi INFN Bologna Virtual Pools for Interactive Analysis and Software Development through an Integrated Cloud Environment Claudio Grandi (INFN.
SAM architecture EGEE 07 Service Availability Monitor for the LHC experiments Simone Campana, Alessandro Di Girolamo, Nicolò Magini, Patricia Mendez Lorenzo,
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
Abhishek Singh Rana and Frank Wuerthwein UC San Diegowww.opensciencegrid.org The Open Science Grid ConsortiumCHEP 2006 Mumbai INDIA February gPLAZMA:
Open Science Grid Consortium Storage on Open Science Grid Placing, Using and Retrieving Data on OSG Resources Abhishek Singh Rana OSG Users Meeting July.
Towards Dynamic Database Deployment LCG 3D Meeting November 24, 2005 CERN, Geneva, Switzerland Alexandre Vaniachine (ANL)
VOX Project Status Report Tanya Levshina. 03/10/2004 VOX Project Status Report2 Presentation overview Introduction Stakeholders, team and collaborators.
Grid Colombia Workshop with OSG Week 2 Startup Rob Gardner University of Chicago October 26, 2009.
Workspace Management Services Kate Keahey Argonne National Laboratory.
Dynamic Accounts: Identity Management for Site Operations Kate Keahey R. Ananthakrishnan, T. Freeman, R. Madduri, F. Siebenlist.
CERN Openlab Openlab II virtualization developments Havard Bjerke.
Virtualization Review and Discussion
Management of Virtual Machines in Grids Infrastructures
Management of Virtual Machines in Grids Infrastructures
Presentation transcript:

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium An Edge Services Framework (ESF) for EGEE, LCG, OSG The XVth International Conference on Computing in High Energy and Nuclear Physics (CHEP’06) February 15, 2006 TIFR, Mumbai Abhishek Singh Rana UC San Diego Frank Würthwein UC San Diego

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 2 MALON, David (Argonne National Laboratory, IL, USA) MAY, Ed (Argonne National Laboratory, IL, USA) POPESCU, Razvan (Brookhaven National Laboratory, Upton, NY, USA) SOTOMAYOR, Borja (University of Chicago, IL, USA) SHANK, Jim (Boston University, MA, USA) LAURE, Erwin (CERN (European Organization for Nuclear Research), Geneva, Switzerland) BIRD, Ian (CERN (European Organization for Nuclear Research), Geneva, Switzerland) SCHULZ, Markus (CERN (European Organization for Nuclear Research), Geneva, Switzerland) FIELD, Laurence (CERN (European Organization for Nuclear Research), Geneva, Switzerland) PORDES, Ruth (Fermi National Accelerator Laboratory, IL, USA) SKOW, Dane (Fermi National Accelerator Laboratory, IL, USA) LITMAATH, Maarten (CERN (European Organization for Nuclear Research), Geneva, Switzerland) CAMPANA, Simone (CERN (European Organization for Nuclear Research), Geneva, Switzerland) WENAUS, Torre (Brookhaven National Laboratory, Upton, NY, USA) SMITH, David (CERN (European Organization for Nuclear Research), Geneva, Switzerland) BLUMENFELD, Barry (Johns Hopkins University, Baltimore, MD, USA) MARTIN, Stuart (Argonne National Laboratory, IL, USA) DE, Kaushik (The University of Texas, Arlington, TX, USA) VRANICAR, Matthew (PIOCON, IL, USA) WEICHER, John (PIOCON, IL, USA) SMITH, Preston (Purdue University, IN, USA) WANG, Shaowen (University of Iowa) RANA, Abhishek Singh (University of California, San Diego, CA, USA) WUERTHWEIN, Frank (University of California, San Diego, CA, USA) GARDNER, Robert (University of Chicago, IL, USA) KEAHEY, Kate (Argonne National Laboratory, IL, USA) FREEMAN, Timothy (Argonne National Laboratory, IL, USA) VANIACHINE, Alexandre (Argonne National Laboratory, IL, USA) HOLZMAN, Burt (Fermi National Accelerator Laboratory, IL, USA) Authors (ESF mailing list)

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 3 Outline ESF Activity ESF Phase 1 –Concepts and Design ESF future direction Xen overview Phase 1 Status Next Steps

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 4 Vision

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 5 Can there be a shared Services Framework that makes site admins happy? No login access to strangers. Isolation of services. –VOs can’t affect each other. –VOs receive a strictly controlled environment. Encapsulation of services. –Service instances can receive security review by site before they get installed. Explore solution based on virtual machines.

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 6 OSG-ESF Activity Started in September Physicists, Computer Scientists & Engineers, Software Architects. Chairs: Kate Keahey and Abhishek Singh Rana. Workspace Services Architecture and Design: –Globus Alliance and UC San Diego. Edge Services Implementations: –USATLAS: Teams at U Chicago and ANL. –USCMS: Teams at UC San Diego and FNAL. Mailing List and Discussion Forum – Web collaborative area –

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 7 ESF - Phase 1

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 8 No ESF - Phase 0 SECE Site

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 9 No ESF - Phase 0 Site SECE CMSATLASCDF Static Deployment of VO Services on a Site

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 10 ESF? SECE Site

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 11 ESF - Phase 1 ESF SE Site CE CDF CMS ATLAS Guest VO Snapshot of ES Wafers implemented as Virtual Workspaces

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 12 An attempt at ESF Terminology Edge Services Wafer (ES Wafer) –A specific instance of a dynamically-created VM (workspace) is called an Edge Services Wafer. –An ES Wafer can have several Edge Services running. –A VO can have multiple ES Wafers up at a Site. Edge Services Slot (ES Slot) –An ES Slot has hardware characteristics specified by the Site Admin. –An ES Slot can be leased by a VO to host an ES Wafer. Edge Service (ES) –A VO-specific service instantiated by a VO in a Wafer. Workspace Service (WS) –Service at a Site that allows VOs to instantiate ES Wafers in ES Slots.

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 13 ESF - Phase 1 CDF CMS ATLAS Guest VO ESF SECE Site GT4 Workspace Service & VMM Dynamically deployed ES Wafers for each VO Wafer images stored in SE Compute nodes and Storage nodes Snapshot of ES Wafers implemented as Virtual Workspaces

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 14 User jobs at Compute nodes using ES Wafers for VO Edge Services ESF SECE Site CDF CMS ATLAS Guest VO

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 15 VO Admin transporting/storing ES image to a remote Site....Deploying ES using image stored in Site’s local repository

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 16 ESF - Phase 1 ESF SECE Site CMS Role=VO Admin

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 17 ESF - Phase 1 ESF SECE Site CMS Role=VO Admin PEP

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 18 ESF - Phase 1 ESF SECE Site CMS Role=VO Admin

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 19 ESF - Phase 1 ESF SECE Site Role=VO Admin

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 20 ESF - Phase 1 ESF SECE Site Role=VO Admin PEP

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 21 ESF - Phase 1 ESF SECE Site Role=VO Admin

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 22 ESF - Phase 1 ESF SECE Site Role=VO Admin PEP

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 23 ESF - Phase 1 ESF SECE Site CMS Role=VO Admin

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 24 ESF - Phase 1 ESF SECE Site CMS Role=VO Admin

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 25 ESF - Phase 1 ESF SECE Site CMS Role=VO Admin

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 26 ESF - Phase 1 ESF SECE Site CMS Role=VO Admin ES Wafer (Multiple VO Services at a Site’s Edge)

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 27 A VO User using ES..

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 28 ESF - Phase 1 ESF SECE Site CMS Role=VO User

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 29 ESF - Phase 1 ESF SECE Site CMS Role=VO User PEP

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 30 ESF - Phase 1 ESF SE Site CMS Role=VO User CE

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 31 ESF - Phase 1 ESF SE Site CMS Role=VO User CE PEP

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 32 ESF - Phase 1 ESF SE Site CMS Role=VO User CE Resource Slice (User execution environment at a WN)

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 33 ESF - Phase 1 ESF SECE Site CMS Role=VO User

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 34 ESF - Phase 1 ESF SECE Site CMS Role=VO User PEP

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 35 ESF - Phase 1 ESF SECE Site CMS Role=VO User

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 36 ESF - future direction

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 37 ESF - future direction Same concept. Deeploy a cluster of ES slots that are fully schedulable by any VO allowed at the grid site.

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 38 ESF - future direction ESF Site Brokering and Scheduling by Edge Services Framework Dynamically deployed ES Wafers for many VOs Cluster of ES Slots with different properties ATLAS1 CMS ATLAS2 CDF

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 39 Xen overview Hardware Virtual Machine Monitor (VMM) Public Network Private Network

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 40 Phase 1 on OSG

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 41 Phase 1 on OSG ATLAS & CMS procure one piece of hardware at their Sites on OSG that runs ESF (called ESF node). –Dual CPU recommended. –4GB RAM (Xen2 has no PAE support, Xen3 has.). Site administrators install: –Xen (Xen 2.0.7, Xen 3.0.0). –GT4 Workspace Service. VO administrators use ESF to fire-up Xen VMs that instantiate VO Services – Edge Services in an ES Wafer. A single ESF node hosts ES Wafers for both ATLAS & CMS.

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 42 Site Administrator Responsibilities Deploy: –Xen. –Custom kernel for domain 0 (Grub bootloader required). –Custom kernel for domain U. –Prepare RAMdisk image if needed. –GT4. –GT4 Workspace Service. Provision: –One public IP, One private IP per VM. –Host certificates per VM. –Disk space per VM. Declare available ES Slots and their properties to ESF.

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 43 VO Administrator Responsibilities Fetch a standard OS filesystem image from a central ESF repository. Deploy the desired service on OS filesystem image. Thus, prepare (freeze) ES Wafer instance. Develop portable methods to dynamically configure all networking properties at a remote Site. Package these. Prepare this image into file for transport. SRMCP the image to remote Site’s SE. Use ESF to fire-up Xen VM with VO-Services (ES Wafer) at remote Site, from image file in remote SE, using role based authorization. Advertise the running Edge Services as needed.

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 44 Status New features added to GT4 Workspace Service. First prototype of ESF with Integration-testbed (Xen2.0.7) consisting of sites at ANL, FNAL, UCSD, U Chicago; and a Production-testbed (Xen3.0.0) with a site at UCSD. Pure OS Filesystem Images: SL3.0.3, SL4, LTS 3, LTS 4, FC4, CentOS4. USCMS Edge Service: FroNTier (Squid db). USATLAS Edge Service: DASH (MySQL db) General Edge Service: A subset of OSG 0.4 CE. Stress/throughput testing performed at ANL and UCSD. Based on parts of above results, a publication submitted for peer-review to IEEE HPDC-15.

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 45 Partial list of features added to GT4 WSS (WSS Release: VM Technology Preview 1.1) Support for a new, allocate networking method that allows the workspace service administrator to specify pools of IP addresses (and DNS information) which are then assigned to virtual machines on deployment. The resource properties have been extended to publish deployment information about a workspace, such as its IP address. Workspace metadata validation has been extended to support requirement checking for specific architecture, Xen version, and CPU. The workspace factory advertises the supported qualities as a resource property; the requirement section of workspace metadata is checked against the supported set. The workspace service can now accept and process VOMS credentials and GridShib SAML attributes. Support for Xen3 has been added. The workspace client interface has been extended to enable subscribing for notifications and specifying the resource allocation information at command-line. Installation has been improved. The client now requires only a minimal installation (as opposed to the full service installation).

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 46 Next Steps Verify performance, functionality, robustness. Gain production use experience. –CDF is capable of failover operations between multiple squids, thus allowing production use experience without negative impact on users. –Example squid use cases: DB cache (FroNTier) Application tarball serving (see glideCAF & OSG-CAF presentations) Parrot based CDF software mounts. Further evolve GT4 Workspace Service design. Widen deployment to more USCMS and USATLAS sites, using CMS & ATLAS services as use cases.

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 47

CHEP 2006 Mumbai INDIA February Frank Würthwein and Abhishek Singh Rana Edge Services Framework for EGEE, LCG and OSGwww.opensciencegrid.org The Open Science Grid Consortium 48 Thank You.