Rutherford Appleton Lab, UK VOBox Considerations from GridPP. GridPP DTeam Meeting. Wed Sep 13 th 2005.

Slides:



Advertisements
Similar presentations
21 Sep 2005LCG's R-GMA Applications R-GMA and LCG Steve Fisher & Antony Wilson.
Advertisements

Your university or experiment logo here What is it? What is it for? The Grid.
Andrew McNab - Manchester HEP - 17 September 2002 Putting Existing Farms on the Testbed Manchester DZero/Atlas and BaBar farms are available via the Testbed.
29 June 2006 GridSite Andrew McNabwww.gridsite.org VOMS and VOs Andrew McNab University of Manchester.
Level 1 Components of the Project. Level 0 Goal or Aim of GridPP. Level 2 Elements of the components. Level 2 Milestones for the elements.
Andrew McNab - Manchester HEP - 22 April 2002 EU DataGrid Testbed EU DataGrid Software releases Testbed 1 Job Lifecycle Authorisation at your site More.
Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft Torsten Antoni – LCG Operations Workshop, CERN 02-04/11/04 Global Grid User Support - GGUS -
Andrew McNab - Manchester HEP - 2 May 2002 Testbed and Authorisation EU DataGrid Testbed 1 Job Lifecycle Software releases Authorisation at your site Grid/Web.
EGEE SA1 Operations Workshop Stockholm, 13-15/06/2007 Enabling Grids for E-sciencE Service Level Agreement Metrics SLA SA1 Working Group Łukasz Skitał.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
CoreGRID Workpackage 5 Virtual Institute on Grid Information and Monitoring Services Authorizing Grid Resource Access and Consumption Erik Elmroth, Michał.
Joining the Grid Andrew McNab. 28 March 2006Andrew McNab – Joining the Grid Outline ● LCG – the grid you're joining ● Related projects ● Getting a certificate.
Security Middleware and VOMS service status Andrew McNab Grid Security Research Fellow University of Manchester.
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
SOS EGEE ‘06 GGF Security Auditing Service: Draft Architecture Brian Tierney Dan Gunter Lawrence Berkeley National Laboratory Marty Humphrey University.
INFSO-RI Enabling Grids for E-sciencE SA1: Cookbook (DSA1.7) Ian Bird CERN 18 January 2006.
GGF12 – 20 Sept LCG Incident Response Ian Neilson LCG Security Officer Grid Deployment Group CERN.
Monitoring the Grid at local, national, and Global levels Pete Gronbech GridPP Project Manager ACAT - Brunel Sept 2011.
G RID M IDDLEWARE AND S ECURITY Suchandra Thapa Computation Institute University of Chicago.
3 June 2004GridPP10Slide 1 GridPP Dissemination Sarah Pearce Dissemination Officer
INFSO-RI Enabling Grids for E-sciencE VO BOX Summary Conclusions from Joint OSG and EGEE Operations Workshop - 3 Abingdon, 27 -
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
Training and Dissemination Enabling Grids for E-sciencE Jinny Chien, ASGC 1 Training and Dissemination Jinny Chien Academia Sinica Grid.
…building the next IT revolution From Web to Grid…
Michael Fenn CPSC 620, Fall 09.  Grid computing is the process of allowing loosely-coupled virtual organizations to share resources over a wide area.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Firewall Configurations Responses from the ETF (the names have been changed to protect the innocent..)
INFSO-RI Enabling Grids for E-sciencE Enabling Grids for E-sciencE Pre-GDB Storage Classes summary of discussions Flavia Donno Pre-GDB.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
Grid Security Vulnerability Group Linda Cornwall, GDB, CERN 7 th September 2005
Your university or experiment logo here Tier1 Deployment Steve Traylen.
1 Andrea Sciabà CERN Critical Services and Monitoring - CMS Andrea Sciabà WLCG Service Reliability Workshop 26 – 30 November, 2007.
Andrew McNabSecurity Middleware, GridPP8, 23 Sept 2003Slide 1 Security Middleware Andrew McNab High Energy Physics University of Manchester.
Your university or experiment logo here What is it? What is it for? The Grid.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
Jens G Jensen RAL, EDG WP5 Storage Element Overview DataGrid Project Conference Heidelberg, 26 Sep-01 Oct 2003.
Derek Ross E-Science Department DCache Deployment at Tier1A UK HEP Sysman April 2005.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE Site Architecture Resource Center Deployment Considerations MIMOS EGEE Tutorial.
2-Sep-02Steve Traylen, RAL WP6 Test Bed Report1 RAL and UK WP6 Test Bed Report Steve Traylen, WP6
INFSO-RI Enabling Grids for E-sciencE ARDA Experiment Dashboard Ricardo Rocha (ARDA – CERN) on behalf of the Dashboard Team.
SAM Sensors & Tests Judit Novak CERN IT/GD SAM Review I. 21. May 2007, CERN.
EGEE is a project funded by the European Union under contract IST VO box: Experiment requirements and LCG prototype Operations.
VO Box Issues Summary of concerns expressed following publication of Jeff’s slides Ian Bird GDB, Bologna, 12 Oct 2005 (not necessarily the opinion of)
Oracle for Physics Services and Support Levels Maria Girone, IT-ADC 24 January 2005.
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
Last update 29/01/ :01 LCG 1Maria Dimou- cern-it-gd Maria Dimou IT/GD CERN VOMS server deployment LCG Grid Deployment Board
LCG CERN David Foster LCG WP4 Meeting 20 th June 2002 LCG Project Status WP4 Meeting Presentation David Foster IT/LCG 20 June 2002.
David Foster LCG Project 12-March-02 Fabric Automation The Challenge of LHC Scale Fabrics LHC Computing Grid Workshop David Foster 12 th March 2002.
BaBar Cluster Had been unstable mainly because of failing disks Very few (
EGEE is a project funded by the European Union under contract IST Information and Monitoring Services within a Grid R-GMA (Relational Grid.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Ian Bird All Activity Meeting, Sofia
ICFA DDW'06 - Cracow1 (HEP) GRID Activities in Hungary Csaba Hajdu KFKI RMKI Budapest.
INRNE's participation in LCG Elena Puncheva Preslav Konstantinov IT Department.
SAM Status Update Piotr Nyczyk LCG Management Board CERN, 5 June 2007.
Breaking the frontiers of the Grid R. Graciani EGI TF 2012.
SAM architecture EGEE 07 Service Availability Monitor for the LHC experiments Simone Campana, Alessandro Di Girolamo, Nicolò Magini, Patricia Mendez Lorenzo,
VO Box discussion ATLAS NIKHEF January, 2006 Miguel Branco -
Grid Deployment Technical Working Groups: Middleware selection AAA,security Resource scheduling Operations User Support GDB Grid Deployment Resource planning,
Using HLRmon for advanced visualization of resource usage Enrico Fattibene INFN - CNAF ISCG 2010 – Taipei March 11 th, 2010.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
A Seminar On. What is Cloud Computing? Distributed computing on internet Or delivery of computing service over the internet. Eg: Yahoo!, GMail, Hotmail-
Bob Jones EGEE Technical Director
C Loomis (CNRS/LAL) and V. Floros (GRNET)
JRA3 Introduction Åke Edlund EGEE Security Head
Short update on the latest gLite status
VOMS deployment for small national VOs and local groups
Leigh Grundhoefer Indiana University
Pierre Girard ATLAS Visit
gLite The EGEE Middleware Distribution
Presentation transcript:

Rutherford Appleton Lab, UK VOBox Considerations from GridPP. GridPP DTeam Meeting. Wed Sep 13 th 2005.

Rutherford Appleton Lab, UK Introduction VOBoxes enable LHC VO software managers to use gsissh to access a node at a site and install and enable a persistent process. The following pages outline the constraints under which these can be operated at GridPP sites and the consequences of their installation.

Rutherford Appleton Lab, UK VO Services A VO service is the persistent service that is installed by small named set of users. VO services running inside VOBoxes must be well defined. –A description of any such service is required by sites. Port numbers must be divulged in advance. Does the server use authentication, if not why not. –e.g. a read only web server might be acceptable. –What logging is provided. Can I find who accessed the service at any time the last 6 months. –For a VO service that collects information. What is being monitored? What is being stored? What is being published and to who. –Access to a VO service should only given to valid VO members. –Ports are likely to be enforced via iptables (including restrictions to userid) and /or site firewall. –VO services will only run as the VOs users account at all times.

Rutherford Appleton Lab, UK Access Rights. VO boxes are expected not to provide a back door to fabric. –Access to compute elements still via GK. –Access to storage is via SRM. If they do then this must be clear. –e.g. CMS may want to reorder jobs. –VOs want access to software area.

Rutherford Appleton Lab, UK Multiple VOs A VO box must be able to be shared by multiple VOs. –Sites are not convinced that allocating 2 CPUs per VO is a good use of resources. But a site will add more if they are busy. –In reality just requires that port numbers can be changed for any VO service. The deployed VO box already support multiple VOs. The alternatives: –4 Experiments * 20 T2 sites = 80 nodes for LHC in the UK is not tenable. –Sites will drop VOs.

Rutherford Appleton Lab, UK Service Security For well known VO services like apache or tomcat these must be well maintained. Sites may close these services if updates are not provided in a timely fashion. Sites will close box if service is impacting the site in detrimental way.

Rutherford Appleton Lab, UK Providing Service To VOs What level of service is expected for experiments. –Disk backup. Procedure must exist for node interventions. –e.g. In 24hours a VOBox will be rebooted for a kernel upgrade. – Who do I tell.

Rutherford Appleton Lab, UK Overall Impressions VOBoxes are not in the spirit of grid. –Sites plan to drop LHC VOs if they become difficult to support requiring specialised resources rather than just sharing spare capacity. –No UK site will support all 4 LHC VOs if a VO box is a requirement. Currently all LCG UK sites do support all 4 LHC VOs. For example 19 of 20 sites are likely to drop Alice. This will be a big shift in the vision of the GridPP grid deployment. –We believe there will be a loss in the scalability of the EGEE grid non LHC VOs within EGEE. VOBoxes must be considered a short term solution. Components and functionality should be provided by the middleware where possible.