LHCb Grid MeetingLiverpool, 11.09.00 UK GRID Activities Glenn Patrick 11.09.00 Not particularly knowledgeable-just based on attending 3 meetings.  09.08.00UK-HEP.

Slides:



Advertisements
Similar presentations
30-31 Jan 2003J G Jensen, RAL/WP5 Storage Elephant Grid Access to Mass Storage.
Advertisements

1 ALICE Grid Status David Evans The University of Birmingham GridPP 14 th Collaboration Meeting Birmingham 6-7 Sept 2005.
GridPP Meeting, Cambridge, 15 Feb 2002 Paul Mealor, UCL UCL Testbed 1 status report Paul Mealor.
Andrew McNab - Manchester HEP - 17 September 2002 Putting Existing Farms on the Testbed Manchester DZero/Atlas and BaBar farms are available via the Testbed.
Andrew McNab - Manchester HEP - 24 May 2001 WorkGroup H: Software Support Both middleware and application support Installation tools and expertise Communication.
LHCb Bologna Workshop Glenn Patrick1 Backbone Analysis Grid A Skeleton for LHCb? LHCb Grid Meeting Bologna, 14th June 2001 Glenn Patrick (RAL)
Andrew McNab - Manchester HEP - 22 April 2002 EU DataGrid Testbed EU DataGrid Software releases Testbed 1 Job Lifecycle Authorisation at your site More.
LHCb(UK) Meeting Glenn Patrick1 LHCb Grid Activities in UK LHCb(UK) Meeting Cambridge, 10th January 2001 Glenn Patrick (RAL)
05/07/00LHCb Computing Model Meeting (CERN) LHCb(UK) Computing Status Glenn Patrick Prototype (Tier 1) UK national computing centre Bid to Joint.
T. Bowcock Liverpool December Nov-00T. Bowcock University of Liverpool Status CDF/GRID.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
Andrew McNab - Manchester HEP - 2 May 2002 Testbed and Authorisation EU DataGrid Testbed 1 Job Lifecycle Software releases Authorisation at your site Grid/Web.
Andrew McNab - Manchester HEP - 31 January 2002 Testbed Release in the UK Integration Team UK deployment TB1 Job Lifecycle VO: Authorisation VO: GIIS and.
Andrew McNab - Manchester HEP - 22 April 2002 EU DataGrid Testbed EU DataGrid Software releases Testbed 1 Job Lifecycle Authorisation at your site More.
4/2/2002HEP Globus Testing Request - Jae Yu x Participating in Globus Test-bed Activity for DØGrid UTA HEP group is playing a leading role in establishing.
CERN LCG Overview & Scaling challenges David Smith For LCG Deployment Group CERN HEPiX 2003, Vancouver.
NIKHEF Testbed 1 Plans for the coming three months.
Andrew McNab - EDG Access Control - 14 Jan 2003 EU DataGrid security with GSI and Globus Andrew McNab University of Manchester
Dr. David Wallom Experience of Setting up and Running a Production Grid on a University Campus July 2004.
Andrew McNab - Manchester HEP - 6 November Old version of website was maintained from Unix command line => needed (gsi)ssh access.
David Adams ATLAS DIAL Distributed Interactive Analysis of Large datasets David Adams BNL March 25, 2003 CHEP 2003 Data Analysis Environment and Visualization.
GRID Workload Management System Massimo Sgaravatto INFN Padova.
Oxford Jan 2005 RAL Computing 1 RAL Computing Implementing the computing model: SAM and the Grid Nick West.
11 Dec 2000F Harris Datagrid Testbed meeting at Milan 1 LHCb ‘use-case’ - distributed MC production
Andrew McNab - Manchester HEP - 22 April 2002 UK Rollout and Support Plan Aim of this talk is to the answer question “As a site admin, what are the steps.
Andrew McNab - Manchester HEP - 26 June 2001 WG-H / Support status Packaging / RPM’s UK + EU DG CA’s central grid-users file grid “ping”
08/06/00 LHCb(UK) Meeting Glenn Patrick LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick Central UK Computing (what.
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
Andrew McNab - Manchester HEP - 5 July 2001 WP6/Testbed Status Status by partner –CNRS, Czech R., INFN, NIKHEF, NorduGrid, LIP, Russia, UK Security Integration.
S.L.LloydCNAPSlide 1 CNAP What is CNAP? CNAP Structure and Membership What does CNAP do? Interaction with Sysman What is the future of CNAP? S.L.Lloyd.
3 Sept 2001F HARRIS CHEP, Beijing 1 Moving the LHCb Monte Carlo production system to the GRID D.Galli,U.Marconi,V.Vagnoni INFN Bologna N Brook Bristol.
LHCb Applications and GRID Integration Domenico Galli Catania, April 9, st INFN-GRID Workshop.
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
DataGrid Applications Federico Carminati WP6 WorkShop December 11, 2000.
Robin Middleton RAL/PPD DG Co-ordination Rome, 23rd June 2001.
Nick Brook Current status Future Collaboration Plans Future UK plans.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
Andrew McNabNorthGrid, GridPP8, 23 Sept 2003Slide 1 NorthGrid Status Andrew McNab High Energy Physics University of Manchester.
Laboratório de Instrumentação e Física Experimental de Partículas GRID Activities at LIP Jorge Gomes - (LIP Computer Centre)
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
WP8 Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid WP8 Meeting, 16th November 2000 Glenn Patrick (RAL)
3-Nov-00D.P.Kelsey, HEPiX, JLAB1 Certificates for DataGRID David Kelsey CLRC/RAL, UK
Grid Operations Centre LCG Accounting Trevor Daniels, John Gordon GDB 8 Mar 2004.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
CLRC and the European DataGrid Middleware Information and Monitoring Services The current information service is built on the hierarchical database OpenLDAP.
GRID IIII D UK Particle Physics Tony Doyle - University of Glasgow GridPP Status Report  Executive Summary  News  “Unfunded” EU DataGrid Posts  Shadow.
Dave Newbold, University of Bristol8/3/2001 UK Testbed 0 Sites Sites that have committed to TB0: RAL (R) Birmingham (Q) Bristol (Q) Edinburgh (Q) Imperial.
…building the next IT revolution From Web to Grid…
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Presenter Name Facility Name UK Testbed Status and EDG Testbed Two. Steve Traylen GridPP 7, Oxford.
Andrew McNabSecurity Middleware, GridPP8, 23 Sept 2003Slide 1 Security Middleware Andrew McNab High Energy Physics University of Manchester.
UK Grid Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid Prototype and Globus Technical Meeting QMW, 22nd November 2000 Glenn Patrick (RAL)
Andrew McNab - Manchester HEP - 17 September 2002 UK Testbed Deployment Aim of this talk is to the answer the questions: –“How much of the Testbed has.
Andrew McNabGrid in 2002, Manchester HEP, 7 Jan 2003Slide 1 Grid Work in 2002 Andrew McNab High Energy Physics University of Manchester.
Andrew McNab - Security issues - 4 Mar 2002 Security issues for TB1+ (some personal observations from a WP6 and sysadmin perspective) Andrew McNab, University.
Andrew McNab - Dynamic Accounts - 2 July 2002 Dynamic Accounts in TB1.3 What we could do with what we’ve got now... Andrew McNab, University of Manchester.
CLRC Grid Team Glenn Patrick LHCb GRID Plans Glenn Patrick LHCb has formed a GRID technical working group to co-ordinate practical Grid.
Andrew McNab - Globus Distribution for Testbed 1 Status of the Globus Distribution for Testbed 1 Andrew McNab, University of Manchester
EGEE is a project funded by the European Union under contract IST Issues from current Experience SA1 Feedback to JRA1 A. Pacheco PIC Barcelona.
Dave Newbold, University of Bristol14/8/2001 Testbed 1 What is it? First deployment of DataGrid middleware tools The place where we find out if it all.
UK Status and Plans Catalin Condurache – STFC RAL ALICE Tier-1/Tier-2 Workshop University of Torino, February 2015.
18/12/03PPD Christmas Lectures 2003 Grid in the Department A Guide for the Uninvolved PPD Computing Group Christmas Lecture 2003 Chris Brew.
Moving the LHCb Monte Carlo production system to the GRID
UK GridPP Tier-1/A Centre at CLRC
UK Testbed Status Testbed 0 GridPP project Experiments’ tests started
LHCb(UK) Computing Status Glenn Patrick
Short to middle term GRID deployment plan for LHCb
Presentation transcript:

LHCb Grid MeetingLiverpool, UK GRID Activities Glenn Patrick Not particularly knowledgeable-just based on attending 3 meetings.  UK-HEP Globus meeting(RAL)  UK-Grid Meeting(Cosener’s)  UK-Grid Testbed Meeting (RAL) Globus meeting on 9th August the most useful, especially from the technical point of view.

Globus Activities RAL Globus installed on RAL-CSF (Redhat PBS) with 2 gatekeepers (Andrew will cover). Interfacing GASS Server to RAL DataStore (T.Folkes). Backend code has been written and various technical issues identified concerning globus-cache, proxies and file opens. Users to access files using pseudo path name? Gaining experience with GSI-ftp and GSI-ssh (B.Saunders). Now working on PPD Linux machines.

Globus activities II Manchester (Andrew McNab) Globus packaged in RPM format (built on RH6.2/Globus 1.1.3). Interim measure as Globus supposed to be adopting RPM in future. GSI-ssh,GSI-ftp & GSI-gdm also being packaged in RPM. “Grid Aware ROOT” - first attempt at calling GASS client API from ROOT by modifying Tfile class. However, ROOT already has mechanisms for remote files and next version will add Grid files into list of recognised protocols. Need MDS/LDAP names rather than URLs. Standardising GASS cache on Manchester machines. Spool area for each user.

Globus Activities III QMW (Alex Martin) Globus installed on several Linux boxes. PBS installed (packaged as RPM). Developed simple “quote” system using LDAP protocol and Perl scripts. Currently only works for single job and only based on CPU requirement. What is really required in wider batch world? TASKS Common kit of parts/Globus distribution (eg.RPM). Solution for handling certificates & Gridmap files. Common UKHEP GIIS Service (gateway index)? Security implications. Next meeting 20th Sept (Manchester)

LHC(UK) Proto-Centres/Testbeds? RAL Tier 1 (ALICE/ATLAS/CMS/LHCb) Submitted to JIF in Feb for £5.9M. Outcome in Nov(?). Liverpool MAP/COMPASS (LHCb/ATLAS) Funded by JREI in Operational. Upgrade requested. Glasgow/Edinburgh Computing Centre (LHCb/ATLAS) Submitted to JREI in May. Outcome known in ~December. Manchester-UCL-Sheffield (ATLAS WW scattering?) JREI bid. 32 cpu + 5 TB disk. Bristol (BaBar/CMS/LHCb) 8 node Linux  32 nodes later+storage server. Birmingham (ALICE) Funded by JREI(1999). Farm and disk storage. Others?

UK GRID Organisation Visible people seem to fall into 2 main camps: Potential “Exploiters” for experiments & applications. System Managers installing/developing Globus, etc. Various people are then involved in DataGrid work packages, but organisation of UK middleware work is not clear (to me). Significant funds supposed to go into Grid and e-science. Some (hierarchical) structures have been proposed along lines.. PPARC Grid Steering Committee Particle Physics Grid Management Board Technical Work Groups Sub-groups

RAL GRID Organisation Up until now there has been a “CLRC Grid Team” which was originally based around PPD+ITD, but has gradually pulled in other departments/sciences. Now just about to be split into: e-science forum for all of CLRC. Particle Physics Grid Team. Not clear yet how this maps into existing structures and how it affects effort for LHCb applications.