WP8 Meeting 16.11.00Glenn Patrick1 LHCb Grid Activities in UK Grid WP8 Meeting, 16th November 2000 Glenn Patrick (RAL)

Slides:



Advertisements
Similar presentations
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Advertisements

Stephen Burke - WP8 Status - 14/2/2002 Partner Logo WP8 Status Stephen Burke, PPARC/RAL.
ATLAS/LHCb GANGA DEVELOPMENT Introduction Requirements Architecture and design Interfacing to the Grid Ganga prototyping A. Soroko (Oxford), K. Harrison.
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
LHCb Bologna Workshop Glenn Patrick1 Backbone Analysis Grid A Skeleton for LHCb? LHCb Grid Meeting Bologna, 14th June 2001 Glenn Patrick (RAL)
LHCb(UK) Meeting Glenn Patrick1 LHCb Grid Activities in UK LHCb(UK) Meeting Cambridge, 10th January 2001 Glenn Patrick (RAL)
05/07/00LHCb Computing Model Meeting (CERN) LHCb(UK) Computing Status Glenn Patrick Prototype (Tier 1) UK national computing centre Bid to Joint.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
Andrew McNab - Manchester HEP - 2 May 2002 Testbed and Authorisation EU DataGrid Testbed 1 Job Lifecycle Software releases Authorisation at your site Grid/Web.
4/2/2002HEP Globus Testing Request - Jae Yu x Participating in Globus Test-bed Activity for DØGrid UTA HEP group is playing a leading role in establishing.
UK Campus Grid Special Interest Group Dr. David Wallom University of Oxford.
NIKHEF Testbed 1 Plans for the coming three months.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
USING THE GLOBUS TOOLKIT This summary by: Asad Samar / CALTECH/CMS Ben Segal / CERN-IT FULL INFO AT:
OxGrid, A Campus Grid for the University of Oxford Dr. David Wallom.
Production Planning Eric van Herwijnen Thursday, 20 june 2002.
K.Harrison CERN, 23rd October 2002 HOW TO COMMISSION A NEW CENTRE FOR LHCb PRODUCTION - Overview of LHCb distributed production system - Configuration.
Magda – Manager for grid-based data Wensheng Deng Physics Applications Software group Brookhaven National Laboratory.
11 Dec 2000F Harris Datagrid Testbed meeting at Milan 1 LHCb ‘use-case’ - distributed MC production
Exploiting the Grid to Simulate and Design the LHCb Experiment K Harrison 1, N Brook 2, G Patrick 3, E van Herwijnen 4, on behalf of the LHCb Grid Group.
08/06/00 LHCb(UK) Meeting Glenn Patrick LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick Central UK Computing (what.
LHCb Software Meeting Glenn Patrick1 First Ideas on Distributed Analysis for LHCb LHCb Software Week CERN, 28th March 2001 Glenn Patrick (RAL)
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
April 2001HEPix/HEPNT1 RAL Site Report John Gordon CLRC, UK.
Andrew McNab - Manchester HEP - 5 July 2001 WP6/Testbed Status Status by partner –CNRS, Czech R., INFN, NIKHEF, NorduGrid, LIP, Russia, UK Security Integration.
3 Sept 2001F HARRIS CHEP, Beijing 1 Moving the LHCb Monte Carlo production system to the GRID D.Galli,U.Marconi,V.Vagnoni INFN Bologna N Brook Bristol.
LHCb Applications and GRID Integration Domenico Galli Catania, April 9, st INFN-GRID Workshop.
November 7, 2001Dutch Datagrid SARA 1 DØ Monte Carlo Challenge A HEP Application.
3rd Nov 2000HEPiX/HEPNT CDF-UK MINI-GRID Ian McArthur Oxford University, Physics Department
3rd June 2004 CDF Grid SAM:Metadata and Middleware Components Mòrag Burgon-Lyon University of Glasgow.
SLICE Simulation for LHCb and Integrated Control Environment Gennady Kuznetsov & Glenn Patrick (RAL) Cosener’s House Workshop 23 rd May 2002.
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
Cosener’s House – 30 th Jan’031 LHCb Progress & Plans Nick Brook University of Bristol News & User Plans Technical Progress Review of deliverables.
Nick Brook Current status Future Collaboration Plans Future UK plans.
LHCb and DataGRID - the workplan for 2001 Eric van Herwijnen Wednesday, 28 march 2001.
21 st October 2002BaBar Computing – Stephen J. Gowdy 1 Of 25 BaBar Computing Stephen J. Gowdy BaBar Computing Coordinator SLAC 21 st October 2002 Second.
Laboratório de Instrumentação e Física Experimental de Partículas GRID Activities at LIP Jorge Gomes - (LIP Computer Centre)
7April 2000F Harris LHCb Software Workshop 1 LHCb planning on EU GRID activities (for discussion) F Harris.
Status of the LHCb MC production system Andrei Tsaregorodtsev, CPPM, Marseille DataGRID France workshop, Marseille, 24 September 2002.
LHCb planning for DataGRID testbed0 Eric van Herwijnen Thursday, 10 may 2001.
29 May 2002Joint EDG/WP8-EDT/WP4 MeetingClaudio Grandi INFN Bologna LHC Experiments Grid Integration Plans C.Grandi INFN - Bologna.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
Architecture and ATLAS Western Tier 2 Wei Yang ATLAS Western Tier 2 User Forum meeting SLAC April
3-Nov-00D.P.Kelsey, HEPiX, JLAB1 Certificates for DataGRID David Kelsey CLRC/RAL, UK
Dave Newbold, University of Bristol8/3/2001 UK Testbed 0 Sites Sites that have committed to TB0: RAL (R) Birmingham (Q) Bristol (Q) Edinburgh (Q) Imperial.
11/5/2001WP5 UKHEPGRID1 WP5 Mass Storage UK HEPGrid UCL 11th May Tim Folkes, RAL
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 14 February 2006.
Presenter Name Facility Name UK Testbed Status and EDG Testbed Two. Steve Traylen GridPP 7, Oxford.
UK Grid Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid Prototype and Globus Technical Meeting QMW, 22nd November 2000 Glenn Patrick (RAL)
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
10 May 2001WP6 Testbed Meeting1 WP5 - Mass Storage Management Jean-Philippe Baud PDP/IT/CERN.
LHCb Data Challenge in 2002 A.Tsaregorodtsev, CPPM, Marseille DataGRID France meeting, Lyon, 18 April 2002.
CLRC Grid Team Glenn Patrick LHCb GRID Plans Glenn Patrick LHCb has formed a GRID technical working group to co-ordinate practical Grid.
LHCb Grid MeetingLiverpool, UK GRID Activities Glenn Patrick Not particularly knowledgeable-just based on attending 3 meetings.  UK-HEP.
Stephen Burke – Sysman meeting - 22/4/2002 Partner Logo The Testbed – A User View Stephen Burke, PPARC/RAL.
The RAL PPD Tier 2/3 Current Status and Future Plans or “Are we ready for next year?” Chris Brew PPD Christmas Lectures th December 2007.
Dave Newbold, University of Bristol14/8/2001 Testbed 1 What is it? First deployment of DataGrid middleware tools The place where we find out if it all.
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 22 February 2006.
LHCb computing model and the planned exploitation of the GRID Eric van Herwijnen, Frank Harris Monday, 17 July 2000.
Moving the LHCb Monte Carlo production system to the GRID
UK GridPP Tier-1/A Centre at CLRC
D. Galli, U. Marconi, V. Vagnoni INFN Bologna N. Brook Bristol
LHCb(UK) Computing Status Glenn Patrick
Gridifying the LHCb Monte Carlo production system
Status and plans for bookkeeping system and production tools
Short to middle term GRID deployment plan for LHCb
Presentation transcript:

WP8 Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid WP8 Meeting, 16th November 2000 Glenn Patrick (RAL)

WP8 Meeting Glenn Patrick2 Aims In June, LHCb formed Grid technical working group to: Initiate and co-ordinate “user” and production activity in this area. Provide practical experience in Grid tools. Lead into longer term LHCb applications. Initially based around UK facilities at RAL and Liverpool, as well as CERN, but other countries/institutes are gradually joining in.  IN2P3, NIKHEF, INFN …

WP8 Meeting Glenn Patrick3 RAL CSF 120 Linux cpu IBM 3494 tape robot LIVERPOOL MAP 300 Linux cpu CERN pcrd25.cern.ch lxplus009.cern.ch RAL (PPD) Bristol Imperial College Oxford GLASGOW/ EDINBURGH “Proto-Tier 2” Initial LHCb-UK “Testbed” Institutes Exists Planned RAL DataGrid Testbed

WP8 Meeting Glenn Patrick4 Initial Architecture Based around existing production facilities (separate Datagrid testbed facilities will eventually exist). Intel PCs running Linux Redhat 6.1 Mixture of batch systems (LSF at CERN, PBS at RAL, FCS at MAP). Globus everywhere. Standard file transfer tools (eg. globus-rcp, GSIFTP). GASS servers for secondary storage? Java tools for controlling production, bookkeeping, etc. MDS/LDAP for bookkeeping database(s).

WP8 Meeting Glenn Patrick5 LHCb Applications Use existing LHCb code... Distributed Monte-Carlo Production: SICBMCLHCb Simulation Program SICBDST/LHCb Digitsn/Reconstruction Programs Brunel & Gaudi Distributed Analysis: With datasets stored at different centres - eventually.

WP8 Meeting Glenn Patrick6 Interactive Tests Accounts, globus certificates and gridmap entries set up for small group of people at RAL and Liverpool sites. (problems at CERN for external people) Globus set up at all sites. Managed to remotely submit scripts and run SICBMC executable between sites via... globus-job-run  from CERN to RAL and MAP  from RAL to CERN and MAP  from MAP to CERN and RAL

WP8 Meeting Glenn Patrick7 Batch Tests Mixture of batch systems - LSF at CERN, PBS at RAL, FCS at MAP. LHCb batch jobs remotely run on RAL-CSF Farm (120 Linuxnodes) using PBS (A.Sansum) via... globus-job-submit csflnx01.rl.ac.uk/jobmanager-pbs globus-job-get-output CERN - access to LSF? MAP - this week setting up LHCb software. Need to interface FCS/mapsub to globus?

WP8 Meeting Glenn Patrick8 Data Transfer Data transferred back to originating computer via... globus-rcp Unreliable for large files. Can use a lot of temporary disk space. Limited proxies breaking scripts when running commands in batch. At RAL, now using GSIFTP (uses Globus authentication): Availability of GSI toolkit at other UK sites & CERN? Consistency of user/security interface.

WP8 Meeting Glenn Patrick9 Secondary/Tertiary Storage RAL DataStore (IBM 3494) interfaced to prototype GASS server and accessible via Globus (Tim Folkes) - 30TB tape robot Allows remote access to LHCb tapes (750MB) via pseudo URLs... globus-rcp rs6ktest.cis.rl.ac.uk:/atlasdatastore/lhcb/L42426 myfile globus-url-copy... globus-gass-cache-add... Location of gass_cache can vary with command (server or client side) - space issue for large-scale transfers. Also, can use C API interface from code.

10 MAN SuperJANET Backbone SuperJANET III 155 Mbit/s (SuperJANET IV 2.5Gbit/s) London RAL Campus Univ. Dept MAN Networking Bottlenecks? CERN 100 Mbit/s 34 Mbit/s 622 Mbit/s (March 2001) TEN-155 Need to study/measure for data transfer and replication within UK and to CERN. 622 Mbit/s Schematic only 155 Mbit/s

WP8 Meeting Glenn Patrick11 Problems/Issues “Seamless” certification and authorisation of all people across all sites (certificates, gridmap files, etc) with common set of tools. Getting people set up on CERN facilities (need defined technical contact). Central to any LHCb grid work. Remote access into various batch systems (testbeds to cure?). Role of wide area filesystems like AFS. But at least we now have some practical user-experience in a “physics” environment. Helped to shakedown UK systems and provide valuable feedback.

WP8 Meeting Glenn Patrick12 Next? Migrate LHCb production to Linux (mainly NT till now). Start to adapt standard LHCb production scripts to use Globus tools. Modify book-keeping to make use of LDAP. MDS & access to metadata/services at different sites. Gradually adapt LHCb software to be Grid aware. Try “production style” runs end 2000/start 2001? Expand to include other centres/countries in “LHCb Grid”. Use available production systems and testbeds.