Presentation on theme: "LHCb(UK) Meeting 10.01.01 Glenn Patrick1 LHCb Grid Activities in UK LHCb(UK) Meeting Cambridge, 10th January 2001 Glenn Patrick (RAL)"— Presentation transcript:
LHCb(UK) Meeting 10.01.01 Glenn Patrick1 LHCb Grid Activities in UK LHCb(UK) Meeting Cambridge, 10th January 2001 Glenn Patrick (RAL) http://hepwww.rl.ac.uk/lhcb/computing/lhcbuk100101.ppt
LHCb(UK) Meeting 10.01.01 Glenn Patrick2 Aims In June, LHCb formed Grid technical working group to: Initiate and co-ordinate user and production activity in this area. Provide practical experience in Grid tools. Lead into longer term LHCb applications. Initially based around UK facilities at RAL and Liverpool, as well as CERN, but other countries/institutes are gradually joining in. IN2P3, NIKHEF, INFN …
LHCb(UK) Meeting 10.01.01 Glenn Patrick3 RAL CSF 120 Linux cpu IBM 3494 tape robot LIVERPOOL MAP 300 Linux cpu CERN pcrd25.cern.ch lxplus009.cern.ch RAL (PPD) Bristol Imperial College Oxford GLASGOW/ EDINBURGH Proto-Tier 2 Initial LHCb-UK Testbed Institutes Exists Planned RAL DataGrid Testbed Cambridge
LHCb(UK) Meeting 10.01.01 Glenn Patrick4 Initial Architecture Based around existing production facilities (separate Datagrid testbed facilities will eventually exist). Intel PCs running Linux Redhat 6.1 Mixture of batch systems (LSF at CERN, PBS at RAL, FCS at MAP). Globus 1.1.3 everywhere. Standard file transfer tools (eg. globus-rcp, GSIFTP). GASS servers for secondary storage? Java tools for controlling production, bookkeeping, etc. MDS/LDAP for bookkeeping database(s).
LHCb(UK) Meeting 10.01.01 Glenn Patrick5 LHCb Applications Use existing LHCb code... Distributed Monte-Carlo Production: SICBMCLHCb Simulation Program SICBDST/LHCb Digitsn/Reconstruction Programs Brunel & Gaudi Distributed Analysis: With datasets stored at different centres - eventually.
LHCb(UK) Meeting 10.01.01 Glenn Patrick6 Interactive Tests Accounts, globus certificates and gridmap entries set up for small group of people at RAL and Liverpool sites. (problems at CERN for external people) Globus 1.1.3 set up at all sites. Managed to remotely submit scripts and run SICBMC executable between sites (front-ends) via … grid-proxy-init globus-job-run from CERN to RAL and MAP from RAL to CERN and MAP from MAP to CERN and RAL
LHCb(UK) Meeting 10.01.01 Glenn Patrick7 Batch Tests Mixture of batch systems - LSF at CERN, PBS at RAL, FCS at MAP. LHCb batch jobs remotely run on RAL-CSF Farm (120 Linuxnodes) using PBS (A.Sansum) via... globus-job-submit csflnx01.rl.ac.uk/jobmanager-pbs globus-job-get-output CERN - access to LSF? MAP - LHCb production software set up. Need to interface FCS/mapsub to globus?
LHCb(UK) Meeting 10.01.01 Glenn Patrick8 Data Transfer Data transferred back to originating computer via... globus-rcp Unreliable for large files. Can use a lot of temporary disk space. Limited proxies can break scripts when running commands in batch. At RAL, now using GSIFTP (uses Globus authentication): Availability of GSI toolkit at other UK sites & CERN? Consistency of user/security interface (now improved).
LHCb(UK) Meeting 10.01.01 Glenn Patrick9 Secondary/Tertiary Storage RAL DataStore (IBM 3494) interfaced to prototype GASS server and accessible via Globus (Tim Folkes) - 30TB tape robot Allows remote access to LHCb tapes (750MB) via pseudo URLs... globus-rcp rs6ktest.cis.rl.ac.uk:/atlasdatastore/lhcb/L42426 myfile globus-url-copy... globus-gass-cache-add... Location of gass_cache can vary with command (server or client side) - space issue for large-scale transfers! Also, can use C API interface from code.
10 MAN SuperJANET Backbone SuperJANET III 155 Mbit/s (SuperJANET IV 2.5Gbit/s) London RAL Campus Univ. Dept MAN Networking Bottlenecks? CERN 100 Mbit/s 34 Mbit/s 622 Mbit/s (March 2001) TEN-155 Need to study/measure for data transfer and replication within UK and to CERN. 622 Mbit/s Schematic only 155 Mbit/s
LHCb(UK) Meeting 10.01.01 Glenn Patrick11 Data and Code Replication? GDMP(WP2) - Asynchronous replication of Objectivity federations on top of Globus (Stockinger,Samar - CMS). gdmp_replicate_file_get gdmp_publish_catalogue General replication tool for Objectivity, Root & ZEBRA files and integrate Grid-FTP (end January 2001). Available in INFN Installation Toolkit. Kickstart & Update Kits(WP8) - Installation of experiment environments on cpu farms? (eg. done for CMS production executables). Experiments encouraged to provide for WP8 and WP6 testbed...
LHCb(UK) Meeting 10.01.01 Glenn Patrick12 Problems/Issues Seamless certification and authorisation of all people across all sites (certificates, gridmap files, etc) with common set of tools UK Certificate Authority(RAL) and UK Globus rpm distribution(Manchester). Getting people set up on CERN facilities (need defined technical contact). Central to any LHCb grid work. Remote access into various batch systems (testbed0 systems to cure?). Role of AFS (eg. kerberos security at CERN)? But at least we now have some practical user-experience in a physics environment. Helped to shakedown UK systems and provide initial feedback.
LHCb(UK) Meeting 10.01.01 Glenn Patrick13 Next? Access to meta-data catalogue and data at different sites. Role of LDAP in book-keeping? Storage model? Evaluate networking performance between LHCb sites and test replication tools like GSIFTP & GDMP. MDS/GRIS - Menu of LHCb(UK) physics/computing services, Quote systems for batch farms, Grid fabric. Standard Grid interfaces/LHCb code at all UK sites. Adapt LHCb software and production scripts to be Grid aware. Try production style runs using available systems and testbeds.