Presentation is loading. Please wait.

Presentation is loading. Please wait.

WP8 Meeting 16.11.00Glenn Patrick1 LHCb Grid Activities in UK Grid WP8 Meeting, 16th November 2000 Glenn Patrick (RAL)

Similar presentations


Presentation on theme: "WP8 Meeting 16.11.00Glenn Patrick1 LHCb Grid Activities in UK Grid WP8 Meeting, 16th November 2000 Glenn Patrick (RAL)"— Presentation transcript:

1 WP8 Meeting 16.11.00Glenn Patrick1 LHCb Grid Activities in UK Grid WP8 Meeting, 16th November 2000 Glenn Patrick (RAL)

2 WP8 Meeting 16.11.00Glenn Patrick2 Aims In June, LHCb formed Grid technical working group to: Initiate and co-ordinate “user” and production activity in this area. Provide practical experience in Grid tools. Lead into longer term LHCb applications. Initially based around UK facilities at RAL and Liverpool, as well as CERN, but other countries/institutes are gradually joining in.  IN2P3, NIKHEF, INFN …

3 WP8 Meeting 16.11.00Glenn Patrick3 RAL CSF 120 Linux cpu IBM 3494 tape robot LIVERPOOL MAP 300 Linux cpu CERN pcrd25.cern.ch lxplus009.cern.ch RAL (PPD) Bristol Imperial College Oxford GLASGOW/ EDINBURGH “Proto-Tier 2” Initial LHCb-UK “Testbed” Institutes Exists Planned RAL DataGrid Testbed

4 WP8 Meeting 16.11.00Glenn Patrick4 Initial Architecture Based around existing production facilities (separate Datagrid testbed facilities will eventually exist). Intel PCs running Linux Redhat 6.1 Mixture of batch systems (LSF at CERN, PBS at RAL, FCS at MAP). Globus 1.1.3 everywhere. Standard file transfer tools (eg. globus-rcp, GSIFTP). GASS servers for secondary storage? Java tools for controlling production, bookkeeping, etc. MDS/LDAP for bookkeeping database(s).

5 WP8 Meeting 16.11.00Glenn Patrick5 LHCb Applications Use existing LHCb code... Distributed Monte-Carlo Production: SICBMCLHCb Simulation Program SICBDST/LHCb Digitsn/Reconstruction Programs Brunel & Gaudi Distributed Analysis: With datasets stored at different centres - eventually.

6 WP8 Meeting 16.11.00Glenn Patrick6 Interactive Tests Accounts, globus certificates and gridmap entries set up for small group of people at RAL and Liverpool sites. (problems at CERN for external people) Globus 1.1.3 set up at all sites. Managed to remotely submit scripts and run SICBMC executable between sites via... globus-job-run  from CERN to RAL and MAP  from RAL to CERN and MAP  from MAP to CERN and RAL

7 WP8 Meeting 16.11.00Glenn Patrick7 Batch Tests Mixture of batch systems - LSF at CERN, PBS at RAL, FCS at MAP. LHCb batch jobs remotely run on RAL-CSF Farm (120 Linuxnodes) using PBS (A.Sansum) via... globus-job-submit csflnx01.rl.ac.uk/jobmanager-pbs globus-job-get-output CERN - access to LSF? MAP - this week setting up LHCb software. Need to interface FCS/mapsub to globus?

8 WP8 Meeting 16.11.00Glenn Patrick8 Data Transfer Data transferred back to originating computer via... globus-rcp Unreliable for large files. Can use a lot of temporary disk space. Limited proxies breaking scripts when running commands in batch. At RAL, now using GSIFTP (uses Globus authentication): Availability of GSI toolkit at other UK sites & CERN? Consistency of user/security interface.

9 WP8 Meeting 16.11.00Glenn Patrick9 Secondary/Tertiary Storage RAL DataStore (IBM 3494) interfaced to prototype GASS server and accessible via Globus (Tim Folkes) - 30TB tape robot Allows remote access to LHCb tapes (750MB) via pseudo URLs... globus-rcp rs6ktest.cis.rl.ac.uk:/atlasdatastore/lhcb/L42426 myfile globus-url-copy... globus-gass-cache-add... Location of gass_cache can vary with command (server or client side) - space issue for large-scale transfers. Also, can use C API interface from code.

10 10 MAN SuperJANET Backbone SuperJANET III 155 Mbit/s (SuperJANET IV 2.5Gbit/s) London RAL Campus Univ. Dept MAN Networking Bottlenecks? CERN 100 Mbit/s 34 Mbit/s 622 Mbit/s (March 2001) TEN-155 Need to study/measure for data transfer and replication within UK and to CERN. 622 Mbit/s Schematic only 155 Mbit/s

11 WP8 Meeting 16.11.00Glenn Patrick11 Problems/Issues “Seamless” certification and authorisation of all people across all sites (certificates, gridmap files, etc) with common set of tools. Getting people set up on CERN facilities (need defined technical contact). Central to any LHCb grid work. Remote access into various batch systems (testbeds to cure?). Role of wide area filesystems like AFS. But at least we now have some practical user-experience in a “physics” environment. Helped to shakedown UK systems and provide valuable feedback.

12 WP8 Meeting 16.11.00Glenn Patrick12 Next? Migrate LHCb production to Linux (mainly NT till now). Start to adapt standard LHCb production scripts to use Globus tools. Modify book-keeping to make use of LDAP. MDS & access to metadata/services at different sites. Gradually adapt LHCb software to be Grid aware. Try “production style” runs end 2000/start 2001? Expand to include other centres/countries in “LHCb Grid”. Use available production systems and testbeds.


Download ppt "WP8 Meeting 16.11.00Glenn Patrick1 LHCb Grid Activities in UK Grid WP8 Meeting, 16th November 2000 Glenn Patrick (RAL)"

Similar presentations


Ads by Google