Presentation on theme: "08/06/00 LHCb(UK) Meeting Glenn Patrick LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick 08.06.00 Central UK Computing (what."— Presentation transcript:
08/06/00 LHCb(UK) Meeting Glenn Patrick (email@example.com) LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick 08.06.00 Central UK Computing (what is hoped for) JIF bid - Prototype UK national computing centre (Tier 1) for all 4 LHC experiments - outcome known in ~ November. Integrated Resources200120022003 Processors (PC99-450MHz)83016703100 Disk (TB)2550125 Tape (TB)67130330
What exists now? RAL-CSF: Main LHCb Platform Currently, 160*P450 equivalent processors. Hope to expand to ~300*P450 in September. Linux Redhat 6.1 being phased in on all machines (HPs being shut down) to give compatibility with CERN (eg. lxplus). PBS (Portable Batch System) not NQS. 1TB+ of robotic tape space for LHCb. 500GB+ of disk space for LHCb (need to request). Globus toolkit v1.1.1 installed on front-end (with testbed service on another machine).
RAL Particle Physics Unix Services 100 Megabit Switched Network HP LINUXSUN Disk Farm n TB Scratch NIS /home userids FDDI AFS DataStore HP BATCH LINUX BATCH DataStore
LHCb Software LHCb software stored in 4GB AFS project space /afs/rl.ac.uk/lhcb Updated just after midnight every night. CMT/CVS installed (although no remote updating to CERN repository). Crude LHCb environment at the moment, but managed to process events through SICBMC with little knowledge of LHCb software. Available for LHCb to exploit for detector, physics & Grid(?) studies.
MC Production: RAL NT Farm 18*450MHz PII + 9*200MHz Pentium Pro LHCb frontend in addition to dual-cpu frontend. Production capacity 100k-200k events/week. 500k bb events processed so far and stored in RAL DataStore. Events now transferred over network to CERN using RAL VTP protocol instead of DLTs. Thanks to Dave Salmon,Eric van H & Chris Brew. Latest production code being installed (DS).
6 BDC 8 RAL NT Farm New Front-end & extra batch nodes 1 23 4 5 4 + 4 GB 18GB 6 7 DAT Front End Batch Node Peripherals 100Mb/s switch PDC File Server 10 11 12 13 9 New Systems + 14 CPU LAN & WAN BDC 14 15 16 17
Grid Developments There is now a “CLRC Team” for the particle physics grid + several work groups (GNP represents LHCb with CAJB also a member). Important that this is beneficial for LHCb. EU (DataGrid) application to distribute 10 7 events & 3TB using MAP/RAL/... does not start production until 2002. Need to start now and acquire some practical experience and expertise decide way forward.
Grid Developments II Meetings: 14th June(RAL)Small technical group to discuss short term LHCb aims, testbeds, etc. (CERN,RAL,Liverpool,Glasgow…) 21st June(RAL)Globus Toolkit User Tutorial 22nd June(RAL)Globus Toolkit Developer Tutorial Open to all, register at... http://www.globus.org/news/uk-registration.html 23rd June(RAL)Globus “strategy” meeting. (invitation/nomination)
From UK Town Meeting 15.3.2000 Which Grid Topology for LHCb(UK)? Flexibility important. CERN Tier 1 INFN RAL IN2P3 Tier 2 Liverpool Glasgow Edinburgh Department Desktop users Tier 0 etc….
Grid Issues Starting to be asked for estimates of LHCb resources (central storage, etc) and Grid requirements for applications and testbeds. Useful to have a LHCb(UK) forum for discussion & feedback define model for all UK institutes, not just RAL. Any documentation (including this talk) on computing/software/Grid at... http://hepwww.rl.ac.uk/lhcb/computing/comphome.html