Presentation on theme: "LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities."— Presentation transcript:
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities
Architecture of LHCb Computing Model - based on a distributed multi-Tier regional centre Processing of real data at CERN (production centre) Regional centre simulation production
Present Facilities: Liverpool MAP - 300 node facility RAL NT farm - closed in February after LHCb MC production RAL CSF facility - 120 node Linux facility RAL datastore - IBM 3494 tape robot RAL NT delivered approx 100k RAWH events/week MAP can deliver ~16k DST2 events/week if dedicated to LHCb
RAL DataGrid Testbed RAL CSF 120 Linux cpu IBM 3494 tape robot LIVERPOOL MAP 300 Linux cpu CERN pcrd25.cern.ch lxplus009.cern.ch RAL (PPD) Bristol Imperial College Oxford GLASGOW/ EDINBURGH “Proto-Tier 2” Proposed LHCb-UK “Testbed” Institutes Exists Planned Cambridge
Prototype “testbed” Architecture Based around existing production facilities Intel PCs running Linux Redhat 6.1 Mixture of batch systems (LSF at CERN, PBS at RAL, FCS at MAP). Globus 1.1.3 everywhere. Standard file transfer tools (eg. globus-rcp, GSIFTP). GASS servers for secondary storage? Java tools for controlling production, bookkeeping, etc. MDS/LDAP for bookkeeping database(s).
Globus been used to remotely submit scripts and run SICBMC (CERN/RAL/Liverpool) LHCb batch (PBS) jobs run on RAL CSF via Globus ongoing investigations of Globus toolkt: globus-rcp, GSIFTP, RAL datastore as GASS server work on-going to “open up” MAP for general use i.e analysis type activitites co-ordinating LHCb external computing - work on-going on baseline DataModel work on-going to harness University NT resources major i/p to Grid PP proposal (most detailed forward look of all UK LHC expts) - disappointment on the emphasis of final document GRID Activities
Summary of current UK LHCb resources 2,775 SI95 shared for LHCb between RAL & Liverpool (the lion’s share being MAP) will increase due to RAL upgrade to 3,400 SI95 in March end of 2001 “Scotch” facility add additional 2,250 SI95 total end of 2001 to 5,650 SI95 by end of 2001 3.5Tb disk space distributed across RAL/Edinburgh/Liverpool at RAL an additional 13Tb robotic tape by year end
Estimation of needs by 2003 Assume 10-15% of 2006-2007 resources needed in 2003 & assuming UK contribution 20-25% 10,800-20,250 SI95 needed 7-14 Tb of disk 23-48Tb of “robotic” storage Figures consistent with “bottom-up” estimation performed at CERN Factors 2-4 greater than resources available by end of 2001
RICH software UK is co-ordinating the s/w effort current FORTRAN simulation written by UK FORTRAN simulation used in current RICH studies still (e.g. alignment studies & optimisation wrt T11) since TDR move towards OO - UK prominent (e.g. GEANT4 studies, interfacing current FORTRAN databanks to OO-framework…) investigation of fast RICH reconstruction
Rich simulation with OO software (work just getting underway) Simulation of testbeam setup – testing of Čerenkov radiation within GEANT4 Interfacing current info. from FORTRAN simulation into OO environment Interest in photodetector simulation OO environment
Fast Online Rich Particle Identification developing complimentary PID algorithm to offline method for use in online applications online PID searches for potential Čerenkov rings for a single track. Much less computation involved and therefore faster. initial results with algorithm are encouraging and studies into potential online applications are progressing. possible gains in global PID performance with online results merged with global algorithm, for example background suppression.
Summary UK in LHCb in prominent position leading Grid effort vital momentum remain and built upon through “GridPP” effort UK leading position in RICH s/w - UK effort is now moving into OO s/w and use of RICH “online”