Your university or experiment logo here User Board or User Bored? Glenn Patrick GridPP19, 29 August 2007
2 Grids need users GridPP WLCG EGEE OSG NGS USER
3 Users need a Board? Not quite – more of a User Forum
4 Board needs a Chair LHCb BaBarOther Experiments CMS
5 Experiments are really the users
6 Tier 1 Fairshares – last 12 months ATLAS BaBar CMS LHCb ATLAS BaBar CMS LHCb...Requested Reality…
7 Tier 1 CPU – 2007 Profile UK Tier 1 Requests CPU(KSI2K) May 07 Used May 07 Alloc. JanFebMarAprilMayJuneJulyAugSeptOctNovDec ALICE ATLAS CMS LHCb BaBar Plan B CDF0.010 D H MICE MINOS SNO ZEUS8.020 Pheno1.620 UKQCD Dteam/Ops Other TOTAL LCG Total Fabric CAPACITY HEADROOM Also an underlying profile for LHC! Napes Needle SuperLHC
at the Tier 1 CPU. Largely met demand through the first half of the year. Resources start to become over-allocated in September when ATLAS and CMS activities are expected to peak. An increasing shortfall of CPU across Q4 is predicted, largely due to the LHC experiments ramping up in preparation for DISK. After the difficulties of 2006, the disk situation through the first 6 months of 2007 has been good with sufficient headroom to provide experiments with the requested capacity, as well as extra resources to assist with the testing of Castor and migration from dCache. There is still headroom through Q3, but Q4 will be challenging and experiments will probably have to wait for the new disk deployment in January TAPE. As always with tape (and the vagaries of Castor repacking, etc), it is difficult to be certain of the physical headroom. Estimates of the unwanted tape storage of the disk1tape0 storage class have had to be included for Castor (hopefully solved when v2.1.4 is deployed). Some allowance has also been made for migration from dCache to Castor.
9 LHC Schedule – End/Start in Sight! Interconnection of the continuous cryostat Leak tests of the last sub-sectors Inner Triplets repairs & interconnections Global pressure test &Consolidation Flushing Cool-down Warm up Powering Tests Global pressure test &Consolidation Cool-down Powering Tests General schedule Baseline rev. 4.0 May 2008 We are here! 8 months
Planning Underway Tier Hardware (GridPP3) CPU(KSI2K)Disk (TB)Tape (TB) Now2008Now2008Now2008 ALICE ATLAS CMS LHCb BaBar Other TOTAL x 2.7x 3.3x 2.8
11 dCache – Castor2 Migration Castor Data The migration to Castor continues to be a challenge! At the UB meeting on 20 June it was agreed that 6 month notice be given for dCache termination. Experiments have to fund storage costs past March 2008 for ADS/vtp tape service.
12 Castor Progress at RAL Separate Instances for LHC Experiments ATLAS Instance-Version in production. CMS Instance-Version in production. LHCb Instance-Version in testing. Issues Strategy meetings and weekly experiment technical meetings helped a lot with progress. Current issues: Tape migration rates. Monitoring at RAL. SRM development (v2.2 timescale). disk1tape0 capability. Repack. But upcoming Data Challenges of CMS (CSA07) and ATLAS (M4, FDR) + LHCb, ALICE will be the real test of Castor.
13 physics group regional group CERN Tier2 Lab a Uni a Lab c Uni n Lab m Lab b Uni b Uni y Uni x Tier3 physics department Desktop Germany Tier 1 USA UK France Italy ………. CERN Tier 1 ………. The LHC Computing Centre LHC Computing Model 2001
14 Grid Only Tier 1 After several discussions, non-Grid access to the Tier 1 is scheduled to finish at the end of 2007 except for a few exceptions. Use cases for the exceptions are being identified: Debugging production jobs. Maintaining experiment environment. Start up of new experiments. etc Limited User Interface Service. Important to retain functionality and flexibility. Implications for RAL AFS service (cell). This is bad news for BaBar who rely on AFS for software distribution.
15 SL4 and 64 Bit Migration Migration to SL4 discussed several times. A new SL4 CE with 20% of the batch capacity commissioned at Tier 1 during the first week of August. Generally, not an issue with experiments, but some teething problems (e.g. LHCb and CMS). Experiment attitudes towards true 64 bit applications (as opposed to 32 bit applications running in compatibility mode) surveyed: ATLAS - not important at the moment. LHCb – Can test, but what about middleware? BaBar/MINOS – No immediate interest. MICE plan to move to 64 bit computing as soon as underlying computing resources become available.
16 Real LHC Data now on Horizon Already here in the case of ATLAS cosmics ! cm -2 s CMS Need to be prepared for increasing luminosity and surprises (good and bad). Important to handle issues as they arise (and if possible anticipate). More direct communication needed in addition to quarterly UB meetings. Variable Backgrounds Changing Beam Energies
17 User Board Future UB and experiment interaction needs to progress as experiments evolve. Suggestions on this are welcome.
18 The End (and The Start)