7 Tier 1 CPU – 2007 Profile UK Tier 1 Requests - 2007 CPU(KSI2K) May 07 Used May 07 Alloc. JanFebMarAprilMayJuneJulyAugSeptOctNovDec ALICE18.660203040506070113 ATLAS222.7500320 500 600 690 CMS20.7150200150 450 600 LHCb56.5100235 226 100 132 309 BaBar Plan B445.2300 CDF0.010 D02.612010010511011512012530 H16.465 MICE0.0111111111110 MINOS22.612550 125 SNO3.612 6666 ZEUS8.020 Pheno1.620 UKQCD0.00000000000000 Dteam/Ops0.08888888888888 Other48.40000000000000 TOTAL856.91491.01361132613321602149115061486 1880188919792306 LCG Total318.5810.0775735736926810820895 1295 13851712 Fabric0.0 000000000000 CAPACITY1508.0 1546153615411508 15061514 15271525 HEADROOM651.117.0185.0210.0209.0-94.017.02.020.028.0 -366.0-362.0-454.0-781.0 2007 2008 2009 2010 2011 Also an underlying profile for LHC! Napes Needle SuperLHC
8 2007 at the Tier 1 CPU. Largely met demand through the first half of the year. Resources start to become over-allocated in September when ATLAS and CMS activities are expected to peak. An increasing shortfall of CPU across Q4 is predicted, largely due to the LHC experiments ramping up in preparation for 2008. DISK. After the difficulties of 2006, the disk situation through the first 6 months of 2007 has been good with sufficient headroom to provide experiments with the requested capacity, as well as extra resources to assist with the testing of Castor and migration from dCache. There is still headroom through Q3, but Q4 will be challenging and experiments will probably have to wait for the new disk deployment in January 2008. TAPE. As always with tape (and the vagaries of Castor repacking, etc), it is difficult to be certain of the physical headroom. Estimates of the unwanted tape storage of the disk1tape0 storage class have had to be included for Castor (hopefully solved when v2.1.4 is deployed). Some allowance has also been made for migration from dCache to Castor.
9 LHC Schedule – End/Start in Sight! Interconnection of the continuous cryostat Leak tests of the last sub-sectors Inner Triplets repairs & interconnections Global pressure test &Consolidation Flushing Cool-down Warm up Powering Tests Global pressure test &Consolidation Cool-down Powering Tests General schedule Baseline rev. 4.0 May 2008 We are here! 8 months
11 dCache – Castor2 Migration Castor Data The migration to Castor continues to be a challenge! At the UB meeting on 20 June it was agreed that 6 month notice be given for dCache termination. Experiments have to fund storage costs past March 2008 for ADS/vtp tape service.
12 Castor Progress at RAL Separate Instances for LHC Experiments ATLAS Instance-Version 2.1.3 in production. CMS Instance-Version 2.1.3 in production. LHCb Instance-Version 2.1.3 in testing. Issues Strategy meetings and weekly experiment technical meetings helped a lot with progress. Current issues: Tape migration rates. Monitoring at RAL. SRM development (v2.2 timescale). disk1tape0 capability. Repack. But upcoming Data Challenges of CMS (CSA07) and ATLAS (M4, FDR) + LHCb, ALICE will be the real test of Castor.
13 physics group regional group firstname.lastname@example.org CERN Tier2 Lab a Uni a Lab c Uni n Lab m Lab b Uni b Uni y Uni x Tier3 physics department Desktop Germany Tier 1 USA UK France Italy ………. CERN Tier 1 ………. The LHC Computing Centre LHC Computing Model 2001
14 Grid Only Tier 1 After several discussions, non-Grid access to the Tier 1 is scheduled to finish at the end of 2007 except for a few exceptions. Use cases for the exceptions are being identified: Debugging production jobs. Maintaining experiment environment. Start up of new experiments. etc Limited User Interface Service. Important to retain functionality and flexibility. Implications for RAL AFS service (cell). This is bad news for BaBar who rely on AFS for software distribution.
15 SL4 and 64 Bit Migration Migration to SL4 discussed several times. A new SL4 CE with 20% of the batch capacity commissioned at Tier 1 during the first week of August. Generally, not an issue with experiments, but some teething problems (e.g. LHCb and CMS). Experiment attitudes towards true 64 bit applications (as opposed to 32 bit applications running in compatibility mode) surveyed: ATLAS - not important at the moment. LHCb – Can test, but what about middleware? BaBar/MINOS – No immediate interest. MICE plan to move to 64 bit computing as soon as underlying computing resources become available.
16 Real LHC Data now on Horizon Already here in the case of ATLAS cosmics ! 10 33 10 32 cm -2 s -1 10 34 CMS Need to be prepared for increasing luminosity and surprises (good and bad). Important to handle issues as they arise (and if possible anticipate). More direct communication needed in addition to quarterly UB meetings. Variable Backgrounds Changing Beam Energies
17 User Board Future UB and experiment interaction needs to progress as experiments evolve. Suggestions on this are welcome.