Presentation is loading. Please wait.

Presentation is loading. Please wait.

GridPP4 Project Management Pete Gronbech April 2012 GridPP28 Manchester.

Similar presentations

Presentation on theme: "GridPP4 Project Management Pete Gronbech April 2012 GridPP28 Manchester."— Presentation transcript:

1 GridPP4 Project Management Pete Gronbech April 2012 GridPP28 Manchester

2 2 GridPP28, Manchester Since the last meeting LHC is still building up to full running again after the Christmas technical stop. Tier-1 running well, and also busy with infrastructure upgrades Tier-2s busy installing new hardware and new networking equipment. GridPP4 1 st tranche hardware money spent Digital Research Infrastructure Grant equipment money spent. 17/4/12

3 3 GridPP28, Manchester Accelerator Update This year the collision energy is 8 TeV (=beam energy 4 TeV), slightly higher from last years 3.5 TeV. Started first beams four weeks ago, mostly for testing of 'safety systems'. First physics at 2 x 4 TeV was one week ago starting with 2 x 3 bunches. At this very moment collisions are taking place with 2 x 1092 bunches, already giving more collisions than last year with 2 x 1380 bunches. In a few days aim to be at the nominal number of bunches for this year, 2 x 1380, (but higher luminosities = more collisions than last year because of the higher energy and smaller beta* at the interaction point). On Friday there will be 3 days of machine development followed by the first Technical Stop of the year. Back to production again for data taking in the beginning of May for an 8 week period. 17/4/12

4 4 GridPP28, Manchester Tier-1 CPU hardware delivered and commissioned in time to meet WLCG pledge Both tranches of disk has been delivered and deployed Upgrade to CASTOR completed Operations very stable following many upgrades in February. 17/4/12

5 5 GridPP28, Manchester Tier-2s All grants for 1 st tranche of hardware issued and should have been spent –sites should have hardware to meet 2012 pledge. –All sites have been trying to spend the money this Financial Year. Most sites made significant upgrades and coupled with the DRI grants have been able to enhance the infrastructure and networking both within the clusters and across campus to the JANET connections. Future MoUs showed shortfalls in Storage capacity more than CPU, which meant an emphasis on disk purchases. Prices were inflated and deliveries extended due to the flood in Thailand causing a worldwide shortage. However prices for networking equipment came down substantially in January which did compensate in part at some sites. 17/4/12

6 6 GridPP28, Manchester DRI and GridPP4 Grants Instructions for JeS issued 9/11/11 GridPP4 grants issued very quickly some in December DRI Bids solicited 8/11/11 DRI Project team reviewed responses very quickly during 18 th November to 8 th December and revised to meet the £3M target once this was known. JeS instructions were sent out on 9 th December. Grants issued early January All equipment on sites by end of March 2012.

7 7 GridPP28, Manchester UKI CPU contribution (LHC) CPU March 2012 – GStat2.0 17/4/12 Since April 2011 Country stats

8 8 GridPP28, Manchester UKI VOs 17/4/12 Since March 2011 Previous year Non LHC VOs are getting squeezed

9 9 GridPP28, Manchester 17/4/12

10 10 GridPP28, Manchester VO support across sites 17/4/12

11 11 GridPP28, Manchester UKI Tier-1 & Tier-2 contributions 17/4/12 Since March 2011 Previous year

12 12 GridPP28, Manchester Storage 17/4/12 From GStat2.0 August 2010 March 2011 April 2012 HS06TB Q411 CPUQ411 Disk LondonGrid NorthGrid ScotGrid SouthGrid TOTAL Quarterly Reported Resources Truth somewhere in between, Q112 report will help clarify the situation.

13 13 GridPP28, Manchester GridPP4 ProjectMap Q411 17/4/12

14 14 GridPP28, Manchester Q411 Tier 1 staff, Service availability for Atlas due to castor and network issues. Atlas data availability (92%) CMS Red Metrics are all due to Bristol Data group no of blog posts low, and NFSv4 study late. Security delay in running SSC. Execution, no of vacant posts, and review of service to expts. Outreach, no of news items, Press releases, KE meetings low. Q112 reports due in at the end of this month or earlier preferably!!!

15 15 GridPP28, Manchester Non LHC Storage Stats so far SiteTotal Percentage Disk used across T2Site percentage non LHC EDFA-JET0 0.01% 31% Birmingham % 1% Bristol % 0% Cambridge % 1% Oxford % 3% RALPPD % 1% Total % 2% UKI-LT2-Brunel430 16%1% UKI-LT2-IC-HEP744 28%4% UKI-LT2-QMUL867 32%5% UKI-LT2-RHUL498 19%1% UKI-LT2-UCL-HEP129 5%1% Total %3%

16 16 GridPP28, Manchester Project map - statistics 17/4/12 MetricsMilestones

17 17 GridPP28, Manchester Manpower GridPP was running at reduced manpower for the later part of 2011, with ~2 FTE short at the T2s and ~4 FTE at RAL. Both T1 and 2s have now filled the posts so there should be the capacity to do development work that has been on hold due to the shortages. 17/4/12

18 18 GridPP28, Manchester Risk register 17/4/12 Highlighted risks –Recruitment and retention – Still a concern but currently more stable. –Resilience to Storage – Problems with batches of Storage –CASTOR is critical and although more stable now, has serious consequences when it fails. –Insufficient funding for T2 h/w: Increased equipment costs (esp Disk), and increases Experiment Resource requests. Mitigated by DRI investment to a certain extent. –Contention for resources anticipated to be more of an issue as LHC use increases and squeezes the minor VOs.

19 19 GridPP28, Manchester 17/4/12 Timeline GridPP2GridPP2+ GridPP3 End of GridPP2 (31 August 2007) Start of GridPP3 (1 April 2008) GridPP3 GridPP4 Start of GridPP4 (1 April 2011) GridPP4 GridPP celebrated its 10 th Birthday in December 2011

20 20 GridPP28, Manchester From the start of GridPP3 to the present time At the start of GridPP4~27000 CPUs and ~7PB disk reported. Now ~31000 CPUs and ~27PB (If GSTAT is to be believed) The UK reported approx 370GSI2K hours last year, just ahead of Germany and France, and is still the largest in the EGI grid. 17/4/12

21 21 GridPP28, Manchester Reporting The main LHC experiments will continue to report on the Tier 1 and the Tier2 performance as both Analysis and Production sites The tier 2 sites reporting continues as before with reports going via the Production Manager. Slight modifications to enable better tracking of Non LHC VO storage use. Storage, Security, NGI and Dissemination have separate reports. 17/4/12

22 22 GridPP28, Manchester Summary 17/4/12 The first accounting period completed and the 1 st tranche of h/w funding was allocated. Last Autumn and this Spring particularly busy with GridPP h/w and DRI grants. Tendering, quotes, purchasing and now installations and upgrades. Should plan to be stable in time for the next data taking in May, although the load seen on Tier 2s is more aligned with Physics conferences than data taking in some cases. A reminder that we are in continuous accounting period which started at the end of the last one. i.e. from 1 st November through to a date to be determined, dependant on STFC capital spend profiling.

Download ppt "GridPP4 Project Management Pete Gronbech April 2012 GridPP28 Manchester."

Similar presentations

Ads by Google