RAL Tier1: 2001 to 2011 James Thorne GridPP 19 30 th August 2007.

Slides:



Advertisements
Similar presentations
GridPP Status David Britton, 3/Sep/ /03/2014 Switching on the LHC The LHC was fully cold by mid August. This is being followed by continued powering.
Advertisements

S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP Oversight Committee Meeting.
GridPP4 – Revised Plan Implementing the PPAN recommendations.
Storage Review David Britton,21/Nov/ /03/2014 One Year Ago Time Line Apr-09 Jan-09 Oct-08 Jul-08 Apr-08 Jan-08 Oct-07 OC Data? Oversight.
Project Status David Britton,15/Dec/ Outline Programmatic Review Outcome CCRC08 LHC Schedule Changes Service Resilience CASTOR Current Status Project.
RAL Tier1 Operations Andrew Sansum 18 th April 2012.
Partner Logo Tier1/A and Tier2 in GridPP2 John Gordon GridPP6 31 January 2003.
User Board - Supporting Other Experiments Stephen Burke, RAL pp Glenn Patrick.
SouthGrid Status Pete Gronbech: 12 th March 2008 GridPP 20 Dublin.
Northgrid Status Alessandra Forti Gridpp24 RHUL 15 April 2010.
Tier-1 Status Andrew Sansum GRIDPP18 21 March 2007.
GridPP: Executive Summary Tony Doyle. Tony Doyle - University of Glasgow Oversight Committee 11 October 2007 Exec 2 Summary Grid Status: Geographical.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP Oversight Committee Meeting.
David Britton – Imperial College 23 September 2005Collaboration Meeting Tier-1 Planning Version-35 David Britton Imperial College.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP24 Collaboration Meeting.
Your university or experiment logo here User Board or User Bored? Glenn Patrick GridPP19, 29 August 2007.
12th September 2002Tim Adye1 RAL Tier A Tim Adye Rutherford Appleton Laboratory BaBar Collaboration Meeting Imperial College, London 12 th September 2002.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
Nick Brook University of Bristol The LHC Experiments & Lattice EB News Brief overview of the expts  ATLAS  CMS  LHCb  Lattice.
D. Britton Preliminary Project Plan for GridPP3 David Britton 15/May/06.
2 GridPP2 Budget David Britton, 4/12/03 Imperial College.
D. Britton Project Manager’s Report David Britton 12/Jan/2005.
Children and Families – Early Help and Prevention Children’s Centre Briefings Transition Arrangements October 2013.
Alastair Dewhurst, Dimitrios Zilaskos RAL Tier1 Acknowledgements: RAL Tier1 team, especially John Kelly and James Adams Maximising job throughput using.
Report of Liverpool HEP Computing during 2007 Executive Summary. Substantial and significant improvements in the local computing facilities during the.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
12. March 2003Bernd Panzer-Steindel, CERN/IT1 LCG Fabric status
October 23rd, 2009 Visit of CMS Computing Management at CC-IN2P3.
Tier-1 Overview Andrew Sansum 21 November Overview of Presentations Morning Presentations –Overview (Me) Not really overview – at request of Tony.
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
Quarterly report SouthernTier-2 Quarter P.D. Gronbech.
CC - IN2P3 Site Report Hepix Fall meeting 2009 – Berkeley
David Hutchcroft on behalf of John Bland Rob Fay Steve Jones And Mike Houlden [ret.] * /.\ /..‘\ /'.‘\ /.''.'\ /.'.'.\ /'.''.'.\ ^^^[_]^^^ * /.\ /..‘\
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
Overview of day-to-day operations Suzanne Poulat.
Tier1 Status Report Martin Bly RAL 27,28 April 2005.
GridPP3 project status Sarah Pearce 14 April 2010 GridPP24 RHUL.
John Gordon STFC-RAL Tier1 Status 9 th July, 2008 Grid Deployment Board.
CERN IT Department CH-1211 Genève 23 Switzerland t Tier0 Status - 1 Tier0 Status Tony Cass LCG-LHCC Referees Meeting 18 th November 2008.
GridPP3 Project Management GridPP20 Sarah Pearce 11 March 2008.
Project Management Sarah Pearce 3 September GridPP21.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
Tier1 Site Report HEPSysMan, RAL May 2007 Martin Bly.
25th October 2006Tim Adye1 RAL Tier A Tim Adye Rutherford Appleton Laboratory BaBar UK Physics Meeting Queen Mary, University of London 25 th October 2006.
ATLAS: Heavier than Heaven? Roger Jones Lancaster University GridPP19 Ambleside 28 August 2007.
BNL Tier 1 Service Planning & Monitoring Bruce G. Gibbard GDB 5-6 August 2006.
User Board Input Tier Storage Review 21 November 2008 Glenn Patrick Rutherford Appleton Laboratory.
US ATLAS Tier 1 Facility Rich Baker Brookhaven National Laboratory Review of U.S. LHC Software and Computing Projects Fermi National Laboratory November.
1 LHCb on the Grid Raja Nandakumar (with contributions from Greig Cowan) ‏ GridPP21 3 rd September 2008.
Tier1A Status Andrew Sansum 30 January Overview Systems Staff Projects.
6. Juli 2015 Dietrich Liko Physics Computing 114. Vorstandssitzung.
UK Tier 1 Centre Glenn Patrick LHCb Software Week, 28 April 2006.
Tier-1 Andrew Sansum Deployment Board 12 July 2007.
RAL PPD Tier 2 (and stuff) Site Report Rob Harper HEP SysMan 30 th June
BaBar Cluster Had been unstable mainly because of failing disks Very few (
Your university or experiment logo here User Board Glenn Patrick GridPP20, 11 March 2008.
SL5 Site Status GDB, September 2009 John Gordon. LCG SL5 Site Status ASGC T1 - will be finished before mid September. Actually the OS migration process.
The RAL Tier-1 and the 3D Deployment Andrew Sansum 3D Meeting 22 March 2006.
The RAL PPD Tier 2/3 Current Status and Future Plans or “Are we ready for next year?” Chris Brew PPD Christmas Lectures th December 2007.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
Farming Andrea Chierici CNAF Review Current situation.
UK Status and Plans Catalin Condurache – STFC RAL ALICE Tier-1/Tier-2 Workshop University of Torino, February 2015.
CASTOR at RAL in 2016 Rob Appleyard. Contents Current Status Staffing Upgrade plans Questions Conclusion.
Update on Plan for KISTI-GSDC
Luca dell’Agnello INFN-CNAF
Olof Bärring LCG-LHCC Review, 22nd September 2008
Bernd Panzer-Steindel CERN/IT
Presentation transcript:

RAL Tier1: 2001 to 2011 James Thorne GridPP th August 2007

30/08/ to 2007 Sorry GridPP, Im afraid I cant do that!

30/08/2007 Result of GridPP3 for Tier1 Good result: –Effort increases from 16.5 to 20.4 FTE –£6.8M hardware budget (cf £2.3M in GridPP2) Extra fault management/hardware staff as size of farm increases A good result but team remains thinly stretched; hardware is just sufficient to meet experiments requirements.

30/08/2007 Planned Tier1 Storage Capacity (TiB)

30/08/2007 Planned Tier1 CPU Capacity (KSI2K)

30/08/2007 Estimated Rack Count

30/08/2007 Estimated number of Disk Servers

30/08/2007 Estimated number of Spinning Drives

30/08/2007 Approximate H.W Value Allocated to Experiments in 2008

30/08/2007 Hardware CPU Disk Tape Further procurements in FY08, FY09 and FY10

30/08/2007 New Machine Room Order placed and contractor has started work 800m 2 can accommodate 300 racks + 5 robots 2.3MW Power/Cooling capacity (some UPS) Office accommodation for all E-Science staff Scheduled to be available for September 2008

30/08/2007 Staffing Lex Holt left Tier1 James Adams is moving from hardware support to Fabric Team system admin Plan to recruit: –Replacement hardware repair position –Two experiment support posts; one ATLAS, one CMS. –Raja Nandakumar as honorary team member from LHCb –Will also shortly commences GridPP3 recruitments

30/08/2007 CASTOR Operational issues mentioned at GridPP 18 were tip of iceberg and CASTOR service was found to be inoperable. Massive amount of re-engineering carried out since March with much effort from CASTOR team. –Huge progress –Areas of concern We are optimistic that CASTOR will be a success

30/08/2007 SL4 20% of batch farm now running SL4 Negotiating with LHC experiments to agree the move of their capacity from SL3 to SL4. Once LHC migration is completed, remaining capacity will follow within a few weeks. Depends on the experiments, but should expect termination of SL3 service in September

30/08/2007 Reliability March: invested a lot of effort without much gain Continue to prioritise reliability and making progress Recently exceeded target, now must maintain Start Sysadmin On Duty in September Start on call later this year

30/08/2007 RAL-LCG2 Availability/Reliability

30/08/2007 CPU Efficiencies CPU efficiency much improved August fall still being investigated March minimum when CASTOR was broken

30/08/2007 CPU Efficiencies

30/08/2007 Termination of GridPP use of ADS Service GridPP funding and use of old legacy Atlas Datastore service scheduled to end at end of March RAL will continue to operate ADS service and experiments are free to purchase capacity directly from ADS Team.

30/08/2007 dCache Closure dCache still supported and working We will give 6 months notice before terminating dCache service No notice of termination yet Aiming to end service by end of GRIDPP2 (March 2008). Also cannot terminate ADS service until dCache ceases.

30/08/2007 Grid Only Move to Grid only access postponed until December 2007 No new local accounts In January 2008: –Batch job submission through RB/CE only (no qsub, some exceptions) –No local login to UIs (some exceptions) –AFS Service will end

30/08/2007 Conclusions Positioning ourselves for LHC production. A lot of good progress with CASTOR and expect to meet the needs of the ATLAS M4 run and CMSs CSA07. Reliability has finally improved.