UK Tier 1 Centre Glenn Patrick LHCb Software Week, 28 April 2006.

Slides:



Advertisements
Similar presentations
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Advertisements

Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
RAL Tier1: 2001 to 2011 James Thorne GridPP th August 2007.
Level 1 Components of the Project. Level 0 Goal or Aim of GridPP. Level 2 Elements of the components. Level 2 Milestones for the elements.
12th September 2002Tim Adye1 RAL Tier A Tim Adye Rutherford Appleton Laboratory BaBar Collaboration Meeting Imperial College, London 12 th September 2002.
05/07/00LHCb Computing Model Meeting (CERN) LHCb(UK) Computing Status Glenn Patrick Prototype (Tier 1) UK national computing centre Bid to Joint.
GridPP News NeSC opening “Media” dissemination Tier 1/A hardware Web pages Collaboration meetings Nick Brook University of Bristol.
Tier1A Status Andrew Sansum GRIDPP 8 23 September 2003.
Martin Bly RAL Tier1/A RAL Tier1/A Site Report HEPiX-HEPNT Vancouver, October 2003.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
 Changes to sources of funding for computing in the UK.  Past and present computing resources.  Future plans for computing developments. UK Status &
12. March 2003Bernd Panzer-Steindel, CERN/IT1 LCG Fabric status
SouthGrid Status Pete Gronbech: 4 th September 2008 GridPP 21 Swansea.
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
Status of WLCG Tier-0 Maite Barroso, CERN-IT With input from T0 service managers Grid Deployment Board 9 April Apr-2014 Maite Barroso Lopez (at)
Quarterly report SouthernTier-2 Quarter P.D. Gronbech.
08/06/00 LHCb(UK) Meeting Glenn Patrick LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick Central UK Computing (what.
Southgrid Technical Meeting Pete Gronbech: 16 th March 2006 Birmingham.
GridKa SC4 Tier2 Workshop – Sep , Warsaw Tier2 Site.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
Robin Middleton RAL/PPD DG Co-ordination Rome, 23rd June 2001.
Quarterly report ScotGrid Quarter Fraser Speirs.
Oxford Update HEPix Pete Gronbech GridPP Project Manager October 2014.
RAL Tier 1 Site Report HEPSysMan – RAL – May 2006 Martin Bly.
LCG Service Challenge Phase 4: Piano di attività e impatto sulla infrastruttura di rete 1 Service Challenge Phase 4: Piano di attività e impatto sulla.
Tier1 Status Report Martin Bly RAL 27,28 April 2005.
RAL Site Report Martin Bly HEPiX Fall 2009, LBL, Berkeley CA.
Tier1 Report Cambridge 23rd October 2006 Martin Bly.
Andrew McNabNorthGrid, GridPP8, 23 Sept 2003Slide 1 NorthGrid Status Andrew McNab High Energy Physics University of Manchester.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
Project Management Sarah Pearce 3 September GridPP21.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
Southgrid Technical Meeting Pete Gronbech: 26 th August 2005 Oxford.
LCG-France Vincent Breton, Eric Lançon and Fairouz Malek, CNRS-IN2P3 and LCG-France ISGC Symposium Taipei, March 27th 2007.
11 March 2008 GridPP20 Collaboration meeting David Britton - University of Glasgow GridPP Status GridPP20 Collaboration Meeting, Dublin David Britton,
GridPP Deployment Status GridPP14 Jeremy Coles 6 th September 2005.
UKI-SouthGrid Update Hepix Pete Gronbech SouthGrid Technical Coordinator April 2012.
Tier1 Site Report HEPSysMan, RAL May 2007 Martin Bly.
19th September 2003Tim Adye1 RAL Tier A Status Tim Adye Rutherford Appleton Laboratory BaBar UK Collaboration Meeting Royal Holloway 19 th September 2003.
25th October 2006Tim Adye1 RAL Tier A Tim Adye Rutherford Appleton Laboratory BaBar UK Physics Meeting Queen Mary, University of London 25 th October 2006.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
User Board Input Tier Storage Review 21 November 2008 Glenn Patrick Rutherford Appleton Laboratory.
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
US ATLAS Tier 1 Facility Rich Baker Brookhaven National Laboratory Review of U.S. LHC Software and Computing Projects Fermi National Laboratory November.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Tier1 Andrew Sansum GRIDPP 10 June GRIDPP10 June 2004Tier1A2 Production Service for HEP (PPARC) GRIDPP ( ). –“ GridPP will enable testing.
CERN IT Department CH-1211 Genève 23 Switzerland t Frédéric Hemmer IT Department Head - CERN 23 rd August 2010 Status of LHC Computing from.
LCG Storage Accounting John Gordon CCLRC – RAL LCG Grid Deployment Board September 2006.
CERN Computer Centre Tier SC4 Planning FZK October 20 th 2005 CERN.ch.
RAL Site Report HEPiX - Rome 3-5 April 2006 Martin Bly.
Tier-1 Andrew Sansum Deployment Board 12 July 2007.
BaBar Cluster Had been unstable mainly because of failing disks Very few (
Scientific Computing in PPD and other odds and ends Chris Brew.
Your university or experiment logo here User Board Glenn Patrick GridPP20, 11 March 2008.
The RAL Tier-1 and the 3D Deployment Andrew Sansum 3D Meeting 22 March 2006.
Western Tier 2 Site at SLAC Wei Yang US ATLAS Tier 2 Workshop Harvard University August 17-18, 2006.
The RAL PPD Tier 2/3 Current Status and Future Plans or “Are we ready for next year?” Chris Brew PPD Christmas Lectures th December 2007.
RAL Plans for SC2 Andrew Sansum Service Challenge Meeting 24 February 2005.
November 28, 2007 Dominique Boutigny – CC-IN2P3 CC-IN2P3 Update Status.
UK Status and Plans Catalin Condurache – STFC RAL ALICE Tier-1/Tier-2 Workshop University of Torino, February 2015.
London Tier-2 Quarter Owen Maroney
LCG Service Challenge: Planning and Milestones
Luca dell’Agnello INFN-CNAF
UK GridPP Tier-1/A Centre at CLRC
Olof Bärring LCG-LHCC Review, 22nd September 2008
LHCb(UK) Computing Status Glenn Patrick
Presentation transcript:

UK Tier 1 Centre Glenn Patrick LHCb Software Week, 28 April 2006

ISIS UK Tier 1 Diamond Particle Physics Rutherford Appleton Laboratory

Glenn PatrickLHCb Software Week – 28 April UK Tier 1 Exploitation BaBar LHCb ATLAS ATLAS LHCb BaBar

Largest GridPP users by VO for 2005 NB: Excludes data from Cambridge – for Condor support in APEL see Dave Kant’s talk ATLAS BaBar LHCb ZEUS CMS BIOMED DZERO LHCb Tier 1 = 505,921 KSI2K hours LHCb Tier 2 = 983,050 KSI2K hours Now 23 approved VOs

Glenn PatrickLHCb Software Week – 28 April UK Tier 1 CPU Current capacity 830 KSI2K. Some units now 4 years old. Extra 266 KSI2K delivered 10 March and should be available at start of May after 4 week load test. Twin dual-core Opteron 270s: 1GB RAM/core 250 GB HDD Total CPU Capacity = 1,096 KSI2K

Glenn PatrickLHCb Software Week – 28 April UK Tier 1 Disk Current capacity 177TB. Extra 168TB and 21 servers delivered on 10 March. Some teething problems. Fix has been generated and tested. Hopefully, resume installation this week with target for end May. Total Disk Storage ~308TB (after retirements).

Glenn PatrickLHCb Software Week – 28 April New SL8500 tape robot New STK SL8500 tape robot. Replaces STK Powderhorn robot (single arm, 9940 drives) Funded by CCLRC. Entered service 28 th March ,000 slots. Upgrade to 10,000 slots later in the year giving a capacity of 5 PBytes. 8 mini robots mounting tapes – faster, more resilient. T10000 tape drives (Castor and not ADS at moment). 318TB(Feb)  336TB  446TB (extra T10K media)

Glenn PatrickLHCb Software Week – 28 April New SL8500 tape robot

Glenn PatrickLHCb Software Week – 28 April CASTOR2 Deployment Mar-AprTesting: functionality, interoperability and database stressing May-SepSpec and deploy hardware for production database MayInternal throughput testing with Tier 1 disk servers JunCERN Service Challenge throughput testing Jul-SepCreate full production infrastructure; full deployment on Tier1 Sep-NovSpec and deploy second phase production hardware to provide full required capacity Apr 07Startup of LHC using CASTOR at RAL dCache available until early 2007

Glenn PatrickLHCb Software Week – 28 April Tier 1 Services LHCb VO Box available since January 2006: lcgvo0339.gridpp.rl.ac.uk Configuration/service certificate by Raja et al. LHCb VO Box DC06-02 requires database service supporting COOL and 3D in October. New hardware ordered: 4 servers – Dual AMD Opteron 250 (ATLAS and LHCb) 3.5TB storage array for both. Ref. Database workshop, RAL, 23 March

Glenn PatrickLHCb Software Week – 28 April UK CPU Allocations LHCb(UK) Tier 2 CPU (KSI2K) – Q22006 Q32006 Q LHCb(UK) Tier 1 CPU (KSI2K) – 2006 AprMayJunJulAugSepOctNovDec Overallocated – scale by 0.81 Underallocated

Glenn PatrickLHCb Software Week – 28 April T1 Experiment Shares ATLAS BaBar LHCb CMS LHCb

Glenn PatrickLHCb Software Week – 28 April UK Disk Allocations LHCb(UK) Tier 1 Disk (TB) – 2006 AprMayJunJulAugSepOctNovDec ??????? Deployment of new disk? If this fails, encouraged by GRIDPP to use Tier 2 disk! 32.2TB= 2.2TB stripped DST(DC06) + 30TB additional signal production

Glenn PatrickLHCb Software Week – 28 April UK Tape Allocations LHCb(UK) Tier 1 Tape (TB) – 2006 AprMayJunJulAugSepOctNovDec ? ? ??????? Sum(all experiments) 372TB Sum(all experiments) 623TB

Glenn PatrickLHCb Software Week – 28 April UK Tier 1 Status Total Available (April 2006) CPU=830 KSI2K (500 dual cpu) Disk =177 TB (60 servers) Tape=318 TB Total Available (Later in 2006) CPU=1,096 KSI2K Disk=308 TB Tape=446 TB LHCb Tier (1/6 share) CPU=222 KSI2K Disk=122 TB Tape=103 TB LHCb(UK) Tier 1 (April 2006) CPU=150 KSI2K Disk= 10 TB Tape= 20 TB

Glenn PatrickLHCb Software Week – 28 April The Future! GRIDPP1 Prototype Grid £17M, complete. September 2001 – August 2004 GRIDPP2 Production Grid £16M, ongoing. September 2004 – August 2007 GridPP3? GRIDPP3 “Exploitation” Grid? PPARC have just issued the call for a bid to cover the period To be submitted by 13 July. LHCb input required now – 2011