25th October 2006Tim Adye1 RAL Tier A Tim Adye Rutherford Appleton Laboratory BaBar UK Physics Meeting Queen Mary, University of London 25 th October 2006.

Slides:



Advertisements
Similar presentations
Your university or experiment logo here BaBar Status Report Chris Brew GridPP16 QMUL 28/06/2006.
Advertisements

RAL Tier1: 2001 to 2011 James Thorne GridPP th August 2007.
B A B AR and the GRID Roger Barlow for Fergus Wilson GridPP 13 5 th July 2005, Durham.
Level 1 Components of the Project. Level 0 Goal or Aim of GridPP. Level 2 Elements of the components. Level 2 Milestones for the elements.
12th September 2002Tim Adye1 RAL Tier A Tim Adye Rutherford Appleton Laboratory BaBar Collaboration Meeting Imperial College, London 12 th September 2002.
13th November 2002Tim Adye1 RAL Tier A Status Tim Adye Rutherford Appleton Laboratory BaBar UK Collaboration Meeting University of Bristol 13 th November.
GridPP News NeSC opening “Media” dissemination Tier 1/A hardware Web pages Collaboration meetings Nick Brook University of Bristol.
Status GridKa & ALICE T2 in Germany Kilian Schwarz GSI Darmstadt.
22nd January 2003Tim Adye1 Summary of Bookkeeping discussions at RAL Workshop Tim Adye Rutherford Appleton Laboratory Kanga Phone Meeting 22 nd January.
ATLAS Tier-3 in Geneva Szymon Gadomski, Uni GE at CSCS, November 2009 S. Gadomski, ”ATLAS T3 in Geneva", CSCS meeting, Nov 091 the Geneva ATLAS Tier-3.
S.L.LloydGridPP CB 29 Oct 2002Slide 1 Agenda 1.Introduction – Steve Lloyd 2.Minutes of Previous Meeting (23 Oct 2001) 3.Matters Arising 4.Project Leader's.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
BaBar Computing Gregory Dubois-Felsmann, SLAC BaBar Computing Coordinator SLAC DOE High Energy Physics Program Review 7 June 2006.
Ian M. Fisk Fermilab February 23, Global Schedule External Items ➨ gLite 3.0 is released for pre-production in mid-April ➨ gLite 3.0 is rolled onto.
AMOD Report Doug Benjamin Duke University. Hourly Jobs Running during last week 140 K Blue – MC simulation Yellow Data processing Red – user Analysis.
Large scale data flow in local and GRID environment V.Kolosov, I.Korolko, S.Makarychev ITEP Moscow.
London Tier 2 Status Report GridPP 12, Brunel, 1 st February 2005 Owen Maroney.
25 February 2000Tim Adye1 Using an Object Oriented Database to Store BaBar's Terabytes Tim Adye Particle Physics Department Rutherford Appleton Laboratory.
US ATLAS Western Tier 2 Status and Plan Wei Yang ATLAS Physics Analysis Retreat SLAC March 5, 2007.
Jean-Yves Nief CC-IN2P3, Lyon HEPiX-HEPNT, Fermilab October 22nd – 25th, 2002.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
8th November 2002Tim Adye1 BaBar Grid Tim Adye Particle Physics Department Rutherford Appleton Laboratory PP Grid Team Coseners House 8 th November 2002.
Nick Brook Current status Future Collaboration Plans Future UK plans.
LCG Service Challenge Phase 4: Piano di attività e impatto sulla infrastruttura di rete 1 Service Challenge Phase 4: Piano di attività e impatto sulla.
Organisation Management and Policy Group (MPG): Responsible for setting and policy decisions and resolving any issues concerning fractional usage, acceptable.
Andrew McNabNorthGrid, GridPP8, 23 Sept 2003Slide 1 NorthGrid Status Andrew McNab High Energy Physics University of Manchester.
GridPP3 project status Sarah Pearce 14 April 2010 GridPP24 RHUL.
Project Management Sarah Pearce 3 September GridPP21.
Sejong STATUS Chang Yeong CHOI CERN, ALICE LHC Computing Grid Tier-2 Workshop in Asia, 1 th December 2006.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
LCG Phase 2 Planning Meeting - Friday July 30th, 2004 Jean-Yves Nief CC-IN2P3, Lyon An example of a data access model in a Tier 1.
19th September 2003Tim Adye1 RAL Tier A Status Tim Adye Rutherford Appleton Laboratory BaBar UK Collaboration Meeting Royal Holloway 19 th September 2003.
London Tier 2 Status Report GridPP 11, Liverpool, 15 September 2004 Ben Waugh on behalf of Owen Maroney.
ATLAS: Heavier than Heaven? Roger Jones Lancaster University GridPP19 Ambleside 28 August 2007.
User Board Input Tier Storage Review 21 November 2008 Glenn Patrick Rutherford Appleton Laboratory.
Outline: Tasks and Goals The analysis (physics) Resources Needed (Tier1) A. Sidoti INFN Pisa.
BaBar and the Grid Roger Barlow Dave Bailey, Chris Brew, Giuliano Castelli, James Werner, Fergus Wilson and Will Roethel GridPP18 Glasgow March 20 th 2007.
Tier1A Status Andrew Sansum 30 January Overview Systems Staff Projects.
26 September 2000Tim Adye1 Data Distribution Tim Adye Rutherford Appleton Laboratory BaBar Collaboration Meeting 26 th September 2000.
2-Dec Offline Report Matthias Schröder Topics: Scientific Linux Fatmen Monte Carlo Production.
UK Tier 1 Centre Glenn Patrick LHCb Software Week, 28 April 2006.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.
11th November 2002Tim Adye1 Distributed Analysis in the BaBar Experiment Tim Adye Particle Physics Department Rutherford Appleton Laboratory University.
Status report of the KLOE offline G. Venanzoni – LNF LNF Scientific Committee Frascati, 9 November 2004.
11th April 2003Tim Adye1 RAL Tier A Status Tim Adye Rutherford Appleton Laboratory BaBar UK Collaboration Meeting Liverpool 11 th April 2003.
BaBar and the GRID Tim Adye CLRC PP GRID Team Meeting 3rd May 2000.
LCG WLCG Accounting: Update, Issues, and Plans John Gordon RAL Management Board, 19 December 2006.
LCG Accounting Update John Gordon, CCLRC-RAL WLCG Workshop, CERN 24/1/2007 LCG.
Tier1A Status Martin Bly 28 April CPU Farm Older hardware: –108 dual processors (450, 600 and 1GHz) –156 dual processor 1400MHz PIII Recent delivery:
15 December 2000Tim Adye1 Data Distribution Tim Adye Rutherford Appleton Laboratory BaBar Collaboration Meeting 15 th December 2000.
Computing Operations Report 29 Jan – 7 June 2015 Stefan Roiser NCB 8 June 2015.
Your university or experiment logo here User Board Glenn Patrick GridPP20, 11 March 2008.
The RAL Tier-1 and the 3D Deployment Andrew Sansum 3D Meeting 22 March 2006.
Data transfers and storage Kilian Schwarz GSI. GSI – current storage capacities vobox LCG RB/CE GSI batchfarm: ALICE cluster (67 nodes/480 cores for batch.
Status of GSDC, KISTI Sang-Un Ahn, for the GSDC Tier-1 Team
STFC in INDIGO DataCloud WP3 INDIGO DataCloud Kickoff Meeting Bologna April 2015 Ian Collier
LCG Accounting Update John Gordon, CCLRC-RAL 10/1/2007.
November 28, 2007 Dominique Boutigny – CC-IN2P3 CC-IN2P3 Update Status.
Apr. 25, 2002Why DØRAC? DØRAC FTFM, Jae Yu 1 What do we want DØ Regional Analysis Centers (DØRAC) do? Why do we need a DØRAC? What do we want a DØRAC do?
CDF ICRB Meeting January 24, 2002 Italy Analysis Plans Stefano Belforte - INFN Trieste1 Strategy and present hardware Combine scattered Italian institutions.
11th September 2002Tim Adye1 BaBar Experience Tim Adye Rutherford Appleton Laboratory PPNCG Meeting Brighton 11 th September 2002.
UK Status and Plans Catalin Condurache – STFC RAL ALICE Tier-1/Tier-2 Workshop University of Torino, February 2015.
BaBar & Grid Eleonora Luppi for the BaBarGrid Group TB GRID Bologna 15 febbraio 2005.
18/12/03PPD Christmas Lectures 2003 Grid in the Department A Guide for the Uninvolved PPD Computing Group Christmas Lecture 2003 Chris Brew.
Update on Plan for KISTI-GSDC
Luca dell’Agnello INFN-CNAF
Universita’ di Torino and INFN – Torino
Kanga Tim Adye Rutherford Appleton Laboratory Computing Plenary
Presentation transcript:

25th October 2006Tim Adye1 RAL Tier A Tim Adye Rutherford Appleton Laboratory BaBar UK Physics Meeting Queen Mary, University of London 25 th October 2006

Tim Adye2 Outline CPU Usage CPU Allocations Disk Status The bleak future Summary

25th October 2006Tim Adye3 BaBar Batch CPU Use at RAL

25th October 2006Tim Adye4 BaBar Batch Users at RAL (running at least one non-trivial job each week)

25th October 2006Tim Adye5 CPU Allocations CPU Allocation (MAUI fairshare target) CPU Usage (MAUI fairshare usage)

BaBar CPU Allocation and Usage Farm Capacity

25th October 2006Tim Adye7 Requests and Allocations BaBar Jan06 Request (MoU) BaBar Mar06 Request (after Tau/QED -> SLAC) GridPP Allocation End ofDisk (TB) CPU (kSI2k) Disk (TB) CPU (kSI2k) Tape (TB) Tape bandwidth (MB/s) Disk (TB) CPU (kSI2k) 05 Q Q Q Q Q

25th October 2006Tim Adye8 Data and Storage Keeping up-to-date with new production Uses disk space freed up by:- Tau/QED skims removed in February Converted R18b pointer skims to R18c deep-copy Removed AllEvents in August All old files still accessible from tape Except old SP5/SP6 generics, now deleted from tape Currently problems with user data disk /stage/babar-user1 offline since 13 Oct Recovering the data going slowly – hope to be done by the end of the week Three 1.9 TB AWG disks /stage/babar-awg1/Quasi2body (was TauQED) /stage/babar-awg2/Quasi2body /stage/babar-awg3/ThreeBody

25th October 2006Tim Adye9 Three bullets we try to dodge 1.BaBar disk, tape, and CPU requirements increase with luminosity. No change in GridPP allocations Jan06 to Dec08. 2.PPGP cuts removed BaBar/RAL support staff 1.5 FTE -> 0.25 FTE in April 2007 No effort to import data, releases, help users, etc identified 21 BaBar-specific tasks needed to keep Tier A running 3.GridPP proposal to remove non-Grid access by September 2007 No RAL front-ends, no NFS access to user/AWG disks Continued BaBar user analysis probably impossible SP and/or skimming might still be possible via the Grid

25th October 2006Tim Adye10 Summary We are making good use of the resources we have Apart from an (understandable) lull over the summer The service works well most of the time Current disk problem is severe, but rare We are fighting hard for 1.the resources we need 2.the staff we need 3.the non-Grid access we need