Tier1A Status Andrew Sansum 30 January 2003. Overview Systems Staff Projects.

Slides:



Advertisements
Similar presentations
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Advertisements

Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Status Report University of Bristol 3 rd GridPP Collaboration Meeting 14/15 February, 2002Marc Kelly University of Bristol 1 Marc Kelly University of Bristol.
Partner Logo Tier1/A and Tier2 in GridPP2 John Gordon GridPP6 31 January 2003.
Partner Logo UK GridPP Testbed Rollout John Gordon GridPP 3rd Collaboration Meeting Cambridge 15th February 2002.
Level 1 Components of the Project. Level 0 Goal or Aim of GridPP. Level 2 Elements of the components. Level 2 Milestones for the elements.
12th September 2002Tim Adye1 RAL Tier A Tim Adye Rutherford Appleton Laboratory BaBar Collaboration Meeting Imperial College, London 12 th September 2002.
13th November 2002Tim Adye1 RAL Tier A Status Tim Adye Rutherford Appleton Laboratory BaBar UK Collaboration Meeting University of Bristol 13 th November.
Andrew McNab - Manchester HEP - 22 April 2002 EU DataGrid Testbed EU DataGrid Software releases Testbed 1 Job Lifecycle Authorisation at your site More.
Martin Bly RAL CSF Tier 1/A RAL Tier 1/A Status HEPiX-HEPNT NIKHEF, May 2003.
Tier1A Status Andrew Sansum GRIDPP 8 23 September 2003.
Martin Bly RAL Tier1/A RAL Tier1/A Site Report HEPiX-HEPNT Vancouver, October 2003.
S.L.LloydGridPP CB 29 Oct 2002Slide 1 Agenda 1.Introduction – Steve Lloyd 2.Minutes of Previous Meeting (23 Oct 2001) 3.Matters Arising 4.Project Leader's.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
INFN Testbed status report L. Gaido WP6 meeting CERN - October 30th, 2002.
Gareth Smith RAL PPD HEP Sysman. April 2003 RAL Particle Physics Department Site Report.
EU funding for DataGrid under contract IST is gratefully acknowledged GridPP Tier-1A Centre CCLRC provides the GRIDPP collaboration (funded.
Tier-1 Overview Andrew Sansum 21 November Overview of Presentations Morning Presentations –Overview (Me) Not really overview – at request of Tony.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
CMS Report – GridPP Collaboration Meeting VIII Peter Hobson, Brunel University22/9/2003 CMS Applications Progress towards GridPP milestones Data management.
Dave Newbold, University of Bristol24/6/2003 CMS MC production tools A lot of work in this area recently! Context: PCP03 (100TB+) just started Short-term.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
Quarterly report SouthernTier-2 Quarter P.D. Gronbech.
US ATLAS Western Tier 2 Status and Plan Wei Yang ATLAS Physics Analysis Retreat SLAC March 5, 2007.
Andrew McNab - Manchester HEP - 5 July 2001 WP6/Testbed Status Status by partner –CNRS, Czech R., INFN, NIKHEF, NorduGrid, LIP, Russia, UK Security Integration.
Tony Doyle GridPP – From Prototype To Production, GridPP10 Meeting, CERN, 2 June 2004.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
Robin Middleton RAL/PPD DG Co-ordination Rome, 23rd June 2001.
1 st EGEE Conference – April UK and Ireland Partner Dave Kant Deputy ROC Manager.
Tier1 Status Report Martin Bly RAL 27,28 April 2005.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
CMS Report – GridPP Collaboration Meeting V Peter Hobson, Brunel University16/9/2002 CMS Status and Plans Progress towards GridPP milestones Workload management.
28 April 2003Imperial College1 Imperial College Site Report HEP Sysman meeting 28 April 2003.
21 st October 2002BaBar Computing – Stephen J. Gowdy 1 Of 25 BaBar Computing Stephen J. Gowdy BaBar Computing Coordinator SLAC 21 st October 2002 Second.
SLAC Site Report Chuck Boeheim Assistant Director, SLAC Computing Services.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
Tony Doyle - University of GlasgowOutline EDG LCG GSC UK Core Grid GridPP2 EGEE Where do we go from here? Operations.
Southgrid Technical Meeting Pete Gronbech: 26 th August 2005 Oxford.
Steve Traylen PPD Rutherford Lab Grid Operations PPD Christmas Lectures Steve Traylen RAL Tier1 Grid Deployment
19th September 2003Tim Adye1 RAL Tier A Status Tim Adye Rutherford Appleton Laboratory BaBar UK Collaboration Meeting Royal Holloway 19 th September 2003.
GridPP Presentation to AstroGrid 13 December 2001 Steve Lloyd Queen Mary University of London.
25th October 2006Tim Adye1 RAL Tier A Tim Adye Rutherford Appleton Laboratory BaBar UK Physics Meeting Queen Mary, University of London 25 th October 2006.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
London Tier 2 Status Report GridPP 11, Liverpool, 15 September 2004 Ben Waugh on behalf of Owen Maroney.
Dave Newbold, University of Bristol8/3/2001 UK Testbed 0 Sites Sites that have committed to TB0: RAL (R) Birmingham (Q) Bristol (Q) Edinburgh (Q) Imperial.
…building the next IT revolution From Web to Grid…
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Tier1 Andrew Sansum GRIDPP 10 June GRIDPP10 June 2004Tier1A2 Production Service for HEP (PPARC) GRIDPP ( ). –“ GridPP will enable testing.
Presenter Name Facility Name UK Testbed Status and EDG Testbed Two. Steve Traylen GridPP 7, Oxford.
2-Sep-02Steve Traylen, RAL WP6 Test Bed Report1 RAL and UK WP6 Test Bed Report Steve Traylen, WP6
UK Tier 1 Centre Glenn Patrick LHCb Software Week, 28 April 2006.
Andrew McNab - Manchester HEP - 17 September 2002 UK Testbed Deployment Aim of this talk is to the answer the questions: –“How much of the Testbed has.
DataTAG Work Package 4 Meeting Bologna Simone Ludwig Brunel University 23rd and 24th of May 2002.
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
Tier-1 Andrew Sansum Deployment Board 12 July 2007.
Tier1A Status Martin Bly 28 April CPU Farm Older hardware: –108 dual processors (450, 600 and 1GHz) –156 dual processor 1400MHz PIII Recent delivery:
15-Feb-02Steve Traylen, RAL WP6 Test Bed Report1 RAL/UK WP6 Test Bed Report Steve Traylen, WP6 PPGRID/RAL, UK
The DataGrid Project NIKHEF, Wetenschappelijke Jaarvergadering, 19 December 2002
Dave Newbold, University of Bristol14/8/2001 Testbed 1 What is it? First deployment of DataGrid middleware tools The place where we find out if it all.
The status of IHEP Beijing Site WLCG Asia-Pacific Workshop Yaodong CHENG IHEP, China 01 December 2006.
18/12/03PPD Christmas Lectures 2003 Grid in the Department A Guide for the Uninvolved PPD Computing Group Christmas Lecture 2003 Chris Brew.
UK GridPP Tier-1/A Centre at CLRC
UK Testbed Status Testbed 0 GridPP project Experiments’ tests started
Presentation transcript:

Tier1A Status Andrew Sansum 30 January 2003

Overview Systems Staff Projects

Lots of Services DISK FARM CPU FARM CDF Babar Suns TESTBEDS Core Services AFS Datastore Support Systems

Lots of Operating Systems Production Farm –Redhat 6.2 (Close to end of life) –Redhat 7.2 (In production/ Babar) –Redhat 7.3 (close to Trial Service: For LHC) CDF Service –Redhat 7.1 (Kerberised Fermi Distribution) –Redhat 7.3 (Possible Future release) Solaris Service –Solaris 2.6/Solaris 8 EDG Testbed(s) - Redhat 6.2 -> Redhat 7.3

Lots of EDG Testbeds! Production Testbed (CE, SE, 3*WN+NM) Development Testbed (CE, SE, 1*WN) RGMA Testbed (CE, SE, WN and RB) WP5 SE WP3/WP5 development systems EDG UI CE for REDhat 7.2 service

Lots of Grid Testbeds! Tier1A Babar

New Hardware Disk –Expect 40TB –Continue with existing IDE technology, but different manufacturer. CPU –Expect 100 CPUs –Move to Pentium 4 or possible AMD

Some New Staff GridPP Staff: Traylen, Radden, Bly ESC/PPD System Staff: Wheeler, White, Sansum, Saunders, Ross, Folkes, Strong Management: Kelsey, Gordon, Sansum,... BITD Support: Networking, Operations, User Reg, AFS Experiment Support Staff (RAL and elsewhere) Users

Lots of New Projects Basic fabric performance monitoring (ganglia) Resource CPU accounting (based on PBS accounts/mysql) New CA in production New batch scheduler (MAUI) Deploy new helpdesk (end March) Network Performance tests (CERN/Bristol - also maybe WP7) Get ready for LCG (February deployment?)

Ganglia Monitoring Urgently needed live performance and utilisation monitoring –RAL Ganglia Monitoring (live)RAL Ganglia Monitoring (live) –RAL Ganglia Monitoring (Static)RAL Ganglia Monitoring (Static) Scalable solution based on multicast Very rapidly deployable - reasonable support on all Tier1A Hardware See:

New CA Deployed Now fully deployed by E-Science Centre (Jens+Alastair Mills) In use in UK core GRID Several PP have RA’s defined Approved by EDG - not yet in distribution. Once in EDG - termination date for old CA will be set.

New Scheduler (MAUI) With Redhat 7.2 now using MAUI Scheduler over PBS Some problems with MAUI scheduling on wallclock time - now corrected. Testing algorithms, but essentially have a range of strategies we can apply. Will make changes to queue structure in due course

New Helpdesk Software Old helpdesk (Remedy) - mail based, unfriendly. With additional staff, urgently need to deploy new solution. Expect new system to be based on free software (Bugzilla, Request Tracker …) Hope that deployed system will also meet needs of Testbed and Tier 2 sites. Expect deployment by end of March.

Network Performance Tests Simon Metson, Nick White, +…. Preparing for CMS production. Must be able to move data to CERN at Mbit/second. Currently aggregate 350Mbit/s to Bristol - but under 100Mbit/s to CERN. Main problem seems to be within CMS infrastructure

BaBar Batch CPU Use at RAL MOU

Successes (2002) Five additional staff online since January Fully engaged in EDG testbed. Making an impact in EDG: Steve Tier1A installation went very well in March/April/May Tier A service ramp up excellent: –Most successful of the Tier A services. SLAC seem pleased - so far.

Challenges Complete 2002/2003 tender/deployment Carry out major EU tenders for 2003/2004 Expand use of Tier 1 Need to evolve strategy to cope with diversity of requirements Deploy the LCG Testbed (What/When?) Enhance automation / out of hours cover Improve reporting to GridPP - accountability