RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.

Slides:



Advertisements
Similar presentations
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Advertisements

SCARF Duncan Tooke RAL HPCSG. Overview What is SCARF? Hardware & OS Management Software Users Future.
A couple of slides on RAL PPD Chris Brew CCLRC - RAL - SPBU - PPD.
Chris Brew RAL PPD Site Report Chris Brew SciTech/PPD.
Martin Bly RAL CSF Tier 1/A RAL Tier 1/A Status HEPiX-HEPNT NIKHEF, May 2003.
Tier1A Status Andrew Sansum GRIDPP 8 23 September 2003.
Martin Bly RAL Tier1/A RAL Tier1/A Site Report HEPiX-HEPNT Vancouver, October 2003.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
Linux Clustering A way to supercomputing. What is Cluster? A group of individual computers bundled together using hardware and software in order to make.
Institute for High Energy Physics ( ) NEC’2007 Varna, Bulgaria, September Activities of IHEP in LCG/EGEE.
Site Report HEPHY-UIBK Austrian federated Tier 2 meeting
SouthGrid Status Pete Gronbech: 4 th September 2008 GridPP 21 Swansea.
EU funding for DataGrid under contract IST is gratefully acknowledged GridPP Tier-1A Centre CCLRC provides the GRIDPP collaboration (funded.
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
Computing/Tier 3 Status at Panjab S. Gautam, V. Bhatnagar India-CMS Meeting, Sept 27-28, 2007 Delhi University, Delhi Centre of Advanced Study in Physics,
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
Online Systems Status Review of requirements System configuration Current acquisitions Next steps... Upgrade Meeting 4-Sep-1997 Stu Fuess.
UTA Site Report Jae Yu UTA Site Report 4 th DOSAR Workshop Iowa State University Apr. 5 – 6, 2007 Jae Yu Univ. of Texas, Arlington.
Dave Kant Grid Monitoring and Accounting Dave Kant CCLRC e-Science Centre, UK HEPiX at Brookhaven 18 th – 22 nd Oct 2004.
BINP/GCF Status Report BINP LCG Site Registration Oct 2009
GridKa SC4 Tier2 Workshop – Sep , Warsaw Tier2 Site.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
INDIACMS-TIFR Tier 2 Grid Status Report I IndiaCMS Meeting, April 05-06, 2007.
Martin Bly RAL Tier1/A RAL Tier1/A Report HepSysMan - July 2004 Martin Bly / Andrew Sansum.
Batch Scheduling at LeSC with Sun Grid Engine David McBride Systems Programmer London e-Science Centre Department of Computing, Imperial College.
Laboratório de Instrumentação e Física Experimental de Partículas GRID Activities at LIP Jorge Gomes - (LIP Computer Centre)
PDSF at NERSC Site Report HEPiX April 2010 Jay Srinivasan (w/contributions from I. Sakrejda, C. Whitney, and B. Draney) (Presented by Sandy.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
10/22/2002Bernd Panzer-Steindel, CERN/IT1 Data Challenges and Fabric Architecture.
HEPix April 2006 NIKHEF site report What’s new at NIKHEF’s infrastructure and Ramping up the LCG tier-1 Wim Heubers / NIKHEF (+SARA)
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
London Tier 2 Status Report GridPP 11, Liverpool, 15 September 2004 Ben Waugh on behalf of Owen Maroney.
Tier2 Centre in Prague Jiří Chudoba FZU AV ČR - Institute of Physics of the Academy of Sciences of the Czech Republic.
Martin Bly RAL Tier1/A Centre Preparations for the LCG Tier1 Centre at RAL LCG CERN 23/24 March 2004.
Tier1 Andrew Sansum GRIDPP 10 June GRIDPP10 June 2004Tier1A2 Production Service for HEP (PPARC) GRIDPP ( ). –“ GridPP will enable testing.
Dave Kant Monitoring ROC Workshop Milan 10-11/5/04.
HEP Computing Status Sheffield University Matt Robinson Paul Hodgson Andrew Beresford.
Gareth Smith RAL PPD RAL PPD Site Report. Gareth Smith RAL PPD RAL Particle Physics Department Overview About 90 staff (plus ~25 visitors) Desktops mainly.
UK Tier 1 Centre Glenn Patrick LHCb Software Week, 28 April 2006.
Rutherford Appleton Lab, UK VOBox Considerations from GridPP. GridPP DTeam Meeting. Wed Sep 13 th 2005.
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
CERN Computer Centre Tier SC4 Planning FZK October 20 th 2005 CERN.ch.
Tier 3 Status at Panjab V. Bhatnagar, S. Gautam India-CMS Meeting, July 20-21, 2007 BARC, Mumbai Centre of Advanced Study in Physics, Panjab University,
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPSYSMAN – RAL 10 th June 2010.
Tier-1 Andrew Sansum Deployment Board 12 July 2007.
RHIC/US ATLAS Tier 1 Computing Facility Site Report Christopher Hollowell Physics Department Brookhaven National Laboratory HEPiX Upton,
Office of Science U.S. Department of Energy NERSC Site Report HEPiX October 20, 2003 TRIUMF.
Tier1 Status Report Andrew Sansum Service Challenge Meeting 27 January 2004.
SA1 operational policy training, Athens 20-21/01/05 Presentation of the HG Node “Isabella” and operational experience Antonis Zissimos Member of ICCS administration.
Tier1A Status Martin Bly 28 April CPU Farm Older hardware: –108 dual processors (450, 600 and 1GHz) –156 dual processor 1400MHz PIII Recent delivery:
Ole’ Miss DOSAR Grid Michael D. Joy Institutional Analysis Center.
The RAL Tier-1 and the 3D Deployment Andrew Sansum 3D Meeting 22 March 2006.
Evangelos Markatos and Charalampos Gkikas FORTH-ICS Athens, th Mar Institute of Computer Science - FORTH Christos.
The RAL PPD Tier 2/3 Current Status and Future Plans or “Are we ready for next year?” Chris Brew PPD Christmas Lectures th December 2007.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
The UK National Grid Service Andrew Richards – CCLRC, RAL.
Tier2 Centre in Prague Jiří Chudoba FZU AV ČR - Institute of Physics of the Academy of Sciences of the Czech Republic.
CNAF - 24 September 2004 EGEE SA-1 SPACI Activity Italo Epicoco.
Bernd Panzer-Steindel CERN/IT/ADC1 Medium Term Issues for the Data Challenges.
INFN Site Report R.Gomezel October 9-13, 2006 Jefferson Lab, Newport News.
18/12/03PPD Christmas Lectures 2003 Grid in the Department A Guide for the Uninvolved PPD Computing Group Christmas Lecture 2003 Chris Brew.
Brief introduction about “Grid at LNS”
Grid Computing for the ILC
UK GridPP Tier-1/A Centre at CLRC
The National Grid Service
The National Grid Service Mike Mineter NeSC-TOE
QMUL Site Report by Dave Kant HEPSYSMAN Meeting /09/2019
Presentation transcript:

RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004

24 May 2004CCLRC E-Science Centre2 Overview GRIDPP Tier1 Service Particle Physics Department Tier2 Grid Operations Centre (GOC) Other E-Science Systems

24 May 2004CCLRC E-Science Centre3 Tier1 in GRIDPP2 ( ) The Tier-1 Centre will provide GRIDPP2 with a large computing resource of a scale and quality that can be categorised as an LCG Regional Computing Centre January 2004 – GRIDPP2 confirm RAL to host Tier1 Service –GRIDPP2 to commence September 2004 Tier1 Hardware budget: –£2.3M over 3 years Staff –Increase from 12.1 to 16.5 by September

24 May 2004CCLRC E-Science Centre4 Current Tier1 Hardware CPU –350 dual Processor Intel – PIII and Xeon servers mainly rack mounts –About 400KSI2K Disk Service – mainly “standard” configuration –Dual Processor Server –Dual channel SCSI interconnect –External IDE/SCSI RAID arrays (Accusys and Infortrend) –ATA drives (mainly Maxtor) –About 80TB disk –Cheap and (fairly) cheerful Tape Service –STK Powderhorn 9310 silo with B drives

24 May 2004CCLRC E-Science Centre5 Layout

24 May 2004CCLRC E-Science Centre6 New Hardware Arrives 7 th June CPU Capacity (500 KSI2K) –256 dual processor 2.8GHz Xeons –2/4GB Memory –120GB HDA Disk Capacity (140TB) –Infortrend SATA/SCSI RAID Arrays –16*250GB Western Digital SATA per array –Two arrays per server

24 May 2004CCLRC E-Science Centre7 Development Areas Storage architecture –70 Disk Servers by July. Management becoming harder –Wish to decouple storage hardware from Middleware (which middleware) –Still using ext2/ext3 – consider alternatives … ClusterFS etc. –Maybe (at last) need a SAN of some sort –Becoming interested in iSCSI (or maybe Fibre) Fabric Management –About 800 systems by July – LCG nodes managed by LCFG but most still managed using “simple” kickstart. This is getting harder. New LAN backbone needed soon –But too early for 10 Gigabit –Maybe new backbone router or switch stack for LAN depends on iSCSI plans Simplify cluster configuration. Clean up the spaghetti diagram of services and interfaces. Upgrade from Redhat 7.3 to Redhat Enteprise??????

24 May 2004CCLRC E-Science Centre8 RAL PP Tier2 Run by Particle Physics Department. Acts as a peer with other UK University Tier2 systems Currently 30 Nodes Running LCG2 Hardware upgrades expected each year –Additional 24 systems and 8TB disk in July –50 CPUs and 5TB disk each year

24 May 2004CCLRC E-Science Centre9 GRID Operations CCLRC is involved in Grid Operations for –LCG –GridPP –NGS –CCLRC –EGEE This means different things for different grids

24 May 2004CCLRC E-Science Centre10 Within the scope of LCG we are responsible for monitoring how the grid is running – who is up, who is down, and why Identifying Problems, Contact the Right People, Suggest Actions Providing scalable solutions to allow other people to monitor resources Manage site Information – definitive source of information Accounting – Aggregate Job Throughput (per Site, per VO) Established at CCLRC (RAL) Status of LCG2 Grid here: LCG GOC Monitoring

24 May 2004CCLRC E-Science Centre11 LCG2 CORE SITES Status: 12th May ~30 SITES

24 May 2004CCLRC E-Science Centre12 LCG Accounting Overview CE PBS/LSF Jobmanager Log GateKeeper Listens on port 2119 GRAM Authentication GIIS LDAP Information Server MON RGMA Database We have an accounting solution. The Accounting is provided by RGMA At each site, log-file data is processed from different sources and published into a local database.

24 May 2004CCLRC E-Science Centre13 LCG Accounting – How it Works GOC provides an interface to produce accounting plots “on-demand” Total Number of Jobs per VO per Site (ok) Total Number of Jobs per VO aggregated over all sites (to be done) Tailor plots according to the requirements of the user community ~ 1000 Alice Jobs Taipei Statistics Feb/Mar

24 May 2004CCLRC E-Science Centre14 Other Major Developments Major new scientific compute facilities: NGS: UK National Grid Service Storage Node –18TB SAN –40 CPUs –Myrinet and ORACLE SCARF: 128 Node Opteron Cluster (myrinet) –64bit Scientific Compute Service for CCLRC science Two more large 64bit Computational Chemistry service by Xmass. About 500KW equipment by Christmas Power and cooling currently under review