RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.

Slides:



Advertisements
Similar presentations
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Advertisements

Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH Home server AFS using openafs 3 DB servers. Web server AFS Mail Server.
Andrew McNab - Manchester HEP - 22 April 2002 EU DataGrid Testbed EU DataGrid Software releases Testbed 1 Job Lifecycle Authorisation at your site More.
LHCb(UK) Meeting Glenn Patrick1 LHCb Grid Activities in UK LHCb(UK) Meeting Cambridge, 10th January 2001 Glenn Patrick (RAL)
24-Apr-03UCL HEP Computing Status April DESKTOPS LAPTOPS BATCH PROCESSING DEDICATED SYSTEMS GRID MAIL WEB WTS SECURITY SOFTWARE MAINTENANCE BACKUP.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
Martin Bly RAL CSF Tier 1/A RAL Tier 1/A Status HEPiX-HEPNT NIKHEF, May 2003.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
 Changes to sources of funding for computing in the UK.  Past and present computing resources.  Future plans for computing developments. UK Status &
11 Dec 2000F Harris Datagrid Testbed meeting at Milan 1 LHCb ‘use-case’ - distributed MC production
EU funding for DataGrid under contract IST is gratefully acknowledged GridPP Tier-1A Centre CCLRC provides the GRIDPP collaboration (funded.
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
Fabric Management for CERN Experiments Past, Present, and Future Tim Smith CERN/IT.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
08/06/00 LHCb(UK) Meeting Glenn Patrick LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick Central UK Computing (what.
April 2001HEPix/HEPNT1 RAL Site Report John Gordon CLRC, UK.
Andrew McNab - Manchester HEP - 5 July 2001 WP6/Testbed Status Status by partner –CNRS, Czech R., INFN, NIKHEF, NorduGrid, LIP, Russia, UK Security Integration.
9/16/2000Ian Bird/JLAB1 Planning for JLAB Computational Resources Ian Bird.
3 Sept 2001F HARRIS CHEP, Beijing 1 Moving the LHCb Monte Carlo production system to the GRID D.Galli,U.Marconi,V.Vagnoni INFN Bologna N Brook Bristol.
Design & Management of the JLAB Farms Ian Bird, Jefferson Lab May 24, 2001 FNAL LCCWS.
November 7, 2001Dutch Datagrid SARA 1 DØ Monte Carlo Challenge A HEP Application.
Farm Management D. Andreotti 1), A. Crescente 2), A. Dorigo 2), F. Galeazzi 2), M. Marzolla 3), M. Morandin 2), F.
Jean-Yves Nief CC-IN2P3, Lyon HEPiX-HEPNT, Fermilab October 22nd – 25th, 2002.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
Cosener’s House – 30 th Jan’031 LHCb Progress & Plans Nick Brook University of Bristol News & User Plans Technical Progress Review of deliverables.
Robin Middleton RAL/PPD DG Co-ordination Rome, 23rd June 2001.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
28 April 2003Imperial College1 Imperial College Site Report HEP Sysman meeting 28 April 2003.
21 st October 2002BaBar Computing – Stephen J. Gowdy 1 Of 25 BaBar Computing Stephen J. Gowdy BaBar Computing Coordinator SLAC 21 st October 2002 Second.
6/26/01High Throughput Linux Clustering at Fermilab--S. Timm 1 High Throughput Linux Clustering at Fermilab Steven C. Timm--Fermilab.
Laboratório de Instrumentação e Física Experimental de Partículas GRID Activities at LIP Jorge Gomes - (LIP Computer Centre)
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
Status of the LHCb MC production system Andrei Tsaregorodtsev, CPPM, Marseille DataGRID France workshop, Marseille, 24 September 2002.
Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH OS X Home server AFS using openafs 3 DB servers Kerberos 4 we will move.
WP8 Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid WP8 Meeting, 16th November 2000 Glenn Patrick (RAL)
Quick Introduction to NorduGrid Oxana Smirnova 4 th Nordic LHC Workshop November 23, 2001, Stockholm.
The GRID and the Linux Farm at the RCF HEPIX – Amsterdam HEPIX – Amsterdam May 19-23, 2003 May 19-23, 2003 A. Chan, R. Hogue, C. Hollowell, O. Rind, A.
JLAB Computing Facilities Development Ian Bird Jefferson Lab 2 November 2001.
3-Nov-00D.P.Kelsey, HEPiX, JLAB1 Certificates for DataGRID David Kelsey CLRC/RAL, UK
Dave Newbold, University of Bristol8/3/2001 UK Testbed 0 Sites Sites that have committed to TB0: RAL (R) Birmingham (Q) Bristol (Q) Edinburgh (Q) Imperial.
…building the next IT revolution From Web to Grid…
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Tier1 Andrew Sansum GRIDPP 10 June GRIDPP10 June 2004Tier1A2 Production Service for HEP (PPARC) GRIDPP ( ). –“ GridPP will enable testing.
Presenter Name Facility Name UK Testbed Status and EDG Testbed Two. Steve Traylen GridPP 7, Oxford.
RAL Site report John Gordon ITD October 1999
RAL Site Report John Gordon HEPiX/HEPNT Catania 17th April 2002.
UK Grid Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid Prototype and Globus Technical Meeting QMW, 22nd November 2000 Glenn Patrick (RAL)
Grid Glasgow Outline LHC Computing at a Glance Glasgow Starting Point LHC Computing Challenge CPU Intensive Applications Timeline ScotGRID.
2-Sep-02Steve Traylen, RAL WP6 Test Bed Report1 RAL and UK WP6 Test Bed Report Steve Traylen, WP6
UK Tier 1 Centre Glenn Patrick LHCb Software Week, 28 April 2006.
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
Status of the Bologna Computing Farm and GRID related activities Vincenzo M. Vagnoni Thursday, 7 March 2002.
International Workshop on HEP Data Grid Aug 23, 2003, KNU Status of Data Storage, Network, Clustering in SKKU CDF group Intae Yu*, Joong Seok Chae Department.
RHIC/US ATLAS Tier 1 Computing Facility Site Report Christopher Hollowell Physics Department Brookhaven National Laboratory HEPiX Upton,
W.A.Wojcik/CCIN2P3, Nov 1, CCIN2P3 Site report Wojciech A. Wojcik IN2P3 Computing Center URL:
RAL PPD Tier 2 (and stuff) Site Report Rob Harper HEP SysMan 30 th June
MC Production in Canada Pierre Savard University of Toronto and TRIUMF IFC Meeting October 2003.
15-Feb-02Steve Traylen, RAL WP6 Test Bed Report1 RAL/UK WP6 Test Bed Report Steve Traylen, WP6 PPGRID/RAL, UK
A UK Computing Facility John Gordon RAL October ‘99HEPiX Fall ‘99 Data Size Event Rate 10 9 events/year Storage Requirements (real & simulated data)
Hans Wenzel CDF CAF meeting October 18 th -19 th CMS Computing at FNAL Hans Wenzel Fermilab  Introduction  CMS: What's on the floor, How we got.
Oct. 6, 1999PHENIX Comp. Mtg.1 CC-J: Progress, Prospects and PBS Shin’ya Sawada (KEK) For CCJ-WG.
The RAL PPD Tier 2/3 Current Status and Future Plans or “Are we ready for next year?” Chris Brew PPD Christmas Lectures th December 2007.
10/18/01Linux Reconstruction Farms at Fermilab 1 Steven C. Timm--Fermilab.
June 2000 Globus UK Workshop R. Hughes-Jones Globus Current and Future Organizational / Management uHow do we keep informed of work in UK / HEP? èSimple.
PC Farms & Central Data Recording
SAM at CCIN2P3 configuration issues
UK GridPP Tier-1/A Centre at CLRC
Short to middle term GRID deployment plan for LHCb
Presentation transcript:

RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000

July 2000EU DataGrid Outline NT Farm Linux Farm and servers Data Storage Networking Other Facilities G***

July 2000EU DataGrid NT Farm System –LSF Batch CPUs: 18 (450MHz) + 9 (200MHz) –MS Terminal Server front-end - dual 450MHz Use - LHCb Simulation –Event generation & reconstruction –500k + assorted events already generated –Feb 2001 targets: Generation - 250k B-Bbar inclusive + 250k minimum-bias Reconstruction with min-bias event overlay Future LHCb work –GRID/Globus emphasis –use Linux facilities at CERN, RAL (CSF) + others

July 2000EU DataGrid Linux Production Farm 60 dual 450 and 600MHz –add another 40 duals before end of 2000 (>2x) PBS Kickstart installation autorpm Hardware health monitoring No network boot disk and memory an issue no racked systems - boxes of shelves

July 2000EU DataGrid Level of Operating System All RH6.1 but BaBar want RH6.2 and CDF want FNAL Linux No problem yet but creates more work

July 2000EU DataGrid Kickstart very good for installation but results in many kickstart files –maintenance problem need a configuration tool –remembers changes and produces kickstart file –applies new changes to multiple kickstart files.

July 2000EU DataGrid Memory & Disk Allocation Memory (256MB/dual) –Starting to get jobs that use more than 50% of this. Local Disk (10GB and 20GB scratch space) –Some jobs starting to need >5GB Both of these are easily solved by new machines –but we don’t scrap or upgrade the old machines Need a batch environment that matches jobs to capability of machines - PBS?

July 2000EU DataGrid Data Storage Currently IBM 3590 tapes in 3494 Looking at capacity increase –for new experiments (BaBar, CDF, SNO) –for LHC and Datagrid Have considered IBM, STK, and ADIC No conclusions yet

July 2000EU DataGrid Networking WAN - SuperJANET 4 - 2Gb backbone1Q2001 –10Gb in 2003 or sooner –622Mb to RAL LAN - upgrade from FDDI to support above –grid of interconnected switches with multiple Gb –Gb ethernet for servers and switches. –Fibre and copper

July 2000EU DataGrid Special Services BaBar –6TB of disk at RAL + more at 9 UK univs(1999) –4xSun420R (16 cpus) at RAL, farms at Univs(2000) CDF –Disk (RAL & univs) and farms at universities –Ian McArthur’s talk Friday

July 2000EU DataGrid Grid Issues Certification –UK CA for HEP Datastore Access –Access to Atlas Datastore via Globus tools Network Performance –QoS and performance Metadata - resources & data –projects with BaBar and CDF and others and working with many other areas of science

July 2000EU DataGrid UK HEP Grid 4 or 5 sites will define themselves as a Grid today authentication via common CA at RAL GIIS - information service storing resource metadata from all sites user account database for easy registration and group authorisation Form basis of UK testbed for Datagrid

July 2000EU DataGrid Finally A year ago the problem seemed to be managing large numbers of roughly homogenous systems for cpu and storage –tools like kickstart seemed to offer hope In the last six months demand has risen for numbers of administratively distinct systems –Datagrid testbed, grid development systems, LDAP servers, data portals, user database servers, network monitors, QoS testbeds –They may all have Linux in common but they have many different software and management requirements All increasing demands on staff