28 April 2003Imperial College1 Imperial College Site Report HEP Sysman meeting 28 April 2003.

Slides:



Advertisements
Similar presentations
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Advertisements

12th September 2002Tim Adye1 RAL Tier A Tim Adye Rutherford Appleton Laboratory BaBar Collaboration Meeting Imperial College, London 12 th September 2002.
Liverpool HEP – Site Report May 2007 John Bland, Robert Fay.
Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH Home server AFS using openafs 3 DB servers. Web server AFS Mail Server.
Birmingham site report Lawrie Lowe HEP System Managers Meeting, RAL,1 st July 2004.
UCL HEP Computing Status HEPSYSMAN, RAL,
24-Apr-03UCL HEP Computing Status April DESKTOPS LAPTOPS BATCH PROCESSING DEDICATED SYSTEMS GRID MAIL WEB WTS SECURITY SOFTWARE MAINTENANCE BACKUP.
RAL Particle Physics Dept. Site Report. Gareth Smith RAL PPD About 2 staff mainly on windows and general infrastructure About 1.5 staff on departmental.
A couple of slides on RAL PPD Chris Brew CCLRC - RAL - SPBU - PPD.
Report of Liverpool HEP Computing during 2007 Executive Summary. Substantial and significant improvements in the local computing facilities during the.
Chris Brew RAL PPD Site Report Chris Brew SciTech/PPD.
Martin Bly RAL CSF Tier 1/A RAL Tier 1/A Status HEPiX-HEPNT NIKHEF, May 2003.
Birmingham site report Lawrie Lowe: System Manager Yves Coppens: SouthGrid support HEP System Managers’ Meeting, RAL, May 2007.
Tier1A Status Andrew Sansum GRIDPP 8 23 September 2003.
Martin Bly RAL Tier1/A RAL Tier1/A Site Report HEPiX-HEPNT Vancouver, October 2003.
Optinuity Confidential. All rights reserved. C2O Configuration Requirements.
Title US-CMS User Facilities Vivian O’Dell US CMS Physics Meeting May 18, 2001.
CT NIKHEF June File server CT system support.
Gareth Smith RAL PPD HEP Sysman. April 2003 RAL Particle Physics Department Site Report.
Edinburgh Site Report 1 July 2004 Steve Thorn Particle Physics Experiments Group.
RHUL1 Site Report Royal Holloway Sukhbir Johal Simon George Barry Green.
14th April 1999Hepix Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
UCL Site Report Ben Waugh HepSysMan, 22 May 2007.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
27/04/05Sabah Salih Particle Physics Group The School of Physics and Astronomy The University of Manchester
US ATLAS Western Tier 2 Status and Plan Wei Yang ATLAS Physics Analysis Retreat SLAC March 5, 2007.
April 2001HEPix/HEPNT1 RAL Site Report John Gordon CLRC, UK.
October, Scientific Linux INFN/Trieste B.Gobbo – Compass R.Gomezel - T.Macorini - L.Strizzolo INFN - Trieste.
30-Jun-04UCL HEP Computing Status June UCL HEP Computing Status April DESKTOPS LAPTOPS BATCH PROCESSING DEDICATED SYSTEMS GRID MAIL WEB WTS.
Group Computing Strategy Introduction and BaBar Roger Barlow June 28 th 2005.
Site Report May 2006 RHUL Simon George Sukhbir Johal Royal Holloway, University of London, Egham, Surrey TW20 0EX HEP SYSMAN May 2006.
Paul Scherrer Institut 5232 Villigen PSI HEPIX_AMST / / BJ95 PAUL SCHERRER INSTITUT THE PAUL SCHERRER INSTITUTE Swiss Light Source (SLS) Particle accelerator.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
HEPiX/HEPNT TRIUMF,Vancouver 1 October 18, 2003 NIKHEF Site Report Paul Kuipers
Introduction to U.S. ATLAS Facilities Rich Baker Brookhaven National Lab.
March 2003 CERN 1 EDG and AliEn in Prague Dagmar Adamova INP Rez near Prague.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
Batch Scheduling at LeSC with Sun Grid Engine David McBride Systems Programmer London e-Science Centre Department of Computing, Imperial College.
6/26/01High Throughput Linux Clustering at Fermilab--S. Timm 1 High Throughput Linux Clustering at Fermilab Steven C. Timm--Fermilab.
23 April 2002HEP SYSMAN meeting1 Cambridge HEP Group - site report April 2002 John Hill.
Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH OS X Home server AFS using openafs 3 DB servers Kerberos 4 we will move.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
22nd March 2000HEPSYSMAN Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
October 2002 INFN Catania 1 The (LHCC) Grid Project Initiative in Prague Dagmar Adamova INP Rez near Prague.
2-3 April 2001HEPSYSMAN Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
1 PRAGUE site report. 2 Overview Supported HEP experiments and staff Hardware on Prague farms Statistics about running LHC experiment’s DC Experience.
Architecture and ATLAS Western Tier 2 Wei Yang ATLAS Western Tier 2 User Forum meeting SLAC April
London Tier 2 Status Report GridPP 11, Liverpool, 15 September 2004 Ben Waugh on behalf of Owen Maroney.
Manchester Site report Sabah Salih HEPP The University of Manchester UK HEP Tier3.
Site Report Bristol University HEP group March 2000Jean-Pierre Melot, Bristol2 The People 7 Academics Brian Foster, ZEUS spokesman Greg Heath,
Tier1 Andrew Sansum GRIDPP 10 June GRIDPP10 June 2004Tier1A2 Production Service for HEP (PPARC) GRIDPP ( ). –“ GridPP will enable testing.
Brunel University, School of Engineering and Design, Uxbridge, UB8 3PH, UK Henry Nebrensky (not a systems manager) SIRE Group.
Tier1A Status Andrew Sansum 30 January Overview Systems Staff Projects.
RAL Site report John Gordon ITD October 1999
HEP Computing Status Sheffield University Matt Robinson Paul Hodgson Andrew Beresford.
Gareth Smith RAL PPD RAL PPD Site Report. Gareth Smith RAL PPD RAL Particle Physics Department Overview About 90 staff (plus ~25 visitors) Desktops mainly.
2-Sep-02Steve Traylen, RAL WP6 Test Bed Report1 RAL and UK WP6 Test Bed Report Steve Traylen, WP6
DataTAG Work Package 4 Meeting Bologna Simone Ludwig Brunel University 23rd and 24th of May 2002.
Status of the Bologna Computing Farm and GRID related activities Vincenzo M. Vagnoni Thursday, 7 March 2002.
ClinicalSoftwareSolutions Patient focused.Business minded. Slide 1 Opus Server Architecture Fritz Feltner Sept 7, 2007 Director, IT and Systems Integration.
Macromolecular Crystallography Workshop 2004 Recent developments regarding our Computer Environment, Remote Access and Backup Options.
RAL PPD Tier 2 (and stuff) Site Report Rob Harper HEP SysMan 30 th June
Tier1A Status Martin Bly 28 April CPU Farm Older hardware: –108 dual processors (450, 600 and 1GHz) –156 dual processor 1400MHz PIII Recent delivery:
15-Feb-02Steve Traylen, RAL WP6 Test Bed Report1 RAL/UK WP6 Test Bed Report Steve Traylen, WP6 PPGRID/RAL, UK
The RAL PPD Tier 2/3 Current Status and Future Plans or “Are we ready for next year?” Chris Brew PPD Christmas Lectures th December 2007.
NL Service Challenge Plans
SAM at CCIN2P3 configuration issues
UK GridPP Tier-1/A Centre at CLRC
QMUL Site Report by Dave Kant HEPSYSMAN Meeting /09/2019
Presentation transcript:

28 April 2003Imperial College1 Imperial College Site Report HEP Sysman meeting 28 April 2003

Imperial College2 Outline Equipment –what we have Software –what we use Activities –what we do Problems –all of the above

28 April 2003Imperial College3 Equipment Unix Servers & Desktops –2 x (Sun E450 servers, solaris 8, ~ 1Tbyte) one JREI funded BaBar resource, one general looking old and expensive now compared to… –Linux PC based servers and desktops 2 PC based raid servers, ~ 1 Tbyte each 4 dual processor, 2GHz, rack mounted, RH 7.3 for general interactive/batch use. Various individual Linux desktops (some Grid).

28 April 2003Imperial College4 Equipment Linux PC farms –BaBar Linux PC farm, from JREI. Main analysis facility for IC BaBar group. 2 dual CPU masters, 40 dual CPU workers. PBS batch system, ~300 Gbytes per master. –CMS/Grid PC farm 5 master nodes, 440 Gbytes disk each. 40 worker nodes. All are 1GHz dual PIII with 1GB of RAM per CPU. –LeSC grid resources

28 April 2003Imperial College5 Equipment Windows Servers and Desktops –Windows 2000 server & backup server group W2K domain accounts profiles, home directories, experiment areas domain printer queues –W2K and XP desktop PCs (~ 70 machines) current default desktop environment MS Office, windows ssh, Exceed, … some PCs with specialist software, e.g. CAD.

28 April 2003Imperial College6 Solaris Group Server Sun Enterprise E450 running Solaris 2.8 Three 400MHz processors Two network interfaces –100 Mbit/s to original subnet –1 Gbit/s to farm subnet ~ 1Tbyte disk, of which 800Mbytes Raid Web, , user accounts,...

28 April 2003Imperial College7 Software Unix: Solaris 2.8, Redhat Linux 7.3 –no user software supported on Solaris. –the usual Linux s/w + whatever experiment specific software we need. –Linux version is tied to experiments. Windows server and desktops –College deal provides standard MS Office products for licensed windows PCs.

28 April 2003Imperial College8 Activities HEP programme –BaBar, CMS, DØ, LHCb, Zeus, dark matter, neutrino factory, detector development –Considerable MC production for the experimental programmes Grid developments –see separate slides… Desktop Office applications

28 April 2003Imperial College9 LHCb MC production

28 April 2003Imperial College10 DØ MC production

28 April 2003Imperial College11 Grid Developments We are a testbed node –with CE, 8 WN (dual 1GHz PIII ) and 1 SE with ~440GB We run a resource broker (RB) –used as one of the 4 production RBs (others at CERN, Lyon, CNAF). –It is also the GridPP and BaBar RB

28 April 2003Imperial College12 Grid Developments We took part in the CMS Grid Stress test before Christmas. We run a production quality (?) SAM station which automatically delivers the data required by our DØ members. Have gridified (part) of the BaBar Farm ( MHz PIII).

28 April 2003Imperial College13 e-Science (not Grid) We are now making heavy use of Viking at LeSC –(132 Dual 2GHz Xeon nodes currently... new procurement currently underway and another in ~6 months). We also use the HSM hosted by Saturn (24 processor E TB of Disk 24TB tape space). –Have found issues with time outs as data transferred from tape.

28 April 2003Imperial College14 Current Issues: Server Suns are getting old and not cost- effective. –BaBar Sun is out of warranty, Group server will be next year. –Maintenance cost on the RAIDs for the Suns is too expensive and the disks are expensive. –CPU maint. only, assign some part of RAID as spares.

28 April 2003Imperial College15 Current Issues: Desktops Do we stick with Windows for the standard desktop ? –College policy is for Windows Desktop they also want complete control over all aspects of S/W installed, PC purchase, networking, “standardised desktop”. –Increasingly users want Linux desktop especially Grid developers many have only infrequent need for Windows.

28 April 2003Imperial College16 Current Issues: Desktops Dual booting is unattractive –experimenting with Terminal Server software for “occasional” Windows users. Seems to work well, but need to clarify licensing situation. Probably OK for us. Have considered providing laptops –many people are using laptops already as their default desktop machine, advantages when travelling, e.g. on LTA.

28 April 2003Imperial College17 Current Issues: Security College firewall –moving to default “deny all” policy this year. maybe even ssh blocked unless registered. –Already causing some problems with recent blocks on all ports > 1026 ftp call backs etc. a problem for Kerberised ftp to FNAL; needs PASV mode. Tough for emacs. Some problems for grid apps.

28 April 2003Imperial College18 Current Issues: Security Some Grid developments are clashing with needs for secure systems. –edg software still needs obsolete RH 6.2 –Most of our Grid developers are really ex- HEP RAs, not SW professionals. We need to make sure they are not cutting corners on security and compromising the rest of our systems for expediency. Of course they all want root access.

28 April 2003Imperial College19