Brunel University, School of Engineering and Design, Uxbridge, UB8 3PH, UK Henry Nebrensky (not a systems manager) SIRE Group.

Slides:



Advertisements
Similar presentations
© University of Reading IT Services ITS Support for e­ Research Stephen Gough Assistant Director of IT Services 18 June 2008.
Advertisements

Liverpool HEP – Site Report May 2007 John Bland, Robert Fay.
Birmingham site report Lawrie Lowe HEP System Managers Meeting, RAL,1 st July 2004.
UCL HEP Computing Status HEPSYSMAN, RAL,
24-Apr-03UCL HEP Computing Status April DESKTOPS LAPTOPS BATCH PROCESSING DEDICATED SYSTEMS GRID MAIL WEB WTS SECURITY SOFTWARE MAINTENANCE BACKUP.
RAL Particle Physics Dept. Site Report. Gareth Smith RAL PPD About 2 staff mainly on windows and general infrastructure About 1.5 staff on departmental.
A couple of slides on RAL PPD Chris Brew CCLRC - RAL - SPBU - PPD.
Introduction to Physics IT Support. To learn about IT Support available with the Department of Physics, and across the University. To find out a little.
Birmingham site report Lawrie Lowe: System Manager Yves Coppens: SouthGrid support HEP System Managers’ Meeting, RAL, May 2007.
Duke Atlas Tier 3 Site Doug Benjamin (Duke University)
Grid Computing Reinhard Bischof ECFA-Meeting March 26 th 2004 Innsbruck.
Gareth Smith RAL PPD HEP Sysman. April 2003 RAL Particle Physics Department Site Report.
Edinburgh University Experimental Particle Physics Alasdair Earl PPARC eScience Summer School September 2002.
Edinburgh Site Report 1 July 2004 Steve Thorn Particle Physics Experiments Group.
IBIS System: Requirements and Components Lois M. Haggard Office of Public Health Assessment.
05/18/03Maurizio Davini Hepix2003 Department of Physics University of Pisa Site Report Maurizio Davini Department of Physics and INFN Pisa.
RHUL1 Site Report Royal Holloway Sukhbir Johal Simon George Barry Green.
14th April 1999Hepix Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
UCL Site Report Ben Waugh HepSysMan, 22 May 2007.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
15-Feb-02PvS Brunel Report, GridPP 3 Cambridge 1 Brunel University ECE Brunel Grid Activities Report Peter van Santen Distributed and Grid Computing Group.
27/04/05Sabah Salih Particle Physics Group The School of Physics and Astronomy The University of Manchester
April 2001HEPix/HEPNT1 RAL Site Report John Gordon CLRC, UK.
The SLAC Cluster Chuck Boeheim Assistant Director, SLAC Computing Services.
30-Jun-04UCL HEP Computing Status June UCL HEP Computing Status April DESKTOPS LAPTOPS BATCH PROCESSING DEDICATED SYSTEMS GRID MAIL WEB WTS.
David Hutchcroft on behalf of John Bland Rob Fay Steve Jones And Mike Houlden [ret.] * /.\ /..‘\ /'.‘\ /.''.'\ /.'.'.\ /'.''.'.\ ^^^[_]^^^ * /.\ /..‘\
Group Computing Strategy Introduction and BaBar Roger Barlow June 28 th 2005.
GridKa SC4 Tier2 Workshop – Sep , Warsaw Tier2 Site.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
HEPiX/HEPNT TRIUMF,Vancouver 1 October 18, 2003 NIKHEF Site Report Paul Kuipers
March 2003 CERN 1 EDG and AliEn in Prague Dagmar Adamova INP Rez near Prague.
HYDRA: Using Windows Desktop Systems in Distributed Parallel Computing Arvind Gopu, Douglas Grover, David Hart, Richard Repasky, Joseph Rinkovsky, Steve.
INDIACMS-TIFR Tier 2 Grid Status Report I IndiaCMS Meeting, April 05-06, 2007.
Batch Scheduling at LeSC with Sun Grid Engine David McBride Systems Programmer London e-Science Centre Department of Computing, Imperial College.
28 April 2003Imperial College1 Imperial College Site Report HEP Sysman meeting 28 April 2003.
HYDRA: Using Windows Desktop Systems in Distributed Parallel Computing Arvind Gopu, Douglas Grover, David Hart, Richard Repasky, Joseph Rinkovsky, Steve.
21 st October 2002BaBar Computing – Stephen J. Gowdy 1 Of 25 BaBar Computing Stephen J. Gowdy BaBar Computing Coordinator SLAC 21 st October 2002 Second.
Developing & Managing A Large Linux Farm – The Brookhaven Experience CHEP2004 – Interlaken September 27, 2004 Tomasz Wlodek - BNL.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH OS X Home server AFS using openafs 3 DB servers Kerberos 4 we will move.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
22nd March 2000HEPSYSMAN Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
October 2002 INFN Catania 1 The (LHCC) Grid Project Initiative in Prague Dagmar Adamova INP Rez near Prague.
2-3 April 2001HEPSYSMAN Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
Architecture and ATLAS Western Tier 2 Wei Yang ATLAS Western Tier 2 User Forum meeting SLAC April
1st July 2004HEPSYSMAN RAL - Oxford Site Report1 Oxford University Particle Physics Site Report Pete Gronbech Systems Manager.
HEPix April 2006 NIKHEF site report What’s new at NIKHEF’s infrastructure and Ramping up the LCG tier-1 Wim Heubers / NIKHEF (+SARA)
O AK R IDGE N ATIONAL L ABORATORY U.S. D EPARTMENT OF E NERGY Facilities and How They Are Used ORNL/Probe Randy Burris Dan Million – facility administrator.
London Tier 2 Status Report GridPP 11, Liverpool, 15 September 2004 Ben Waugh on behalf of Owen Maroney.
Manchester Site report Sabah Salih HEPP The University of Manchester UK HEP Tier3.
Rob Allan Daresbury Laboratory NW-GRID Training Event 25 th January 2007 Introduction to NW-GRID R.J. Allan CCLRC Daresbury Laboratory.
…building the next IT revolution From Web to Grid…
CEA DSM Irfu IRFU site report. CEA DSM Irfu HEPiX Fall 0927/10/ Computing centers used by IRFU people IRFU local computing IRFU GRIF sub site Windows.
Presenter Name Facility Name UK Testbed Status and EDG Testbed Two. Steve Traylen GridPP 7, Oxford.
HEP Computing Status Sheffield University Matt Robinson Paul Hodgson Andrew Beresford.
Gareth Smith RAL PPD RAL PPD Site Report. Gareth Smith RAL PPD RAL Particle Physics Department Overview About 90 staff (plus ~25 visitors) Desktops mainly.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE Site Architecture Resource Center Deployment Considerations MIMOS EGEE Tutorial.
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
CERN Computer Centre Tier SC4 Planning FZK October 20 th 2005 CERN.ch.
Brunel University, Department of Electronic and Computer Engineering, Uxbridge, UB8 3PH, UK Dr Peter R Hobson C.Phys M.Inst.P SIRE Group.
The Hungarian ClusterGRID Project Péter Stefán research associate NIIF/HUNGARNET
The Digital Portfolio Add samples of student work directly from the Gradebook into the Digital Portfolio Keep track of several different formats: Written.
A UK Computing Facility John Gordon RAL October ‘99HEPiX Fall ‘99 Data Size Event Rate 10 9 events/year Storage Requirements (real & simulated data)
Jefferson Lab Site Report Kelvin Edwards Thomas Jefferson National Accelerator Facility Newport News, Virginia USA
Dag Toppe Larsen UiB/CERN CERN,
Dag Toppe Larsen UiB/CERN CERN,
UK GridPP Tier-1/A Centre at CLRC
Computing Board Report CHIPP Plenary Meeting
QMUL Site Report by Dave Kant HEPSYSMAN Meeting /09/2019
Presentation transcript:

Brunel University, School of Engineering and Design, Uxbridge, UB8 3PH, UK Henry Nebrensky (not a systems manager) SIRE Group

HEPSYSMAN, 27 April 2005 SIRE Group facilities Note the Grid cluster is outside the campus big wide world Grid cluster Site firewall SIRE users Other schoolsOther groupsResidences Campus network WebSense

HEPSYSMAN, 27 April 2005 SIRE Grid facilities Two RH7.3 “green-spot” nodes working with LCG 2.3.0, but not yet Production cluster (SRIF 1) – simple job submission works –64 Dual Xeon nodes –2 Gb memory per node –Dual 1-Gbit network connectivity between nodes –Part of the London Distributed Tier-2 centre –128 more nodes to come (SRIF 2) All cluster computing resources shared with a number of other non-HEP users.

HEPSYSMAN, 27 April 2005 SIRE user computing HEP computing plus image processing and digital holography. About 15 local users Windows 2000 or Linux (SLC) on the individual’s desktop/laptop , networking, modest file storage provided centrally (separate Win2K and Unix servers at present) No funded system management effort ( yet: )

HEPSYSMAN, 27 April 2005 SIRE group facilities We have: A Sun RAID array (Babar) One Athlon XP (RH 7.3) – ROOT and LCG UI (partly blocked by firewall) An old SGI Indy, and a dedicated VRVS PC (Win2K) We don’t have: Local mass storage for backups Own , web, firewall etc. Own subnet

HEPSYSMAN, 27 April 2005 Local challenges All networking controlled centrally –Presents a real challenge for GRID computing, videoconferencing etc. –Centrally managed firewall –Central responsibility for certificate authentication? –How does a WebSense box work – and how will it affect performance of Web Services?

HEPSYSMAN, 27 April 2005 Local challenges We share our subnets (physically and logically) with other groups and teaching labs; the campus network is shared with other departments, admin and even student residences –Currently no easy way to apply firewall rules to just our group – hence issues for Grid users –Very little internal control: constant stream of Windows and Apache probes…