Brunel University, Department of Electronic and Computer Engineering, Uxbridge, UB8 3PH, UK Dr Peter R Hobson C.Phys M.Inst.P SIRE Group.

Slides:



Advertisements
Similar presentations
Condor use in Department of Computing, Imperial College Stephen M c Gough, David McBride London e-Science Centre.
Advertisements

UCL HEP Computing Status HEPSYSMAN, RAL,
24-Apr-03UCL HEP Computing Status April DESKTOPS LAPTOPS BATCH PROCESSING DEDICATED SYSTEMS GRID MAIL WEB WTS SECURITY SOFTWARE MAINTENANCE BACKUP.
Duke Atlas Tier 3 Site Doug Benjamin (Duke University)
The LHC Computing Grid Project Tomi Kauppi Timo Larjo.
Internet Services Alberto Pace. Internet Services Group u Mission and Goals u Provide core computing services, worldwide u Three specific areas u Collaborative.
Page 1 CITS Active Directory Implementation UMass Dartmouth.
An Introduction to Infrastructure Ch 11. Issues Performance drain on the operating environment Technical skills of the data warehouse implementers Operational.
Gareth Smith RAL PPD HEP Sysman. April 2003 RAL Particle Physics Department Site Report.
Edinburgh University Experimental Particle Physics Alasdair Earl PPARC eScience Summer School September 2002.
Edinburgh Site Report 1 July 2004 Steve Thorn Particle Physics Experiments Group.
1 Down Place Hammersmith London UK 530 Lytton Ave. Palo Alto CA USA.
14th April 1999Hepix Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
GridPP Tuesday, 23 September 2003 Tim Phillips. 2 Bristol e-Science Vision National scene Bristol e-Science Centre Issues & Challenges.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
The Grid Prof Steve Lloyd Queen Mary, University of London.
15-Feb-02PvS Brunel Report, GridPP 3 Cambridge 1 Brunel University ECE Brunel Grid Activities Report Peter van Santen Distributed and Grid Computing Group.
The GRID and the Cardiff LAN upgrade Tom Wiersma INFOS 22 July 2003.
High Energy Physics At OSCER A User Perspective OU Supercomputing Symposium 2003 Joel Snow, Langston U.
Version 4.0. Objectives Describe how networks impact our daily lives. Describe the role of data networking in the human network. Identify the key components.
27/04/05Sabah Salih Particle Physics Group The School of Physics and Astronomy The University of Manchester
Test Of Distributed Data Quality Monitoring Of CMS Tracker Dataset H->ZZ->2e2mu with PileUp - 10,000 events ( ~ 50,000 hits for events) The monitoring.
SMART GRID The Next Generation Electric Grid Kunkerati Lublertlop 11/30/2011 Electrical Engineering Department Southern Taiwan University.
S.L.LloydCNAPSlide 1 CNAP What is CNAP? CNAP Structure and Membership What does CNAP do? Interaction with Sysman What is the future of CNAP? S.L.Lloyd.
Lecture #10 COMPUTER SOFTWARE By Shahid Naseem (Lecturer)
Spring 2011 CIS 4911 Senior Project Catalog Description: Students work on faculty supervised projects in teams of up to 5 members to design and implement.
Group Computing Strategy Introduction and BaBar Roger Barlow June 28 th 2005.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
HEPiX/HEPNT TRIUMF,Vancouver 1 October 18, 2003 NIKHEF Site Report Paul Kuipers
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
CSG - Research Computing Redux John Holt, Alan Wolf University of Wisconsin - Madison.
28 April 2003Imperial College1 Imperial College Site Report HEP Sysman meeting 28 April 2003.
Types of Operating Systems
The Birmingham Environment for Academic Research Setting the Scene Peter Watkins, School of Physics and Astronomy (on behalf of the Blue Bear team)
Developing & Managing A Large Linux Farm – The Brookhaven Experience CHEP2004 – Interlaken September 27, 2004 Tomasz Wlodek - BNL.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH OS X Home server AFS using openafs 3 DB servers Kerberos 4 we will move.
22nd March 2000HEPSYSMAN Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
October 2002 INFN Catania 1 The (LHCC) Grid Project Initiative in Prague Dagmar Adamova INP Rez near Prague.
2-3 April 2001HEPSYSMAN Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
CSCS Status Peter Kunszt Manager Swiss Grid Initiative CHIPP, 21 April, 2006.
OSG Tier 3 support Marco Mambelli - OSG Tier 3 Dan Fraser - OSG Tier 3 liaison Tanya Levshina - OSG.
Rob Allan Daresbury Laboratory NW-GRID Training Event 25 th January 2007 Introduction to NW-GRID R.J. Allan CCLRC Daresbury Laboratory.
…building the next IT revolution From Web to Grid…
CEA DSM Irfu IRFU site report. CEA DSM Irfu HEPiX Fall 0927/10/ Computing centers used by IRFU people IRFU local computing IRFU GRIF sub site Windows.
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
Brunel University, School of Engineering and Design, Uxbridge, UB8 3PH, UK Henry Nebrensky (not a systems manager) SIRE Group.
Cyber Security Review, April 23-24, 2002, 0 Operated by the Southeastern Universities Research Association for the U.S. Depart. Of Energy Thomas Jefferson.
Types of Operating Systems 1 Computer Engineering Department Distributed Systems Course Assoc. Prof. Dr. Ahmet Sayar Kocaeli University - Fall 2015.
Investigating the Performance of Audio/Video Service Architecture I: Single Broker Ahmet Uyar & Geoffrey Fox Tuesday, May 17th, 2005 The 2005 International.
SiGNET – Slovenian Production Grid Marko Mikuž Univ. Ljubljana & J. Stefan Institute on behalf of SiGNET team ICFA DDW’06 Kraków, 10 th October 2006.
Gareth Smith RAL PPD RAL PPD Site Report. Gareth Smith RAL PPD RAL Particle Physics Department Overview About 90 staff (plus ~25 visitors) Desktops mainly.
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
International Workshop on HEP Data Grid Aug 23, 2003, KNU Status of Data Storage, Network, Clustering in SKKU CDF group Intae Yu*, Joong Seok Chae Department.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
The Digital Portfolio Add samples of student work directly from the Gradebook into the Digital Portfolio Keep track of several different formats: Written.
Status of NICE/NT at INFN Gian Piero Siroli, Physics Dept. Univ. of Bologna and INFN HEPiX-HEPNT, SLAC, Oct.99.
J Jensen/J Gordon RAL Storage Storage at RAL Service Challenge Meeting 27 Jan 2005.
The UK National Grid Service Andrew Richards – CCLRC, RAL.
Hopper The next step in High Performance Computing at Auburn University February 16, 2016.
INFSO-RI Enabling Grids for E-sciencE Grid & Cloud Computing Introduction
CernVM and Volunteer Computing Ivan D Reid Brunel University London Laurence Field CERN.
Belle II Physics Analysis Center at TIFR
CLUSTER COMPUTING Presented By, Navaneeth.C.Mouly 1AY05IS037
UK GridPP Tier-1/A Centre at CLRC
Manchester HEP group Network, Servers, Desktop, Laptops, and What Sabah Has Been Doing Sabah Salih.
NTC 324 RANK Perfect Education/ ntc324rank.com.
Name Title Group Microsoft Corporation
Collaboration Board Meeting
Presentation transcript:

Brunel University, Department of Electronic and Computer Engineering, Uxbridge, UB8 3PH, UK Dr Peter R Hobson C.Phys M.Inst.P SIRE Group

HEPSYSMAN, 28 April 2003 SIRE Group computing HEP computing plus image processing and digital holography. About 15 local users GRID computing for particle physics and other computationally intensive applications. Windows 2000 or RH 7 or 8 on the desktop , networking provided centrally (Unix servers at present but migrating to full W2000) No PPARC funded system management effort

HEPSYSMAN, 28 April 2003 SIRE Group facilities Small amount of desktop power New cluster being installed (SRIF 1) in our new “BITLab” –64 Dual hyper threading Xeon nodes –2 Gb memory per node –Dual 1-Gbit network connectivity between nodes –Part of the London Distributed Tier-2 centre Recent success with SRIF 2 funds –128 more nodes to come All cluster computing resources shared with a number of other non-HEP users.

HEPSYSMAN, 28 April 2003 Local challenges All networking controlled centrally –Presents a real challenge for GRID computing, videoconferencing etc. –Centrally managed firewall –Central responsibility for certificate authentication Efficient and flexible management of the Tier 2 –Mixed user community with special needs HEP computing 3D visualisation Video walls Ray tracing engines etc. –Coordination of new resources

HEPSYSMAN, 28 April 2003 Summary Major changes taking place Huge (for us) increase in computing power and management complexity No increase in funded personnel New mixed user environment Changes in the central computing provision Our “BITLab” is a University flagship project, so it had better work well this Autumn!