22nd March 2000HEPSYSMAN 20001 Oxford Particle Physics Site Report Pete Gronbech Systems Manager.

Slides:



Advertisements
Similar presentations
Oxford PP Computing Site Report HEPSYSMAN 28 th April 2003 Pete Gronbech.
Advertisements

23rd April 2002HEPSYSMAN April Oxford University Particle Physics Site Report Pete Gronbech Systems Manager.
The Approach to Security in CLRC Gareth Smith With acknowledgements to all the members of the CLRC Computer Network and Security Group, especially Trevor.
Birmingham site report Lawrie Lowe HEP System Managers Meeting, RAL,1 st July 2004.
UCL HEP Computing Status HEPSYSMAN, RAL,
24-Apr-03UCL HEP Computing Status April DESKTOPS LAPTOPS BATCH PROCESSING DEDICATED SYSTEMS GRID MAIL WEB WTS SECURITY SOFTWARE MAINTENANCE BACKUP.
Site report for KFKI RMKI Piroska Giese HEPiX ‘99 meeting at RAL April.
April 1999 NIKHEF site report Ronald Boontje 1 NIKHEF Site report HEPiX April 1999 RAL.
9th May 2006HEPSYSMAN RAL - Oxford Site Report1 Oxford University Particle Physics Site Report Pete Gronbech Systems Manager.
Chapter 5 Operating Systems. 5 The Operating System When working with multimedia, the operating system is perhaps the most important, the most complex,
17th October 2013Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and GridPP Project Manager.
Duke Atlas Tier 3 Site Doug Benjamin (Duke University)
 Changes to sources of funding for computing in the UK.  Past and present computing resources.  Future plans for computing developments. UK Status &
Gareth Smith RAL PPD HEP Sysman. April 2003 RAL Particle Physics Department Site Report.
Edinburgh Site Report 1 July 2004 Steve Thorn Particle Physics Experiments Group.
Virtual Desktops and Flex CSU-Pueblo Joseph Campbell.
RHUL1 Site Report Royal Holloway Sukhbir Johal Simon George Barry Green.
Stuart Cunningham - Computer Platforms COMPUTER PLATFORMS Network Operating Systems Week 9.
14th April 1999Hepix Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
UCL Site Report Ben Waugh HepSysMan, 22 May 2007.
CERN - European Laboratory for Particle Physics HEP Computer Farms Frédéric Hemmer CERN Information Technology Division Physics Data processing Group.
Online Systems Status Review of requirements System configuration Current acquisitions Next steps... Upgrade Meeting 4-Sep-1997 Stu Fuess.
Introductionto Networking Basics By Avinash Kulkarni.
08/06/00 LHCb(UK) Meeting Glenn Patrick LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick Central UK Computing (what.
April 2001HEPix/HEPNT1 RAL Site Report John Gordon CLRC, UK.
Kinds of Systems Mgmt Reporting EEO, OSHA, etc. absenteeism by department Decision Support staffing (long and short term) benefits planning Workflow recruitment.
30-Jun-04UCL HEP Computing Status June UCL HEP Computing Status April DESKTOPS LAPTOPS BATCH PROCESSING DEDICATED SYSTEMS GRID MAIL WEB WTS.
20th October 2003Hepix Vancouver - Oxford Site Report1 Oxford University Particle Physics Site Report Pete Gronbech Systems Manager.
Cloning NT Using DriveImage Chris Brew Particle Physics Department Rutherford Appleton Laboratory rl.ac.uk.
Paul Scherrer Institut 5232 Villigen PSI HEPIX_AMST / / BJ95 PAUL SCHERRER INSTITUT THE PAUL SCHERRER INSTITUTE Swiss Light Source (SLS) Particle accelerator.
Glasgow status A.Flavell For HEPSYSMAN July 2004.
A Design for KCAF for CDF Experiment Kihyeon Cho (CHEP, Kyungpook National University) and Jysoo Lee (KISTI, Supercomputing Center) The International Workshop.
28 April 2003Imperial College1 Imperial College Site Report HEP Sysman meeting 28 April 2003.
Site Report GSI ● GSI ● GSI ● Linux-Farm Walter Schön, GSI.
11th Oct 2005Hepix SLAC - Oxford Site Report1 Oxford University Particle Physics Site Report Pete Gronbech Systems Manager and South Grid Technical Co-ordinator.
6/26/01High Throughput Linux Clustering at Fermilab--S. Timm 1 High Throughput Linux Clustering at Fermilab Steven C. Timm--Fermilab.
23 April 2002HEP SYSMAN meeting1 Cambridge HEP Group - site report April 2002 John Hill.
3-Dec-1998 Stanford Linear Accelerator Center Patrick R. Hancox
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
WP8 Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid WP8 Meeting, 16th November 2000 Glenn Patrick (RAL)
LHCb-Italy Farm Monitor Domenico Galli Bologna, June 13, 2001.
2-3 April 2001HEPSYSMAN Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
Current Deployment (NT4) n Minimal central infrastructure u DHCP/DNS service (non NT) u WINS service (but not supported) u Software image repository u.
Virtualization for the LHCb Online system CHEP Taipei Dedicato a Zio Renato Enrico Bonaccorsi, (CERN)
13th October 2011Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and GridPP Project Manager.
1st July 2004HEPSYSMAN RAL - Oxford Site Report1 Oxford University Particle Physics Site Report Pete Gronbech Systems Manager.
Cloning Windows NT Systems Mainly based on experiences at RAL and Oxford.
14-Nov-2000EPICS Workshop - Oak Ridge1 PCaPAC Review Matthias Clausen DESY/ MKS-2.
Jefferson Lab Site Report Sandy Philpott Thomas Jefferson National Accelerator Facility (formerly CEBAF - The Continuous Electron Beam Accelerator Facility)
Site Report Bristol University HEP group March 2000Jean-Pierre Melot, Bristol2 The People 7 Academics Brian Foster, ZEUS spokesman Greg Heath,
S.Jarp CERN openlab CERN openlab Total Cost of Ownership 11 November 2003 Sverre Jarp.
14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.
Brunel University, School of Engineering and Design, Uxbridge, UB8 3PH, UK Henry Nebrensky (not a systems manager) SIRE Group.
IDE disk servers at CERN Helge Meinhard / CERN-IT CERN OpenLab workshop 17 March 2003.
HEPSYSMAN May 2007 Oxford & SouthGrid Computing Status (Ian McArthur), Pete Gronbech May 2007 Physics IT Services PP Computing.
HepNT - January 15, 1997 : PCSF Frederic Hemmer IT/PDP 1 PCSF - A Pentium ® /Windows NT ® Based simulation farm Frederic Hemmer CERN IT/PDP.
Gareth Smith RAL PPD RAL PPD Site Report. Gareth Smith RAL PPD RAL Particle Physics Department Overview About 90 staff (plus ~25 visitors) Desktops mainly.
14 th April 1999CERN Site Report, HEPiX RAL. A.Silverman CERN Site Report HEPiX April 1999 RAL Alan Silverman CERN/IT/DIS.
CASPUR Site Report Andrei Maslennikov Group Leader - Systems RAL, April 1999.
11th October 2012Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and GridPP Project Manager.
Brunel University, Department of Electronic and Computer Engineering, Uxbridge, UB8 3PH, UK Dr Peter R Hobson C.Phys M.Inst.P SIRE Group.
CERN - European Organization for Nuclear Research FOCUS March 2 nd, 2000 Frédéric Hemmer - IT Division.
Windows NT at DESY Status report HEP NT 4 th -8 th October 1999 SLAC.
Virtualization OVERVIEW
3.1 Types of Servers.
PC Farms & Central Data Recording
Andrew Sansum 21 March 2000 ITD/CLRC Site Report Andrew Sansum 21 March 2000
Enrico Bonaccorsi, (CERN) Loic Brarda, (CERN) Gary Moine, (CERN)
3.1 Types of Servers.
QMUL Site Report by Dave Kant HEPSYSMAN Meeting /09/2019
Presentation transcript:

22nd March 2000HEPSYSMAN Oxford Particle Physics Site Report Pete Gronbech Systems Manager

22nd March 2000HEPSYSMAN Goals and Objectives l Flexibility on the Desktop (Low cost Seats) n Access to networked services (X, http, IMAP) n Access to PC Applications n Reduce management overheads n Reduce costs l Servers provide Compute power n Central Servers for UNIX, VMS, Mail & Web n Compatibility with CERN / DESY / Fermilab n Provide Code development environments

22nd March 2000HEPSYSMAN The Server / Desktop Divide NT PC Unix Workstation Desktops Servers General Purpose Unix Server VMS Server Mail Server Web Server NT Server

22nd March 2000HEPSYSMAN Status l Aim to standardise and centralise as much of the IT infrastructure as provides real benefit across whole department. l HEP specific computing on top of basic infrastructure. l Campus backbone now Gigabit Ethernet with 100Mbs to departments. Physics is mixture or 10 /100MBs switched ethernet behind ‘drawbridge’ firewall. l Desktop Strategy - Windows NT4 Workstation (100 particle physics 200 across physics) MS Office, eXceed, Web browsers, Mail clients etc. Systems generated by cloning (with GHOST) l Server Strategy - HEP Computation - Heavy use of remote compute farms, local compute server (Digital UNIX 3 CPU server) for interactive work especially code development. Linux for CDF group, RAL csf Porting machine and MINOS dual boot DAQ machine.

22nd March 2000HEPSYSMAN Status 2 l Additional CPU is provided by a Digital Alpha 500au. New CPU provided by Linux. Just ordered a dual 800MHz 2GB RAM system for SNO analysis. l Oxford is the ‘Lead site’ for successful CDF_JIF bid, multi-cpu server plus 1 TB store in each CDF institute. Plus larger 2TB store at RAL and even larger at Fermilab. l Server Strategy - IT in general - NT server (6) for desktop file/print, Exchange 5.5 for , IIS 4 for web serving, MS Terminal Server to give NT 4 remote access. l VMS - DAQ systems still important but general purpose service is running down (mail, word processing etc going to NT) l Data Acquisition - LABview on NT for most laboratory DAQ and control (used by wide range of research groups) l Videoconferencing - PC based Intel plus access to ISDN6 Tandberg. MS Netmeeting used frequently to DESY

22nd March 2000HEPSYSMAN Linux NT PC Unix Workstation Desktops Servers CDF Linux (Dual 400MHz PII) RAL Linux Farm Porting Machine Treat Linux as just another Unix and hence a server OS to be managed centrally. Wish to avoid badly managed desktop PC’s running Linux. MINOS Linux/NT DAQ pplx1 RH5.0 pplx2 RH5.2 ppnt109 RH6.1

22nd March 2000HEPSYSMAN Plans and Concerns l Look to replace local compute server with RAID Disk Server and Linux CPU servers. Disk server could be Intel based running NT or Linux but if performance is not sufficient a proprietary solution may be used. l Experience gained with CDF distributed data store will help plan LHC requirements. l Choice of OS/platforms for computation. Clear that this will be Red Hat Linux l NT4 provides all desktop functionality we need. Will look at Windows2000 but no rush (at least for the desktop) l cost of software licensing l MAN POWER