Southgrid Technical Meeting Pete Gronbech: May 2005 Birmingham.

Slides:



Advertisements
Similar presentations
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Advertisements

Southgrid Status Pete Gronbech: 21 st March 2007 GridPP 18 Glasgow.
Southgrid Status Pete Gronbech: 30 th August 2007 GridPP 19 Ambleside.
SouthGrid Status Pete Gronbech: 12 th March 2008 GridPP 20 Dublin.
Andrew McNab - Manchester HEP - 17 September 2002 Putting Existing Farms on the Testbed Manchester DZero/Atlas and BaBar farms are available via the Testbed.
Northgrid Status Alessandra Forti Gridpp24 RHUL 15 April 2010.
NorthGrid status Alessandra Forti Gridpp12 Brunel, 1 February 2005.
GridPP News NeSC opening “Media” dissemination Tier 1/A hardware Web pages Collaboration meetings Nick Brook University of Bristol.
Chris Brew RAL PPD Site Report Chris Brew SciTech/PPD.
Birmingham site report Lawrie Lowe: System Manager Yves Coppens: SouthGrid support HEP System Managers’ Meeting, RAL, May 2007.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
IFIN-HH LHCB GRID Activities Eduard Pauna Radu Stoica.
London Tier 2 Status Report GridPP 13, Durham, 4 th July 2005 Owen Maroney, David Colling.
SouthGrid Status Pete Gronbech: 4 th September 2008 GridPP 21 Swansea.
UKI-SouthGrid Overview Face-2-Face Meeting Pete Gronbech SouthGrid Technical Coordinator Oxford June 2013.
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
UCL Site Report Ben Waugh HepSysMan, 22 May 2007.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator GridPP 24 - RHUL 15 th April 2010.
Quarterly report SouthernTier-2 Quarter P.D. Gronbech.
ASGC 1 ASGC Site Status 3D CERN. ASGC 2 Outlines Current activity Hardware and software specifications Configuration issues and experience.
Southgrid Technical Meeting Pete Gronbech: 16 th March 2006 Birmingham.
David Hutchcroft on behalf of John Bland Rob Fay Steve Jones And Mike Houlden [ret.] * /.\ /..‘\ /'.‘\ /.''.'\ /.'.'.\ /'.''.'.\ ^^^[_]^^^ * /.\ /..‘\
Group Computing Strategy Introduction and BaBar Roger Barlow June 28 th 2005.
SouthGrid Status Pete Gronbech: 2 nd April 2009 GridPP22 UCL.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPIX 2009 Umea, Sweden 26 th May 2009.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPSYSMAN RAL 30 th June 2009.
Oxford Update HEPix Pete Gronbech GridPP Project Manager October 2014.
1 st EGEE Conference – April UK and Ireland Partner Dave Kant Deputy ROC Manager.
Tier1 Status Report Martin Bly RAL 27,28 April 2005.
Martin Bly RAL Tier1/A RAL Tier1/A Report HepSysMan - July 2004 Martin Bly / Andrew Sansum.
11th Oct 2005Hepix SLAC - Oxford Site Report1 Oxford University Particle Physics Site Report Pete Gronbech Systems Manager and South Grid Technical Co-ordinator.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH OS X Home server AFS using openafs 3 DB servers Kerberos 4 we will move.
Support in setting up a non-grid Atlas Tier 3 Doug Benjamin Duke University.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
26SEP03 2 nd SAR Workshop Oklahoma University Dick Greenwood Louisiana Tech University LaTech IAC Site Report.
São Paulo Regional Analysis Center SPRACE Status Report 22/Aug/2006 SPRACE Status Report 22/Aug/2006.
Southgrid Technical Meeting Pete Gronbech: 26 th August 2005 Oxford.
UKI-SouthGrid Update Hepix Pete Gronbech SouthGrid Technical Coordinator April 2012.
Steve Traylen PPD Rutherford Lab Grid Operations PPD Christmas Lectures Steve Traylen RAL Tier1 Grid Deployment
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
London Tier 2 Status Report GridPP 11, Liverpool, 15 September 2004 Ben Waugh on behalf of Owen Maroney.
Southgrid Technical Meeting Pete Gronbech: 24 th October 2006 Cambridge.
Southgrid Technical Meeting Pete Gronbech: February 2005 Birmingham.
Presenter Name Facility Name UK Testbed Status and EDG Testbed Two. Steve Traylen GridPP 7, Oxford.
HEPSYSMAN May 2007 Oxford & SouthGrid Computing Status (Ian McArthur), Pete Gronbech May 2007 Physics IT Services PP Computing.
HEP Computing Status Sheffield University Matt Robinson Paul Hodgson Andrew Beresford.
Gareth Smith RAL PPD RAL PPD Site Report. Gareth Smith RAL PPD RAL Particle Physics Department Overview About 90 staff (plus ~25 visitors) Desktops mainly.
2-Sep-02Steve Traylen, RAL WP6 Test Bed Report1 RAL and UK WP6 Test Bed Report Steve Traylen, WP6
Oxford & SouthGrid Update HEPiX Pete Gronbech GridPP Project Manager October 2015.
11th October 2012Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and GridPP Project Manager.
LCG LCG-1 Deployment and usage experience Lev Shamardin SINP MSU, Moscow
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPSYSMAN – RAL 10 th June 2010.
Outline: Status: Report after one month of Plans for the future (Preparing Summer -Fall 2003) (CNAF): Update A. Sidoti, INFN Pisa and.
RHIC/US ATLAS Tier 1 Computing Facility Site Report Christopher Hollowell Physics Department Brookhaven National Laboratory HEPiX Upton,
Doug Benjamin Duke University. 2 ESD/AOD, D 1 PD, D 2 PD - POOL based D 3 PD - flat ntuple Contents defined by physics group(s) - made in official production.
LCG and Tier-1 Facilities Status ● LCG interoperability. ● Tier-1 facilities.. ● Observations. (Not guaranteed to be wry, witty or nonobvious.) Joseph.
BaBar Cluster Had been unstable mainly because of failing disks Very few (
15-Feb-02Steve Traylen, RAL WP6 Test Bed Report1 RAL/UK WP6 Test Bed Report Steve Traylen, WP6 PPGRID/RAL, UK
Scientific Computing in PPD and other odds and ends Chris Brew.
RALPP Site Report HEP Sys Man, 11 th May 2012 Rob Harper.
Evangelos Markatos and Charalampos Gkikas FORTH-ICS Athens, th Mar Institute of Computer Science - FORTH Christos.
J Jensen/J Gordon RAL Storage Storage at RAL Service Challenge Meeting 27 Jan 2005.
The RAL PPD Tier 2/3 Current Status and Future Plans or “Are we ready for next year?” Chris Brew PPD Christmas Lectures th December 2007.
18/12/03PPD Christmas Lectures 2003 Grid in the Department A Guide for the Uninvolved PPD Computing Group Christmas Lecture 2003 Chris Brew.
Pete Gronbech GridPP Project Manager April 2016
Pete Gronbech, Kashif Mohammad and Vipul Davda
Presentation transcript:

Southgrid Technical Meeting Pete Gronbech: May 2005 Birmingham

Present Pete Gronbech – Oxford Chris Brew – RAL PPD Santanu Das - Cambridge Lawrie Lowe – Birmingham Yves Coppens - Birmingham

Southgrid Member Institutions Oxford RAL PPD Cambridge Birmingham Bristol HP-Bristol Warwick

Monitoring reports/cgi-bin/lastreport.cgihttp://lcg-testzone-reports.web.cern.ch/lcg-testzone- reports/cgi-bin/lastreport.cgi Configure view UKI Dave Kants helpful doc in the minutes of a tbsupport meeting links tominutes of a tbsupport

GOC Accounting Gridpp view

Status at RAL PPD SL3 cluster on CPUs: GHz, GHz –100% Dedicated to LCG 0.7 TB Storage –100% Dedicated to LCG Configuring 6.4TB of SATA RAID disks for use by dcache Recent Power trip April 27 th.

Status at Cambridge Currently LCG on SL3 CPUs: GHz –100% Dedicated to LCG 2 TB Storage –100% Dedicated to LCG Condor Batch System Recent job submission problems Lack of Condor support from LCG teams

Status at Bristol Status –LCG involvement limited (“black dot”) for previous six months due to lack of manpower –New resources, posts now on the horizon! Existing resources –80-CPU BaBar farm to be switched to LCG –~ 2TB storage resources to be LCG – accessible –LCG head nodes installed by SouthGrid support team with New resources –Funding now confirmed for large University investment in hardware –Includes CPU, high quality and scratch disk resources Humans –New system manager post (RG) being filled –New SouthGrid support / development post (GridPP / HP) being filled –HP keen to expand industrial collaboration – suggestions?

Status at Birmingham Currently SL3 with LCG- 2_4_0 CPUs: GHz Xenon (+48 soon) –100% LCG 200GB Storage, 2TB shortly –100% LCG. Air Conditioning problems 21 st May (3 days) Babar Farm moving to SL3

Status at Oxford Currently LCG on SL304 Only 48 cpus running due to power limitations (Power trip on 10 th May) CPUs: GHz –100% LCG 1.5 TB Storage – upgrade to 3TB planned –100% LCG.

Apel Accounting Get latest rpm from This should fix the cron job problem or do it by hand: add RGMA_HOME=/opt/edg

Security Best practices link Wiki entry iptables?? – Birmingham to share their setup on the South Grid web pages

Kickstart Packages Minimum for worker ?? glibc-headers required by atlas graphics group pulls that in. (Other options may also) But development-tools may be more logical Atlas require various glibc headers which come from glibc-devel And zlib-devel

kickstart for worker nodes %packages office base-x # graphics is Needed as it pulls in glibc-headers openafs-client graphical-internet kernel kernel-smp pine grub gv gnupg -xchat -redhat-config-samba -samba

Grid Ice Need to open port 2136

Action Plan for Bristol Plan to visit on June 9 th to install an installation server –dhcp server –NFS copies of SL (local mirror) –PXE boot setup etc Second visit to reinstall head nodes with SL304 and LCG-2_4_0 and some worker nodes Babar cluster to go to Birmingham –Fergus, Chris, Yves to Liaise. –

Chris Brew Gave an overview of the hepix meeting including talks about SL versions Gave advice on using Maui fairshare and put config files on the south grid web site Gave advice on customizing ganglia and put files on southgrid web site.