Southgrid Technical Meeting Pete Gronbech: 16 th March 2006 Birmingham.

Slides:



Advertisements
Similar presentations
Southgrid Status Pete Gronbech: 21 st March 2007 GridPP 18 Glasgow.
Advertisements

SouthGrid Status Pete Gronbech: 12 th March 2008 GridPP 20 Dublin.
Northgrid Status Alessandra Forti Gridpp24 RHUL 15 April 2010.
NorthGrid status Alessandra Forti Gridpp12 Brunel, 1 February 2005.
Winnie Lacesso Bristol Site Report May Scope User Support / Servers / Config Security / Network UKI-SOUTHGRID-BRIS-HEP Upcoming: major infrastructure.
A couple of slides on RAL PPD Chris Brew CCLRC - RAL - SPBU - PPD.
Report of Liverpool HEP Computing during 2007 Executive Summary. Substantial and significant improvements in the local computing facilities during the.
Chris Brew RAL PPD Site Report Chris Brew SciTech/PPD.
Birmingham site report Lawrie Lowe: System Manager Yves Coppens: SouthGrid support HEP System Managers’ Meeting, RAL, May 2007.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
Cambridge Site Report Cambridge Site Report HEP SYSMAN, RAL th June 2010 Santanu Das Cavendish Laboratory, Cambridge Santanu.
SouthGrid Status Pete Gronbech: 4 th September 2008 GridPP 21 Swansea.
UKI-SouthGrid Overview Face-2-Face Meeting Pete Gronbech SouthGrid Technical Coordinator Oxford June 2013.
London Tier 2 Status Report GridPP 12, Brunel, 1 st February 2005 Owen Maroney.
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator GridPP 24 - RHUL 15 th April 2010.
LHCC Comprehensive Review – September WLCG Commissioning Schedule Still an ambitious programme ahead Still an ambitious programme ahead Timely testing.
Quarterly report SouthernTier-2 Quarter P.D. Gronbech.
US ATLAS Western Tier 2 Status and Plan Wei Yang ATLAS Physics Analysis Retreat SLAC March 5, 2007.
Winnie Lacesso Bristol Storage June DPM LCG Storage lcgse01 = DPM built in 2005 by Yves Coppens & Pete Gronbech SuperMicro X5DPAGG (Streamline.
Group Computing Strategy Introduction and BaBar Roger Barlow June 28 th 2005.
BINP/GCF Status Report BINP LCG Site Registration Oct 2009
SouthGrid Status Pete Gronbech: 2 nd April 2009 GridPP22 UCL.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
Monitoring the Grid at local, national, and Global levels Pete Gronbech GridPP Project Manager ACAT - Brunel Sept 2011.
How to Install and Use the DQ2 User Tools US ATLAS Tier2 workshop at IU June 20, Bloomington, IN Marco Mambelli University of Chicago.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPIX 2009 Umea, Sweden 26 th May 2009.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPSYSMAN RAL 30 th June 2009.
Oxford Update HEPix Pete Gronbech GridPP Project Manager October 2014.
Tier1 Status Report Martin Bly RAL 27,28 April 2005.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
11th Oct 2005Hepix SLAC - Oxford Site Report1 Oxford University Particle Physics Site Report Pete Gronbech Systems Manager and South Grid Technical Co-ordinator.
RAL PPD Computing A tier 2, a tier 3 and a load of other stuff Rob Harper, June 2011.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
Southgrid Technical Meeting Pete Gronbech: 26 th August 2005 Oxford.
1 PRAGUE site report. 2 Overview Supported HEP experiments and staff Hardware on Prague farms Statistics about running LHC experiment’s DC Experience.
UKI-SouthGrid Update Hepix Pete Gronbech SouthGrid Technical Coordinator April 2012.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
Southgrid Technical Meeting Pete Gronbech: 24 th October 2006 Cambridge.
Southgrid Technical Meeting Pete Gronbech: May 2005 Birmingham.
Presenter Name Facility Name UK Testbed Status and EDG Testbed Two. Steve Traylen GridPP 7, Oxford.
HEPSYSMAN May 2007 Oxford & SouthGrid Computing Status (Ian McArthur), Pete Gronbech May 2007 Physics IT Services PP Computing.
CERN Database Services for the LHC Computing Grid Maria Girone, CERN.
IHEP(Beijing LCG2) Site Report Fazhi.Qi, Gang Chen Computing Center,IHEP.
2-Sep-02Steve Traylen, RAL WP6 Test Bed Report1 RAL and UK WP6 Test Bed Report Steve Traylen, WP6
UK Tier 1 Centre Glenn Patrick LHCb Software Week, 28 April 2006.
Oxford & SouthGrid Update HEPiX Pete Gronbech GridPP Project Manager October 2015.
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPSYSMAN – RAL 10 th June 2010.
Site Report: Prague Jiří Chudoba Institute of Physics, Prague WLCG GridKa+T2s Workshop.
Tier-1 Andrew Sansum Deployment Board 12 July 2007.
Plans for Service Challenge 3 Ian Bird LHCC Referees Meeting 27 th June 2005.
Data Transfer Service Challenge Infrastructure Ian Bird GDB 12 th January 2005.
Materials for Report about Computing Jiří Chudoba x.y.2006 Institute of Physics, Prague.
RAL PPD Tier 2 (and stuff) Site Report Rob Harper HEP SysMan 30 th June
BaBar Cluster Had been unstable mainly because of failing disks Very few (
15-Feb-02Steve Traylen, RAL WP6 Test Bed Report1 RAL/UK WP6 Test Bed Report Steve Traylen, WP6 PPGRID/RAL, UK
Accounting in LCG/EGEE Can We Gauge Grid Usage via RBs? Dave Kant CCLRC, e-Science Centre.
SouthGrid Status Pete Gronbech: 31st March 2010 Technical Meeting.
Scientific Computing in PPD and other odds and ends Chris Brew.
RALPP Site Report HEP Sys Man, 11 th May 2012 Rob Harper.
The RAL Tier-1 and the 3D Deployment Andrew Sansum 3D Meeting 22 March 2006.
J Jensen/J Gordon RAL Storage Storage at RAL Service Challenge Meeting 27 Jan 2005.
The RAL PPD Tier 2/3 Current Status and Future Plans or “Are we ready for next year?” Chris Brew PPD Christmas Lectures th December 2007.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
UK Status and Plans Catalin Condurache – STFC RAL ALICE Tier-1/Tier-2 Workshop University of Torino, February 2015.
18/12/03PPD Christmas Lectures 2003 Grid in the Department A Guide for the Uninvolved PPD Computing Group Christmas Lecture 2003 Chris Brew.
Pete Gronbech GridPP Project Manager April 2016
Oxford Site Report HEPSYSMAN
Presentation transcript:

Southgrid Technical Meeting Pete Gronbech: 16 th March 2006 Birmingham

Present Pete Gronbech – Oxford Rosario Esposito - Oxford Chris Brew – RAL PPD Yves Coppens – Birmingham Winnie Lacesso - Bristol

Agenda 10:30 Start Pete + Others 12pm Lunch Interactive Workshop!! 3:15pm Coffee ?? 4:20pm Finish

Agenda Topics LCG-2_7_0 –Experiences from Yves & Chris –Plans for Oxford and Cambridge Monitoring –Network Monitoring Box –Ganglia at Bristol once webserver ready. –Ganglia mods for VO’s Help again from Chris? –Nagios anyone? –Pakiti –aid –tripwire?? –swatch /ranger SC4 –T2 workshop Who is going? –Throughput tests –Bristol and Cam repeat tests at end of Month –Network connectivity at Bristol/ Cambridge at 1Gbps ?? Next week Bristol, Cam ?? –UI FTS client works out of the box. –Storage security challenge?? Do we know which logs to look at or even are the SRM’s doing enough logging. –Re security Challenge a Best practice how to should now be made available on the wiki….

Agenda Topics ALICE paying for machine to act as VO box at Birmingham. Possibly also at other SouthGrid sites. Security. Root access. policy?? Yves will test first … Unified Naming Scheme? UKI-SOUTHGRID-OXF ?? Oxford to try it!! Cambridge progress with APEL / Condor? future upgrades VO support Can all SouthGrid sites support the same VO’s? With lcg 2_7_0 vo tool available. On clusters with various memory allocations have to advertise memory available per job slot not per machine!! Backups: –DPM database on se. –ce –se –mon –lfc Central Logging Machine also useful as a secondary backup of logs VRVS demo

LCG 2_7_0 Birmingham SL304 problems with mktmp so upgrading to sl305 first helps. Check SL mirror is still OK. Bham other problems –info provider on ce (extra info from maui only worked for default maui setup, as we have a customized config ) Recent (March 7 th ) Bristol Upgrade was much smoother. Many bugs fixed SouthGrid Now using more modules eg for ganglia For new nodes use pbsnodes –o fqdn then when happy use pbsnodes –c fqdn Plan for Oxford Next week and Camb shortly after.

SL 3 – 4- 5 Summary of Talk given at CERN wrt to SL versions. SL5 will be to late for LHC so push to certify SL4 by end of March and migrate in Autumn 2006 There will be no OS upgrade planned for 2007!

Future Upgrades RAL PPD (£250K) –52 dual core dual opteron wn’s (late March 06) –7 * 8TB sata disk servers (2 sys, 1 parity, 1 Hot spare, 20 data) –Network: separate 1Gb/s fibre. separate from rest of PPD. Nortell switches (8) 10Gb capable. so should be able to have a 10Gb link to RAL T1. –CPU over MoU can be used for T3 local PPD VO

Future Upgrades Oxford –Still waiting for our new computer room…..now Autumn 06 –Short term air con upgrade to allow us to stay as we are! Bristol –Uni cluster to go on 5 th floor of physics. –New room ?? June?? seems un realistic –Need to know how to send LCG jobs to external clusters – We should ask LT2 how they did escience centre.

Future Upgrades Birmingham –Atlas farm may get integrated more but… –Babar farm increased nodes but h/w reliability problems. Some new disks purchased and may buy some replacement PSU’s –babar ce to be migrated to gridpp ui box. –Lawrie applying for esci clusters….