SL5 Site Status GDB, September 2009 John Gordon. LCG SL5 Site Status ASGC T1 - will be finished before mid September. Actually the OS migration process.

Slides:



Advertisements
Similar presentations
Exporting Raw/ESD data from Tier-0 Tier-1s Wrap-up.
Advertisements

LCG-France Project Status Fabio Hernandez Frédérique Chollet Fairouz Malek Réunion Sites LCG-France Annecy, May
S.Chechelnitskiy / SFU Simon Fraser Running CE and SE in a XEN virtualized environment S.Chechelnitskiy Simon Fraser University CHEP 2007 September 6 th.
Sue Foffano LCG Resource Manager WLCG – Resources & Accounting LHCC Comprehensive Review November, 2007 LCG.
CREAM John Gordon GDB November CREAM number of sites now – gstat2 says 24. Batch systems supported Experiment Tests Feedback from sites. Evaluation.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
London Tier 2 Status Report GridPP 13, Durham, 4 th July 2005 Owen Maroney, David Colling.
CERN IT Department CH-1211 Geneva 23 Switzerland t T0 report WLCG operations Workshop Barcelona, 07/07/2014 Maite Barroso, CERN IT.
Patricia Méndez Lorenzo (IT/GS) ALICE Offline Week (18th March 2009)
WLCG ‘Weekly’ Service Report ~~~ WLCG Management Board, 22 th July 2008.
London Tier 2 Status Report GridPP 12, Brunel, 1 st February 2005 Owen Maroney.
LHCC Comprehensive Review – September WLCG Commissioning Schedule Still an ambitious programme ahead Still an ambitious programme ahead Timely testing.
Status of WLCG Tier-0 Maite Barroso, CERN-IT With input from T0 service managers Grid Deployment Board 9 April Apr-2014 Maite Barroso Lopez (at)
Quarterly report SouthernTier-2 Quarter P.D. Gronbech.
INFSO-RI Enabling Grids for E-sciencE Status of LCG-2 porting Stephen Childs, Brian Coghlan and Eamonn Kenny Grid-Ireland/EGEE October.
David Hutchcroft on behalf of John Bland Rob Fay Steve Jones And Mike Houlden [ret.] * /.\ /..‘\ /'.‘\ /.''.'\ /.'.'.\ /'.''.'.\ ^^^[_]^^^ * /.\ /..‘\
WLCG Service Report ~~~ WLCG Management Board, 27 th October
Status of PDC’07 Latchezar Betev TF meeting – April 5, 2007.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
News from the HEPiX IPv6 Working Group David Kelsey (STFC-RAL) WLCG GDB, CERN 8 July 2015.
Detailed Plans on Production Service Operations in Spain Andreu Pacheco (IFAE/PIC) GDB Meeting
Quarterly report ScotGrid Quarter Fraser Speirs.
WLCG Service Report ~~~ WLCG Management Board, 24 th November
LCG Plans for Chrsitmas Shutdown John Gordon, STFC-RAL GDB December 10 th, 2008.
John Gordon STFC-RAL Tier1 Status 9 th July, 2008 Grid Deployment Board.
WLCG Service Report ~~~ WLCG Management Board, 1 st September
Southgrid Technical Meeting Pete Gronbech: 26 th August 2005 Oxford.
CCRC’08 Weekly Update Jamie Shiers ~~~ LCG MB, 1 st April 2008.
CERN Physics Database Services and Plans Maria Girone, CERN-IT
SL6 Status at Oxford. Status  SL6 EMI-3 CREAMCE  SL6 EMI3 WN and gLExec  Small test cluster with three WN’s  Configured using Puppet and Cobbler 
MW Readiness Verification Status Andrea Manzi IT/SDC 21/01/ /01/15 2.
MW Readiness WG Update Andrea Manzi Maria Dimou Lionel Cons 10/12/2014.
SC4 Planning Planning for the Initial LCG Service September 2005.
UK Tier 1 Centre Glenn Patrick LHCb Software Week, 28 April 2006.
WLCG Middleware Support II Markus Schulz CERN-IT-GT May 2011.
LCG Report from GDB John Gordon, STFC-RAL MB meeting February24 th, 2009.
LCG Support for Pilot Jobs John Gordon, STFC GDB December 2 nd 2009.
Plans for Service Challenge 3 Ian Bird LHCC Referees Meeting 27 th June 2005.
Maria Girone CERN - IT Tier0 plans and security and backup policy proposals Maria Girone, CERN IT-PSS.
Experiment Support CERN IT Department CH-1211 Geneva 23 Switzerland t DBES CVMFS deployment status Ian Collier – STFC Stefan Roiser – CERN.
Gas System Construction Status Oct FH. General Construction is proceeding well, (nearly on-schedule). Testing of Racks is longer than foreseen Testing.
1 Update at RAL and in the Quattor community Ian Collier - RAL Tier1 HEPiX FAll 2010, Cornell.
Enabling Grids for E-sciencE INFSO-RI Enabling Grids for E-sciencE Gavin McCance GDB – 6 June 2007 FTS 2.0 deployment and testing.
CERN IT Department CH-1211 Genève 23 Switzerland t SL(C) 5 Migration at CERN CHEP 2009, Prague Ulrich SCHWICKERATH Ricardo SILVA CERN, IT-FIO-FS.
WLCG Operations Coordination Andrea Sciabà IT/SDC 10 th July 2013.
LCG Issues from GDB John Gordon, STFC WLCG MB meeting September 28 th 2010.
Data transfers and storage Kilian Schwarz GSI. GSI – current storage capacities vobox LCG RB/CE GSI batchfarm: ALICE cluster (67 nodes/480 cores for batch.
Database Project Milestones (+ few status slides) Dirk Duellmann, CERN IT-PSS (
WLCG Service Report Jean-Philippe Baud ~~~ WLCG Management Board, 24 th August
WLCG Operations Coordination report Maria Alandes, Andrea Sciabà IT-SDC On behalf of the WLCG Operations Coordination team GDB 9 th April 2014.
MW Readiness WG Update Andrea Manzi Maria Dimou Lionel Cons Maarten Litmaath On behalf of the WG participants GDB 09/09/2015.
Current status WMS and CREAM CE deployment Patricia Mendez Lorenzo ALICE TF Meeting (CERN, 02/04/09)
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE Operations: Evolution of the Role of.
Status of gLite-3.0 deployment and uptake Ian Bird CERN IT LCG-LHCC Referees Meeting 29 th January 2007.
Summary of SC4 Disk-Disk Transfers LCG MB, April Jamie Shiers, CERN.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
Database Requirements Updates from LHC Experiments WLCG Grid Deployment Board Meeting CERN, Geneva, Switzerland February 7, 2007 Alexandre Vaniachine (Argonne)
17 September 2004David Foster CERN IT-CS 1 Network Planning September 2004 David Foster Networks and Communications Systems Group Leader
UK Status and Plans Catalin Condurache – STFC RAL ALICE Tier-1/Tier-2 Workshop University of Torino, February 2015.
WLCG Operations Coordination Andrea Sciabà IT/SDC GDB 11 th September 2013.
Status of the SL5 migration ALICE TF Meeting
WLCG Tier-2 Asia Workshop TIFR, Mumbai 1-3 December 2006
“A Data Movement Service for the LHC”
LCG Service Challenge: Planning and Milestones
gLite->EMI2/UMD2 transition
3D Application Tests Application test proposals
Luca dell’Agnello INFN-CNAF
Summary from last MB “The MB agreed that a detailed deployment plan and a realistic time scale are required for deploying glexec with setuid mode at WLCG.
John Gordon, STFC GDB October 12th 2011
LHC Data Analysis using a worldwide computing grid
Presentation transcript:

SL5 Site Status GDB, September 2009 John Gordon

LCG SL5 Site Status ASGC T1 - will be finished before mid September. Actually the OS migration process is under way, more than 1.2K cores are being offline and prepare for the first batch OS update. T2 ?? BNL - Processing resources were migrated to SL5 in late July. USATLAS T2s - prepared to migrate as soon as the upcoming ATLAS reprocessing exercise is completed. We expect all of them to be running SL5 by mid October CNAF T1 - a test queue (with several WN) is ready and we are waiting for feedback from experiments to migrate. All T2s foresee to migrate before the end of September 2

LCG SL5 Site Status (2) FNAL T1 rollout for worker nodes is complete. dCache servers have also been moved to SL5. We still have interactive analysis nodes at SL4 (request from CMS) SL5 upgrade at Tier-2s is proceeding. Only 1 of the 7 has indicated a problem in finishing before the end of September. 3

LCG SL5 Site Status (3) IN2P3 T1 - ALICE already validated our SL5 deployment. Other experiments are doing so. We are actively and progressively increasing our SL5-based fraction of the batch farm. The rhythm of this migration will depend on the speed the other LHC experiments validate that their software performs OK on our installation. IN2P3 T2 - IN2P3-GRIF: currently validating with the experiments and deploying SL5. One of the sites composing GRIF has already migrated all the worker nodes to SL5. Difficulties to support at the same site a batch farm using WNs running SL4 and SL5, in particular if the experiment requires a different software area for SL4 and another for SL5. Goal: deployment of SL5 on week 38 or IN2P3-SUBATECH: partially migrated to SL5. Waiting for ALICE to confirm that they can stop using worker nodes running SL4. - IN2P3-LAPP: SL5 validated. Being deployed. - IN2P3-LPC: ongoing validation before wide deployment - TOKYO: plan to deploy SL5 in early October. New hardware expected to be delivered in november or december. 4

LCG SL5 Site Status (4) KIT T1 - Had SL4 and SL5 subcluster since April. Migration of the remaining SL4 WNs to SL5 will finish this week. All CEs (and Cream-CEs) usable by LHC VOs point to SL5 since August 24. Tier-2s LRZ-LMU Munich (Atlas) : SuSE Linux (SLES10) on WNs, no SL(C) planned MPPMU Munich (Atlas) : SuSE Linux (SLES10) on WNs, no SL(C) planned DESY (Atlas, CMS, LHCb) : Hamburg site: Migration of all WNs to SL5 started this week Zeuthen site: Will have SL5 subcluster in September Aachen (CMS) : running SL5 on all WNs since approx. mid of July Göttingen (Atlas) Start migration to SL5 probably end of September GSI Darmstadt (Alice) : Debian Etch64 on WNs, no plan to go to SL(C). Prague (Atlas) : Currently SL4 and SL5 subcluster, all new hardware on SL5 since this year. Wuppertal (Atlas) : All WNs still on CentOS4. Plan to migrate (to CentOS5) mid September. Freiburg (Atlas) : WNs still on SL4. Plan to migrate first batch of WNs to SL5 soon. Aim for complete migration by end of CSCS (Atlas, CMS, LHCb) : All WNs on SL4. Start migration to SL5 as soon as possible when remaining LHCb problems are confirmed to be solved. Polish sites: - Cyfronet Cracow (Atlas, LHCb) : No SL(C)5 yet and no clear plan when to migrate. - PSNC Poznan (Atlas, LHCb) : SL5 test cluster existing. All sites running SL4 and SL5 mixed provide separate CEs pointing to the subclusters 5

LCG SL5 Site Status (5) NL-T1 All parts of the NL-T1 and all sites in the Netherlands coordinated by BiG grid will have a coordinated migration to EL5/x64_64. The migration will not happen before October 1st, 2009, and we aim to complete the entire migration before November 1st. At any point in time no more than two (2) sites will be down for this migration. NL-T2 None 6

LCG SL5 Site Status (6) PIC T1 -CMS already validated the SL5 test WN at PIC: CMS capacity (aprox 500 cores) will be migrated to SL5 queues during today/tomorrow. -ATLAS and LHCb still validating test WN at PIC. Will migrate 100% of the T1 capacity once they are ready. Plan to complete the migration in September. ATLAS Tier2s: -IFAE: Plan to migrate to SL5 before the end of September. -IFIC, UAM: No info. -LIP-Coimbra (atlas): currently being reinstalled in SL5. -LIP-Lisbon (atlas and cms): aprox 25% of 570 slots already in SL5 -NCG-INGRID-PT (atlas and cms): 100% of resources already in SL5. CMS Tier2s: -CIEMAT: Already migrated ~50% of the WNs to SL5 (364 out of 768). Plan to complete the migration during the next 2 weeks. -IFCA: Currently testing SL5 WNs. Plan to migrate production farm before the end of September. LHCb Tier2s: -UB: Will migrate to SL5 after PIC has done so. -USC: No plans yet. 7

LCG TRIUMF T1 the production system is still at SL4. We deployed a tiny SL5 cluster (with separate CE and ATLAS software area) in order to do some testing and validation. New hardware coming online in the coming 1-2 months will be on SL5. We will likely run two large separate clusters and then do a full move to SL5. Canadian T2s –Four T2 sites are currently running SL4 exclusively (UVic, UofA, WestGrid-UofA, Toronto). Timing of migration to be be scheduled pending further discussion with site management as these are parts of shared facilities. –One T2 has SL4 and SL5 (SFU). –New facilities coming online at Scinet-Toronto will be SL5* (CentOS 5). UK T1 Test cluster/CE now. Substantial resources moved mid September UK T2 Most Sites will migrate in September. A few will take as long as November, and one little-used site January 8