LHCb report to LHCC and C-RSG Philippe Charpentier CERN on behalf of LHCb.

Slides:



Advertisements
Similar presentations
Status GridKa & ALICE T2 in Germany Kilian Schwarz GSI Darmstadt.
Advertisements

ATLAS Tier-3 in Geneva Szymon Gadomski, Uni GE at CSCS, November 2009 S. Gadomski, ”ATLAS T3 in Geneva", CSCS meeting, Nov 091 the Geneva ATLAS Tier-3.
Clara Gaspar on behalf of the LHCb Collaboration, “Physics at the LHC and Beyond”, Quy Nhon, Vietnam, August 2014 Challenges and lessons learnt LHCb Operations.
ATLAS computing in Geneva Szymon Gadomski, NDGF meeting, September 2009 S. Gadomski, ”ATLAS computing in Geneva", NDGF, Sept 091 the Geneva ATLAS Tier-3.
Large scale data flow in local and GRID environment V.Kolosov, I.Korolko, S.Makarychev ITEP Moscow.
LHCb Quarterly Report October Core Software (Gaudi) m Stable version was ready for 2008 data taking o Gaudi based on latest LCG 55a o Applications.
LHCC Comprehensive Review – September WLCG Commissioning Schedule Still an ambitious programme ahead Still an ambitious programme ahead Timely testing.
Alexandre A. P. Suaide VI DOSAR workshop, São Paulo, 2005 STAR grid activities and São Paulo experience.
Computing and LHCb Raja Nandakumar. The LHCb experiment  Universe is made of matter  Still not clear why  Andrei Sakharov’s theory of cp-violation.
Claudio Grandi INFN Bologna CMS Operations Update Ian Fisk, Claudio Grandi 1.
Computing Infrastructure Status. LHCb Computing Status LHCb LHCC mini-review, February The LHCb Computing Model: a reminder m Simulation is using.
Status of 2015 pledges 2016 requests RRB Report Concezio Bozzi INFN Ferrara LHCb NCB, November 3 rd 2014.
ATLAS Scalability Tests of Tier-1 Database Replicas WLCG Collaboration Workshop (Tier0/Tier1/Tier2) Victoria, British Columbia, Canada September 1-2, 2007.
1 LCG-France sites contribution to the LHC activities in 2007 A.Tsaregorodtsev, CPPM, Marseille 14 January 2008, LCG-France Direction.
Summary of RSG/RRB Ian Bird GDB 9 th May 2012
WLCG Service Report ~~~ WLCG Management Board, 9 th August
LHCb The LHCb Data Management System Philippe Charpentier CERN On behalf of the LHCb Collaboration.
The LHCb CERN R. Graciani (U. de Barcelona, Spain) for the LHCb Collaboration International ICFA Workshop on Digital Divide Mexico City, October.
1 LHCb on the Grid Raja Nandakumar (with contributions from Greig Cowan) ‏ GridPP21 3 rd September 2008.
LHCb production experience with Geant4 LCG Applications Area Meeting October F.Ranjard/ CERN.
CERN IT Department CH-1211 Genève 23 Switzerland t Frédéric Hemmer IT Department Head - CERN 23 rd August 2010 Status of LHC Computing from.
Marco Cattaneo LHCb computing status for LHCC referees meeting 14 th June
The CMS CERN Analysis Facility (CAF) Peter Kreuzer (RWTH Aachen) - Stephen Gowdy (CERN), Jose Afonso Sanches (UERJ Brazil) on behalf.
Status of the Bologna Computing Farm and GRID related activities Vincenzo M. Vagnoni Thursday, 7 March 2002.
Claudio Grandi INFN Bologna CERN - WLCG Workshop 13 November 2008 CMS - Plan for shutdown and data-taking preparation Claudio Grandi Outline: Global Runs.
LHCbComputing LHCC status report. Operations June 2014 to September m Running jobs by activity o Montecarlo simulation continues as main activity.
WLCG LHCC mini-review LHCb Summary. Outline m Activities in 2008: summary m Status of DIRAC m Activities in 2009: outlook m Resources in PhC2.
Large scale data flow in local and GRID environment Viktor Kolosov (ITEP Moscow) Ivan Korolko (ITEP Moscow)
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
WLCG Service Report ~~~ WLCG Management Board, 18 th September
Predrag Buncic ALICE Status Report LHCC Referee Meeting CERN
OPERATIONS REPORT JUNE – SEPTEMBER 2015 Stefan Roiser CERN.
1 Andrea Sciabà CERN The commissioning of CMS computing centres in the WLCG Grid ACAT November 2008 Erice, Italy Andrea Sciabà S. Belforte, A.
ATLAS Distributed Computing perspectives for Run-2 Simone Campana CERN-IT/SDC on behalf of ADC.
Predrag Buncic CERN ALICE Status Report LHCC Referee Meeting 01/12/2015.
1 LHCb computing for the analysis : a naive user point of view Workshop analyse cc-in2p3 17 avril 2008 Marie-Hélène Schune, LAL-Orsay for LHCb-France Framework,
Victoria, Sept WLCG Collaboration Workshop1 ATLAS Dress Rehersals Kors Bos NIKHEF, Amsterdam.
LHCb status and plans Ph.Charpentier CERN. LHCb status and plans WLCG Workshop 1-2 Sept 2007, Victoria, BC 2 Ph.C. Status of DC06  Reminder:  Two-fold.
LHCbComputing Computing for the LHCb Upgrade. 2 LHCb Upgrade: goal and timescale m LHCb upgrade will be operational after LS2 (~2020) m Increase significantly.
CMS: T1 Disk/Tape separation Nicolò Magini, CERN IT/SDC Oliver Gutsche, FNAL November 11 th 2013.
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
LHCb Computing activities Philippe Charpentier CERN – LHCb On behalf of the LHCb Computing Group.
LHCb 2009-Q4 report Q4 report LHCb 2009-Q4 report, PhC2 Activities in 2009-Q4 m Core Software o Stable versions of Gaudi and LCG-AA m Applications.
1 June 11/Ian Fisk CMS Model and the Network Ian Fisk.
GGUS summary (3 weeks) VOUserTeamAlarmTotal ALICE7029 ATLAS CMS LHCb Totals
Grid technologies for large-scale projects N. S. Astakhov, A. S. Baginyan, S. D. Belov, A. G. Dolbilov, A. O. Golunov, I. N. Gorbunov, N. I. Gromova, I.
Barthélémy von Haller CERN PH/AID For the ALICE Collaboration The ALICE data quality monitoring system.
ATLAS Computing: Experience from first data processing and analysis Workshop TYL’10.
LHCb LHCb GRID SOLUTION TM Recent and planned changes to the LHCb computing model Marco Cattaneo, Philippe Charpentier, Peter Clarke, Stefan Roiser.
LHCb Computing 2015 Q3 Report Stefan Roiser LHCC Referees Meeting 1 December 2015.
ATLAS – statements of interest (1) A degree of hierarchy between the different computing facilities, with distinct roles at each level –Event filter Online.
First collisions in LHC
Ian Bird WLCG Workshop San Francisco, 8th October 2016
LCG Service Challenge: Planning and Milestones
Predrag Buncic ALICE Status Report LHCC Referee Meeting CERN
evoluzione modello per Run3 LHC
Update on Plan for KISTI-GSDC
Offline data taking and processing
LHCb Software & Computing Status
Luca dell’Agnello INFN-CNAF
LHCb Computing Model and Data Handling Angelo Carbone 5° workshop italiano sulla fisica p-p ad LHC 31st January 2008.
Philippe Charpentier CERN – LHCb On behalf of the LHCb Computing Group
Olof Bärring LCG-LHCC Review, 22nd September 2008
Simulation use cases for T2 in ALICE
Status of MC production on the grid
R. Graciani for LHCb Mumbay, Feb 2006
LHCb Computing Philippe Charpentier CERN
LHCb status and plans Ph.Charpentier CERN.
ATLAS DC2 & Continuous production
The LHCb Computing Data Challenge DC06
Presentation transcript:

LHCb report to LHCC and C-RSG Philippe Charpentier CERN on behalf of LHCb

LHCb to LHCC and C-RSG review, PhC2 Activities in 2009-Q3/Q4 m Core Software o Stable versions of Gaudi and LCG-AA m Applications o Stable as of September for real data o Fast minor releases to cope with reality of life… m Monte-Carlo o Intensive MC09 simulation 5TeV) P Minimum bias P b- and c- inclusive P b signal channels o Few events in foreseen 2009 configuration (450 GeV) o MC09 stripping (2 passes) P Trigger stripping P Physics stripping m Real data reconstruction and stripping o As of November 20 th …

Resource usage LHCb to LHCC and C-RSG review, PhC3

139 sites hit, 4.2 million jobs m Start in June: start of MC09 LHCb to LHCC and C-RSG review, PhC4

Job failure: 15% (17% at Tier1s) LHCb to LHCC and C-RSG review, PhC5

Failure breakdown LHCb to LHCC and C-RSG review, PhC6

Production and user jobs LHCb to LHCC and C-RSG review, PhC7

Jobs at Tier1s LHCb to LHCC and C-RSG review, PhC8

Job types at Tier1s LHCb to LHCC and C-RSG review, PhC9

CPU used (not normalised) LHCb to LHCC and C-RSG review, PhC10 m Average job duration o 5.6 hours for all jobs o 20 mn for user jobs (20%) o 6.6 hours for production jobs

m Average job duration o 5.6 hours for all jobs o 20 mn for user jobs o 6.6 hours for production jobs LHCb to LHCC and C-RSG review, PhC11

CPU usage (not normalised) LHCb to LHCC and C-RSG review, PhC12

WLCG vs LHCb accounting (unnormalised) m 13% more in WLCG than in DIRAC (unnormalised) o 1.26 Mdays vs 1.1 Mdays o Overhead of non reporting jobs + pilot/LCG/batch frameworks m Average CPU power: 1.5 kSI2k (from WLCG accounting) LHCb to LHCC and C-RSG review, PhC13

Normalised CPU usage in 2009 m Ramping up of pilot role in summer m Resource usage decreased since LHC restarted o Concentrate on (few) real data o Wait for data analysis for continuing MC simulation LHCb to LHCC and C-RSG review, PhC14 m Group 1: production m Group 2: pilot m Group 3 & 4: user m Group 5: lcgadmin

Resource usage LHCb to LHCC and C-RSG review, PhC15 m Note: CERN above does not include non-Grid usage o From WLCG accounting: 32% is non-Grid at CERN o CERN number should then read: 2.18 kHS06.years m CPU usage within 10% of requests m Distribution not exactly like expected o More non-Tier1 resources available P Less MC ran at CERN + Tier1s o Almost no real data: less resources used at CERN P CAF not used as much as expected SiteUsed (kHS06.years)Requested (kHS06.years) CERN Tier1s Tier2s Total

Storage usage LHCb to LHCC and C-RSG review, PhC16 m *) From Castor queries today m **) From WLCG accounting end December m ***) Including 420 TB for T1D0 cache m Sites provided slightly more than the pledges o Thanks! o At CERN, some disk pools (default, T1D0) were not included in the requests but are in the accounting SiteRequestedAllocatedUsed CERN *) TxD CERN *) T1D irrelevant CERN **) Tier1s **) 1740 ***)

Experience with real data LHCb to LHCC and C-RSG review, PhC17

First experience with real data m Very low crossing rate o Maximum 8 bunches colliding (88 kHz crossing) o Very low luminosity o Minimum bias trigger rate: from 0.1 to 10 Hz o Data taken with single beam and with collisions LHCb to LHCC and C-RSG review, PhC18 No zero-suppression in VELO Otherwise ~25 GB only!

Real data processing m Iterative process o Small changes in reconstruction application o Improved alignment o In total 7 sets of processing conditions P Only last files were all processed 4 times now (twice in 2010) m Processing submission o Automatic job creation and submission after: P File is successfully migrated in Castor P File is successfully replicated at Tier1 o If job fails for a reason other than application crash P The file is reset as “to be processed” P New job is created / submitted (automatic) o Processing more efficient at CERN (see later) P Eventually after few trials at Tier1, the file is processed at CERN o No stripping ;-) P DST files distributed to all Tier1s for analysis LHCb to LHCC and C-RSG review, PhC19

Reconstruction jobs LHCb to LHCC and C-RSG review, PhC20

Issues with real data m Castor migration o Very low rate: had to change the migration algorithm for more frequent migration (1 hour instead of 8 hours) m Issue with large files (above 2 GB) o Real data files are not ROOT files but open by ROOT o There was an issue with a compatibility library for slc4-32 bit on slc5 nodes P Fixed within a day m Wrong magnetic field sign o Due to different coordinate systems for LHCb and LHC ;-) o Fixed within hours m Data access problem (by protocol, directly from server) o Still dCache issue at IN2P3 and NIKHEF P dCache experts working on it o Moved to copy mode paradigm for reconstruction o Still a problem for user jobs: a pain! P Sites are regularly banned for analysis LHCb to LHCC and C-RSG review, PhC21

Transfers and job latency m No problem observed during file transfers o Files randomly distributed to Tier1 o Will move to distribution by runs (few 100’s files) o For 2009, runs were never longer than 4-5 files! o Max file size set to 3 GB m Very good Grid latency o Time between submission and jobs starting running LHCb to LHCC and C-RSG review, PhC22

Resource requests LHCb to LHCC and C-RSG review, PhC23

Resource requests for m 2010 running o The requests were made in April-June 2009 P No additional resources expected P Try to fit within those requests o Running scenario for LHCb P March: 35% LHC 100 Hz P April-May-June: 50% LHC 1 kHz in average P July-August-September-half October: 2 kHz P no Heavy Ion run for LHCb P This corresponds to kHz P The request accounted precisely by chance for seconds ( ) P Therefore we use seconds for 2010 at 2 kHz trigger rate m 2011 running o Use the recommendation of MB P March: 35% LHC 2 kHz P April to mid-October: 50% LHC 2 kHz P Total running time: seconds m 2012: no run LHCb to LHCC and C-RSG review, PhC24

Resource requirements for LHCb to LHCC and C-RSG review, PhC25 kHEP06*year 2010 (old)2010 (confirmed)2011 (prelim.)2012 (very prelim.) Integrated PowerIntegratedPowerIntegratedPower CERN T CERN CAF - Analysis/Calib/Alignm ent CERN T0 + T Tier1s Tier2s Total Disk (TB) CERN T0 + T Tier1s Tier2s20 Total Tape (TB) CERN T0 + T Tier1s Total

Comments on resources m Very uncertain and fluctuating running plans! m Depending on LHC running, MC requests may be different o Minimum bias, charm physics, b physics… m Only after one year (at least) experience we can see how running analysis on the Grid works o Analysis at CERN? o Analysis at Tier3s? o Reliability for analysis? m 2012 is still very uncertain o No LHC running o Will the MC requests be the same as previous years o How many reprocessings? P Currently assume 1 full reprocessing of 2010 and 2 of 2011 LHCb to LHCC and C-RSG review, PhC26

Conclusions m Real data in 2009 o So few that it didn’t impact resource usage o Was extremely valuable for P Setting procedures P Start understanding the detector d Already very promising performance after a few days  Π 0 peak, Λ and K 0 reconstruction… P Exercising automatic processes m 2010 o Still expect somewhat chaotic running P Frequent changes in LHC settings, LHCb trigger commissioning o No change in LHCb resource requests w.r.t. June 2009 m 2011 o More precise requests with experience from 2010 m 2012 o Still very preliminary, but small increase only compared to 2011 LHCb to LHCC and C-RSG review, PhC27