Presentation is loading. Please wait.

Presentation is loading. Please wait.

Report to Worldwide Analysis Review Panel J. Harvey 23 March 2000 Slide 1 LHCb Processing requirements Focus on the first year of data-taking Report to.

Similar presentations


Presentation on theme: "Report to Worldwide Analysis Review Panel J. Harvey 23 March 2000 Slide 1 LHCb Processing requirements Focus on the first year of data-taking Report to."— Presentation transcript:

1 Report to Worldwide Analysis Review Panel J. Harvey 23 March 2000 Slide 1 LHCb Processing requirements Focus on the first year of data-taking Report to Management and Resources Panel 24 March 2000 J.Harvey / CERN

2 Report to Worldwide Analysis Review Panel J. Harvey 23 March 2000 Slide 2 Talk Outline qDataflow model qData processing and storage requirements qBaseline computing model qData processing at CERN qThe first 6 months - calibration needs qImpact on processing requirements qManpower qConclusions

3 Report to Worldwide Analysis Review Panel J. Harvey 23 March 2000 Slide 3 General Comments qLHCb Technical Notes in preparation ãLHCb answers to the SPP questions ãBaseline Model of LHCb’s distributed computing facilities qBaseline model reflects current thinking ãbased on what seems most appropriate technically ãdiscussions are just starting qOpen meeting of the collaboration April 5-7 ãfeedback and changes can be expected

4 Report to Worldwide Analysis Review Panel J. Harvey 23 March 2000 Slide 4 Dataflow Model RAW Data DAQ system L2/L3 Trigger Calibration Data Reconstruction Event Summary Data (ESD) Reconstruction Tags Detector RAW Tags L3YES, sample L2/L3NO ESD Reconstruction TagsAnalysis Object Data (AOD)Physics Tags First Pass Analysis Physics Analysis Private Data Analysis Workstation Physics results ESD RAW

5 Report to Worldwide Analysis Review Panel J. Harvey 23 March 2000 Slide 5 Real Data Processing Requirements

6 Report to Worldwide Analysis Review Panel J. Harvey 23 March 2000 Slide 6 Simulation Requirements

7 CPU for production Mass Storage for RAW, ESD AOD, and TAG Institute Selected User Analyses Institute Selected User Analyses Regional Centre User analysis Production Centre Generate raw data Reconstruction First pass analysis User analysis Regional Centre User analysis Regional Centre User analysis Institute Selected User Analyses Regional Centre User analysis Institute Selected User Analyses CPU for analysis Mass storage for AOD, TAG CPU and data servers AOD,TAG real : 80TB/yr sim: 120TB/yr AOD,TAG 8-12 TB/yr

8 Production Centre (x1) Regional Centre (~x5) Institute (~x50) Real DataSimulated Data Data collection Triggering Reconstruction Final State Reconstruction User Analysis CERN Output to each RC: AOD and TAG datasets 20TB x 4 times/yr= 80TB/yr User Analysis Output to each Institute: AOD and TAG for samples 1TB x 10 times/yr= 10TB/yr e.g. RAL Event Generation GEANT tracking Reconstruction Final State Reconstruction User Analysis Output to each RC: AOD, Generator and TAG datasets 30TB x 4 times/yr= 120TB/yr User Analysis Selected User Analysis Output to each institute: AOD and TAG for samples 3TB x 10 times/yr= 30TB/yr RAL, Lyon,...CERN, Lyon, ….

9 Report to Worldwide Analysis Review Panel J. Harvey 23 March 2000 Slide 9 Compute Facilities at CERN DAQ / Calibration L2/L3 trigger processing Reconstruction Re-reconstruction (2-3 times per year) CDR 80 Gb/s Experiment - LHC Pit 8 CERN Computer Centre Temporary store (10 TB) for Raw Data and ESD Calibration (5TB) Data Storage and Analysis Physics Analysis MSS Readout Network Data Taking DAQ @ 200 Hz Raw data 20 MB/s ESD data 20 MB/s SHUTDOWN Reprocessing @ 400 Hz Raw data 60 MB/s ESD Data 60 MB/s disk RAW ESD AOD TAG Data and CPU server CPU farm AOD TAG to Regional Centres

10 Report to Worldwide Analysis Review Panel J. Harvey 23 March 2000 Slide 10 Facility at Pit - Requirements

11 Report to Worldwide Analysis Review Panel J. Harvey 23 March 2000 Slide 11 CERN Computer Centre Requirements

12 Report to Worldwide Analysis Review Panel J. Harvey 23 March 2000 Slide 12 LHCb - the first 6 months qAssumptions ãwe will get our full nominal luminosity 2. 10 32 (x1) ãthe LHC duty cycle will be lower - assume 25% (x 0.5) ãdatataking will start end June - assume 60 days in 2005 (x0.5) qFactors ãcommission high level triggers ãcommission reconstruction software ãunderstand calibration and alignment qConsequences ãstream of “pass-all” events to study trigger candidates (x1) ãas trigger improves b sample will get richer ãlots of re-reconstruction (x 2)

13 Report to Worldwide Analysis Review Panel J. Harvey 23 March 2000 Slide 13 Calibration qDecide what a run is... ã10 7 events per day ã2 x 10 hour fills per day ã5. 10 6 events per fill ã5 runs per fill (~2 hrs - data quality) ã10 6 events per run or 100 GB qCalibration ãVELO - 1 short run (5 mins) at SOR for alignment ãRICH - P,T changes can be detected and corrected in real time ãPASS 0 correct for defects ãQuality - identify all ‘dead wires’ etc and input at start of next run Calibration Cycle Raw DataCalibration Data Reconstructio n ESD Quality SOR EOR Pass 0

14 Report to Worldwide Analysis Review Panel J. Harvey 23 March 2000 Slide 14 Calibration - Alignment 1. Start from survey (Following steps use real data) 2. Alignment and calibration of trackers ãvertex, inner, outer, muon 3. Cross alignment between different trackers ãvertex, inner, outer, muon 4. Alignment and calibration of other detectors with well- measured tracks ãcalorimeters and RICH detectors qWithin 1 month of start of datataking we expect to have good understanding of alignment and calibration of the detector

15 Report to Worldwide Analysis Review Panel J. Harvey 23 March 2000 Slide 15 Calibration - Alignment qWill need intensive effort to understand strange alignment effects qRun after reconstruction - use reconstructed track information (“second order calibration”) qNot CPU intensive - ~25 SI95sec / event qEstimate we require 10 6 events (2 hours of datataking) qRepeat alignment after interventions on the detector ã2-3 times per year qRe-process whole data sample after new alignment completed ãestimate ~ 5-6 complete re-processings in first year instead of 2-3 (x2 increase)

16 Report to Worldwide Analysis Review Panel J. Harvey 23 March 2000 Slide 16 Simple Analyses qMust be ready to do interesting physics measurements since we run with full luminosity  B d -> J/  K s is straightforward ãcan be studied “online” as data are recorded ãresults compared with measurements from BaBar and BELLE qOther channels need very good understanding of the detector ãB s -> D s K needs excellent understanding of vertex detector ãresults may not come promptly with the data

17 Report to Worldwide Analysis Review Panel J. Harvey 23 March 2000 Slide 17 Impact on CPU needs qHigh Level Triggers ãFull luminosity implies full L2/L3 capacity needed from day 1 ãcorresponds to 10000 SI95 (L2), 25000 SI95 (L3) qReconstruction : ã0.5 (duty cycle ) x 0.5 (days) x 2 (reprocessings) = 0.5 ãimplement half reconstruction capacity in first year ãcorresponds to 25000 SI95 ãinstall full capacity for second year ãbenefit from improvements in performance ãNB farm will be a heterogeneous system

18 Report to Worldwide Analysis Review Panel J. Harvey 23 March 2000 Slide 18 Impact on CPU needs qAnalysis ãreduced event sample (x 0.25) ãintensive development of analysis algorithms ãextra load at CERN in the first months before distributed computing model is fully operational ãincentive to turn around the full analysis quickly (~2 days) ãassume need full analysis power available from day 1 ãcorresponds to 20,000 SI95 ãrepeat analysis every week - 10 reprocessings å25 TB of data if all are kept

19 Report to Worldwide Analysis Review Panel J. Harvey 23 March 2000 Slide 19 Costs qCPU Power ã80,000 SI95 ã12 SFr / SI95 ã1 million SFr qData storage ã0.25 10 9 events ã25 TB RAW + 25TB ESD + 25 TB AOD = 75 TB ãTape - 0.5 SFr / GB ~40,000 SFr ãDisk ~3 SFr / GB x 40 GB (active) = ~120,000 SFr

20 Report to Worldwide Analysis Review Panel J. Harvey 23 March 2000 Slide 20 Assignment of responsibility qUnderstood in LHCb institutes building subdetectors also take responsibility for development and maintenance of software qThe detector TDRs are in preparation now qMOU after TDRs

21 Report to Worldwide Analysis Review Panel J. Harvey 23 March 2000 Slide 21 TDR Schedule qMagnet Dec 1999 qVertex Apr 2001 qITR Sep 2001 qOTR Mar 2001 qRICH Jun 2000 qMuon Jan 2001 qCalorimeter Jul 2000 qTriggerL0/L1 Jan 2002 qDAQ Jan 2002 qComputing Jul 2002

22 Report to Worldwide Analysis Review Panel J. Harvey 23 March 2000 Slide 22 Manpower for Software ActivityNeedHaveMiss Type Software frameworks 12 7 5E/P Software support 5 2 3E Muon software 6 6 0 P/E Muon trigger 2 2 0 P Tracking 6 6 0 P Vertex 6 P/E Trigger (L0,L1,L2,L3) 7 3 4 E/P Calorimeter (ECAL,HCAL,PREShower) 8 8 0 P RICH Total 52+ 34+ 12+ Flat profile in time. Missing manpower needs to be acquired now

23 Report to Worldwide Analysis Review Panel J. Harvey 23 March 2000 Slide 23 Manpower for facilities and operations Project Leader1 FTE Compute farm2 FTE Network0.5 FTE Storage, media, bookkeeping0.5 FTE Servers, desktop, OS 2 FTE ãuse services of outsource contract Control room infrastructure1 FTE Utilities for day to day operation0.5 FTE Training shift crews, documentation0.5 FTE Link person for LHC machine1 FTE Total9 FTEs Time profile - 3 FTEs by 2002, remainder by 2004


Download ppt "Report to Worldwide Analysis Review Panel J. Harvey 23 March 2000 Slide 1 LHCb Processing requirements Focus on the first year of data-taking Report to."

Similar presentations


Ads by Google