Computing Resources for ILD Akiya Miyamoto, KEK with a help by Vincent, Mark, Junping, Frank 9 September 2014 ILD Oshu City a report on work.

Slides:



Advertisements
Similar presentations
Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
Advertisements

T1 at LBL/NERSC/OAK RIDGE General principles. RAW data flow T0 disk buffer DAQ & HLT CERN Tape AliEn FC Raw data Condition & Calibration & data DB disk.
P hysics background for luminosity calorimeter at ILC I. Božović-Jelisavčić 1, V. Borka 1, W. Lohmann 2, H. Nowak 2 1 INN VINČA, Belgrade 2 DESY, Hamburg.
Sept 30 th 2004Iacopo Vivarelli – INFN Pisa FTK meeting Z  bb measurement in ATLAS Iacopo Vivarelli, Alberto Annovi Scuola Normale Superiore,University.
Status of  b Scan Jianchun Wang Syracuse University Representing L b scanners CLEO Meeting 05/11/02.
Large scale data flow in local and GRID environment V.Kolosov, I.Korolko, S.Makarychev ITEP Moscow.
Computing plans for the post DBD phase Akiya Miyamoto KEK ILD Session ECFA May 2013.
Status of ILC Computing Model and Cost Study Akiya Miyamoto Norman Graf Frank Gaede Andre Sailer Marcel Stanitzki Jan Strube 1.
ILD Detector Optimization and Benchmarking Akiya Miyamoto, KEK at Tsinghua University 12-January-2009.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
Jan MDI WS SLAC Electron Detection in the Very Forward Region V. Drugakov, W. Lohmann Motivation Talk given by Philip Detection of Electrons and.
Computing for ILC experiments Akiya Miyamoto KEK 14 May 2014 AWLC14 Any comments are welcomed.
BeamCal Simulations with Mokka Madalina Stanescu-Bellu West University Timisoara, Romania Desy, Zeuthen 30 Jun 2009 – FCAL Meeting.
Plan of Jupiter data production Akiya Miyamoto 19 March 2008 ILD Optimization WG Updates of the presentation at ILD meeting, 7 March, at Sendai.
Commissioning Studies Top Physics Group M. Cobal – University of Udine ATLAS Week, Prague, Sep 2003.
ILC DBD Common simulation and software tools Akiya Miyamoto KEK ILC PAC 14 December 2012 at KEK.
LCWS2013 Higgs Recoil Mass Study at 350 GeV Apr 27, 2014 Jacqueline Yan Komamiya Lab, Univ. of Tokyo 富山 12013/07/20.
Update on MC-04(05) –New interaction region –Attenuation lengths vs endcap module numbers –Regeneration and nuclear interaction –EMC time resolution –
Status of tth analysis 1. √s=500 GeV 2. √s=1 TeV 3. MC requests R. Yonamine, T. Tanabe, K. Fujii T. Price, H. Tabassam, N. Watson, V. Martin ILD Workshop,
Feb. 7, 2007First GLAST symposium1 Measuring the PSF and the energy resolution with the GLAST-LAT Calibration Unit Ph. Bruel on behalf of the beam test.
Taikan Suehara, 16 th general meeting of ILC physics (Asia) wg., 2010/07/17 page 1 Model 500 GeV Taikan Suehara ICEPP, The Univ. of Tokyo.
Software Overview Akiya Miyamoto KEK JSPS Tokusui Workshop December-2012 Topics MC production Computing reousces GRID Future events Topics MC production.
Taikan Suehara et al., Albuquerque, 1 Oct page 1 Tau-pair analysis in the ILD detector Taikan Suehara (ICEPP, The Univ. of Tokyo) T. Tanabe.
Offline Status Report A. Antonelli Summary presentation for KLOE General Meeting Outline: Reprocessing status DST production Data Quality MC production.
Status report of the KLOE offline G. Venanzoni – LNF LNF Scientific Committee Frascati, 9 November 2004.
Summary of Software and tracking activities Tokusui Workshop December 2015 Akiya Miyamoto KEK.
John MarshallHZ Recoil Mass1 HZ Recoil Mass Analysis: Production Requests LCD WG6 Meeting, 28/02/2012 J.S. Marshall, University of Cambridge.
Software Common Task Group Report Akiya Miyamoto ILC PAC POSTECH 3 November 2009.
Next Step Akiya Miyamoto, KEK 17-Sep-2008 ILD Asia Meeting.
Computing Resources for ILD Akiya Miyamoto, KEK with a help by Vincent, Mark, Junping, Frank 9 September 2014 ILD Oshu City a report on work.
Large scale data flow in local and GRID environment Viktor Kolosov (ITEP Moscow) Ivan Korolko (ITEP Moscow)
LHCb report to LHCC and C-RSG Philippe Charpentier CERN on behalf of LHCb.
From the Standard Model to Discoveries - Physics with the CMS Experiment at the Dawn of the LHC Era Dimitri Bourilkov University of Florida CMS Collaboration.
KEK GRID for ILC Experiments Akiya Miyamoto, Go Iwai, Katsumasa Ikematsu KEK LCWS March 2010.
06/2006I.Larin PrimEx Collaboration meeting  0 analysis.
Report from SB2009 Working Group Akiya Miyamoto 8-July-2010 DESY ILD Software and Integration Workshop.
LHCbComputing Computing for the LHCb Upgrade. 2 LHCb Upgrade: goal and timescale m LHCb upgrade will be operational after LS2 (~2020) m Increase significantly.
Régis Lefèvre (LPC Clermont-Ferrand - France)ATLAS Physics Workshop - Lund - September 2001 In situ jet energy calibration General considerations The different.
SiD Full MC Samples E.Devetak, A.Nomerotski 26 Aug 2008.
Taikan Suehara, 15 th general meeting of ILC physics (Asia) wg., 2010/05/15 page 1 Model 500 GeV Taikan Suehara ICEPP, The Univ. of Tokyo.
Kali Calo progress report Dasha Savrina (ITEP/Moscow), Vanya Belyaev.
Jianming Qian, UM/DØ Software & Computing Where we are now Where we want to go Overview Director’s Review, June 5, 2002.
Status Report on Data Reconstruction May 2002 C.Bloise Results of the study of the reconstructed events in year 2001 Data Reprocessing in Y2002 DST production.
Physics performance with SB2009 Progress report of a study by QuickSimulator Akiya Miyamoto 15-Jan-2010 KEK SB2009 WG Meeting.
Asian Physics and Software meeting Akiya Miyamoto KEK 3-September-2010.
Higgs Recoil Mass Measurement and ZH Cross Section Tim Barklow, Awatif Belymam SLAC Feb 24, 2009.
Initial proposal for the design of the luminosity calorimeter at a 3TeV CLIC Iftach Sadeh Tel Aviv University March 6th 2009
Computing requirements for the experiments Akiya Miyamoto, KEK 31 August 2015 Mini-Workshop on ILC Infrastructure and CFS for Physics and KEK.
ATLAS – statements of interest (1) A degree of hierarchy between the different computing facilities, with distinct roles at each level –Event filter Online.
ILD MCProduction with ILCDirac
BESIII data processing
ILD Soft & Analysis meeting
Discrimination of new physics models with ILC
Benchmark analysis of tau-pairs
BeamCal Simulation for CLIC
Muons from single top events for DC04
Tracking performances & b-tagging commissioning
ILD Soft & Analysis meeting
Akiya Miyamoto KEK 1 June 2016
Brief summary of discussion at LCWS2010 and draft short term plan
Philippe Charpentier CERN – LHCb On behalf of the LHCb Computing Group
ILD Background WG Mark Thomson
ALICE Computing Model in Run3
ALICE Computing Upgrade Predrag Buncic
ILD Ichinoseki Meeting
ZEUS Computing Board Report
Simulation plan -- GLDPrime --
Using Single Photons for WIMP Searches at the ILC
How to optimize the ILC energy for measuring ZH
ATLAS DC2 & Continuous production
Presentation transcript:

Computing Resources for ILD Akiya Miyamoto, KEK with a help by Vincent, Mark, Junping, Frank 9 September 2014 ILD Oshu City a report on work in progress

Introduction Computing design and cost are not included in ILC TDR, because  difficult to estimate a reliable estimate now  development of computing technology in > 10 years will be enormous But there requests to evaluate the cost and the human power needed for ILC computing.  “HL-LHC needs a huge computing resource. How about ILC ?”  Funding agency would like to know the total cost.  First discussion at AWLC14 Sim/Rec session.  Several suggestions for improvements  LCC infrastructure group’s request from Yasuhiro  Ad hoc meeting within ILD in August with Vincent, Frank, Junping, Mark, Akiya /9/8 ILD Oshu

A model of ILD data processing ILD F.E. Online Processing Farm 1.build a train data 2.sub-detector based preliminary reconstruction 3.identify bunches of interest 4.calibration and alignment 5.background hit rejection 6.full event reconstruction 7.event classification Fast Physics Data (FPD) Online reconstruction chain Raw Data (RD ) Online Processed Data (OPD) JOB-A Re-processing with better constants Offline Reconstructed Data (ORD) DST MC data JOB-C MC-Production JOB-B Produce condensed data sample GRID based Offline computing Type of Data Raw Data(RD): train-based data. 3 Copies at ILC Site, US, EU Calibration Data (CD): - Data for calibration & alignments Online Processed Data (OPD): - Event based data after filtering - Consists of raw data objects and reconstructed objects - Replicated to GRID sites for sub-sequent processing Fast Physics Data (FPD): - DST of important physics events Offline Reconstructed Data (ORD): - Re-processed OPD. Produced after a few months later with new constants. Same events as OPD. DST: - DST from OPD or ORD. Several DSTs may be produced depending on physics MC data: - Monte Calro data Calibration Data (CD) 3 O(1)GB/sec

4 Bases of estimation: ILD raw data size in TDR GeV) Total data size : < 150MB/train = 750MB/sec ~ 6Gbps (bit per sec) VXD : ~ 100MB BeamCal : 126 MB  reduced to 5% = 6MB Others < 40MB raw data size per train 500 GeV ~ 7.5PB/1 year ( 10 7 sec ) for ILD

Energy dependence of pair hits The number of SimTracker/SimCalorimeter hits/BX were obtained by reading 100 BX of pair simulation files. Many Muon hits seems inconsistent with the data size estimation in TDR. The ratio relative to 500 GeV was used to estimate the data size at different beam energy 5

Energy dependence of pair hits GeV Data size / Train 6 Data size / Year Beam induced hits are scaled.

Energy dependence of MC events Procedure: 50~100 events of all type of Stdhep events were simulated by Mokka and estimated CPU time and data size. 250GeV 500GeV 350GeV 2380kev/1fb -1 Summary Energy kEvents/fb CPU days/fb Data size (GB)/fb note: - high cross section events, such as bhabha and ee  not included. - Marlin Rec. is not take into account yet. 7

8 Total storage and CPU : Assumptions for estimation Online Processed Data (OPD) = RDx0.02x2.03  0.02: #signal and ECAL bhabha events <1% of BX  2.03: from MC, (rec+dst)/sim ~ 2.03 Offline Reconstructed Data = OPDx1.5 ( ~ RDx0.02x3 ) MC data :  x10 real data statistics  sim+rec+dst ~ 2.03xsim  x2 for bhabha+eemumu+… Data replication  Raw data : 3 copies at ILC site, EU and USA  OPD, ORD, MC DST at 10 major GRID sites CPU Time  MC data: CPU time for 10 times more statistics than Raw Data Another factor 2 for bhabha, eemumu, … Marline CPU time = 0.2 x Mokka CPU time  Online Process Data: CPU time/event = 0.2 x Mokka CPU time Nb. of events to be processed = 2xNb. of signal ( 2 for Bhabha, etc )  Offline Process Data: Same as OPD  Computing efficiency: 90%

9 Summary of data size and CPU of ~1 year(10 7 sec) of running preliminary

Int. Lumi (1/fb) Data Size(PB) in 3 year in 3 year Year from T0 in 4.5 year in 5 year Evolution of storage capacity based on a sample running scenario 10

Summary A preliminary result on computing resources necessary for ILD is presented. Numbers used for the estimation are very preliminary and subject to change Needs to be improved or confirmed. For examples,  Efficiency of non-signal event filtering  CPU time of online data processing, incl. calibration, alignment, filtering,...  Consistency of raw data size in TDR and pair background simulation  How many MC events do we produce ?  How many reprocessing do we need ?  size of disk storage ?  resources during the construction ?  … more Independent estimation by SiD will help Comments/suggestions/thoughts are highly welcomed. 2014/9/8 ILD Oshu 11