Computing Resources for ILD Akiya Miyamoto, KEK with a help by Vincent, Mark, Junping, Frank 9 September 2014 ILD Oshu City a report on work.

Slides:



Advertisements
Similar presentations
T1 at LBL/NERSC/OAK RIDGE General principles. RAW data flow T0 disk buffer DAQ & HLT CERN Tape AliEn FC Raw data Condition & Calibration & data DB disk.
Advertisements

27 th June 2008Johannes Albrecht, BEACH 2008 Johannes Albrecht Physikalisches Institut Universität Heidelberg on behalf of the LHCb Collaboration The LHCb.
DPF Victor Pavlunin on behalf of the CLEO Collaboration DPF-2006 Results from four CLEO Y (5S) analyses:  Exclusive B s and B Reconstruction at.
Sept 30 th 2004Iacopo Vivarelli – INFN Pisa FTK meeting Z  bb measurement in ATLAS Iacopo Vivarelli, Alberto Annovi Scuola Normale Superiore,University.
Large scale data flow in local and GRID environment V.Kolosov, I.Korolko, S.Makarychev ITEP Moscow.
Hall D Online Data Acquisition CEBAF provides us with a tremendous scientific opportunity for understanding one of the fundamental forces of nature. 75.
Computing plans for the post DBD phase Akiya Miyamoto KEK ILD Session ECFA May 2013.
Status of ILC Computing Model and Cost Study Akiya Miyamoto Norman Graf Frank Gaede Andre Sailer Marcel Stanitzki Jan Strube 1.
Software Common Task Group Report Akiya Miyamoto KEK ALCPG09 30 September 2009.
The GlueX Collaboration Meeting October 4-6, 2012 Jefferson Lab Curtis Meyer.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
Mark Thomson University of Cambridge  LoI loose ends  New physics studies?  What next... Future Plans… This talk:
Jan MDI WS SLAC Electron Detection in the Very Forward Region V. Drugakov, W. Lohmann Motivation Talk given by Philip Detection of Electrons and.
Computing for ILC experiments Akiya Miyamoto KEK 14 May 2014 AWLC14 Any comments are welcomed.
Sim/Recon DBD Editors Report Norman Graf (SLAC) Jan Strube (CERN) SiD Workshop SLAC, August 22, 2012.
Computing Resources for ILD Akiya Miyamoto, KEK with a help by Vincent, Mark, Junping, Frank 9 September 2014 ILD Oshu City a report on work.
Commissioning Studies Top Physics Group M. Cobal – University of Udine ATLAS Week, Prague, Sep 2003.
HERA-LHC, CERN Oct Preliminary study of Z+b in ATLAS /1 A preliminary study of Z+b production in ATLAS The D0 measurement of  (Z+b)/  (Z+jet)
ILC DBD Common simulation and software tools Akiya Miyamoto KEK ILC PAC 14 December 2012 at KEK.
Update on MC-04(05) –New interaction region –Attenuation lengths vs endcap module numbers –Regeneration and nuclear interaction –EMC time resolution –
Dec.11, 2008 ECL parallel session, Super B1 Results of the run with the new electronics A.Kuzmin, Yu.Usov, V.Shebalin, B.Shwartz 1.New electronics configuration.
Software Common Task Group Meeting Akiya Miyamoto 6-June-2011.
Status of tth analysis 1. √s=500 GeV 2. √s=1 TeV 3. MC requests R. Yonamine, T. Tanabe, K. Fujii T. Price, H. Tabassam, N. Watson, V. Martin ILD Workshop,
Feb. 7, 2007First GLAST symposium1 Measuring the PSF and the energy resolution with the GLAST-LAT Calibration Unit Ph. Bruel on behalf of the beam test.
A B A B AR InterGrid Testbed Proposal for discussion Robin Middleton/Roger Barlow Rome: October 2001.
Software Overview Akiya Miyamoto KEK JSPS Tokusui Workshop December-2012 Topics MC production Computing reousces GRID Future events Topics MC production.
Report from Software Workshop before TILC09 (16-April) Akiya Miyamoto, KEK 21-April-2009 ILD Meeting.
Offline Status Report A. Antonelli Summary presentation for KLOE General Meeting Outline: Reprocessing status DST production Data Quality MC production.
Status report of the KLOE offline G. Venanzoni – LNF LNF Scientific Committee Frascati, 9 November 2004.
Beam Extrapolation Fit Peter Litchfield  An update on the method I described at the September meeting  Objective;  To fit all data, nc and cc combined,
John MarshallHZ Recoil Mass1 HZ Recoil Mass Analysis: Production Requests LCD WG6 Meeting, 28/02/2012 J.S. Marshall, University of Cambridge.
Software Common Task Group Report Akiya Miyamoto ILC PAC POSTECH 3 November 2009.
Next Step Akiya Miyamoto, KEK 17-Sep-2008 ILD Asia Meeting.
Large scale data flow in local and GRID environment Viktor Kolosov (ITEP Moscow) Ivan Korolko (ITEP Moscow)
Predrag Buncic ALICE Status Report LHCC Referee Meeting CERN
From the Standard Model to Discoveries - Physics with the CMS Experiment at the Dawn of the LHC Era Dimitri Bourilkov University of Florida CMS Collaboration.
06/2006I.Larin PrimEx Collaboration meeting  0 analysis.
Report from SB2009 Working Group Akiya Miyamoto 8-July-2010 DESY ILD Software and Integration Workshop.
LHCbComputing Computing for the LHCb Upgrade. 2 LHCb Upgrade: goal and timescale m LHCb upgrade will be operational after LS2 (~2020) m Increase significantly.
The ATLAS Computing & Analysis Model Roger Jones Lancaster University ATLAS UK 06 IPPP, 20/9/2006.
Régis Lefèvre (LPC Clermont-Ferrand - France)ATLAS Physics Workshop - Lund - September 2001 In situ jet energy calibration General considerations The different.
LumiCal background and systematics at CLIC energy I. Smiljanić, Vinča Institute of Nuclear Sciences.
SiD Full MC Samples E.Devetak, A.Nomerotski 26 Aug 2008.
Meeting with University of Malta| CERN, May 18, 2015 | Predrag Buncic ALICE Computing in Run 2+ P. Buncic 1.
Octobre 2007LAL Orsay Very Forward Instrumentation of the ILC Detector Wolfgang Lohmann, DESY.
Status Report on Data Reconstruction May 2002 C.Bloise Results of the study of the reconstructed events in year 2001 Data Reprocessing in Y2002 DST production.
Physics performance with SB2009 Progress report of a study by QuickSimulator Akiya Miyamoto 15-Jan-2010 KEK SB2009 WG Meeting.
Computing requirements for the experiments Akiya Miyamoto, KEK 31 August 2015 Mini-Workshop on ILC Infrastructure and CFS for Physics and KEK.
ATLAS – statements of interest (1) A degree of hierarchy between the different computing facilities, with distinct roles at each level –Event filter Online.
ILD MCProduction with ILCDirac
BESIII data processing
ILD Soft & Analysis meeting
Discrimination of new physics models with ILC
Status of MEG-I physics analysis
Muons from single top events for DC04
Pasquale Migliozzi INFN Napoli
ILD Soft & Analysis meeting
Akiya Miyamoto KEK 1 June 2016
Brief summary of discussion at LCWS2010 and draft short term plan
Philippe Charpentier CERN – LHCb On behalf of the LHCb Computing Group
ILD Background WG Mark Thomson
1 TeV pair backgrounds with ILD_O1_v02
ALICE Computing Upgrade Predrag Buncic
ILD Ichinoseki Meeting
ILD Optimisation: towards 2012 University of Cambridge
Sim/Rec/Opt Session Summary
Simulation plan -- GLDPrime --
Using Single Photons for WIMP Searches at the ILC
How to optimize the ILC energy for measuring ZH
ATLAS DC2 & Continuous production
Presentation transcript:

Computing Resources for ILD Akiya Miyamoto, KEK with a help by Vincent, Mark, Junping, Frank 9 September 2014 ILD Oshu City a report on work in progress

Introduction Computing design and cost are not included in ILC TDR, because – difficult to estimate a reliable estimate now – development of computing technology in > 10 years will be enormous But there requests to evaluate the cost and the human power needed for ILC computing. – “HL-LHC needs a huge computing resource. How about ILC ?” – Funding agency would like to know the total cost.  First discussion at AWLC14 Sim/Rec session.  Several suggestions for improvements – LCC infrastructure group’s request from Yasuhiro – Ad hoc meeting within ILD in August with Vincent, Frank, Junping, Mark, Akiya 2014/9/8ILD Oshu2

A model of ILD data processing ILD F.E. Online Processing Farm 1.build a train data 2.sub-detector based preliminary reconstruction 3.identify bunches of interest 4.calibration and alignment 5.background hit rejection 6.full event reconstruction 7.event classification Fast Physics Data (FPD) Online reconstruction chain Raw Data (RD ) Online Processed Data (OPD) JOB-A Re-processing with better constants Offline Reconstructed Data (ORD) DST MC data JOB-C MC-Production JOB-B Produce condensed data sample GRID based Offline computing Type of Data Raw Data(RD): train-based data. 3 Copies at ILC Site, US, EU Calibration Data (CD): - Data for calibration & alignments Online Processed Data (OPD): - Event based data after filtering - Consists of raw data objects and reconstructed objects - Replicated to GRID sites for sub-sequent processing Fast Physics Data (FPD): - DST of important physics events Offline Reconstructed Data (ORD): - Re-processed OPD. Produced after a few months later with new constants. Same events as OPD. DST: - DST from OPD or ORD. Several DSTs may be produced depending on physics MC data: - Monte Calro data Calibration Data (CD) 3 O(1)GB/sec

4 Bases of estimation: ILD raw data size in TDR GeV) Total data size : < 150MB/train = 750MB/sec ~ 6Gbps (bit per sec) VXD : ~ 100MB BeamCal : 126 MB  reduced to 5% = 6MB Others < 40MB raw data size per train 500 GeV ~ 7.5PB/1 year ( 10 7 sec ) for ILD

Energy dependence of pair hits The number of SimTracker/SimCalorimeter hits/BX were obtained by reading 100 BX of pair simulation files. Many Muon hits seems inconsistent with the data size estimation in TDR. The ratio relative to 500 GeV was used to estimate the data size at different beam energy 5

Energy dependence of pair hits GeV Data size / Train 6 Data size / Year Beam induced hits are scaled.

Energy dependence of MC events Procedure: 50~100 events of all type of Stdhep events were simulated by Mokka and estimated CPU time and data size. 250GeV 500GeV 350GeV 2380kev/1fb -1 Summary Energy kEvents/fb CPU days/fb Data size (GB)/fb note: - high cross section events, such as bhabha and ee  not included. - Marlin Rec. is not take into account yet. 7

8 Total storage and CPU : Assumptions for estimation Online Processed Data (OPD) = RDx0.02x2.03 – 0.02: #signal and ECAL bhabha events <1% of BX – 2.03: from MC, (rec+dst)/sim ~ 2.03 Offline Reconstructed Data = OPDx1.5 ( ~ RDx0.02x3 ) MC data : – x10 real data statistics – sim+rec+dst ~ 2.03xsim – x2 for bhabha+eemumu+… Data replication – Raw data : 3 copies at ILC site, EU and USA – OPD, ORD, MC DST at 10 major GRID sites CPU Time – MC data: CPU time for 10 times more statistics than Raw Data Another factor 2 for bhabha, eemumu, … Marline CPU time = 0.2 x Mokka CPU time – Online Process Data: CPU time/event = 0.2 x Mokka CPU time Nb. of events to be processed = 2xNb. of signal ( 2 for Bhabha, etc ) – Offline Process Data: Same as OPD – Computing efficiency: 90%

9 Summary of data size and CPU of ~1 year(10 7 sec) of running preliminary

Int. Lumi (1/fb) Data Size(PB) in 3 year in 3 year Year from T0 in 4.5 year in 5 year Evolution of storage capacity based on a sample running scenario 10

Summary A preliminary result on computing resources necessary for ILD is presented. Numbers used for the estimation are very preliminary and subject to change Needs to be improved or confirmed. For examples, – Efficiency of non-signal event filtering – CPU time of online data processing, incl. calibration, alignment, filtering,... – Consistency of raw data size in TDR and pair background simulation – How many MC events do we produce ? – How many reprocessing do we need ? – size of disk storage ? – resources during the construction ? – … more Independent estimation by SiD will help Comments/suggestions/thoughts are highly welcomed. 2014/9/8ILD Oshu11

SiD’s task(s) (From what I understand) the main purpose of this effort is to understand the spending profile for (all aspects of) the ILC campus ILD also has prepared man power estimates – SiD should come up with their own estimates, then we compare Some things will be shared with ILD, others will be taken care of the detector collaborations – We have some catching up to do before we can compare numbers with ILD – There are a few things we can / should discuss together, but we also need to put in some work to understand the details, e.g. event filtering Akiya did basically all of this by himself – we should try to split up tasks to catch up faster 12Jan Strube -- Summary of Akiya's slides