1 First Considerations on LNF Tier2 Activity E. Vilucchi January 2006.

Slides:



Advertisements
Similar presentations
L. Perini Milano 6 Mar Centri Regionali e Progetto GRID per ATLAS-Italia La situazione a oggi: decisioni, interesse, impegni, punti da chiarire.
Advertisements

31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
Status GridKa & ALICE T2 in Germany Kilian Schwarz GSI Darmstadt.
CWG10 Control, Configuration and Monitoring Status and plans for Control, Configuration and Monitoring 16 December 2014 ALICE O 2 Asian Workshop
Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
1 Alignment using high Pt tracks 1.Small w.r.t. large barrel chambers 2.BEE w.r.t. EC chambers 3.Endcap – Barrel 4.BIS8 w.r.t. BIS7 5.Inner tracker – Muon.
1 Data Storage MICE DAQ Workshop 10 th February 2006 Malcolm Ellis & Paul Kyberd.
CMS Alignment and Calibration Yuriy Pakhotin on behalf of CMS Collaboration.
August 98 1 Jürgen Knobloch ATLAS Software Workshop Ann Arbor ATLAS Computing Planning ATLAS Software Workshop August 1998 Jürgen Knobloch Slides also.
Large scale data flow in local and GRID environment V.Kolosov, I.Korolko, S.Makarychev ITEP Moscow.
Hall D Online Data Acquisition CEBAF provides us with a tremendous scientific opportunity for understanding one of the fundamental forces of nature. 75.
2/10/2000 CHEP2000 Padova Italy The BaBar Online Databases George Zioulas SLAC For the BaBar Computing Group.
LHCb Quarterly Report October Core Software (Gaudi) m Stable version was ready for 2008 data taking o Gaudi based on latest LCG 55a o Applications.
ATLAS Metrics for CCRC’08 Database Milestones WLCG CCRC'08 Post-Mortem Workshop CERN, Geneva, Switzerland June 12-13, 2008 Alexandre Vaniachine.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
LHCb computing in Russia Ivan Korolko (ITEP Moscow) Russia-CERN JWGC, October 2005.
LHC Computing Review - Resources ATLAS Resource Issues John Huth Harvard University.
Manoj Kumar Jha INFN – Bologna On Behalf of ATLAS Muon Calibration Group 20 th October 2010/CHEP 2010, Taipei ATLAS Muon Calibration Framework.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
Computing for LHCb-Italy Domenico Galli, Umberto Marconi and Vincenzo Vagnoni Genève, January 17, 2001.
5 May 98 1 Jürgen Knobloch Computing Planning for ATLAS ATLAS Software Week 5 May 1998 Jürgen Knobloch Slides also on:
ATLAS Liquid Argon Calorimeter Monitoring & Data Quality Jessica Levêque Centre de Physique des Particules de Marseille ATLAS Liquid Argon Calorimeter.
ATLAS Computing Model – US Research Program Manpower J. Shank N.A. ATLAS Physics Workshop Tucson, AZ 21 Dec., 2004.
19 November 98 1 Jürgen Knobloch ATLAS Computing ATLAS Computing - issues for 1999 Jürgen Knobloch Slides also on:
A.Golunov, “Remote operational center for CMS in JINR ”, XXIII International Symposium on Nuclear Electronics and Computing, BULGARIA, VARNA, September,
ATLAS Data Challenges US ATLAS Physics & Computing ANL October 30th 2001 Gilbert Poulard CERN EP-ATC.
The LHCb Italian Tier-2 Domenico Galli, Bologna INFN CSN1 Roma,
V.Ilyin, V.Gavrilov, O.Kodolova, V.Korenkov, E.Tikhonenko Meeting of Russia-CERN JWG on LHC computing CERN, March 14, 2007 RDMS CMS Computing.
The LHCb CERN R. Graciani (U. de Barcelona, Spain) for the LHCb Collaboration International ICFA Workshop on Digital Divide Mexico City, October.
ATLAS Tier 1 at BNL Overview Bruce G. Gibbard Grid Deployment Board BNL 5-6 September 2006.
ATLAS Working Group Report IHEP, Beijing Nanjing Univ, Nanjing Shandong Univ, Jinan USTC, Hefei.
26 Nov 1999 F Harris LHCb computing workshop1 Development of LHCb Computing Model F Harris Overview of proposed workplan to produce ‘baseline computing.
The CMS CERN Analysis Facility (CAF) Peter Kreuzer (RWTH Aachen) - Stephen Gowdy (CERN), Jose Afonso Sanches (UERJ Brazil) on behalf.
LHCbComputing LHCC status report. Operations June 2014 to September m Running jobs by activity o Montecarlo simulation continues as main activity.
NA62 computing resources update 1 Paolo Valente – INFN Roma Liverpool, Aug. 2013NA62 collaboration meeting.
Large scale data flow in local and GRID environment Viktor Kolosov (ITEP Moscow) Ivan Korolko (ITEP Moscow)
ATLAS Computing Requirements LHCC - 19 March ATLAS Computing Requirements for 2007 and beyond.
LHCb datasets and processing stages. 200 kB100 kB 70 kB 0.1 kB 10kB 150 kB 0.1 kB 200 Hz LHCb datasets and processing stages.
Victoria, Sept WLCG Collaboration Workshop1 ATLAS Dress Rehersals Kors Bos NIKHEF, Amsterdam.
ALICE RRB-T ALICE Computing – an update F.Carminati 23 October 2001.
Computer Performance. Hard Drive - HDD Stores your files, programs, and information. If it gets full, you can’t save any more. Measured in bytes (KB,
L. Perini DATAGRID WP8 Use-cases 19 Dec ATLAS short term grid use-cases The “production” activities foreseen till mid-2001 and the tools to be used.
1 Reconstruction tasks R.Shahoyan, 25/06/ Including TRD into track fit (JIRA PWGPP-1))  JIRA PWGPP-2: Code is in the release, need to switch setting.
Main parameters of Russian Tier2 for ATLAS (RuTier-2 model) Russia-CERN JWGC meeting A.Minaenko IHEP (Protvino)
LHCb Current Understanding of Italian Tier-n Centres Domenico Galli, Umberto Marconi Roma, January 23, 2001.
WLCG November Plan for shutdown and 2009 data-taking Kors Bos.
June-18 th -2007ATLAS-MPI; Status of ATLAS Technical Paper, R. Richter1 The ATLAS Technical Paper and back-up papers  Content of the ATLAS technical paper.
LHCb Computing 2015 Q3 Report Stefan Roiser LHCC Referees Meeting 1 December 2015.
ATLAS – statements of interest (1) A degree of hierarchy between the different computing facilities, with distinct roles at each level –Event filter Online.
Status of the CERN Analysis Facility
Vanderbilt Tier 2 Project
Bernd Panzer-Steindel, CERN/IT
LHCb Computing Model and Data Handling Angelo Carbone 5° workshop italiano sulla fisica p-p ad LHC 31st January 2008.
Philippe Charpentier CERN – LHCb On behalf of the LHCb Computing Group
AliRoot status and PDC’04
Off-line & GRID Computing
Artem Trunov and EKP team EPK – Uni Karlsruhe
Proposal for the LHCb Italian Tier-2
ALICE Computing Model in Run3
An introduction to the ATLAS Computing Model Alessandro De Salvo
US CMS Testbed.
ILD Ichinoseki Meeting
US ATLAS Physics & Computing
Lecture 14 Virtual Memory and the Alpha Memory Hierarchy
ALICE Data Challenges Fons Rademakers Click to add notes.
ATLAS DC2 & Continuous production
LHCb thinking on Regional Centres and Related activities (GRIDs)
Development of LHCb Computing Model F Harris
Presentation transcript:

1 First Considerations on LNF Tier2 Activity E. Vilucchi January 2006

2 Hypothesis LNF Tier2 Computing Resources [1] : Resources Use: 30% resources for SIMULATION 30% resources for ANALYSIS 30% resources for CALIBRATION Anno CPU (KSI2K) Work. Node Disk (TB) Power Elec. (kW) Racks numb

3 CPU Increment

4 Disk Increment

5 Work Proposal [2] Year Simulation (10^6 Events) Analysis (Users) Residual CPU for calibration (KSI2K) Residual Disk Space for calibration (TB)

6 Simulated events Keeping data in the Tier2 for one year

7 User analysis supported

8 Calibration [3] Proposal for calibration and alignment of: Muon groups of Michigan, Munich, Rome (La Sapienza, Roma Tre), Saclay; October Work subdivision: CERN: splits data in sub-streams (not stored): barrel, end-caps, alignments; Tier1: stores data on tape and sends them to Tier2; University of Michigan: spectrometer end-caps; Munich: spectrometer barrel and alignments tasks; Rome (La Sapienza and Roma Tre): spectrometer barrel;

9 Data Requirements for all calibration centre : t 0 computed weekly: ~20000 muons hits per tube, therefore ~ 10^8 muon tracks each day. Since 1 muon track is  1 KB (while 1 event  1.6 MB), then  100 GB per day; r-t computed daily: the same data collected for the t 0 computation; Alignment parameters updated every 2-4 hours: 2x10^6 muon tracks; negligible w.r.t. 10^8. Calibration and alignment validation: internal consistency of t 0 and r-t functions (still under study) and physical checks with a small fraction of ATLAS data stream (~10^5 events/day: 100 CPU h./day 200 GB/day). Computing requirements for all calibration centre: CPU for calibration: 0.2 s/track  0.2X10^8 sec per day  6000 h. per day  300 processors  300 KSI2K of CPU per day; CPU for alignment: 1 s/track  2X10^6 sec per day  0.2X10^7 sec per day, negligible w.r.t. CPU for calibration; 300 GB of disk space per day. If it is maintained for 1 mounts: 9 TB of disk space;

10 Human resources per calibration center: Supervision of transfer process, functioning of transmission links and reaction to errors: 0.5 FTE; Program running and debugging of calibration software: 0.25 FTE; in centers responsible for alignment tasks: 0.45 FTE; Maintenance and update of calibration software: 0.5 FTE; for alignment task: 0.5 FTE in more; Calibration, alignment and validation: 2-3 FTE (detector or analysis experts). After some experience human resources can be replaced by automated systems and these estimates can be relaxed.

11 References 1.Progetto di massima Centro Tier2 ATLAS LNF; M. L. Ferrer, M. Pistoni), 2005; 2.ATLAS Computing Technical Design Report; ATLAS Computing Group, ATLAS TDR—017, CERN-LHC , July 2005; 3.“A proposal for the calibration alignment of the ATLAS muon spectrometer”; P. Bagania, O. Biebel, C. Guyot, H. Kroha, D. Orestano, B. Zhou, October 30, version 2.