Computing for LHCb-Italy Domenico Galli and Umberto Marconi Dipartimento di Fisica and INFN Bologna Genève, july 5, 2000.

Slides:



Advertisements
Similar presentations
5/2/2005MAGIC II Workshop – RömerTurm1 The Monte Carlo Center Presently 5 ( + 5) XEON bi-processors at 3.0 CNAF (Bologna) allocated to MAGIC. During.
Advertisements

Computing for LHCb-Italy Domenico Galli and Umberto Marconi Dipartimento di Fisica and INFN Bologna Cagliari, September 13, 2000.
L. Perini Milano 6 Mar Centri Regionali e Progetto GRID per ATLAS-Italia La situazione a oggi: decisioni, interesse, impegni, punti da chiarire.
TAB, 03. March 2006 Bruno Hoeft German LHC – WAN present state – future concept Forschungszentrum Karlsruhe GmbH Institute for Scientific Computing P.O.
CERN STAR TAP June 2001 Status of the EU DataGrid Project Fabrizio Gagliardi CERN EU-DataGrid Project Leader June 2001
The Open Method of Coordination as an example of policy learning at the European Level Six Countries Programme Stockholm May 2006 Peder Christensen.
TABLE OF CONTENTS CHAPTER 5.0: Workforce Chart 5.1: Total Number of Active Physicians per 1,000 Persons, 1980 – 2008 Chart 5.2: Total Number of Active.
An open source approach for grids Bob Jones CERN EU DataGrid Project Deputy Project Leader EU EGEE Designated Technical Director
1 THE SECOND CHANCE SCHOOL OF MIDI-PYRENEES FRANCE.
1  1 =.
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Review of WLCG Tier-2 Workshop Duncan Rand Royal Holloway, University of London Brunel University.
BaBarGrid GridPP10 Meeting CERN June 3 rd 2004 Roger Barlow Manchester University 1: Simulation 2: Data Distribution: The SRB 3: Distributed Analysis.
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
GridPP Funding Model(s) D. Britton Imperial College 24/5/01 £21m + £5m.
B A B AR and the GRID Roger Barlow for Fergus Wilson GridPP 13 5 th July 2005, Durham.
1Oxford eSc – 1 st July03 GridPP2: Application Requirement & Developments Nick Brook University of Bristol ALICE Hardware Projections Applications Programme.
GridPP Presentation to PPARC Grid Steering Committee 26 July 2001 Steve Lloyd Tony Doyle John Gordon.
Tony Doyle GridPP2 Proposal, BT Meeting, Imperial, 23 July 2003.
GridPP Presentation to PPARC e-Science Committee 26 July 2001 Steve Lloyd Tony Doyle John Gordon.
Chinese Delegation visit Malcolm Atkinson Director 18 th November 2004.
SWITCH Visit to NeSC Malcolm Atkinson Director 5 th October 2004.
May 2000G-WIN-TalkContent Next Generation: Giga-Wissenschaftsnetz of DFN (G-WIN) Peter Kaufmann (DFN-Verein, Berlin, Tel: ,
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
05/07/00LHCb Computing Model Meeting (CERN) LHCb(UK) Computing Status Glenn Patrick Prototype (Tier 1) UK national computing centre Bid to Joint.
Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft LCG-POB, , Reinhard Maschuw1 Grid Computing Centre Karlsruhe - GridKa Regional/Tier.
Istituto Nazionale di Fisica Nucleare Italy LAL - Orsay April Site Report – R.Gomezel Site Report Roberto Gomezel INFN - Trieste.
Forschungszentrum Karlsruhe Technik und Umwelt LHCC Meeting Bologna, 15 June Planning a Regional Data and Computing Center in Germany Peter Mickel.
Service Data Challenge Meeting, Karlsruhe, Dec 2, 2004 Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft Plans and outlook at GridKa Forschungszentrum.
© 2007 IBM Corporation Steve Bowden Green Computing CTO IBM Systems & Technology Group UKISA Sustainable Computing in a Carbon Sensitive World Blue turning.
Bruxelles, June 1st, 2005 CESI 1 Electricity Sector A.Invernizzi (Italy) Future Needs for Research Infrastructures Energy.
15 May 2006Collaboration Board GridPP3 Planning Executive Summary Steve Lloyd.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
Nick Brook University of Bristol The LHC Experiments & Lattice EB News Brief overview of the expts  ATLAS  CMS  LHCb  Lattice.
Beat the Computer Drill Divide 10s Becky Afghani, LBUSD Math Curriculum Office, 2004 Vertical Format.
Interoperable Railways from vision to reality: the Italian experience RAILWAY DAYS – BUCAREST.
André Augustinus 16 June 2003 DCS Workshop Safety.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
08/06/00 LHCb(UK) Meeting Glenn Patrick LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick Central UK Computing (what.
3 Sept 2001F HARRIS CHEP, Beijing 1 Moving the LHCb Monte Carlo production system to the GRID D.Galli,U.Marconi,V.Vagnoni INFN Bologna N Brook Bristol.
Status of LHCb-INFN Computing CSN1, Catania, September 18, 2002 Domenico Galli, Bologna.
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
INFN Computing for LHCb Domenico Galli, Umberto Marconi, and Vincenzo Vagnoni Genève, February 15, 2001.
Computing for LHCb-Italy Domenico Galli, Umberto Marconi and Vincenzo Vagnoni Genève, January 17, 2001.
7April 2000F Harris LHCb Software Workshop 1 LHCb planning on EU GRID activities (for discussion) F Harris.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
Overall Goal of the Project  Develop full functionality of CMS Tier-2 centers  Embed the Tier-2 centers in the LHC-GRID  Provide well documented and.
The LHCb Italian Tier-2 Domenico Galli, Bologna INFN CSN1 Roma,
V.Ilyin, V.Gavrilov, O.Kodolova, V.Korenkov, E.Tikhonenko Meeting of Russia-CERN JWG on LHC computing CERN, March 14, 2007 RDMS CMS Computing.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
The 2001 Tier-1 prototype for LHCb-Italy Vincenzo Vagnoni Genève, November 2000.
U.S. ATLAS Computing Facilities Overview Bruce G. Gibbard Brookhaven National Laboratory U.S. LHC Software and Computing Review Brookhaven National Laboratory.
LHCb Data Challenge in 2002 A.Tsaregorodtsev, CPPM, Marseille DataGRID France meeting, Lyon, 18 April 2002.
LHCb Current Understanding of Italian Tier-n Centres Domenico Galli, Umberto Marconi Roma, January 23, 2001.
ScotGRID is the Scottish prototype Tier 2 Centre for LHCb and ATLAS computing resources. It uses a novel distributed architecture and cutting-edge technology,
LHCb computing model and the planned exploitation of the GRID Eric van Herwijnen, Frank Harris Monday, 17 July 2000.
Dynamic Extension of the INFN Tier-1 on external resources
Grid site as a tool for data processing and data analysis
INFN Computing Outlook The Bologna Initiative
Pasquale Migliozzi INFN Napoli
Andrea Chierici On behalf of INFN-T1 staff
UK GridPP Tier-1/A Centre at CLRC
Proposal for the LHCb Italian Tier-2
Status of LHCb-INFN Computing
Collaboration Board Meeting
Gridifying the LHCb Monte Carlo production system
MonteCarlo production for the BaBar experiment on the Italian grid
LHCb thinking on Regional Centres and Related activities (GRIDs)
Development of LHCb Computing Model F Harris
Presentation transcript:

Computing for LHCb-Italy Domenico Galli and Umberto Marconi Dipartimento di Fisica and INFN Bologna Genève, july 5, 2000

Computing for LHCb-Italy Domenico Galli and Umberto Marconi Milestones On June 15, the LHCb-Italy computing plan was presented to INFN “Commissione I”. On July 4, the LHCb-Italy detailed request was submitted to INFN referees. On September 7, we will meet INFN referees. On September 15, the response of “Commissione I” for 2001 investments is expected.

Computing for LHCb-Italy Domenico Galli and Umberto Marconi Plans for Monte Carlo Production from 2001 to 2005 In 2001 we will produce ~2  10 8 simulated events for detector and trigger optimisation (detector TDRs expected in 2001 and early 2002); In we will start to produce very large samples of simulated events, in particular background, for physics studies (2.4  10 6 evt in 2002, 7  10 6 in 2003, 1.5  10 7 in 2004, …);

Computing for LHCb-Italy Domenico Galli and Umberto Marconi Statistics needed for trigger optimisation Trigger SignalMinimum bias EventsCPU timeEventsCPU time #[SI95 s]# L0 1.5     10 8 L1 3   L2/3 5.5     10 11

Computing for LHCb-Italy Domenico Galli and Umberto Marconi CPU time needed for L0 trigger optimisation EventsCPU time/evtTotal CPU time #[Si95 s/evt][Si95 s] SignalM. B.SignalM. B. Generation1.5     10 7 Tracking1.5     10 8 Digitisation1.5     10 7 Reconstruction1.5     10 7 Total15505   10 8

Computing for LHCb-Italy Domenico Galli and Umberto Marconi CPU time needed for L1 trigger optimisation EventsCPU time/evtTotal CPU time #[Si95 s/evt][Si95 s] SignalM. B.SignalM. B. Generation3     10 9 Tracking3     10 9 Digitisation3     10 8 Reconstruction3     10 9 Total

Computing for LHCb-Italy Domenico Galli and Umberto Marconi CPU time needed for L2 trigger optimisation EventsCPU time/evtTotal CPU time #[Si95 s/evt][Si95 s] SignalM. B.SignalM. B. Generation5.5     Tracking5.5     Digitisation5.5     Trigger L0/L15.5     Reconstruction1.5     10 8 Total16502   10 11

Computing for LHCb-Italy Domenico Galli and Umberto Marconi Total Requirements for LHCb Tier-1 Units Signal events a    Background events a -1 2    CPU for signalSI    CPU for background SI     10 4 CPU for analysis SI   10 4 RAWmc disc TB RAWmc tape TB ESDmc disc TB AOD disc TB TAG disc TB

Computing for LHCb-Italy Domenico Galli and Umberto Marconi Total Requirements for LHCb Tier-1 (II) SI95 TB

Computing for LHCb-Italy Domenico Galli and Umberto Marconi Incremental acquisition of Equipment for LHCb Tier-1 Resources splitted up among 5 regional centres. In steady state (starting from 2006) replacement every year of 30% of CPU and of 20% of disc. Units CPU SI Disc TB Tape TB

Computing for LHCb-Italy Domenico Galli and Umberto Marconi The INFN Tier-1 for LHCb LHCb-Italy plans to concentrate the Tier-1 Regional Centre in only one site (differently from other Italian LHC groups), housed in a “consorzio di calcolo”, chosen based on economic convenience. Only one installation. Same architecture of other European Tier-1 centres. No problems concerning WAN usage (routing, data transfer optimisation to obviate bandwidth, etc.). No organisation and synchronisation requirements. Remote control of concentrated computing resources requires less bandwidth than geographically distributed data and CPUs. Transparent remote control of concentrated resources by institutes will be allowed by GRID middleware.

Computing for LHCb-Italy Domenico Galli and Umberto Marconi The INFN Tier-1 for LHCb (II) The “consorzio” is concerned with housing of computing resources: Links to electricity grid and computing network; Air conditioning; Uninterruptible power supply; Guardianship. To the “consorzio” we plan to outsource system administration: Participation to system installation; Administration and monitoring; Intervention in case of hang; Intervention for restoring LAN or WAN connectivity; Backup procedures; Operating system updating and patching.

Computing for LHCb-Italy Domenico Galli and Umberto Marconi Investment plan for LHCb-Italy Units CPU[k E ] Switch[k E ]126 Rack[k E ]10515 Hard disc[k E ] Tape[k E ] Tape driver[k E ]400 Total hardware[k E ] CPU number (integral)# Electric power[k E ] Housing[k E ] System administration staff[k E ] Total operating costs[k E ] Grand total[k E ]

Computing for LHCb-Italy Domenico Galli and Umberto Marconi Computing group of LHCb-Italy SiteTest-bed HEP applications DATAGRIDTotal Maurizio BonesiniMI10%-- Walter BoniventoCA20%-- Domenico GalliBO40%-30%70% Alberto GianoliFE50%-- Giacomo GrazianiFI20% -40% Umberto MarconiBO40%-30%70% Marco PaganoniMI20%-- Giovanni PassalevaFI20%-- Roberta SantacesariaRM1-20%- Nicola Semprini CesariBO-30%- Vincenzo VagnoniBO30%20%-50% Stefania VecchiBO-40%- Total250%130%60%440%