LHCb thinking on Regional Centres and Related activities (GRIDs)

Slides:



Advertisements
Similar presentations
CERN STAR TAP June 2001 Status of the EU DataGrid Project Fabrizio Gagliardi CERN EU-DataGrid Project Leader June 2001
Advertisements

31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
05/07/00LHCb Computing Model Meeting (CERN) LHCb(UK) Computing Status Glenn Patrick Prototype (Tier 1) UK national computing centre Bid to Joint.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
Status GridKa & ALICE T2 in Germany Kilian Schwarz GSI Darmstadt.
Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
T1 at LBL/NERSC/OAK RIDGE General principles. RAW data flow T0 disk buffer DAQ & HLT CERN Tape AliEn FC Raw data Condition & Calibration & data DB disk.
23/04/2008VLVnT08, Toulon, FR, April 2008, M. Stavrianakou, NESTOR-NOA 1 First thoughts for KM3Net on-shore data storage and distribution Facilities VLV.
Exploiting the Grid to Simulate and Design the LHCb Experiment K Harrison 1, N Brook 2, G Patrick 3, E van Herwijnen 4, on behalf of the LHCb Grid Group.
Planning LHCb computing infrastructure 22 May 2000 Slide 1 Planning LHCb computing infrastructure at CERN and at regional centres F. Harris.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
25 February 2000Tim Adye1 Using an Object Oriented Database to Store BaBar's Terabytes Tim Adye Particle Physics Department Rutherford Appleton Laboratory.
December 17th 2008RAL PPD Computing Christmas Lectures 11 ATLAS Distributed Computing Stephen Burke RAL.
08/06/00 LHCb(UK) Meeting Glenn Patrick LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick Central UK Computing (what.
23 Feb 2000F Harris Hoffmann Review Status1 Status of Hoffmann Review of LHC computing.
CERN TERENA Lisbon The Grid Project Fabrizio Gagliardi CERN Information Technology Division May, 2000
16 October 2005 Collaboration Meeting1 Computing Issues & Status L. Pinsky Computing Coordinator ALICE-USA.
Remote Production and Regional Analysis Centers Iain Bertram 24 May 2002 Draft 1 Lancaster University.
3 Sept 2001F HARRIS CHEP, Beijing 1 Moving the LHCb Monte Carlo production system to the GRID D.Galli,U.Marconi,V.Vagnoni INFN Bologna N Brook Bristol.
LHCb computing in Russia Ivan Korolko (ITEP Moscow) Russia-CERN JWGC, October 2005.
Data Import Data Export Mass Storage & Disk Servers Database Servers Tapes Network from CERN Network from Tier 2 and simulation centers Physics Software.
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
LHC Computing Review - Resources ATLAS Resource Issues John Huth Harvard University.
Your university or experiment logo here Caitriana Nicholson University of Glasgow Dynamic Data Replication in LCG 2008.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
Computing for LHCb-Italy Domenico Galli, Umberto Marconi and Vincenzo Vagnoni Genève, January 17, 2001.
7April 2000F Harris LHCb Software Workshop 1 LHCb planning on EU GRID activities (for discussion) F Harris.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
Summary of EU Grid meetings June, IN2P3,Lyon Slide 1 EU Grid meetings 29June,2000 (Testbeds and HEP applications) F. Harris.
The LHCb Italian Tier-2 Domenico Galli, Bologna INFN CSN1 Roma,
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
The LHCb CERN R. Graciani (U. de Barcelona, Spain) for the LHCb Collaboration International ICFA Workshop on Digital Divide Mexico City, October.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
LHCb Computing Model and updated requirements John Harvey.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.
1CHEP2000 February 2000F. Gagliardi EU HEP GRID Project Fabrizio Gagliardi
23.March 2004Bernd Panzer-Steindel, CERN/IT1 LCG Workshop Computing Fabric.
NA62 computing resources update 1 Paolo Valente – INFN Roma Liverpool, Aug. 2013NA62 collaboration meeting.
LHC Computing, CERN, & Federated Identities
10-Jan-00 CERN Building a Regional Centre A few ideas & a personal view CHEP 2000 – Padova 10 January 2000 Les Robertson CERN/IT.
CLRC Grid Team Glenn Patrick LHCb GRID Plans Glenn Patrick LHCb has formed a GRID technical working group to co-ordinate practical Grid.
The ATLAS Computing & Analysis Model Roger Jones Lancaster University ATLAS UK 06 IPPP, 20/9/2006.
LHCb Current Understanding of Italian Tier-n Centres Domenico Galli, Umberto Marconi Roma, January 23, 2001.
10-Feb-00 CERN HepCCC Grid Initiative ATLAS meeting – 16 February 2000 Les Robertson CERN/IT.
Hall D Computing Facilities Ian Bird 16 March 2001.
ATLAS – statements of interest (1) A degree of hierarchy between the different computing facilities, with distinct roles at each level –Event filter Online.
LHCb computing model and the planned exploitation of the GRID Eric van Herwijnen, Frank Harris Monday, 17 July 2000.
WLCG Tier-2 Asia Workshop TIFR, Mumbai 1-3 December 2006
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
Moving the LHCb Monte Carlo production system to the GRID
Data Challenge with the Grid in ATLAS
Grid related projects CERN openlab LCG EDG F.Fluckiger
Russian Regional Center for LHC Data Analysis
LHCb computing in Russia
LHCb Computing Model and Data Handling Angelo Carbone 5° workshop italiano sulla fisica p-p ad LHC 31st January 2008.
UK GridPP Tier-1/A Centre at CLRC
Readiness of ATLAS Computing - A personal view
Dagmar Adamova (NPI AS CR Prague/Rez) and Maarten Litmaath (CERN)
Proposal for the LHCb Italian Tier-2
New strategies of the LHC experiments to meet
Collaboration Board Meeting
LHCb(UK) Computing Status Glenn Patrick
Gridifying the LHCb Monte Carlo production system
The ATLAS Computing Model
Short to middle term GRID deployment plan for LHCb
Development of LHCb Computing Model F Harris
The LHC Computing Grid Visit of Professor Andreas Demetriou
The LHCb Computing Data Challenge DC06
Presentation transcript:

LHCb thinking on Regional Centres and Related activities (GRIDs) F Harris 5 April 2000 F Harris LHCb Software Workshop

Overview of presentation Country situations and a possible LHCb model for RCs Status of EU Grid proposal and LHCb involvement Comments on LHCb attitude to Tapes vs. Disks (and some related points) 5 April 2000 F Harris LHCb Software Workshop

Overview of current situation DISCLAIMER Nothing is ‘agreed’ in the MOU sense (requires negotiations in collaboration and with funding agencies), but we have the following viewpoint We are trying to apply (1/3, 2/3) rule overall Good candidates for regional centres are Tier1 Lyon,INFN,RAL,Nikhef Tier2 Liverpool,Glasgow/Edinburgh Discussions going on Russia (?Tier1 for all expts ? Networking) Switzerland (? Tier2 centre for LHCb) Germany (? LHCb use of a national centre) Discussions just beginning Spain Poland Brazil 5 April 2000 F Harris LHCb Software Workshop

Strategy for LHCb country computing planning Make case to funding agencies based on Detector etc. studies 2001-2 Physics +trigger studies up to startup By startup have facilities in place to match pro-rata requirement for whole expt (see experiment model ) Each country has its own constraints (financial, existing infrastructure,etc.) leading to different possibilities for Tier-1/2) Get involved in GRID related activities as appropriate(?manpower) 5 April 2000 F Harris LHCb Software Workshop

For example - planning in the UK Computing requirements for 2001-3 for UK/LHCb dominated by detector (RICH+VELO) construction + some trigger optimisation (physics background studies in general start late 2003 but some now) CPU(PC99) STORAGE (TB) 2001 200-400 5-10 2002 200-400 5-10 2003 400-600 10-20 Satisfied(?) by MAP(Liverpool) + JIF (all 4 LHC expts) JIF proposal (know result late 2000) for all 4 experiments CPU(PC99) STORAGE (TB) + networking enhancement 2001 830 25 2002 1670 50 2003 3100 125 5 April 2000 F Harris LHCb Software Workshop

F Harris LHCb Software Workshop REAL Generates RAW 100 kB reconstructs ESD 100 kB AOD 20 kB TAG ~100+ B stores RAW+ESD+AOD+TAG MC Import samples RAW+ESD Imports all AOD+TAG ANALYSIS For ‘CERN’ community But we want a GRID not a hierachy, see next slide --------- CERN – Tier 0 INFN RAL IN2P3 Regional Centres REAL Import samples RAW+ESD Imports all AOD+TAG MC Generates RAW 200 kB Reconstructs ESD 100 kB AOD 30 kB TAG ~100+ B Imports AOD+TAG from other centres ANALYSIS according to scale of centre (National,region,university) Tier 1 Liv Glasg Edin Uni n Tier2 Department Desktop ANALYSIS with ‘Ntuples +AOD+ESD+RAW’ (10**5 ev take ~ 100 GB)    5 April 2000 F Harris LHCb Software Workshop

More realistically - a Grid Topology CERN Tier 0 INFN IN2P3 Tier 1 etc…. RAL etc…. etc…. Tier 2 Liverpool Glasgow Edinburgh etc….    Department Desktop users 5 April 2000 F Harris LHCb Software Workshop

EU GRID proposal status (http://grid.web.cern.ch/grid/) EU Reaction to pre-proposal of 30 M Euro - come back with a proposal of 10 M Euro maximum! Scaled down proposal being worked on - to be submitted early May Main signatories (CERN,France,Italy,UK,Netherlands,ESA) + associate signatories (Spain,Czechoslovakia,Hungary,Spain,Portugal,Scandinavia..) Project composed of Work Packages (to which countries provide effort) LHCb involvement Depends on country Essentially comes via ‘Testbeds’ and ‘HEP applications’ 5 April 2000 F Harris LHCb Software Workshop

F Harris LHCb Software Workshop EU Grid Work Packages Middleware Grid work scheduling C Vistoli(INFN) Grid Data Management B Segal(IT) Grid Application Monitoring R Middleton(RAL) Fabric Management T Smith(IT) Mass Storage Management O Barring(IT) Infrastructure Testbed and Demonstrators F Etienne(Marseille) Network Services C Michau(CNRS) Applications HEP (LHCb involved) H Hoffmann(CERN) Earth Observation L Fusco(ESA) Biology C Michau(CNRS) Management Project Management F Gagliardi(IT) 5 April 2000 F Harris LHCb Software Workshop

GRID LHCb WP Physics Study(DRAFT) The total sample of B > JY/Ks simulated events needed is ~10 times the number produced in the real data. In one year of datataking we expect to collect and fully reconstruct 105 events, therefore need 10 6simulated events. The number of events that have to be generated, stored and reconstructed to produce this sample is 10 7. 10% of the ESD data copied for systematic studies (~100 GB). The total amount of data generated in this production would be : RAW data 200 kB/event x 10 7 = 2 .0 TB Generator data 12 kB/event x 10 7 = 0.12 TB ESD data 100 kB/event x 10 7 = 1 .0 TB AOD data 20 kB/event x 10 7 = 0. 2 TB TAG data 1 kB/event x 10 7 = 0.01 TB 5 April 2000 F Harris LHCb Software Workshop

Grid LHCb WP - Grid Testbed (DRAFT) MAP farm at Liverpool has 300 processors would take 4 months to generate the full sample of events All data generated (~3TB) would be transferred to RAL for archive (UK regional facility). All AOD and TAG datasets dispatched from RAL to other regional centres, such as Lyon and CERN. Physicists run jobs at the regional centre or ship AOD and TAG data to local institute and run jobs there. Also copy ESD for a fraction (~10%) of events for systematic studies (~100 GB). The resulting data volumes to be shipped between facilities over 4 months would be as follows : Liverpool to RAL 3 TB (RAW ESD AOD and TAG) RAL to LYON/CERN/… 0.3 TB (AOD and TAG) LYON to LHCb institute 0.3 TB (AOD and TAG) RAL to LHCb institute 100 GB (ESD for systematic studies) 5 April 2000 F Harris LHCb Software Workshop

Thoughts on mass storage usage (see our note) We would like as much active data online on disk as possible Use tape for archiving ‘old’ data (? Some have suggested all disk systems- but how do you decide when/what to throw away) R/D - try strategy of moving job to the data (Liverpool COMPASS) ? If 2.5 Gb/s networks prove not to be affordable then we may need to move data by tape. Don’t want to do that if possible! 5 April 2000 F Harris LHCb Software Workshop