31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.

Slides:



Advertisements
Similar presentations
Review of WLCG Tier-2 Workshop Duncan Rand Royal Holloway, University of London Brunel University.
Advertisements

B A B AR and the GRID Roger Barlow for Fergus Wilson GridPP 13 5 th July 2005, Durham.
LHCb(UK) Meeting Glenn Patrick1 LHCb Grid Activities in UK LHCb(UK) Meeting Cambridge, 10th January 2001 Glenn Patrick (RAL)
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
Ian Willers Information: CMS participation in Monarc and RD45 Slides: Paolo Capiluppi, Irwin Gaines, Harvey Newman, Les Robertson, Jamie Shiers, Lucas.
Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
Amber Boehnlein, FNAL D0 Computing Model and Plans Amber Boehnlein D0 Financial Committee November 18, 2002.
Distributed IT Infrastructure for U.S. ATLAS Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
23/04/2008VLVnT08, Toulon, FR, April 2008, M. Stavrianakou, NESTOR-NOA 1 First thoughts for KM3Net on-shore data storage and distribution Facilities VLV.
11 Dec 2000F Harris Datagrid Testbed meeting at Milan 1 LHCb ‘use-case’ - distributed MC production
August 98 1 Jürgen Knobloch ATLAS Software Workshop Ann Arbor ATLAS Computing Planning ATLAS Software Workshop August 1998 Jürgen Knobloch Slides also.
CERN/IT/DB Multi-PB Distributed Databases Jamie Shiers IT Division, DB Group, CERN, Geneva, Switzerland February 2001.
Planning LHCb computing infrastructure 22 May 2000 Slide 1 Planning LHCb computing infrastructure at CERN and at regional centres F. Harris.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
December 17th 2008RAL PPD Computing Christmas Lectures 11 ATLAS Distributed Computing Stephen Burke RAL.
MONARC : results and open issues Laura Perini Milano.
08/06/00 LHCb(UK) Meeting Glenn Patrick LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick Central UK Computing (what.
LHCb Software Meeting Glenn Patrick1 First Ideas on Distributed Analysis for LHCb LHCb Software Week CERN, 28th March 2001 Glenn Patrick (RAL)
LHCb DataModel Nick Brook Glenn Patrick University of Bristol Rutherford Lab Motivation DataModel Options Future plans.
November 7, 2001Dutch Datagrid SARA 1 DØ Monte Carlo Challenge A HEP Application.
LHCb computing in Russia Ivan Korolko (ITEP Moscow) Russia-CERN JWGC, October 2005.
Data Import Data Export Mass Storage & Disk Servers Database Servers Tapes Network from CERN Network from Tier 2 and simulation centers Physics Software.
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
Nick Brook Current status Future Collaboration Plans Future UK plans.
LHC Computing Review - Resources ATLAS Resource Issues John Huth Harvard University.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
Modeling Regional Centers with MONARC Simulation Tools Modeling LHC Regional Centers with the MONARC Simulation Tools Irwin Gaines, FNAL for the MONARC.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
Computing for LHCb-Italy Domenico Galli, Umberto Marconi and Vincenzo Vagnoni Genève, January 17, 2001.
Data Grid projects in HENP R. Pordes, Fermilab Many HENP projects are working on the infrastructure for global distributed simulated data production, data.
7April 2000F Harris LHCb Software Workshop 1 LHCb planning on EU GRID activities (for discussion) F Harris.
The LHCb Italian Tier-2 Domenico Galli, Bologna INFN CSN1 Roma,
…building the next IT revolution From Web to Grid…
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
ATLAS WAN Requirements at BNL Slides Extracted From Presentation Given By Bruce G. Gibbard 13 December 2004.
26 Nov 1999 F Harris LHCb computing workshop1 Development of LHCb Computing Model F Harris Overview of proposed workplan to produce ‘baseline computing.
Report to Worldwide Analysis Review Panel J. Harvey 23 March 2000 Slide 1 LHCb Processing requirements Focus on the first year of data-taking Report to.
LHCb Computing Model and updated requirements John Harvey.
Computing R&D and Milestones LHCb Plenary June 18th, 1998 These slides are on WWW at:
NA62 computing resources update 1 Paolo Valente – INFN Roma Liverpool, Aug. 2013NA62 collaboration meeting.
ATLAS Grid Computing Rob Gardner University of Chicago ICFA Workshop on HEP Networking, Grid, and Digital Divide Issues for Global e-Science THE CENTER.
The ATLAS Computing Model and USATLAS Tier-2/Tier-3 Meeting Shawn McKee University of Michigan Joint Techs, FNAL July 16 th, 2007.
CMS Computing Model summary UKI Monthly Operations Meeting Olivier van der Aa.
David Stickland CMS Core Software and Computing
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
LHCb datasets and processing stages. 200 kB100 kB 70 kB 0.1 kB 10kB 150 kB 0.1 kB 200 Hz LHCb datasets and processing stages.
10-Jan-00 CERN Building a Regional Centre A few ideas & a personal view CHEP 2000 – Padova 10 January 2000 Les Robertson CERN/IT.
The MEG Offline Project General Architecture Offline Organization Responsibilities Milestones PSI 2/7/2004Corrado Gatto INFN.
January 20, 2000K. Sliwa/ Tufts University DOE/NSF ATLAS Review 1 SIMULATION OF DAILY ACTIVITITIES AT REGIONAL CENTERS MONARC Collaboration Alexander Nazarenko.
CLRC Grid Team Glenn Patrick LHCb GRID Plans Glenn Patrick LHCb has formed a GRID technical working group to co-ordinate practical Grid.
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
GDB, 07/06/06 CMS Centre Roles à CMS data hierarchy: n RAW (1.5/2MB) -> RECO (0.2/0.4MB) -> AOD (50kB)-> TAG à Tier-0 role: n First-pass.
Main parameters of Russian Tier2 for ATLAS (RuTier-2 model) Russia-CERN JWGC meeting A.Minaenko IHEP (Protvino)
LHCb Current Understanding of Italian Tier-n Centres Domenico Galli, Umberto Marconi Roma, January 23, 2001.
LHCb Computing activities Philippe Charpentier CERN – LHCb On behalf of the LHCb Computing Group.
ATLAS Physics Analysis Framework James R. Catmore Lancaster University.
CERN IT Department CH-1211 Genève 23 Switzerland t EGEE09 Barcelona ATLAS Distributed Data Management Fernando H. Barreiro Megino on behalf.
Hall D Computing Facilities Ian Bird 16 March 2001.
ATLAS – statements of interest (1) A degree of hierarchy between the different computing facilities, with distinct roles at each level –Event filter Online.
ALICE analysis preservation
Philippe Charpentier CERN – LHCb On behalf of the LHCb Computing Group
ALICE Computing Model in Run3
Gridifying the LHCb Monte Carlo production system
ATLAS DC2 & Continuous production
The ATLAS Computing Model
LHCb thinking on Regional Centres and Related activities (GRIDs)
Development of LHCb Computing Model F Harris
The LHCb Computing Data Challenge DC06
Presentation transcript:

31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure (ie. Grid). The problem then reduces to: What datasets are required? Where are they required? Why are they required? Who is going to generate, distribute them? What are the formats, sizes & access patterns?

Event Tag Data Physics Objects Reconstructed Data Raw Data

Data Import Data Export Mass Storage & Disk Servers Database Servers Tapes Network from CERN Network from Tier 2 and simulation centers Physics Software Development R&D Systems and Testbeds Info servers Code servers Web Servers Telepresence Servers Training Consulting Help Desk Production Reconstruction Raw/Sim-->ESD Scheduled, predictable experiment/ physics groups Production Analysis ESD-->AOD AOD-->DPD Scheduled Physics groups Individual Analysis AOD-->DPD and plots Chaotic Physicists Desktops Tier 2 Local institutes CERN Tapes Support Services

batch physics analysis batch physics analysis detector event summary data raw data event reconstruction event reconstruction event simulation event simulation analysis objects (extracted by physics topic) Offline Data and Computation for Physics Analysis event filter (selection & reconstruction) event filter (selection & reconstruction) processed data

CPU for production Mass Storage for RAW, ESD AOD, and TAG Institute Selected User Analyses Institute Selected User Analyses Regional Centre User analysis Production Centre Generate raw data Reconstruction Production analysis User analysis Regional Centre User analysis Regional Centre User analysis Institute Selected User Analyses Regional Centre User analysis Institute Selected User Analyses CPU for analysis Mass storage for AOD, TAG CPU and data servers AOD,TAG real : 80TB/yr sim: 120TB/yr AOD,TAG 8-12 TB/yr LHCb

Production Centre (x1) Regional Centre (~x5) Institute (~x50) Real DataSimulated Data Data collection Triggering Reconstruction Final State Reconstruction CERN WAN Output to each RC: AOD and TAG datasets 20TB x 4 times/yr= 80TB/yr User Analysis WAN Output to each Institute: AOD and TAG for samples 1TB x 10 times/yr= 10TB/yr RAL, Lyon,... Event Generation GEANT tracking Reconstruction Final State Reconstruction WAN Output to each RC: AOD, Generator and TAG datasets 30TB x 4 times/yr= 120TB/yr User Analysis Selected User Analysis WAN Output to each institute: AOD and TAG for samples 3TB x 10 times/yr= 30TB/yr LHCb

Dataflow Model RAW Data DAQ system L2/L3 Trigger Calibration Data Reconstruction Event Summary Data (ESD) Reconstruction Tags Detector RAW Tags L3YES, sample L2/L3NO ESD Reconstruction Tags Analysis Object Data (AOD)Physics Tags First Pass Analysis Physics Analysis Private Data Analysis Workstation Physics results ESD RAW

Need to answer questions like... How will a physicist in Bristol/Brunel/IC/RAL: Select events for a given physics channel from a years worth of data taking? Transfer/replicate the selection for further analysis? Generate & process a large sample of simulated events? Run his/her batch job on existing samples of Monte- Carlo events (eg. at Tier1/Tier2)? Where do you want the data? What sort of data do you need - Tag,AOD,ESD,Raw?

How to Go Forward? Need to identify critical mass of people formed from all of the institutes who will start to study, develop and exploit CMS(UK) facilities now. Require expert(ise) in OO databases - specifically Objectivity (BaBar estimate 1 FTE). Each institute needs to start to identify its data requirements for simulation/physics/trigger studies. Need to understand how best to distribute, replicate, and centralise database & associated resources. Need good organisation with regular meetings, etc.