ATLAS Data Challenges US ATLAS Physics & Computing ANL October 30th 2001 Gilbert Poulard CERN EP-ATC.

Slides:



Advertisements
Similar presentations
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
Advertisements

Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
1 Databases in ALICE L.Betev LCG Database Deployment and Persistency Workshop Geneva, October 17, 2005.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
23/04/2008VLVnT08, Toulon, FR, April 2008, M. Stavrianakou, NESTOR-NOA 1 First thoughts for KM3Net on-shore data storage and distribution Facilities VLV.
Current Monte Carlo calculation activities in ATLAS (ATLAS Data Challenges) Oxana Smirnova LCG/ATLAS, Lund University SWEGRID Seminar (April 9, 2003, Uppsala)
Trigger and online software Simon George & Reiner Hauser T/DAQ Phase 1 IDR.
August 98 1 Jürgen Knobloch ATLAS Software Workshop Ann Arbor ATLAS Computing Planning ATLAS Software Workshop August 1998 Jürgen Knobloch Slides also.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
The D0 Monte Carlo Challenge Gregory E. Graham University of Maryland (for the D0 Collaboration) February 8, 2000 CHEP 2000.
ATLAS-Specific Activity in GridPP EDG Integration LCG Integration Metadata.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
REVIEW OF NA61 SOFTWRE UPGRADE PROPOSAL. Mandate The NA61 experiment is contemplating to rewrite its fortran software in modern technology and are requesting.
ARGONNE  CHICAGO Ian Foster Discussion Points l Maintaining the right balance between research and development l Maintaining focus vs. accepting broader.
David Adams ATLAS ATLAS Distributed Analysis David Adams BNL March 18, 2004 ATLAS Software Workshop Grid session.
Conditions DB in LHCb LCG Conditions DB Workshop 8-9 December 2003 P. Mato / CERN.
Data Import Data Export Mass Storage & Disk Servers Database Servers Tapes Network from CERN Network from Tier 2 and simulation centers Physics Software.
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
Offline Coordinators  CMSSW_7_1_0 release: 17 June 2014  Usage:  Generation and Simulation samples for run 2 startup  Limited digitization and reconstruction.
Nick Brook Current status Future Collaboration Plans Future UK plans.
LHC Computing Review - Resources ATLAS Resource Issues John Huth Harvard University.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
ANL/BNL Virtual Data Technologies in ATLAS Alexandre Vaniachine Pavel Nevski US-ATLAS Core/GRID software workshop Brookhaven National Laboratory May 6-7,
11 December 2000 Paolo Capiluppi - DataGrid Testbed Workshop CMS Applications Requirements DataGrid Testbed Workshop Milano, 11 December 2000 Paolo Capiluppi,
ALICE Upgrade for Run3: Computing HL-LHC Trigger, Online and Offline Computing Working Group Topical Workshop Sep 5 th 2014.
N ATIONAL E NERGY R ESEARCH S CIENTIFIC C OMPUTING C ENTER Charles Leggett The Athena Control Framework in Production, New Developments and Lessons Learned.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
F. Fassi, S. Cabrera, R. Vives, S. González de la Hoz, Á. Fernández, J. Sánchez, L. March, J. Salt, A. Lamas IFIC-CSIC-UV, Valencia, Spain Third EELA conference,
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
5 May 98 1 Jürgen Knobloch Computing Planning for ATLAS ATLAS Software Week 5 May 1998 Jürgen Knobloch Slides also on:
19 November 98 1 Jürgen Knobloch ATLAS Computing ATLAS Computing - issues for 1999 Jürgen Knobloch Slides also on:
Fabiola Gianotti, 31/8/’99 PHYSICS and SOFTWARE ATLAS Software Week 31/8/’99 Fabiola Gianotti Software requirements of physics groups What should Detector.
November 2013 Review Talks Morning Plenary Talk – CLAS12 Software Overview and Progress ( ) Current Status with Emphasis on Past Year’s Progress:
Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.
CMS Computing and Core-Software USCMS CB Riverside, May 19, 2001 David Stickland, Princeton University CMS Computing and Core-Software Deputy PM.
…building the next IT revolution From Web to Grid…
05/09/2001ATLAS UK Physics Meeting Data Challenge Needs RWL Jones.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
CERN IT Department CH-1211 Genève 23 Switzerland t Frédéric Hemmer IT Department Head - CERN 23 rd August 2010 Status of LHC Computing from.
ATLAS Data Challenges The Physics point of view UCL, September 5th 2001 Fabiola Gianotti (CERN)
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
Post-DC2/Rome Production Kaushik De, Mark Sosebee University of Texas at Arlington U.S. Grid Phone Meeting July 13, 2005.
David Adams ATLAS DIAL: Distributed Interactive Analysis of Large datasets David Adams BNL August 5, 2002 BNL OMEGA talk.
SC4 Planning Planning for the Initial LCG Service September 2005.
Integration of the ATLAS Tag Database with Data Management and Analysis Components Caitriana Nicholson University of Glasgow 3 rd September 2007 CHEP,
Computing R&D and Milestones LHCb Plenary June 18th, 1998 These slides are on WWW at:
Computing for Alice at GSI (Proposal) (Marian Ivanov)
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
ATLAS Distributed Computing perspectives for Run-2 Simone Campana CERN-IT/SDC on behalf of ADC.
The MEG Offline Project General Architecture Offline Organization Responsibilities Milestones PSI 2/7/2004Corrado Gatto INFN.
Overview of ATLAS Data Challenge Oxana Smirnova LCG/ATLAS, Lund University GAG monthly, February 28, 2003, CERN Strongly based on slides of Gilbert Poulard.
Ian Bird Overview Board; CERN, 8 th March 2013 March 6, 2013
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
Software Week - 8/12/98G. Poulard - CERN EP/ATC1 Status of Software for Physics TDR Atlas Software Week 8 December 1998 G. Poulard.
ALICE RRB-T ALICE Computing – an update F.Carminati 23 October 2001.
David Adams ATLAS ATLAS Distributed Analysis (ADA) David Adams BNL December 5, 2003 ATLAS software workshop CERN.
August 98 1 Jürgen Knobloch ATLAS Software Workshop Ann Arbor ACOS Report ATLAS Software Workshop December 1998 Jürgen Knobloch Slides also on:
L. Perini DATAGRID WP8 Use-cases 19 Dec ATLAS short term grid use-cases The “production” activities foreseen till mid-2001 and the tools to be used.
ATLAS Distributed Analysis DISTRIBUTED ANALYSIS JOBS WITH THE ATLAS PRODUCTION SYSTEM S. González D. Liko
LHCb Current Understanding of Italian Tier-n Centres Domenico Galli, Umberto Marconi Roma, January 23, 2001.
Update on CHEP from the Computing Speaker Committee G. Carlino (INFN Napoli) on behalf of the CSC ICB, October
SuperB and its computing requirements
The LHCb Software and Computing NSS/IEEE workshop Ph. Charpentier, CERN B00le.
Readiness of ATLAS Computing - A personal view
ATLAS DC2 ISGC-2005 Taipei 27th April 2005
US ATLAS Physics & Computing
ATLAS DC2 & Continuous production
Computing activities at Victoria
Agenda SICb Session Status of SICb software migration F.Ranjard
Offline framework for conditions data
Presentation transcript:

ATLAS Data Challenges US ATLAS Physics & Computing ANL October 30th 2001 Gilbert Poulard CERN EP-ATC

ATLAS Plenary-18 October Outline  ATLAS Data Challenges & “LHC Computing Grid Project”  Goals  Scenarios  Organization

ATLAS Plenary-18 October From CERN Computing Review CERN Computing Review (December February 2001) Recommendations: organize the computing for the LHC era LHC Grid project Phase 1: Development & prototyping ( ) Phase 2: Installation of the 1st production system ( ) Software & Computing Committee (SC2) Project accepted by the CERN council (20 September) Ask the experiments to validate their Computing model by iterating on a set of Data Challenges of increasing complexity However DC’s were in our plans

ATLAS Plenary-18 October LHC Computing GRID project Phase1:  Prototype construction develop Grid middleware acquire experience with high-speed wide-area network develop model for distributed analysis adapt LHC applications deploy a prototype (CERN+Tier1+Tier2)  Software complete the development of the 1st version of the physics application and enable them for the distributed grid model develop & support common libraries, tools & frameworks including simulation, analysis, data management,... in parallel LHC collaborations must develop and deploy the first version of their core software

ATLAS Plenary-18 October ATLAS Data challenges  Goal understand and validate: our computing model, our data model and our software our technology choices  How? In iterating on a set of DCs of increasing complexity Start with data which looks like real data Run the filtering and reconstruction chain Store the output data into our database Run the analysis Produce physics results To study Performances issues, database technologies, analysis scenarios,... To identify weaknesses, bottle necks, etc… (but also good points)

ATLAS Plenary-18 October ATLAS Data challenges  But: Today we don’t have ‘real data’ Needs to produce ‘simulated data’ first so: Physics Event generation Simulation Pile-up Detector response Plus reconstruction and analysis will be part of the first Data Challenges we need also to “satisfy” the ATLAS communities HLT, Physics groups,...

ATLAS Plenary-18 October ATLAS Data Challenges: DC0  DC0 November-December 2001 'continuity' test through the software chain aim is primarily to check the state of readiness for DC1 We plan ~100k Z+jet events, or similar To validate the software: issues to be checked include – G3 simulation running with the ‘latest’ version of the geometry –reconstruction running Re-analyze part of the Physics TDR data “reading from & writing to Objectivity” Would test the “Objy database infrastructure” Complementary to the “continuity test”

ATLAS Plenary-18 October ATLAS Data Challenges: DC1  DC1 February-July 2002 reconstruction & analysis on a large scale learn about data model; I/O performances; identify bottle necks … use of GRID as and when possible and appropriate data management Use (evaluate) more than one database technology (Objectivity and ROOT I/O) Learn about distributed analysis should involve CERN & outside-CERN sites site planning is going on, an incomplete list already includes sites from Canada, France, Italy, Japan, UK, US, Russia scale 10 7 events in days, O(1000) PC’s data needed by HLT & Physics groups (others?) Study performance of Athena and algorithms for use in HLT simulation & pile-up will play an important role checking of Geant4 versus Geant3

ATLAS Plenary-18 October ATLAS Data Challenges: DC1  DC1 will have two distinct phases First, production of events for HLT TDR, where the primary concern is delivery of events to HLT community; Second, testing of software (G4, dBases, detector description,etc.) with delivery of events for physics studies Software will change between these two phases Simulation & pile-up will be of great importance strategy to be defined (I/O rate, number of “event” servers?) As we want to do it ‘world-wide’ we will ‘port’ our software to the GRID environment and use as much as possible the GRID middleware (ATLAS kit to be prepared)

ATLAS Plenary-18 October ATLAS Data Challenges: DC2  DC2 Spring-Autumn 2003 Scope will depend on what has and has not been achieved in DC0 & DC1 At this stage the goal includes: Use of ‘TestBed’ which will be built in the context of the Phase 1 of the “LHC Computing Grid Project” Scale at a sample of 10 8 events System at a complexity ~50% of system Extensive use of the GRID middleware Geant4 should play a major role Physics samples could(should) have ‘hidden’ new physics Calibration and alignment procedures should be tested May be to be synchronized with “Grid” developments

ATLAS Plenary-18 October DC scenario  Production Chain: Event generation Detector Simulation Pile-up Detectors responses Reconstruction Analysis These steps should be as independent as possible

ATLAS Plenary-18 October Production stream “OO-db” is used for “OO database”, it could be Objectivity, ROOT/IO, …

ATLAS Plenary-18 October DC1 Ntuple Pythia, Isajet, Herwig, MyGeneratorModule HepMC Obj., Root ATLFAST OO Ntuple Obj., Root GENZ G3/DICE RD event ? OO-DB ? ATHENA reconstruction Comb. Ntuple Obj., Root Comb. Ntuple G4 Obj. Missing: -- filter, trigger -- Detector description -- HepMC in Root -- Digitisation -- ATLFAST output in Root (TObjects) -- Pile-up -- Link MC truth - ATLFAST -- Reconstruction output in Obj., Root -- EDM (e.g. G3/DICE, G4 input to ATHENA) Ntuple- like ZEBRA

ATLAS Plenary-18 October Detector Simulation  Geant3 and Geant4 For HLT & physics studies we will use Geant3 Continuity with past studies ATLAS validation of Geant4 is proceeding well but not completed Detector simulation in Atlsim (Zebra output) Some production with Geant4 too Goals to be defined with G4 and Physics groups It is important to get experience with ‘large production’ as part of G4 validation it is important to use the same geometry input  In the early stage we could decide to use only part of the detector it would also be good to use the same sample of generated events Detector simulation we propose to use the FADS/Goofy framework Output will be ‘Hits collections’ in OO-db  Detector responses (& pileup) has to be worked on in new framework

ATLAS Plenary-18 October Reconstruction  Reconstruction we want to use the ‘new reconstruction’ code being run in Athena framework Input should be from OO-db Output in OO-db: ESD (event summary data) AOD (analysis object data) TAG (event tag)

ATLAS Plenary-18 October Analysis  We are just starting to work on this but Analysis tools evaluation should be part of the DC It will be a good test of the Event Data Model Performance issues should be evaluated Analysis scenario It is important to know the number of analysis groups, the number of physicists per group, the number of people who want to access the data at the same time It is of ‘first’ importance to ‘design’ the analysis environment to measure the response time to identify the bottle necks for that users’ input is needed

ATLAS Plenary-18 October Data management  It is a key issue  Evaluation of more than one technology is part of DC1 Infrastructure has to be put in place: For Objectivity & ROOT I/O Software, hardware, tools to manage the data –creation, replication, distribution discussed in database workshop  Tools are needed to run the production “bookkeeping”, “cataloguing” … Run number, random number allocation, … Working group now set-up Job submission Close collaboration with ATLAS Grid (validation of Release 1)

ATLAS Plenary-18 October DC1-HLT - CPU Number of events Time per event sec SI95 Total time Sec SI95 Total time Hours SI95 simulation * reconstruction * * 10 6

ATLAS Plenary-18 October DC1-HLT - data Number of events Event size MB Total size GB Total size TB simulation reconstruction

ATLAS Plenary-18 October DC1-HLT data with pile-up LNumber of events Event size MB Total size GBTotal size TB 2 x x 10 6 (1) 2.6 (2) x 10 6 (1) 6.5 (2) In addition to ‘simulated’ data, assuming ‘filtering’ after simulation (~14% of the events kept). - (1) keeping only ‘digits’ - (2) keeping ‘digits’ and ‘hits’

ATLAS Plenary-18 October Ramp-up CERN Week in 2002

ATLAS Plenary-18 October What next  This week: Have an updated list of goals & requirements prepared with HLT, Physics communities simulation, reconstruction, database communities people working on ‘infrastructure’ activities bookkeeping, cataloguing,... Have a list of tasks Some Physics oriented But also like testing code, running production, … with ‘established’ responsibilities and priorities And working groups in place You can join

ATLAS Plenary-18 October What next  In parallel Define ATLAS validation plan for EU-DataGrid Release 1 ATLAS software for DC0 and DC1 Understand the involvement of Tier centers Insure that we have the necessary CERN and outside CERN We have already some input and a ‘table’ is being prepared “And turn the key”