Presentation is loading. Please wait.

Presentation is loading. Please wait.

23-06-2003L.Perini-CSN11 ATLAS Italia Calcolo Stato e piani: Ruolo di LCG Nessun finanziamento chiesto ora (a Settembre si)

Similar presentations


Presentation on theme: "23-06-2003L.Perini-CSN11 ATLAS Italia Calcolo Stato e piani: Ruolo di LCG Nessun finanziamento chiesto ora (a Settembre si)"— Presentation transcript:

1 23-06-2003L.Perini-CSN11 ATLAS Italia Calcolo Stato e piani: Ruolo di LCG Nessun finanziamento chiesto ora (a Settembre si)

2 2 Layout Stato e piani del s/w –Slides scelte da presentazione di D.Quarrie al GDB del 10 giugno scorso Passato e futuro dei Data Challenges –Slides scelte da presentazione di G.Poulard al GDB del 10 giugno scorso Ruolo di LCG e fallbacks –Slides prodotte dal Computing coordinator per questa riunione, agreed nel gruppo di rappresentanti ATLAS in LCG

3 GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory DRQuarrie@LBL.Gov

4 David R. Quarrie: ATLAS Offline Software GDB Meeting - 10 June 2003 4 ATLAS is in closing stages of transition from FORTRAN-based software to C++ based For DC-0 & DC-1 simulation was based on Geant3 For DC-2 it will be based on Geant4  ATLAS has been very active in validating Geant4 Common Framework (Athena) based on collaboration with LHCb First version of C++ reconstruction in place  Used in Level 2 and Event Filter as well as offline  First major design iteration underway  Functionality and robustness are already good (>10 6 events in DC-2)  Performance in some areas needs work Software Overview

5 David R. Quarrie: ATLAS Offline Software GDB Meeting - 10 June 2003 5 Based on concepts of Components Services, Algorithms and Tools Highly modular and flexible Good mapping to GRID services Based on abstract interfaces - no direct coupling with algorithms Compatible with non-GRID environment (e.g. laptop) Integration with interactive scripting language (Python) Athena

6 David R. Quarrie: ATLAS Offline Software GDB Meeting - 10 June 2003 6 A recurring problem is compatibility with external software e.g. POOL/SEAL need their own set of external software (e.g. Boost) Still grappling with obvious (e.g. incompatible versions) and not so obvious (e.g. compilation/configuration flags) problems This is still an area requiring more work to minimize ATLAS-specific external packages and take advantage of e.g. LCG common installations Software Distribution (2/2)

7 David R. Quarrie: ATLAS Offline Software GDB Meeting - 10 June 2003 7 LCG Component Integration POOL/SEAL Geant4 Integration Pile-up Infrastructure in place All detectors supported Detector Description Integration Reconstruction and G4 Simulation from common geometry Calibration/alignment infrastructure in place Begin to incorporate feedback from Reco Task Force New Reco EDM Release 7.0.0 - 31 July 2003

8 David R. Quarrie: ATLAS Offline Software GDB Meeting - 10 June 2003 8 Dual targets DC-2 (Q2-Q3 2004) Combined Testbeams (Q2-Q3 2004) A major focus is consolidation from 7.0.0 Robustness, house-cleaning Performance G4 validated for production Full integration from RecoTaskForce designs/recommendations Interactive as well as batch  Replacement of jobOptions files by Python scripts GRID integration Release 8.0.0 (Feb 2004)

9 David R. Quarrie: ATLAS Offline Software GDB Meeting - 10 June 2003 9 Multiple prototypes developed in conjunction with data challenges Both European and USA Magda, AMI, Grappa, etc. Some overlapping functionality, but necessary to explore Distributed Physics Analysis Projects developing GANGA, DIAL, Chimera Goal is to bring these under a coherent umbrella by end of Q3 2003 ready for DC-2 GRID Projects

10 10 Towards ATLAS Data Challenges 2 LCG-GDB 10 th June 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC

11 11 Outline nDC1: a starting point for DC2 u What has been achieved nDC2 u Main goals u Planning u Resources

12 12 ATLAS DC1 (July 2002-April 2003) nPrimary concern was delivery of events to High Level Trigger (HLT) and to Physics communities u HLT-TDR due by June 2003 u Athens Physics workshop in May 2003 nPut in place the full software chain from event generation to reconstruction u Switch to AthenaRoot I/O (for Event generation) u Updated geometry u New Event Data Model and Detector Description u Reconstruction (mostly OO) moved to Athena nPut in place the distributed production u “ATLAS kit” (rpm) for software distribution u Scripts and tools (monitoring, bookkeeping)  AMI database; Magda replica catalogue; VDC  Job production (AtCom) u Quality Control and Validation of the full chain nUse as much as possible Grid tools

13 13 Tools in DC1 AMIMagda VDC AtCom GRAT replica catalog physics metadata recipe catalog Perm. production log Trans. production log physics metadata perm production log trans production log replica catalog recipe catalog interactive production framework automatic production framework AMI physics metadata

14 14 DC1 in numbers ProcessNo. of events CPU TimeCPU-days (400 SI2k) Volume of data kSI2k.monthsTB Simulation Physics evt. 10 7 4153000023 Simulation Single part. 3x10 7 12596002 Lumi02 Pile-up4x10 6 22165014 Lumi10 Pile-up2.8x10 6 78600021 Reconstruction4x10 6 503750 Reconstruction + Lvl1/2 2.5x10 6 (84)(6300) Total690 (+84)51000 (+6300) 60

15 15 ATLAS DC1 Phase 1 : July-August 2002 3200 CPU‘s 110 kSI95 71000 CPU days 5*10* 7 events generated 1*10* 7 events simulated 3*10* 7 single particles 30 Tbytes 35 000 files 39 Institutes in 18 Countries 1.Australia 2.Austria 3.Canada 4.CERN 5.Czech Republic 6.France 7.Germany 8.Israel 9.Italy 10.Japan 11.Nordic 12.Russia 13.Spain 14.Taiwan 15.UK 16.USA grid tools used at 11 sites

16 16 Primary data (in 8 sites) Data (TB) Simulation: 23.7 (40%) Pile-up: 35.4 (60%) Lumi02: (14.5) Lumi10: (20.9) Pile-up: Low luminosity ~ 4 x 10 6 events (~ 4 x 10 3 NCU days) High luminosity ~ 3 x 10 6 events ( ~ 12 x 10 3 NCU days) Data replication using Grid tools (Magda)

17 17 Grid in ATLAS DC1 US-ATLAS EDG Testbed Prod NorduGrid part of simulation reproduce part of full phase 1 & 2 Pile-up phase 1 data production reconstruction several tests reconstruction GRAT & Chimera

18 18 ATLAS Data Challenges: DC2 July 2003 – July 2004 nAt this stage the goal includes:  Full detector simulation with Geant4  Pile-up and digitization in Athena  Deployment of the complete Event Data Model and the Detector Description  Use as much as possible the LCG Applications software (e.g. POOL)  Test the calibration and alignment procedures  Perform large-scale physics analysis  Use widely the GRID middleware  Use more and more GRID tools  Run as much as possible the production on LCG-1

19 19 DC2 and LCG-1 (LP summary) nLCG-1 u We intend to use and contribute to validate LCG-1 components when they become available (R-GMA; RLS; …) u ATLAS-EDG becoming ATLAS-LCG task force nScale of DC2 u About 10 7 events simulated as in DC1 (but GEANT4) u All of them pileupped u All of them reconsructed u Analysis….

20 20 DC2:Time scale nEnd-July: Release 7 nMid-November: pre-production release nFebruary 1 st : ”production” release nApril 1 st nJune 1 st : “DC2” nJuly 15th  Put in place, understand & validate:  Geant4  POOL persistency & LCG App.  Event Data Model  Digitization; pile-up; byte-stream  Conversion of DC1 data to POOL and run reconstruction  Testing and validation  Run test-production  Start final validation  Start simulation  Pile-up & digitization  Transfer data to CERN  Start Reconstruction on “Tier0”  Distribution of ESD & AOD  Calibration; alignment  Start Physics analysis  Reprocessing

21 21 ATLAS Data Challenges: DC2 nWe are building an ATLAS Grid production & Analysis system nWe intend to put in place a “permanent” Monte Carlo production system u If we continue to produce simulated data during summer 2004 we want to keep open the possibility to run another “DC” later (November 2004?) with more statistics

22 INFN, 23 June 200322 ATLAS Software & Computing and LCG products in 2003-2004 Dario Barberis CERN & Genoa University/INFN

23 INFN, 23 June 200323 ATLAS Computing Timeline POOL/SEAL release ATLAS release 7 (with POOL persistency) LCG-1 deployment ATLAS complete Geant4 validation ATLAS release 8 DC2 Phase 1: simulation production DC2 Phase 2: intensive reconstruction (the real challenge!) Combined test beams (barrel wedge) Computing Model paper ATLAS Computing TDR and LCG TDR DC3: produce data for PRR and test LCG-n Computing Memorandum of Understanding Physics Readiness Report Start commissioning run GO! 2003 2004 2005 2006 2007 NOW

24 INFN, 23 June 200324 How to get there: 1) Software Software developments in progress: –Geant4 simulation validation for production –GeoModel (Detector Description) integration in simulation and reconstruction –Full implementation of new Event Data Model –Restructuring of trigger selection, reconstruction and analysis environment –POOL persistency –Interval of Validity service and Conditions DataBase –Detector response simulation in Athena –Pile-up in Athena (was in atlsim/G3)

25 INFN, 23 June 200325 SEAL –Plug-in manager Internal use by POOL now Full integration into Athena Q3 2003 –Data Dictionary Integrated into Athena now Includes Python support POOL –Integration underway –Goal is to have demonstrated support for POOL by 31 July Ability to read and write components of the ATLAS EDM –Complete support by Oct 2003 SEAL Maths Library –Integrate in time for DC-2 PI –Integrate ROOT implementation of AIDA API Q3 2003 LCG Applications Components

26 INFN, 23 June 200326 Main product to we need urgently is POOL persistency –Right now many integration problems –Several ATLAS and LCG people actively working on them –We assume major problems will be sorted out by end July (ATLAS release 7), and full deployment in October What if...? –If there are problems of principle that cannot be overcome (discovered during the Summer): go back to AthenaROOT (home-made direct coupling of Athena/StoreGate to ROOT I/O already prototyped) write converters by hand introduce delays as more work is needed not nice. –Decision in October 2003 to be ready anyway for DC2 LCG Applications: fall-back solutions

27 INFN, 23 June 200327 How to get there: 2) Data Challenges DC1 (2002-2003) completed in April 2003: –2 nd pass of reconstruction with Trigger L1 and L2 algorithms for HLT TDR in progress –Zebra/Geant3 files will be converted to POOL format and used for large-scale persistency tests –they will be used as input for validation of new reconstruction environment DC2 (1 st half 2004): –provide data for Computing Model document (end 2004) –full use of Geant4, POOL and Conditions DB –simulation of full ATLAS and of 2004 combined test beam –prompt reconstruction of 2004 combined test beam DC3 (2 nd half 2005): –scale up computing infrastructure and complexity –provide data for Physics Readiness Report Commissioning Run (from 2 nd half 2006): –real operation!

28 INFN, 23 June 200328 We plan to test (and use) the LCG-1 infrastructure as soon as deployed and functional –First tests will start in 2 nd half of July as soon as CERN installation is open to the experiments ATLAS-EDG test group will become ATLAS-LCG test group can run jobs of varying complexity (CPU and I/O), simulation, pile- up, reconstruction, (analysis later) –In parallel, we continue developing our production tools we have to live with several Grid flavours for a long time to come and we have for the time being to continue productions in non-Grid environments –Effort on distributed analysis tool underway within several national Grid projects new RTAG-11 should help here to get some coherence in developments internal ATLAS coordination also started in this area LCG-1 Deployment

29 INFN, 23 June 200329 What if...? –We assume there will always be several flavours of Grids and other production sites we have to cope with typical examples are electrical grids: we can move electrical appliances all over the world but we need different connectors and transformers –For large-scale productions we know how to cope in the “old” way in DC1 we have produced >10 7 fully-simulated events using >50 different sites, some linked in Grid systems –The real need is for the “added values” of Grids, mainly useful for end-user data analysis: user certification data and CPU management (submit jobs to a single interface) Conclusion: we can cope with delays in the availability of a fully performant system till Q2 2004 (DC2): if still problematic at that point we have to re-think our computing model. LCG-1 Deployment: fall-back solutions


Download ppt "23-06-2003L.Perini-CSN11 ATLAS Italia Calcolo Stato e piani: Ruolo di LCG Nessun finanziamento chiesto ora (a Settembre si)"

Similar presentations


Ads by Google