Presentation is loading. Please wait.

Presentation is loading. Please wait.

Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.

Similar presentations


Presentation on theme: "Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR."— Presentation transcript:

1 Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR

2 Dario Barberis: Introduction & News ATLAS Software & Computing Week - 1 Mar. 2004 ATLAS Computing Timeline POOL/SEAL release (done) ATLAS release 7 (with POOL persistency) (done) LCG-1 deployment (in progress) ATLAS complete Geant4 validation (done) ATLAS release 8 DC2 Phase 1: simulation production DC2 Phase 2: intensive reconstruction (the real challenge!) Combined test beams (barrel wedge) Computing Model paper Computing Memorandum of Understanding (moved to end 2004) ATLAS Computing TDR and LCG TDR DC3: produce data for PRR and test LCG-n Physics Readiness Report Start commissioning run GO! 2003 2004 2005 2006 2007 NOW

3 Dario Barberis: Introduction & News ATLAS Software & Computing Week - 1 Mar. 2004 3 Near-term Software Release Plan l 7.5.0: 14th Jan 2004 l 7.6.0: 4th Feb l 7.7.0: 25th Feb <- SPMB Decision 3rd Feb l 8.0.0: 17th Mar <- DC2 & CTB Simulation Release l 8.1.0: 7th Apr l 8.2.0: 28th Apr l 8.3.0: 19th May l 9.0.0: 9th Jun <- DC2 & CTB Reconstruction Release l 9.1.0: 30th Jun l 9.2.0: 21st Jul l 9.3.0: 11th Aug ⇐ 15th Feb: LAr Technical run starts ⇐ 1st May: Baseline DC-2 Simulation starts ⇐ 10th May: Testbeam starts ⇐ 1st Jul: Baseline DC-2 Reconstruction starts ⇐ 14th Jul: Complete Testbeam ⇐ 15th Jul: Baseline DC-2 Physics Analysis starts

4 Expected Major Milestones  GEANT4 Simulation DC2 (in validation) Test Beam (underway) Pile-Up, Digitization in Athena (debugging)  GeoModel Inner Detector, Muon Spectrometer  Conversion to CLHEP Units (mm, MeV, [-pi,pi])  POOL/SEAL Persistency  Bytestream Converters  Preliminary Conditions Capabilities Other Stepping Stones  Move to InstallArea  jobOption.txt --> jobOption.py  Distribution Kits Release 8

5 ATLAS DC2 ATLAS Software Workshop 2 March 2004 Gilbert Poulard CERN PH-ATC

6 DC2: goals  At this stage the goal includes:  Full use of Geant4; POOL; LCG applications  Pile-up and digitization in Athena  Deployment of the complete Event Data Model and the Detector Description  Simulation of full ATLAS and 2004 combined Testbeam  Test the calibration and alignment procedures  Use widely the GRID middleware and tools  Large scale physics analysis  Computing model studies (document end 2004)  Run as much as possible of the production on LCG-2

7 DC2 operation  Consider DC2 as a three-part operation: o part I: production of simulated data (May-June 2004)  needs Geant4, digitization and pile-up in Athena, POOL persistency  “minimal” reconstruction just to validate simulation suite  will run on any computing facilities we can get access to around the world o part II: test of Tier-0 operation (July 2004)  needs full reconstruction software following RTF report design, definition of AODs and TAGs  (calibration/alignment and) reconstruction will run on Tier-0 prototype as if data were coming from the online system (at 10% of the rate)  output (ESD+AOD) will be distributed to Tier-1s in real time for analysis o part III: test of distributed analysis on the Grid (August-Oct. 2004)  access to event and non-event data from anywhere in the world both in organized and chaotic ways o in parallel: run distributed reconstruction on simulated data

8 DC2: Scenario & Time scale September 03: Release7 March 17th: Release 8 (production) May 3 rd 04: July 1 st 04: “DC2” August 1 st : Put in place, understand & validate: Geant4; POOL; LCG applications Event Data Model Digitization; pile-up; byte-stream Conversion of DC1 data to POOL; large scale persistency tests and reconstruction Testing and validation Run test-production Start final validation Start simulation; Pile-up & digitization Event mixing Transfer data to CERN Intensive Reconstruction on “Tier0” Distribution of ESD & AOD Calibration; alignment Start Physics analysis Reprocessing

9 DC2 resources ProcessNo. of events Time duration CPU power Volume of data At CERN Off site monthskSI2kTB Simulation 10 7 260025520Phase I (May- June) Pile-up (*) Digitization 10 7 2400751560 Byte-stream 10 7 2(small)20 16 Total Phase I 10 7 210001204096 Reconstruction Tier-0 10 7 0.56005510Phase II (July) Reconstruction Tier-1 10 7 2600505 Total 10 7 13045111

10 Atlas Production System schema RB Chimera RB Task (Dataset) Partition Transf. Definition Task Transf. Definition + physics signature Executable name Release version signature Supervisor 1Supervisor 2Supervisor 4 US GridLCGNG Local Batch Task = [job]* Dataset = [partition]* JOB DESCRIPTION Human intervention Data Management System US Grid Executer LCG Executer NG Executer LSF Executer Supervisor 3 Job Run Info Location Hint (Task) Location Hint (Job) Job (Partition) AMI

11 Tiers in DC2  Tier-0 o 20% of simulation will be done at CERN o All data in ByteStream format (~16 TB) will be copied to CERN o Reconstruction will be done at CERN (in ~10 days). o Reconstruction output (ESD) will be exported in 2 copies from Tier-0 ( 2 X ~5 TB).

12 Tiers in DC2  Tier-1s will have to o Host simulated data produced by them or coming from Tier-2; plus ESD (& AOD) coming from Tier-0 o Run reconstruction in parallel to Tier-0 exercise (~2 months)  This will include links to MCTruth  Produce and host ESD and AOD o Provide access to the ATLAS V.O. members  Tier-2s o Run simulation (and other components if they wish to) o Copy (replicate) their data to Tier-1 o ATLAS is committed to LCG  All information should be entered into the relevant database and catalog

13 Core sites and commitments SiteImmediateLater CERN2001200 CNAF200500 FNAL10? FZK100? Nikhef124180 PIC100300 RAL70250 Taipei60? Russia3050 Prague1740 Budapest100? Totals864(+147)>2600(+>90) Initial LCG-2 core sites Other firm commitments Will bring in the other 20 LCG-1 sites as quickly as possible

14 Comments on schedule  The change of the schedule has been “driven” by o ATLAS side:  the readiness of the software Combined test beam has a highest priority  The availability of the production tools The integration with grid is not always easy o Grid side  The readiness of LCG We would prefer run Grid only!  Priorities are not defined by ATLAS only  For the Tier-0 exercise o It will be difficult to define the starting date before we have a better idea how work the “pile-up” and the “event-mixing” processes

15 Armin NAIRZ ATLAS Software Workshop, CERN, March 1-5, 2004 12 Generation and GEANT4 Simulation (as in Rel. 7.5.0+) are already in a production-like state  stable and robust  CPU times per event, event sizes ‘within specifications’ Digitisation (as in Rel. 7.6.0+) could not be tested for all sub-detectors (missing or not working)  for the tested ones, digitisation is working and stable  reason for confidence in working digitisation procedure for the whole detector in/after Rel. 7.7.0 Pile-up not yet fully functional Documentation on pre-production activities available from DC webpage  http://atlas.web.cern.ch/Atlas/GROUPS/SOFTWARE/DC/DC2/preprod  contains also how-to’s (running event generation, simulation, digitisation) Conclusions Status of the Software for Data Challenge 2 Armin Nairz Status of the Software for Data Challenge 2 Armin Nairz

16 ATLAS SW CZ Seminář 2.4.2004chudoba@fzu.cz Další schůze GRID Analysis Tools Distributed Analysis... Atlantis Tutorial Athena Tutorial


Download ppt "Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR."

Similar presentations


Ads by Google