Presentation is loading. Please wait.

Presentation is loading. Please wait.

ATLAS Simulation/Reconstruction Software Reported by Jim Shank, work done by most US Institutes. DOE/NSF review of LHC Software and Computing Projects.

Similar presentations


Presentation on theme: "ATLAS Simulation/Reconstruction Software Reported by Jim Shank, work done by most US Institutes. DOE/NSF review of LHC Software and Computing Projects."— Presentation transcript:

1

2 ATLAS Simulation/Reconstruction Software Reported by Jim Shank, work done by most US Institutes. DOE/NSF review of LHC Software and Computing Projects Fermilab 27-30 November, 2001

3 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 2 Outline  Activities in all systems (mostly by physicists):  Pixels, TRT, EM Cal, Tile Cal, Muons, Trigger  Well integrated into overall ATLAS computing effort.  In particular, the US core efforts on Athena and DB.  Review of recent activity, by sub-system  Future work: Data Challenges.

4 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 3 ATLAS Subsystem/Task Matrix Offline Coordinator ReconstructionSimulationDatabaseChair N. McCubbin D. Rousseau K. Amako D. Malon Inner Detector D. Barberis D. Rousseau F. Luehring S. Bentvelsen Liquid Argon J. Collot S. Rajagopalan M. Leltchouk R. Sobie Tile Calorimeter A. Solodkov F. Merritt A. Solodkov T. LeCompte Muon To be named J.F. Laporte A. Rimoldi S. Goldfarb LVL 2 Trigger/ Trigger DAQ S. George S. Tapprogge M. Wielers H. P. Beck Event Filter V. Vercesi F. Touchard Other US roles: D. Quarrie (LBNL), Chief Architect; P. Nevski (BNL), Geant3 simu coord; H. Ma (BNL), Raw data coord; C.Tull (LBNL), Eurogrid WP8 liaison

5 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 4 Subdetector SW Activities Summary  Performance/design studies  G3 based simulation  Test beam  Athena integration  Reconstruction development in C ++  G4 based simulation development  G4 physics validation  XML based detector description  Database  Conditions DB  Trigger/DAQ

6 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 5 Pixels-Conditions DB (Berkeley)  Goal: To develop general mechanism for retrieving time-dependent alignment constants from database and using them in reconstruction  Requires additions to Athena infrastructure  Requires extension of existing detector description interface  Will prototype using silicon and pixel detectors as the use case  Misalignments calculated from numbers stored ”Conditions Database”  Delivered through a general ”Time Dependent Conditions Service” in Athena (TCS)  In addition to event store (TES):  Need a detector store (TDeS)  Need interface to conditions DB (TCS)  A prototype TDeS coded by C. Leggett and P. Calafiura (a second instance of the Store-Gate Service without object deletion at the end of each event)  Work in progress… Work in progress… Work in progress…

7 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 6 TRT (F. Luehring/Indiana et al)  Athena Pile-Up Requirements documentation ATL-SOFT-2001  GEANT4 code writing  TRT hit and digitization definitions  TRT GEANT3 code  current beampipe + geometry updates  TRT material budget  TRT atlsim example on the grid.  GEANT 3  GEANT 4 comparisons

8 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 7 LAr Simulation (M. Leltchouk/Nevis et al)  LAr simulation coordination: M. Leltchouk/Nevis  Participation in G4 EM barrel development  LAr EM calorimeter hits (LArEMHit) were implemented in GEANT4 by B.Seligman.  The ROOT I/O scheme is used for hit persistency (see http://www.usatlas.bnl.gov/computing/software/db/LArRoot2.html http://www.usatlas.bnl.gov/computing/software/db/rootio.html ) http://www.usatlas.bnl.gov/computing/software/db/LArRoot2.html http://www.usatlas.bnl.gov/computing/software/db/rootio.html http://www.usatlas.bnl.gov/computing/software/db/LArRoot2.html http://www.usatlas.bnl.gov/computing/software/db/rootio.html  Comparisons of GEANT4 Simulations with Testbeam Data and GEANT3 for the ATLAS Liquid Argon Calorimeter has been presented on CHEP2001

9 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 8 GEANT 4 LAr Simulation

10 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 9 FCal1 Testbeam Setup in GEANT4  -counter TailCatcher Cryostat FCal2 Module 0 Iron Shield HoleVeto MWPC FCal1 Module 0 Argon Excluder VetoWall Setup around Cryostat only! FCal1 FCal2 FCal1 Electrode Pattern

11 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 10 FCal1: GEANT3/4 Comparisons of Energy Resolution Noise Cut Dependence Relative Energy Resolution [%] Beam Energy [GeV] Geant4 Geant3 Relative Energy Resolution [%] Beam Energy [GeV] No Noise Cut Fit to experimental data GEANT4 high energy resolution problem ??

12 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 11 LAr Reconstruction -- Major Milestones Met  Early design in PASO: Jan. 2000  Migrate to Athena: May 2000  LAr Reconstruction used as a test bed for early Athena  First application software to successfully migrate to Athena 3 working days at LBL 3 working days at LBL  First Common Calorimeter InterfacesOct. 2000  QA review of then available components Dec. 2000  S. Albrand (Grenoble)  Combined Reconstruction (egamma) Jan. 2001  Process GEANT4 LAr Hits (Root Objects) Mar. 2001  Lund ReleaseJune 2001  Establishing most of the reconstruction chain  From G3/G4 Hits to Particle Identification

13 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 12 LAr Data Classes  Data Objects proposed/implemented in March 2001  J. Collot et. al.

14 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 13 Comparison to ATRECON

15 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 14 Recent Plots using LAr recon. program

16 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 15 LAr Reconstruction Conclusion  A central framework that is evolving to provide robust support  The reconstruction design has been built over this framework  Much of the ‘Fortran’ code has been migrated.  Validation ongoing, but results are promising.  It now paves the way for work in:  optimizing and developing new algorithms  Physics and Detector performance studies

17 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 16 Tile Calorimeter  Tile Calorimeter DB coordination: T. LeCompte/ANL  Tile Cal reconstruction coord: F. Merritt/Chicago  Tile Cal XML Detector Description has been improved  Extended barrel completed  non-uniform plate spacing included  Extended barrel can be easily repositioned w.r.t. the barrel allows studying the effect of recently introduced gap  Geant4 models have been built from both XML and "by hand“  G4 vs. test beam comparisons just beginning  TileCal per se reconstruction is largely an issue of calibration (convert ADC counts to Energy)  calibration DB access is a goal for late FY2002  TileCal classes have changed to be more in line with LAr classes  Jet reconstruction classes have been streamlined

18 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 17 Tucson JetRec Working Group and Supporters Tucson, Arizona, August 20-22, 2001 Tucson JetRec Working Group and Supporters Tucson, Arizona, August 20-22, 2001 Argonne National Lab: Tom LeCompte Brookhaven National Laboratory (*) : Hong Ma, Srini Rajagopalan TRIUMF: Monika Wielers University of Arizona: Peter Loch University of Chicago: Ed Frank, Ambreesh Gupta, Frank Merritt (*) by phone Argonne National Lab: Tom LeCompte Brookhaven National Laboratory (*) : Hong Ma, Srini Rajagopalan TRIUMF: Monika Wielers University of Arizona: Peter Loch University of Chicago: Ed Frank, Ambreesh Gupta, Frank Merritt (*) by phone

19 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 18 Tasklist for the Workshop Come up with an improved JetRec software design in view of recent suggestions for changes: definition of basic classes -> review of use cases; establish the algorithm flow; first look at the “navigation problem” First attempt to set up a working group structure within the Jet/Etmiss performance group: work plans, deliverables and commitments; reporting to Jet/Etmiss and Software groups; bi-weekly phone conferences Tuesdays, 17:00 (Geneva time) -> next October 2, 2001! Come up with an improved JetRec software design in view of recent suggestions for changes: definition of basic classes -> review of use cases; establish the algorithm flow; first look at the “navigation problem” First attempt to set up a working group structure within the Jet/Etmiss performance group: work plans, deliverables and commitments; reporting to Jet/Etmiss and Software groups; bi-weekly phone conferences Tuesdays, 17:00 (Geneva time) -> next October 2, 2001!

20 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 19 Algorithmic Flow Example!! There is NO restriction to Calorimeter Reconstruction Objects or any other specific type in general!

21 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 20 Muon Spectrometer  Boston U (J.Shank), U Michigan, Harvard, BNL + CERN, Italy,..  Current activity:  Muon database and detector description Muon DB coordination: S. Goldfarb/UM XML detector description: MDTs, RPCs, TGCs implemented; full chain to Geant4 implemented  Geometry ID scheme for all subsystems defined and documented  OO muon reconstruction (Moore) development Integrated into Athena; in repository; in early development Limited reconstruction in the barrel  Simulation for detector layout optimization  Near term goals:  Extend Moore to barrel, update to emerging reconstruction data model.  Trigger TDR studies: L1->L2 rejection, efficiencies  Calibration DB, trigger DB, ongoing detector description work

22 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 21 Physics Performance Comparasion CSC Doublets vs. Singlet Endcap Muon System Staging Study B. Zhou & D. Levin, U. of Michigan Final state Recon. Efficiency degradation Resolution Resolutiondegradation Non-Gaussian tail Non-Gaussian taildegradation 1 muon < 3% for Pt range (20 – 500 GeV) < 2% DP/P degradation (20 – 500 GeV) < 10% for 500 GeV muons 2 muons 300 GeV A -> 2 muons ~ 4% more events loss Mass resolution changed from 3.6% - 4.0% < 2% non-Gaussian tail increase 4 muons 150 GeV H -> 4 muons ~ 5% more events loss No significant change in mass resolution No significant non- Gaussian tail increase US DoE/NSF Lehmann Review recommend the US ATLAS Muon Team build 50% of the CSC chambers at the initial phase of the LHC. Physics studies used single muons, and double and four muon final states from low mass Higgs decays. Conclusion: the US CSC muon staging plan has not shown significant impact on The low mass Higgs detections at the Day 1 of the LHC physics run. June, 2001: ATLAS management approved US staging plan.

23 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 22 Investigation of Alignment Criteria in EndCap MDTs Daniel Levin – University of Michigan Impact on Efficiency and Resolution due to Uncertainties in chamber surveying, placement & orientation Green: Rotation about beam TS Z Criterion: Alignment tolerance should be <0.3 mrad Efficiency T Axis Misalignment (mrad)

24 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 23 ATLAS Muon Database Contributions (S. Goldfarb)  Overall Coordination  Management of MuonSpectrometer packages for Event and Detector Description Reduction of cross-package software dependencies, porting to CMT New packages for Objectivity DDL  Planning document for Detector Description development ATL-COM-MUON-2001-021  Event Model Development  MuonEvent Completion of transient G3 hit, container classes for MDT, RPC, TGC Completion of persistent (objectivity) digit, container classes, schema for MDT, RPC, TGC  New Muon Event Model Commencement of discussions with BNL defining project for Muon Spectrometer Coordination with SCT/TRT community  Detector Description Development  MuonDetDescr Completion of transient detector description classes for TGC Completion of persistent (objectivity) detector description classes, schema for MDT  MuonAGDD Evaluations of MDT, RPC, TGC descriptions for GEANT4 simulation Development of “compact” syntax definitions for MDT, RPC, TGC and Barrel Toroid Completion of XML description, expansion interface for MDT, Barrel Toroid  HEPDD (http://doc.cern.ch/age?a01380)  Hosted, Chaired second annual workshop on Detector Description for HEP at CERN

25 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 24 ATLAS Muon Database Contributions Descriptions of Barrel Toroid (left) and H8 test beam geometry (below). Both geometries were generated using compact AGDD syntax and both were developed by REU summer students, under the supervision of S. Goldfarb.

26 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 25 ATLAS Muon Database Planning  Data Challenge 0  Persistent (objectivity) detector description classes, schema for RPC, TGC  Data Challenge 1  Access to Geometry Version O in Athena from AMDB (Naples + SG)  General Development to Event Model  MuonEvent New packages for technology-dependent software Modifications necessary for new geometry implementation  New Muon Event Model Initial implementation of Muon Digit Container and Identifier Classes (BNL) Implementation of new identifier scheme (BNL + SG)  General Development to Detector Description  (These plans detailed in document ATL-COM-MUON-2001-021)  MuonDetDescr Completion and testing of objectivity persistency New AGDD_DetDescrSource classes to interface MuonDetDescr with AGDD  MuonAGDD Completion of syntax, XML descriptions, interfaces for RPC, TGC, CSC, inert material Extensive testing, evaluation of AGDD with G4 Simulation, Moore, Muonbox  HEPDD--Plan to Host/Chair Third Annual Workshop on Detector Description for HEP

27 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 26 Offline Muon Reconstruction (Moore)  Muon Object Oriented Reconstruction (Moore).  Runs in the Athena Framework using the ATLAS CMT  Strategy  Base algorithms on trigger simulation: Make roads from trigger chambers MDT Pattern recognition added (see next slides) Fitting from iPat  Graphics currently using GraXML and ATLANTIS

28 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 27 Pattern Recognition: Track Finding x, y plane z, y plane Inner station Outer station Middle station

29 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 28 Efficiency Efficiency (%) P T (GeV) Muonbox MOORE 6 20 100300 1000 A Muon track consists of hits from at least 2 stations and is successfully fitted. The efficiency is normalized to all events with the generated muon within |  |<1 at the event vertex

30 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 29 Resolution P T -resolution (%) P T (Gev) 6 20 100 300 1000 The resolution is defined as the  of the gaussian fit to the P T rec /P T gen distribution

31 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 30 Pull of 1/P T P T = 1000 GevP T = 6 Gev σ = 1.1941 σ = 6.176 The error on 1\P T pull is due to the material pull = difference between the reconstructed and true values normalised to the error on the reconstructed value.

32 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 31 Moore Plans  Release the code documented  Extend Moore in the End-cap regions  Look into using the Level-1 simulation code directly  Need to get the material description  Plan to use Cobra fitting  Exploring Graphics with Atlantis and continue with GraXML  Implement the current O-Layout  Participate in the Data Challenge (1)

33 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 32 Trigger/DAQ Offline Software  The ATLAS High Level Trigger (HLT) is mostly a software trigger  LVL2: Optimized algorithms and simple trigger menus  Event Filter: Offline-like algorithms, full event, and latest calibrations  The LVL1 trigger is a hardware trigger and needs special simulation in offline  TDAQ software is similar to other detector software in terms of offline requirements and applications  Full simulation is used in design and optimization of TDAQ system  Offline software is used to monitor performance (rates, efficiency, single component performance)  However, very stringent QC needed; “mission criticality”

34 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 33 T/DAQ Offline Software: Status  LVL1 simulation exists in Athena for e/  /  trigger  Recently, most effort has been in the design of the HLT framework. Main requirement in design:  Use the same software in online and offline environments  Also plan to have similar framework for LVL2 and EF  Possibly sharing of (some) services and algorithms  Presently evaluating Athena for use as EF framework  If OK for EF, then consider use at LVL2  First cycle of design recently finished; now implementing first prototype  Aim for vertical slice prototype for Spring 2002  Exploitation for HLT/DAQ/DCS TDR in late 2002

35 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 34 HLT Offline Software: Design  High Level Design stage is finished  Aim is to use same design for LVL2 and EF  System factorized in work areas  Steering  Algorithms  Data Manager  Event Data Model  Interactions needed (and ongoing) with offline and architecture groups

36 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 35 Validation of Athena for HLT Use  The ATLAS EF will use selection and classification algorithms derived from the offline suite  Offline software performance therefore has a direct impact on EF farm size and cost  The HLT community has started “validation studies” (detailed benchmarking) of Athena, offline algorithms, and event model  The aim is to set metrics for monitoring trends in software performance  It is clear that the software is presently far from adequate  Not fair to judge during development phase  But benchmarking can (and has) helped spur improvements  Feedback during monthly meetings with A-team and regular interactions with developers  Software performance is also important for offline – hope that offline community will continue this work

37 28 Nov., 2001. J. Shank ATLAS Simulation/Recon. SW. 36 Summary  New ATLAS framework, Athena, enthusiastically embraced by broad spectrum of sub-system community.  Many US physicists active in C ++ code development  Well integrated into overall ATLAS software effort  Schedule:  DC 0 12/01 Should have full OO sw ready.  Still some Fortran wrapping (muons)  DC1 02/02 Large scale simulation/reconstruction.  Some with GEANT4  Objectivity and Root IO.


Download ppt "ATLAS Simulation/Reconstruction Software Reported by Jim Shank, work done by most US Institutes. DOE/NSF review of LHC Software and Computing Projects."

Similar presentations


Ads by Google