ATLAS Simulation/Reconstruction Software Reported by Jim Shank, work done by most US Institutes. DOE/NSF review of LHC Software and Computing Projects.

Slides:



Advertisements
Similar presentations
E.K.Stefanides March 07, The Muon Spectrometer of the ATLAS detector: progress report on construction and physics studies at the University of Athens.
Advertisements

TRT LAr Tilecal MDT-RPC BOS Pixels&SCT 1 The Atlas combined testbeam Thijs Cornelissen, NIKHEF Jamboree, Nijmegen, December 2004.
Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
Project Overview John Huth Harvard University U.S. ATLAS Physics and Computing Project Review ANL October 2001.
Simulation Project Major achievements (past 6 months 2007)
The First-Level Trigger of ATLAS Johannes Haller (CERN) on behalf of the ATLAS First-Level Trigger Groups International Europhysics Conference on High.
The ATLAS B physics trigger
U.S. ATLAS Software WBS 2.2 S. Rajagopalan July 8, 2003 DOE/NSF Review of LHC Computing.
General Trigger Philosophy The definition of ROI’s is what allows, by transferring a moderate amount of information, to concentrate on improvements in.
Jet Reconstruction and Calibration in Athena US ATLAS Software Workshop BNL, 27/08/03 Ambreesh Gupta, for the JetRec Group University of Chicago Outline:
Trigger and online software Simon George & Reiner Hauser T/DAQ Phase 1 IDR.
August 98 1 Jürgen Knobloch ATLAS Software Workshop Ann Arbor ATLAS Computing Planning ATLAS Software Workshop August 1998 Jürgen Knobloch Slides also.
The digitization procedure occurs separately for each Muon technology, MDT, CSC, RPC and TGC. Here the main steps of each MuonHits to Muon Digits conversion.
N ATIONAL E NERGY R ESEARCH S CIENTIFIC C OMPUTING C ENTER Charles Leggett The Athena Control Framework in Production, New Developments and Lessons Learned.
David N. Brown Lawrence Berkeley National Lab Representing the BaBar Collaboration The BaBar Mini  BaBar  BaBar’s Data Formats  Design of the Mini 
The BESIII Offline Software Weidong Li Institute of High Energy Physics, Beijing Workshop on the cooperation of PRC-US in HEP 16 June 2006.
ATLAS, U.S. ATLAS, and Databases David Malon Argonne National Laboratory DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National Laboratory.
1 Gabriella Cataldi (INFN Lecce) Michela Biglietti (Universita’ di Napoli-Federico II) and the HLT.
MOORE MOORE (Muon Object Oriented REconstruction) Track reconstruction in the Muon Spectrometer MuonIdentification MuonIdentification Reconstruction and.
5 May 98 1 Jürgen Knobloch Computing Planning for ATLAS ATLAS Software Week 5 May 1998 Jürgen Knobloch Slides also on:
ATLAS, U.S. ATLAS, and Databases David Malon Argonne National Laboratory PCAP Review of U.S. ATLAS Computing Project Argonne National Laboratory
June 02 John Huth, LHC Computing 1 U.S. ATLAS Overview  Project ManagementJ. Huth  SoftwareT.Wenaus  ArchitectureD. Quarrie  PhysicsI. Hinchliffe 
19/07/20061 Nectarios Ch. Benekos 1, Rosy Nicolaidou 2, Stathes Paganis 3, Kirill Prokofiev 3 for the collaboration among: 1 Max-Planck-Institut für Physik,
19 November 98 1 Jürgen Knobloch ATLAS Computing ATLAS Computing - issues for 1999 Jürgen Knobloch Slides also on:
Computing in High Energy Physics – Interlaken - September 2004 Ada Farilla Offline Software for the ATLAS Combined Test Beam Ada Farilla – I.N.F.N. Roma3.
WP5: Detector Performance and Cost EuroNu Meeting CERN, 26 March 2009 Paul Soler Coordinator: Paul Soler, University of Glasgow Deputy: Anselmo Cervera.
24/06/03 ATLAS WeekAlexandre Solodkov1 Status of TileCal software.
ATLAS Data Challenges US ATLAS Physics & Computing ANL October 30th 2001 Gilbert Poulard CERN EP-ATC.
ATLAS Simulation/Reconstruction Software Reported by S. Rajagopalan work done by most US Institutes. U.S. ATLAS PCAP review Lawrence Berkeley National.
Muon-raying the ATLAS Detector
U.S. ATLAS Project Overview John Huth Harvard University LHC Computing Review FNAL November 2001.
Kati Lassila-Perini/HIP HIP CMS Software and Physics project evaluation1/ Electron/ physics in CMS Kati Lassila-Perini HIP Activities in the.
Muon Trigger Slice Report Sergio Grancagnolo for the Muon Trigger group INFN-University of Lecce CERN Jan 23, 2007.
Cosmic Rays for ATLAS Commissioning Commissioning Meeting ATLAS Physics Workshop Athens May 2003 Halo+Cosmics group: M.Boonekamp, F.Gianotti, R.McPherson,
GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory
TRT Offline Software DOE Visit, August 21 st 2008 Outline: oTRT Commissioning oTRT Offline Software Activities oTRT Alignment oTRT Efficiency and Noise.
Atlas CHEP‘2000 Padova, ITALY February 2000 Implementation of an Object Oriented Track Reconstruction Model into Multiple LHC Experiments.
ATLAS Simulation/Reconstruction Software James Shank DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National Laboratory NOVEMBER 14-17,
September 2007CHEP 07 Conference 1 A software framework for Data Quality Monitoring in ATLAS S.Kolos, A.Corso-Radu University of California, Irvine, M.Hauschild.
Aspects of LAr Reconstruction S. Rajagopalan ATLAS Week June 4, 2001.
Muon Reconstruction with Moore and MuonIdentification The Moore/MUID group Atlas Physics Workshop Athens, May 2003.
/MuID status report on behalf of the Moore/MuID group. Status in the releases ( Units Migrations DC1/G3 MuonGeoModel Migration DC2/G4) MuID updates Trig.
Status of the LAr OO Reconstruction Srini Rajagopalan ATLAS Larg Week December 7, 1999.
Software Project Status Torre Wenaus BNL DOE/NSF Review of US LHC Software and Computing Fermilab Nov 29, 2001.
Computing R&D and Milestones LHCb Plenary June 18th, 1998 These slides are on WWW at:
MOORE MOORE (Muon Object Oriented REconstruction) Track reconstruction in the Muon Spectrometer MuonIdentification MuonIdentification Reconstruction and.
S t a t u s a n d u pd a t e s Gabriella Cataldi (INFN Lecce) & the group Moore … in the H8 test-beam … in the HLT(Pesa environment) … work in progress.
Adele Rimoldi, Pavia University & INFN – CERN G4usersWorkshop Nov H8 Muon Testbeam Simulation CERN - 14 November, 2002 and the Physics Validation.
Overview of US Work on Simulation and Reconstruction Frederick Luehring August 28, 2003 US ATLAS Computing Meeting at BNL.
1 OO Muon Reconstruction in ATLAS Michela Biglietti Univ. of Naples INFN/Naples Atlas offline software MuonSpectrometer reconstruction (Moore) Atlas combined.
Introduction S. Rajagopalan August 28, 2003 US ATLAS Computing Meeting.
Software Week - 8/12/98G. Poulard - CERN EP/ATC1 Status of Software for Physics TDR Atlas Software Week 8 December 1998 G. Poulard.
18 –20 January 2000, DOE Germantown ATLAS Sub-detector Software James Shank DOE/NSF Review of LHC Software + Computing Projects.
August 98 1 Jürgen Knobloch ATLAS Software Workshop Ann Arbor ACOS Report ATLAS Software Workshop December 1998 Jürgen Knobloch Slides also on:
Atlas Software May, 2000 K.Amako Status of Geant4 Physics Validation Atlas Software Week 10 May, Katsuya Amako (KEK)
Detector SimOOlation activities in ATLAS A.Dell’Acqua CERN-EP/ATC May 19th, 1999.
L. Perini DATAGRID WP8 Use-cases 19 Dec ATLAS short term grid use-cases The “production” activities foreseen till mid-2001 and the tools to be used.
ATLAS The ConditionDB is accessed by the offline reconstruction framework (ATHENA). COOLCOnditions Objects for LHC The interface is provided by COOL (COnditions.
1 Plans for the Muon Trigger CSC Note. 2 Muon Trigger CSC Studies General performance studies and trigger rate evalution for the full slice Evaluation.
Marco Cattaneo, 3-June Event Reconstruction for LHCb  What is the scope of the project?  What are the goals (short+medium term)?  How do we organise.
1 TrigMoore: Status, Plans, Possible Milestones. 2 Moore in HLT- status and ongoing work Package under the CVS directory: Trigger/TrigAlgorithms/TrigMoore.
LHCb Simulation LHCC Computing Manpower Review 3 September 2003 F.Ranjard / CERN.
EPS HEP 2007 Manchester -- Thilo Pauly July The ATLAS Level-1 Trigger Overview and Status Report including Cosmic-Ray Commissioning Thilo.
Measuring the B+→J/ψ (μμ) K+ Channel with the first LHC data in Atlas
S. Rajagopalan August 28, 2003 US ATLAS Computing Meeting
OO Muon Reconstruction in ATLAS
High Level Trigger Studies for the Efstathios (Stathis) Stefanidis
MOORE (Muon Object Oriented REconstruction) MuonIdentification
Simulation and Physics
M.Biglietti (Univ. Naples and INFN Naples)
Presentation transcript:

ATLAS Simulation/Reconstruction Software Reported by Jim Shank, work done by most US Institutes. DOE/NSF review of LHC Software and Computing Projects Fermilab November, 2001

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 2 Outline  Activities in all systems (mostly by physicists):  Pixels, TRT, EM Cal, Tile Cal, Muons, Trigger  Well integrated into overall ATLAS computing effort.  In particular, the US core efforts on Athena and DB.  Review of recent activity, by sub-system  Future work: Data Challenges.

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 3 ATLAS Subsystem/Task Matrix Offline Coordinator ReconstructionSimulationDatabaseChair N. McCubbin D. Rousseau K. Amako D. Malon Inner Detector D. Barberis D. Rousseau F. Luehring S. Bentvelsen Liquid Argon J. Collot S. Rajagopalan M. Leltchouk R. Sobie Tile Calorimeter A. Solodkov F. Merritt A. Solodkov T. LeCompte Muon To be named J.F. Laporte A. Rimoldi S. Goldfarb LVL 2 Trigger/ Trigger DAQ S. George S. Tapprogge M. Wielers H. P. Beck Event Filter V. Vercesi F. Touchard Other US roles: D. Quarrie (LBNL), Chief Architect; P. Nevski (BNL), Geant3 simu coord; H. Ma (BNL), Raw data coord; C.Tull (LBNL), Eurogrid WP8 liaison

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 4 Subdetector SW Activities Summary  Performance/design studies  G3 based simulation  Test beam  Athena integration  Reconstruction development in C ++  G4 based simulation development  G4 physics validation  XML based detector description  Database  Conditions DB  Trigger/DAQ

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 5 Pixels-Conditions DB (Berkeley)  Goal: To develop general mechanism for retrieving time-dependent alignment constants from database and using them in reconstruction  Requires additions to Athena infrastructure  Requires extension of existing detector description interface  Will prototype using silicon and pixel detectors as the use case  Misalignments calculated from numbers stored ”Conditions Database”  Delivered through a general ”Time Dependent Conditions Service” in Athena (TCS)  In addition to event store (TES):  Need a detector store (TDeS)  Need interface to conditions DB (TCS)  A prototype TDeS coded by C. Leggett and P. Calafiura (a second instance of the Store-Gate Service without object deletion at the end of each event)  Work in progress… Work in progress… Work in progress…

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 6 TRT (F. Luehring/Indiana et al)  Athena Pile-Up Requirements documentation ATL-SOFT-2001  GEANT4 code writing  TRT hit and digitization definitions  TRT GEANT3 code  current beampipe + geometry updates  TRT material budget  TRT atlsim example on the grid.  GEANT 3  GEANT 4 comparisons

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 7 LAr Simulation (M. Leltchouk/Nevis et al)  LAr simulation coordination: M. Leltchouk/Nevis  Participation in G4 EM barrel development  LAr EM calorimeter hits (LArEMHit) were implemented in GEANT4 by B.Seligman.  The ROOT I/O scheme is used for hit persistency (see )  Comparisons of GEANT4 Simulations with Testbeam Data and GEANT3 for the ATLAS Liquid Argon Calorimeter has been presented on CHEP2001

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 8 GEANT 4 LAr Simulation

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 9 FCal1 Testbeam Setup in GEANT4  -counter TailCatcher Cryostat FCal2 Module 0 Iron Shield HoleVeto MWPC FCal1 Module 0 Argon Excluder VetoWall Setup around Cryostat only! FCal1 FCal2 FCal1 Electrode Pattern

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 10 FCal1: GEANT3/4 Comparisons of Energy Resolution Noise Cut Dependence Relative Energy Resolution [%] Beam Energy [GeV] Geant4 Geant3 Relative Energy Resolution [%] Beam Energy [GeV] No Noise Cut Fit to experimental data GEANT4 high energy resolution problem ??

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 11 LAr Reconstruction -- Major Milestones Met  Early design in PASO: Jan  Migrate to Athena: May 2000  LAr Reconstruction used as a test bed for early Athena  First application software to successfully migrate to Athena 3 working days at LBL 3 working days at LBL  First Common Calorimeter InterfacesOct  QA review of then available components Dec  S. Albrand (Grenoble)  Combined Reconstruction (egamma) Jan  Process GEANT4 LAr Hits (Root Objects) Mar  Lund ReleaseJune 2001  Establishing most of the reconstruction chain  From G3/G4 Hits to Particle Identification

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 12 LAr Data Classes  Data Objects proposed/implemented in March 2001  J. Collot et. al.

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 13 Comparison to ATRECON

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 14 Recent Plots using LAr recon. program

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 15 LAr Reconstruction Conclusion  A central framework that is evolving to provide robust support  The reconstruction design has been built over this framework  Much of the ‘Fortran’ code has been migrated.  Validation ongoing, but results are promising.  It now paves the way for work in:  optimizing and developing new algorithms  Physics and Detector performance studies

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 16 Tile Calorimeter  Tile Calorimeter DB coordination: T. LeCompte/ANL  Tile Cal reconstruction coord: F. Merritt/Chicago  Tile Cal XML Detector Description has been improved  Extended barrel completed  non-uniform plate spacing included  Extended barrel can be easily repositioned w.r.t. the barrel allows studying the effect of recently introduced gap  Geant4 models have been built from both XML and "by hand“  G4 vs. test beam comparisons just beginning  TileCal per se reconstruction is largely an issue of calibration (convert ADC counts to Energy)  calibration DB access is a goal for late FY2002  TileCal classes have changed to be more in line with LAr classes  Jet reconstruction classes have been streamlined

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 17 Tucson JetRec Working Group and Supporters Tucson, Arizona, August 20-22, 2001 Tucson JetRec Working Group and Supporters Tucson, Arizona, August 20-22, 2001 Argonne National Lab: Tom LeCompte Brookhaven National Laboratory (*) : Hong Ma, Srini Rajagopalan TRIUMF: Monika Wielers University of Arizona: Peter Loch University of Chicago: Ed Frank, Ambreesh Gupta, Frank Merritt (*) by phone Argonne National Lab: Tom LeCompte Brookhaven National Laboratory (*) : Hong Ma, Srini Rajagopalan TRIUMF: Monika Wielers University of Arizona: Peter Loch University of Chicago: Ed Frank, Ambreesh Gupta, Frank Merritt (*) by phone

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 18 Tasklist for the Workshop Come up with an improved JetRec software design in view of recent suggestions for changes: definition of basic classes -> review of use cases; establish the algorithm flow; first look at the “navigation problem” First attempt to set up a working group structure within the Jet/Etmiss performance group: work plans, deliverables and commitments; reporting to Jet/Etmiss and Software groups; bi-weekly phone conferences Tuesdays, 17:00 (Geneva time) -> next October 2, 2001! Come up with an improved JetRec software design in view of recent suggestions for changes: definition of basic classes -> review of use cases; establish the algorithm flow; first look at the “navigation problem” First attempt to set up a working group structure within the Jet/Etmiss performance group: work plans, deliverables and commitments; reporting to Jet/Etmiss and Software groups; bi-weekly phone conferences Tuesdays, 17:00 (Geneva time) -> next October 2, 2001!

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 19 Algorithmic Flow Example!! There is NO restriction to Calorimeter Reconstruction Objects or any other specific type in general!

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 20 Muon Spectrometer  Boston U (J.Shank), U Michigan, Harvard, BNL + CERN, Italy,..  Current activity:  Muon database and detector description Muon DB coordination: S. Goldfarb/UM XML detector description: MDTs, RPCs, TGCs implemented; full chain to Geant4 implemented  Geometry ID scheme for all subsystems defined and documented  OO muon reconstruction (Moore) development Integrated into Athena; in repository; in early development Limited reconstruction in the barrel  Simulation for detector layout optimization  Near term goals:  Extend Moore to barrel, update to emerging reconstruction data model.  Trigger TDR studies: L1->L2 rejection, efficiencies  Calibration DB, trigger DB, ongoing detector description work

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 21 Physics Performance Comparasion CSC Doublets vs. Singlet Endcap Muon System Staging Study B. Zhou & D. Levin, U. of Michigan Final state Recon. Efficiency degradation Resolution Resolutiondegradation Non-Gaussian tail Non-Gaussian taildegradation 1 muon < 3% for Pt range (20 – 500 GeV) < 2% DP/P degradation (20 – 500 GeV) < 10% for 500 GeV muons 2 muons 300 GeV A -> 2 muons ~ 4% more events loss Mass resolution changed from 3.6% - 4.0% < 2% non-Gaussian tail increase 4 muons 150 GeV H -> 4 muons ~ 5% more events loss No significant change in mass resolution No significant non- Gaussian tail increase US DoE/NSF Lehmann Review recommend the US ATLAS Muon Team build 50% of the CSC chambers at the initial phase of the LHC. Physics studies used single muons, and double and four muon final states from low mass Higgs decays. Conclusion: the US CSC muon staging plan has not shown significant impact on The low mass Higgs detections at the Day 1 of the LHC physics run. June, 2001: ATLAS management approved US staging plan.

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 22 Investigation of Alignment Criteria in EndCap MDTs Daniel Levin – University of Michigan Impact on Efficiency and Resolution due to Uncertainties in chamber surveying, placement & orientation Green: Rotation about beam TS Z Criterion: Alignment tolerance should be <0.3 mrad Efficiency T Axis Misalignment (mrad)

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 23 ATLAS Muon Database Contributions (S. Goldfarb)  Overall Coordination  Management of MuonSpectrometer packages for Event and Detector Description Reduction of cross-package software dependencies, porting to CMT New packages for Objectivity DDL  Planning document for Detector Description development ATL-COM-MUON  Event Model Development  MuonEvent Completion of transient G3 hit, container classes for MDT, RPC, TGC Completion of persistent (objectivity) digit, container classes, schema for MDT, RPC, TGC  New Muon Event Model Commencement of discussions with BNL defining project for Muon Spectrometer Coordination with SCT/TRT community  Detector Description Development  MuonDetDescr Completion of transient detector description classes for TGC Completion of persistent (objectivity) detector description classes, schema for MDT  MuonAGDD Evaluations of MDT, RPC, TGC descriptions for GEANT4 simulation Development of “compact” syntax definitions for MDT, RPC, TGC and Barrel Toroid Completion of XML description, expansion interface for MDT, Barrel Toroid  HEPDD (  Hosted, Chaired second annual workshop on Detector Description for HEP at CERN

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 24 ATLAS Muon Database Contributions Descriptions of Barrel Toroid (left) and H8 test beam geometry (below). Both geometries were generated using compact AGDD syntax and both were developed by REU summer students, under the supervision of S. Goldfarb.

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 25 ATLAS Muon Database Planning  Data Challenge 0  Persistent (objectivity) detector description classes, schema for RPC, TGC  Data Challenge 1  Access to Geometry Version O in Athena from AMDB (Naples + SG)  General Development to Event Model  MuonEvent New packages for technology-dependent software Modifications necessary for new geometry implementation  New Muon Event Model Initial implementation of Muon Digit Container and Identifier Classes (BNL) Implementation of new identifier scheme (BNL + SG)  General Development to Detector Description  (These plans detailed in document ATL-COM-MUON )  MuonDetDescr Completion and testing of objectivity persistency New AGDD_DetDescrSource classes to interface MuonDetDescr with AGDD  MuonAGDD Completion of syntax, XML descriptions, interfaces for RPC, TGC, CSC, inert material Extensive testing, evaluation of AGDD with G4 Simulation, Moore, Muonbox  HEPDD--Plan to Host/Chair Third Annual Workshop on Detector Description for HEP

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 26 Offline Muon Reconstruction (Moore)  Muon Object Oriented Reconstruction (Moore).  Runs in the Athena Framework using the ATLAS CMT  Strategy  Base algorithms on trigger simulation: Make roads from trigger chambers MDT Pattern recognition added (see next slides) Fitting from iPat  Graphics currently using GraXML and ATLANTIS

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 27 Pattern Recognition: Track Finding x, y plane z, y plane Inner station Outer station Middle station

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 28 Efficiency Efficiency (%) P T (GeV) Muonbox MOORE A Muon track consists of hits from at least 2 stations and is successfully fitted. The efficiency is normalized to all events with the generated muon within |  |<1 at the event vertex

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 29 Resolution P T -resolution (%) P T (Gev) The resolution is defined as the  of the gaussian fit to the P T rec /P T gen distribution

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 30 Pull of 1/P T P T = 1000 GevP T = 6 Gev σ = σ = The error on 1\P T pull is due to the material pull = difference between the reconstructed and true values normalised to the error on the reconstructed value.

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 31 Moore Plans  Release the code documented  Extend Moore in the End-cap regions  Look into using the Level-1 simulation code directly  Need to get the material description  Plan to use Cobra fitting  Exploring Graphics with Atlantis and continue with GraXML  Implement the current O-Layout  Participate in the Data Challenge (1)

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 32 Trigger/DAQ Offline Software  The ATLAS High Level Trigger (HLT) is mostly a software trigger  LVL2: Optimized algorithms and simple trigger menus  Event Filter: Offline-like algorithms, full event, and latest calibrations  The LVL1 trigger is a hardware trigger and needs special simulation in offline  TDAQ software is similar to other detector software in terms of offline requirements and applications  Full simulation is used in design and optimization of TDAQ system  Offline software is used to monitor performance (rates, efficiency, single component performance)  However, very stringent QC needed; “mission criticality”

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 33 T/DAQ Offline Software: Status  LVL1 simulation exists in Athena for e/  /  trigger  Recently, most effort has been in the design of the HLT framework. Main requirement in design:  Use the same software in online and offline environments  Also plan to have similar framework for LVL2 and EF  Possibly sharing of (some) services and algorithms  Presently evaluating Athena for use as EF framework  If OK for EF, then consider use at LVL2  First cycle of design recently finished; now implementing first prototype  Aim for vertical slice prototype for Spring 2002  Exploitation for HLT/DAQ/DCS TDR in late 2002

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 34 HLT Offline Software: Design  High Level Design stage is finished  Aim is to use same design for LVL2 and EF  System factorized in work areas  Steering  Algorithms  Data Manager  Event Data Model  Interactions needed (and ongoing) with offline and architecture groups

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 35 Validation of Athena for HLT Use  The ATLAS EF will use selection and classification algorithms derived from the offline suite  Offline software performance therefore has a direct impact on EF farm size and cost  The HLT community has started “validation studies” (detailed benchmarking) of Athena, offline algorithms, and event model  The aim is to set metrics for monitoring trends in software performance  It is clear that the software is presently far from adequate  Not fair to judge during development phase  But benchmarking can (and has) helped spur improvements  Feedback during monthly meetings with A-team and regular interactions with developers  Software performance is also important for offline – hope that offline community will continue this work

28 Nov., J. Shank ATLAS Simulation/Recon. SW. 36 Summary  New ATLAS framework, Athena, enthusiastically embraced by broad spectrum of sub-system community.  Many US physicists active in C ++ code development  Well integrated into overall ATLAS software effort  Schedule:  DC 0 12/01 Should have full OO sw ready.  Still some Fortran wrapping (muons)  DC1 02/02 Large scale simulation/reconstruction.  Some with GEANT4  Objectivity and Root IO.