Sander Klous on behalf of the ATLAS Collaboration Real-Time May 2010 28/5/20101.

Slides:



Advertisements
Similar presentations
ATLAS PESA Meeting 20/11/02 1 To do: di-muon & full-scan triggers 1) Di-muon trigger at L=2x10 ^33, introducing full scan at lower luminosity To do: Update.
Advertisements

G ö khan Ü nel / CHEP Interlaken ATLAS 1 Performance of the ATLAS DAQ DataFlow system Introduction/Generalities –Presentation of the ATLAS DAQ components.
Clara Gaspar on behalf of the LHCb Collaboration, “Physics at the LHC and Beyond”, Quy Nhon, Vietnam, August 2014 Challenges and lessons learnt LHCb Operations.
1 The ATLAS Missing E T trigger Pierre-Hugues Beauchemin University of Oxford On behalf of the ATLAS Collaboration Pierre-Hugues Beauchemin University.
1 Introduction to Geneva ATLAS High Level Trigger Activities Xin Wu Journée de réflexion du DPNC, 11 septembre, 2007 Participants Assitant(e)s: Gauthier.
October 20 th, 2000Lyon - DAQ2000HP Beck ATLAS Trigger & Data Acquisition Requirements and Concepts Hanspeter Beck LHEP - Bern for the ATLAS T/DAQ Group.
The ATLAS High Level Trigger Steering Journée de réflexion – Sept. 14 th 2007 Till Eifert DPNC – ATLAS group.
Kostas KORDAS INFN – Frascati XI Bruno Touschek spring school, Frascati,19 May 2006 Higgs → 2e+2  O (1/hr) Higgs → 2e+2  O (1/hr) ~25 min bias events.
The LHCb DAQ and Trigger Systems: recent updates Ricardo Graciani XXXIV International Meeting on Fundamental Physics.
DSP online algorithms for the ATLAS TileCal Read Out Drivers Cristobal Cuenca Almenar IFIC (University of Valencia-CSIC)
Real Time 2010Monika Wielers (RAL)1 ATLAS e/  /  /jet/E T miss High Level Trigger Algorithms Performance with first LHC collisions Monika Wielers (RAL)
1 The ATLAS Online High Level Trigger Framework: Experience reusing Offline Software Components in the ATLAS Trigger Werner Wiedenmann University of Wisconsin,
Worldwide event filter processing for calibration Calorimeter Calibration Workshop Sander Klous September 2006.
Tracking at the ATLAS LVL2 Trigger Athens – HEP2003 Nikos Konstantinidis University College London.
Algorithm / Data-flow Interface
1 Modelling parameters Jos Vermeulen, 2 June 1999.
What’s in the ATLAS data : Trigger Decision ATLAS Offline Software Tutorial CERN, August 2008 Ricardo Gonçalo - RHUL.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
Calibration streams in the Event Filter. Status report Mainz, Thursday 13 October 2005 Sander Klous – NIKHEF On behalf of the EF calibration team: Martine.
ALICE Upgrade for Run3: Computing HL-LHC Trigger, Online and Offline Computing Working Group Topical Workshop Sep 5 th 2014.
The Region of Interest Strategy for the ATLAS Second Level Trigger
Remote Online Farms Sander Klous
ALICE Computing Model The ALICE raw data flow P. VANDE VYVRE – CERN/PH Computing Model WS – 09 Dec CERN.
Control in ATLAS TDAQ Dietrich Liko on behalf of the ATLAS TDAQ Group.
Optimising Cuts for HLT George Talbot Supervisor: Stewart Martin-Haugh.
HEP 2005 WorkShop, Thessaloniki April, 21 st – 24 th 2005 Efstathios (Stathis) Stefanidis Studies on the High.
ATLAS ATLAS Week: 25/Feb to 1/Mar 2002 B-Physics Trigger Working Group Status Report
Trigger input to FFReq 1. Specific Issues for Trigger The HLT trigger reconstruction is a bit different from the offline reconstruction: – The trigger.
1 “Steering the ATLAS High Level Trigger” COMUNE, G. (Michigan State University ) GEORGE, S. (Royal Holloway, University of London) HALLER, J. (CERN) MORETTINI,
The ATLAS Trigger: High-Level Trigger Commissioning and Operation During Early Data Taking Ricardo Gonçalo, Royal Holloway University of London On behalf.
Navigation Timing Studies of the ATLAS High-Level Trigger Andrew Lowe Royal Holloway, University of London.
IOP HEPP: Beauty Physics in the UK, 12/11/08Julie Kirk1 B-triggers at ATLAS Julie Kirk Rutherford Appleton Laboratory Introduction – B physics at LHC –
Overview of DAQ at CERN experiments E.Radicioni, INFN MICE Daq and Controls Workshop.
LHCb front-end electronics and its interface to the DAQ.
2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD The DataFlow of the ATLAS Trigger.
LHCb DAQ system LHCb SFC review Nov. 26 th 2004 Niko Neufeld, CERN.
1 The PHENIX Experiment in the RHIC Run 7 Martin L. Purschke, Brookhaven National Laboratory for the PHENIX Collaboration RHIC from space Long Island,
CHEP March 2003 Sarah Wheeler 1 Supervision of the ATLAS High Level Triggers Sarah Wheeler on behalf of the ATLAS Trigger/DAQ High Level Trigger.
Artemis School On Calibration and Performance of ATLAS Detectors Jörg Stelzer / David Berge.
Modeling Virtualized Environments in Simalytic ® Models by Computing Missing Service Demand Parameters CMG2009 Paper 9103, December 11, 2009 Dr. Tim R.
Overlap Removal and Timing Optimization Studies Nicole Carlson, Northwestern University 8/8/07 Supervisor: Tomasz Bold.
PESAsim – the e/  analysis framework Validation of the framework First look at a trigger menu combining several signatures Short-term plans Mark Sutton.
Routing and Streaming in the HLT TDAQ Week Sander Klous Wednesday, May 17, 2006.
MOORE MOORE (Muon Object Oriented REconstruction) Track reconstruction in the Muon Spectrometer MuonIdentification MuonIdentification Reconstruction and.
ATLAS Trigger Development
Overview of the High-Level Trigger Electron and Photon Selection for the ATLAS Experiment at the LHC Ricardo Gonçalo, Royal Holloway University of London.
S t a t u s a n d u pd a t e s Gabriella Cataldi (INFN Lecce) & the group Moore … in the H8 test-beam … in the HLT(Pesa environment) … work in progress.
Experience with multi-threaded C++ applications in the ATLAS DataFlow Szymon Gadomski University of Bern, Switzerland and INP Cracow, Poland on behalf.
The PESAsim analysis framework What it is How it works What it can do How to get it and use it Mark Sutton Tania McMahon Ricardo Gonçalo.
Kostas KORDAS INFN – Frascati 10th Topical Seminar on Innovative Particle & Radiation Detectors (IPRD06) Siena, 1-5 Oct The ATLAS Data Acquisition.
General requirements for BES III offline & EF selection software Weidong Li.
LHCbComputing Computing for the LHCb Upgrade. 2 LHCb Upgrade: goal and timescale m LHCb upgrade will be operational after LS2 (~2020) m Increase significantly.
May/7/2008 SLAC ATLAS Forum Online Beam Spot in HLT Su Dong.
ANDREA NEGRI, INFN PAVIA – NUCLEAR SCIENCE SYMPOSIUM – ROME 20th October
1 Nicoletta GarelliCPPM, 03/25/2011 Overview of the ATLAS Data-Acquisition System o perating with proton-proton collisions Nicoletta Garelli (CERN) CPPM,
5/14/2018 The ATLAS Trigger and Data Acquisition Architecture & Status Benedetto Gorini CERN - Physics Department on behalf of the ATLAS TDAQ community.
Ricardo Gonçalo, RHUL BNL Analysis Jamboree – Aug. 6, 2007
CMS High Level Trigger Configuration Management
Controlling a large CPU farm using industrial tools
RT2003, Montreal Niko Neufeld, CERN-EP & Univ. de Lausanne
Trigger, DAQ, & Online: Perspectives on Electronics
Operating the ATLAS Data-Flow System with the First LHC Collisions
ATLAS Canada Alberta Carleton McGill Montréal Simon Fraser Toronto
12/3/2018 The ATLAS Trigger and Data Acquisition Architecture & Status Benedetto Gorini CERN - Physics Department on behalf of the ATLAS TDAQ community.
TDAQ commissioning and status Stephen Hillier, on behalf of TDAQ
Remote Online Farms TDAQ Sander Klous ACAT April
1/2/2019 The ATLAS Trigger and Data Acquisition Architecture & Status Benedetto Gorini CERN - Physics Department on behalf of the ATLAS TDAQ community.
CHEP La Jolla San Diego 24-28/3/2003
Presentation transcript:

Sander Klous on behalf of the ATLAS Collaboration Real-Time May /5/20101

 Introduction of the ATLAS DataFlow system  Modeling DataFlow and Resource Utilization  Cost monitoring explained  Example of performance data analysis  Conclusions 28/5/20102

3 To muon calibration centers Acronyms: Frontend Electronics (FE) Read Out Driver (ROD) Region of Interest (RoI) Read Out Buffer (ROB) Read Out System (ROS) Trigger Level 2 (L2) Event Filter (EF) Event Builders Local Event Storage

 Historically studies have been done with different levels of detail  Paper model (static model) ▪ Back of the envelope calculations ▪ Average data volumes and data fragmentation info  Dynamic model (computer simulation) ▪ Discrete event model of the DataFlow system ▪ Cross-check with results of the paper model ▪ Additional information on queuing in the system  How do these studies match with reality?  What predictions can be made for the future? 28/5/20104

5

 Introduce a mechanism in the running DAQ system to:  Collect performance info (i.e. resource utilization) on the fly ▪ On event by event basis  Group performance information together  Use this information to validate the model  Trigger rates, Processing times  Access to information fragments 28/5/20106

7

 Data driven  Event contains multiple parts  Header  Meta data  Payload  Meta data added by  L2 (L2 result)  EF (EF result) 28/5/20108 Event Header L2 result EF result Event payload Detector A Detector B Etc.

 Reduced event payload  Calibration events  Not all detector data needed  Smaller events  Partially built at LVL2  Stripped before stored ▪ By EF or SFO  Improved efficiency  Disk (less storage capacity)  Network (reduced bandwidth)  CPU (bypass L2/EF if possible) 28/5/20109 Event Header L2 result EF result Event payload Detector A Detector B Etc.

 Performance data stored in L2/EF result:  Each event: ▪ L1 accept time and HLT host local time ▪ HLT application ID ▪ L1 and HLT trigger counters ▪ L1 and HLT trigger decision bits.  Every 10 th event: ▪ Start/stop times of HLT algorithms ▪ HLT trigger requesting the HLT algorithm ▪ RoI information, ROB IDs, ROB request time and ROB size 28/5/201010

 Transport information by piggybacking on rejected events that can be built partially:  Without event payload (only L2/EF result)  Avoid mixing with other data  Collection rate of buffered information  Each N th rejected event (N=100)  Cost algorithm fires, buffer is serialized  Typically less than 1 MB/second collected 28/5/201011

28/5/ Buffer performance data To muon calibration centers Event Builders Local Event Storage

 Separate stream with performance data  Automatic NTuple production and analysis  Results listed on html pages:  Trigger rates  Trigger sequences  Processing times  Feedback information for:  Operations and menu coordination  Performance studies, modeling and extrapolation 28/5/201013

 The online L2 monitoring show a long tail in the event processing time (wall clock time): 28/5/ Trigger L2 Run L1 BPTX seeding at 5 kHz

 In our new tool, we identify the dominating algorithm, responsible for the long tail: 28/5/ Minimum Bias L2

 With our tool we can investigate the different aspects of the algorithm: 28/5/ CPU consumption is healthy Typical retrieval time about 1 ms Problem is in ROB retrieval (congestion?, ROS problem?)

 Cost monitoring is a valuable new tool for performance measurements  The tool makes intelligent use of existing features in the ATLAS TDAQ system  The tool is operational and is working fine, as demonstrated with the example  Next steps:  Validate MC event performance model with real data  Modeling with higher luminosity MC events (extrapolate)  Make cost monitoring available online 28/5/201017