Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.

Slides:



Advertisements
Similar presentations
Sander Klous on behalf of the ATLAS Collaboration Real-Time May /5/20101.
Advertisements

CHEP 2012 – New York City 1.  LHC Delivers bunch crossing at 40MHz  LHCb reduces the rate with a two level trigger system: ◦ First Level (L0) – Hardware.
LHCb Upgrade Overview ALICE, ATLAS, CMS & LHCb joint workshop on DAQ Château de Bossey 13 March 2013 Beat Jost / Cern.
27 th June 2008Johannes Albrecht, BEACH 2008 Johannes Albrecht Physikalisches Institut Universität Heidelberg on behalf of the LHCb Collaboration The LHCb.
The First-Level Trigger of ATLAS Johannes Haller (CERN) on behalf of the ATLAS First-Level Trigger Groups International Europhysics Conference on High.
October 20 th, 2000Lyon - DAQ2000HP Beck ATLAS Trigger & Data Acquisition Requirements and Concepts Hanspeter Beck LHEP - Bern for the ATLAS T/DAQ Group.
The ATLAS High Level Trigger Steering Journée de réflexion – Sept. 14 th 2007 Till Eifert DPNC – ATLAS group.
Kostas KORDAS INFN – Frascati XI Bruno Touschek spring school, Frascati,19 May 2006 Higgs → 2e+2  O (1/hr) Higgs → 2e+2  O (1/hr) ~25 min bias events.
The LHCb DAQ and Trigger Systems: recent updates Ricardo Graciani XXXIV International Meeting on Fundamental Physics.
J. Leonard, U. Wisconsin 1 Commissioning the Trigger of the CMS Experiment at the CERN Large Hadron Collider Jessica L. Leonard Real-Time Conference Lisbon,
ACAT 2002, Moscow June 24-28thJ. Hernández. DESY-Zeuthen1 Offline Mass Data Processing using Online Computing Resources at HERA-B José Hernández DESY-Zeuthen.
The ATLAS trigger Ricardo Gonçalo Royal Holloway University of London.
CERN/IT/DB Multi-PB Distributed Databases Jamie Shiers IT Division, DB Group, CERN, Geneva, Switzerland February 2001.
Virtual Organization Approach for Running HEP Applications in Grid Environment Łukasz Skitał 1, Łukasz Dutka 1, Renata Słota 2, Krzysztof Korcyl 3, Maciej.
IOP HEPP 6-8 April 2009Matthew Tamsett, RHUL 1 Determining the ATLAS electron trigger efficiency in BSM channels from Data Matthew Tamsett, RHUL Supervisor:
Worldwide event filter processing for calibration Calorimeter Calibration Workshop Sander Klous September 2006.
Tracking at the ATLAS LVL2 Trigger Athens – HEP2003 Nikos Konstantinidis University College London.
1 A ROOT Tool for 3D Event Visualization in ATLAS Calorimeters Luciano Andrade José de Seixas Federal University of Rio de Janeiro/COPPE.
The Region of Interest Strategy for the ATLAS Second Level Trigger
HEP 2005 WorkShop, Thessaloniki April, 21 st – 24 th 2005 Efstathios (Stathis) Stefanidis Studies on the High.
April CMS Trigger Upgrade Workshop - Paris1 Christian Bohm, Stockholm University for the L1 calorimeter collaboration The ATLAS Trigger Upgrade.
Network Performance for ATLAS Real-Time Remote Computing Farm Study Alberta, CERN Cracow, Manchester, NBI MOTIVATION Several experiments, including ATLAS.
Event selection and readout Online networks and architectures Online event filter Technologies and trends Computing and communication at LHC.
1 “Steering the ATLAS High Level Trigger” COMUNE, G. (Michigan State University ) GEORGE, S. (Royal Holloway, University of London) HALLER, J. (CERN) MORETTINI,
The ATLAS Trigger: High-Level Trigger Commissioning and Operation During Early Data Taking Ricardo Gonçalo, Royal Holloway University of London On behalf.
Silicon Module Tests The modules are tested in the production labs. HIP is is participating in the rod quality tests at CERN. The plan of HIP CMS is to.
Navigation Timing Studies of the ATLAS High-Level Trigger Andrew Lowe Royal Holloway, University of London.
IOP HEPP: Beauty Physics in the UK, 12/11/08Julie Kirk1 B-triggers at ATLAS Julie Kirk Rutherford Appleton Laboratory Introduction – B physics at LHC –
The LHCb CERN R. Graciani (U. de Barcelona, Spain) for the LHCb Collaboration International ICFA Workshop on Digital Divide Mexico City, October.
2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD The DataFlow of the ATLAS Trigger.
LHCb DAQ system LHCb SFC review Nov. 26 th 2004 Niko Neufeld, CERN.
CHEP March 2003 Sarah Wheeler 1 Supervision of the ATLAS High Level Triggers Sarah Wheeler on behalf of the ATLAS Trigger/DAQ High Level Trigger.
Artemis School On Calibration and Performance of ATLAS Detectors Jörg Stelzer / David Berge.
HIGUCHI Takeo Department of Physics, Faulty of Science, University of Tokyo Representing dBASF Development Team BELLE/CHEP20001 Distributed BELLE Analysis.
Modeling PANDA TDAQ system Jacek Otwinowski Krzysztof Korcyl Radoslaw Trebacz Jagiellonian University - Krakow.
ATLAS Trigger Development
Overview of the High-Level Trigger Electron and Photon Selection for the ATLAS Experiment at the LHC Ricardo Gonçalo, Royal Holloway University of London.
Data Acquisition, Trigger and Control
Experience with multi-threaded C++ applications in the ATLAS DataFlow Szymon Gadomski University of Bern, Switzerland and INP Cracow, Poland on behalf.
Status of the ATLAS first-level Central Trigger and the Muon Barrel Trigger and First Results from Cosmic-Ray Data David Berge (CERN-PH) for the ATLAS.
ATLAS TDAQ RoI Builder and the Level 2 Supervisor system R. E. Blair, J. Dawson, G. Drake, W. Haberichter, J. Schlereth, M. Abolins, Y. Ermoline, B. G.
Kostas KORDAS INFN – Frascati 10th Topical Seminar on Innovative Particle & Radiation Detectors (IPRD06) Siena, 1-5 Oct The ATLAS Data Acquisition.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment is one of the four major experiments operating at the Large Hadron Collider.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
1 Farm Issues L1&HLT Implementation Review Niko Neufeld, CERN-EP Tuesday, April 29 th.
Performance of the ATLAS Trigger with Proton Collisions at the LHC John Baines (RAL) for the ATLAS Collaboration 1.
LHCbComputing Computing for the LHCb Upgrade. 2 LHCb Upgrade: goal and timescale m LHCb upgrade will be operational after LS2 (~2020) m Increase significantly.
DAQ Overview + selected Topics Beat Jost Cern EP.
A Fast Hardware Tracker for the ATLAS Trigger System A Fast Hardware Tracker for the ATLAS Trigger System Mark Neubauer 1, Laura Sartori 2 1 University.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment [1] is one of the four major experiments operating at the Large Hadron Collider.
The BTeV Pixel Detector and Trigger System Simon Kwan Fermilab P.O. Box 500, Batavia, IL 60510, USA BEACH2002, June 29, 2002 Vancouver, Canada.
ANDREA NEGRI, INFN PAVIA – NUCLEAR SCIENCE SYMPOSIUM – ROME 20th October
1 Nicoletta GarelliCPPM, 03/25/2011 Overview of the ATLAS Data-Acquisition System o perating with proton-proton collisions Nicoletta Garelli (CERN) CPPM,
The Evaluation Tool for the LHCb Event Builder Network Upgrade Guoming Liu, Niko Neufeld CERN, Switzerland 18 th Real-Time Conference June 13, 2012.
ATLAS UK physics meeting, 10/01/08 1 Triggers for B physics Julie Kirk RAL Overview of B trigger strategy Algorithms – current status and plans Menus Efficiencies.
5/14/2018 The ATLAS Trigger and Data Acquisition Architecture & Status Benedetto Gorini CERN - Physics Department on behalf of the ATLAS TDAQ community.
Grid site as a tool for data processing and data analysis
LHC experiments Requirements and Concepts ALICE
Commissioning of the ALICE HLT, TPC and PHOS systems
Operating the ATLAS Data-Flow System with the First LHC Collisions
ATLAS Canada Alberta Carleton McGill Montréal Simon Fraser Toronto
The First-Level Trigger of ATLAS
Example of DAQ Trigger issues for the SoLID experiment
12/3/2018 The ATLAS Trigger and Data Acquisition Architecture & Status Benedetto Gorini CERN - Physics Department on behalf of the ATLAS TDAQ community.
1/2/2019 The ATLAS Trigger and Data Acquisition Architecture & Status Benedetto Gorini CERN - Physics Department on behalf of the ATLAS TDAQ community.
Low Level HLT Reconstruction Software for the CMS SST
LHCb Trigger, Online and related Electronics
Design Principles of the CMS Level-1 Trigger Control and Hardware Monitoring System Ildefons Magrans de Abril Institute for High Energy Physics, Vienna.
The LHCb Level 1 trigger LHC Symposium, October 27, 2001
Presentation transcript:

Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H. Niewodniczanski Institute of Nuclear Physics, Cracow

Copyright © 2000 OPNET Technologies, Inc. Title – 2 Krzysztof Korcyl, Institute of Nuclear Physics, Cracow Contents LHC physics program detectors (ATLAS, LHCb) LHC T/DAQ system challenges T/DAQ system overview ATLAS LHCb T/DAQ trigger and data collection scheme ATLAS LHCb

Copyright © 2000 OPNET Technologies, Inc. Title – 3 Krzysztof Korcyl, Institute of Nuclear Physics, Cracow CERN and the Large Hadron Collider, LHC LHC is being constructed underground inside a 27 km tunnel. Head on collisions of very high energy protons. ALICE, ATLAS, CMS, LHCb - approved experiments.

Copyright © 2000 OPNET Technologies, Inc. Title – 4 Krzysztof Korcyl, Institute of Nuclear Physics, Cracow The ATLAS LHC Experiment

Copyright © 2000 OPNET Technologies, Inc. Title – 5 Krzysztof Korcyl, Institute of Nuclear Physics, Cracow The LHCb LHC Experiment

Copyright © 2000 OPNET Technologies, Inc. Title – 6 Krzysztof Korcyl, Institute of Nuclear Physics, Cracow The LHCb LHC Experiment - an event signature

Copyright © 2000 OPNET Technologies, Inc. Title – 7 Krzysztof Korcyl, Institute of Nuclear Physics, Cracow Challenges for Trigger/DAQ system The challenges: unprecedented LHC rate of 10 9 interactions per second large and complex detectors with O (10 8 ) channels to be read out bunch crossing rate 40 MHz requires a decision every 25 ns event storage rate limited to O (100) MB/s The big challenge: to select rare physics signatures with high efficiency while rejecting common (background) events. E.q. H  yy (m H  100 GeV) rate is ~ of LHC interaction rate Approach: three level trigger system

Copyright © 2000 OPNET Technologies, Inc. Title – 8 Krzysztof Korcyl, Institute of Nuclear Physics, Cracow ATLAS Trigger/DAQ - system overview CALO MUON TRACKING LVL1 Pipeline memories Regions of Interest LVL2 Event builder Data Recording Readout Driver Readout Buffer Interaction rate ~ 1 GHz Bunch crossing rate: 40 MHz rate : 100 kHz latency: < 2.5  s throughput 200 GB/s rate : 1 kHz latency: throughput 4 GB/s rate : 100 Hz latency: throughput 200 MB/s LVL1 decision based on course granularity calorimeter data and muon trigger stations LVL2 can get data at full granularity and can combine information from all detectors. Emphasis on fast rejection. Region of Interest (RoI) from LVL1 are used to reduce data requested (few % of whole event) in most cases EF refines the selection according to the LVL2 classification, performing a fuller reconstruction. More detailed alignment and calibration data can be used EF

Copyright © 2000 OPNET Technologies, Inc. Title – 9 Krzysztof Korcyl, Institute of Nuclear Physics, Cracow EF SUBFARM ROS ROD ROS DFM L2SV RolB LVL2 FARM EF SUBFARM SFI L2PU CPU SWITCH LARGE SWITCH DATA COLLECTION NETWORK SFI Readout Subsystem ATLAS overall data collection scheme LVL1

Copyright © 2000 OPNET Technologies, Inc. Title – 10 Krzysztof Korcyl, Institute of Nuclear Physics, Cracow Why trigger on GRID? First code benchmarking shows that local CPU power may not be sufficient (budget+ manageable size of the cluster)  distribute the work over remote clusters. Why not? The GRID technology will provide platform independent tools which perfectly match the needs to run, monitor and control the remote trigger algorithms. Developement of dedicated tools (based on the GRID technology) ensuring quasi real-time response of the order of a few seconds might be necessary  task for CROSSGRID

Copyright © 2000 OPNET Technologies, Inc. Title – 11 Krzysztof Korcyl, Institute of Nuclear Physics, Cracow Data flow diagram Experiment Local high level trigger Tape Remote high level trigger Event buffer for remote processing Remote high level trigger... decision Event dispatcher CROSSGRID interface

Copyright © 2000 OPNET Technologies, Inc. Title – 12 Krzysztof Korcyl, Institute of Nuclear Physics, Cracow Operational features Event dispatcher is a separate module. Easy to activate and deactivate Implementation independent on specific trigger solutions for a given experiment Dynamical resource assignment to keep system running within assumed performance limits (event buffer occupancy, link bandwidth, number of remote centers, timeout rate...) Fault tollerance and timeout management (no decision within allowed time limit) User interface to monitor and control by a shifter

Copyright © 2000 OPNET Technologies, Inc. Title – 13 Krzysztof Korcyl, Institute of Nuclear Physics, Cracow Testbed for distributed trigger Easy to test by substituting real experiment data with PC sending Monte Carlo data Monte Carlo Data PC at CERN Event Buffer Poland Spain Germany Event Dispatcher - Monitoring and Control Tool decision

Copyright © 2000 OPNET Technologies, Inc. Title – 14 Krzysztof Korcyl, Institute of Nuclear Physics, Cracow Summary Trigger systems for the LHC experiments are challenging GRID technology may help to solve lack of local CPU power Proposed distributed trigger structure as a separate Event dispatcher module offers cross-experiments platform independent of specific local trigger solutions. Implementation on testbed feasible even without running experiments Dedicated tools to be developed within CROSSGRID project to ensure interactivity, monitoring and control.