Fermilab Scientific Computing Division Fermi National Accelerator Laboratory, Batavia, Illinois, USA. The Power of Data Driven Triggering DAQ Topology.

Slides:



Advertisements
Similar presentations
Network II.5 simulator ..
Advertisements

Linda R. Coney – 24th April 2009 Online Reconstruction & a little about Online Monitoring Linda R. Coney 18 August, 2009.
BTeV Trigger Architecture Vertex 2002, Nov. 4-8 Michael Wang, Fermilab (for the BTeV collaboration)
1 The ATLAS Online High Level Trigger Framework: Experience reusing Offline Software Components in the ATLAS Trigger Werner Wiedenmann University of Wisconsin,
Fermilab Fermilab Scientific Computing Division & Computing Enabling Technologies Department, Fermi National Accelerator Laboratory, Batavia, Illinois,
Data Acquisition Software for CMS HCAL Testbeams Jeremiah Mans Princeton University CHEP2003 San Diego, CA.
Research on cloud computing application in the peer-to-peer based video-on-demand systems Speaker : 吳靖緯 MA0G rd International Workshop.
Modeling of the architectural studies for the PANDA DAT system K. Korcyl 1,2 W. Kuehn 3, J. Otwinowski 1, P. Salabura 1, L. Schmitt 4 1 Jagiellonian University,Krakow,
06/15/2009CALICE TB review RPC DHCAL 1m 3 test software: daq, event building, event display, analysis and simulation Lei Xia.
Niko Neufeld, CERN/PH-Department
Artdaq Introduction artdaq is a toolkit for creating the event building and filtering portions of a DAQ. A set of ready-to-use components along with hooks.
IceCube DAQ Mtg. 10,28-30 IceCube DAQ: “DOM MB to Event Builder”
U N C L A S S I F I E D FVTX Detector Readout Concept S. Butsyk For LANL P-25 group.
Level 3 Muon Software Paul Balm Muon Vertical Review May 22, 2000.
Performance of the NOA Data Acquisition and Data Driven Trigger Systems for the full 14 kT Far Detector A. Norman Fermilab, Scientific Computing Division.
Status of the NO ν A Near Detector Prototype Timothy Kutnink Iowa State University For the NOvA Collaboration.
21-Aug-06DoE Site Review / Harvard(1) Front End Electronics for the NOvA Neutrino Detector John Oliver Long baseline neutrino experiment Fermilab (Chicago)
A Software Solution for the Control, Acquisition, and Storage of CAPTAN Network Topologies Ryan Rivera, Marcos Turqueti, Alan Prosser, Simon Kwan Electronic.
NA62 Trigger Algorithm Trigger and DAQ meeting, 8th September 2011 Cristiano Santoni Mauro Piccini (INFN – Sezione di Perugia) NA62 collaboration meeting,
Online Reconstruction used in the Antares-Tarot alert system J ü rgen Brunner The online reconstruction concept Performance Neutrino doublets and high.
July 19, 2006VLCW-06 Vancouver1 Scint/MAPMT Muon Prototype Operation Robert Abrams Indiana University.
The Main Injector Beam Position Monitor Front-End Software Luciano Piccoli, Stephen Foulkes, Margaret Votava and Charles Briegel Fermi National Accelerator.
VLVnT09A. Belias1 The on-shore DAQ system for a deep-sea neutrino telescope A.Belias NOA-NESTOR.
January 31, MICE DAQ MICE and ISIS Introduction MICE Detector Front End Electronics Software and MICE DAQ Architecture MICE Triggers Status and Schedule.
June 17th, 2002Gustaaf Brooijmans - All Experimenter's Meeting 1 DØ DAQ Status June 17th, 2002 S. Snyder (BNL), D. Chapin, M. Clements, D. Cutts, S. Mattingly.
Modeling PANDA TDAQ system Jacek Otwinowski Krzysztof Korcyl Radoslaw Trebacz Jagiellonian University - Krakow.
Linda R. Coney – 5 November 2009 Online Reconstruction Linda R. Coney 5 November 2009.
Florida Institute of Technology, Melbourne, FL
Disk Towards a conceptual design M. de Jong  Introduction  Design considerations  Design concepts  Summary.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment is one of the four major experiments operating at the Large Hadron Collider.
LKr readout and trigger R. Fantechi 3/2/2010. The CARE structure.
Monitoring for the ALICE O 2 Project 11 February 2016.
1 DAQ.IHEP Beijing, CAS.CHINA mail to: The Readout In BESIII DAQ Framework The BESIII DAQ system consists of the readout subsystem, the.
15-Aug-08DoE Site Review / Harvard(1) Front End Electronics for the NOvA Neutrino Detector John Oliver, Nathan Felt, Sarah Harder Long baseline neutrino.
P H E N I X / R H I CQM04, Janurary 11-17, Event Tagging & Filtering PHENIX Level-2 Trigger Xiaochun He Georgia State University.
Mark Dorman Separation Of Charged Current And Neutral Current Events In The MINOS Far Detector Using The Hough Transform Mark Dorman 16/12/04.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment [1] is one of the four major experiments operating at the Large Hadron Collider.
MICE Tracker Software A. Dobbs CM32 9 th Feb 2012.
IceCube DAQ Mtg. 10,28-30 IceCube DAQ: Implementation Plan.
Jaroslav Zalesak The Institute of Physics of the Czech Academy of Sciences / Fermilab and Peter Shanahan, Andrew Norman, Kurt Biery, Ronald Rechenmacher,
IceTop Design: 1 David Seckel – 3/11/2002 Berkeley, CA IceTop Overview David Seckel IceTop Group University of Delaware.
Remigius K Mommsen Fermilab CMS Run 2 Event Building.
Jianming Qian, UM/DØ Software & Computing Where we are now Where we want to go Overview Director’s Review, June 5, 2002.
FTK high level simulation & the physics case The FTK simulation problem G. Volpi Laboratori Nazionali Frascati, CERN Associate FP07 MC Fellow.
FNAL Software School Day 4 Matt Herndon, University of Wisconsin – Madison.
The Evaluation Tool for the LHCb Event Builder Network Upgrade Guoming Liu, Niko Neufeld CERN, Switzerland 18 th Real-Time Conference June 13, 2012.
Event Display The DAQ system architecture The NOA Data Acquisition system with the full 14 kt far detector P.F. Ding, D. Kalra, A. Norman and J. Tripathi.
Fermilab Scientific Computing Division Fermi National Accelerator Laboratory, Batavia, Illinois, USA. Off-the-Shelf Hardware and Software DAQ Performance.
IRFU The ANTARES Data Acquisition System S. Anvar, F. Druillole, H. Le Provost, F. Louis, B. Vallage (CEA) ACTAR Workshop, 2008 June 10.
The NOνA data driven trigger Matthew Tamsett for the NOνA collaboration NOνA Neutrino 2014, XXVI International Conference on Neutrino Physics and Astrophysics.
EPS HEP 2007 Manchester -- Thilo Pauly July The ATLAS Level-1 Trigger Overview and Status Report including Cosmic-Ray Commissioning Thilo.
WPFL General Meeting, , Nikhef A. Belias1 Shore DAQ system - report on studies A.Belias NOA-NESTOR.
CMS DAQ project at Fermilab
Giovanna Lehmann Miotto CERN EP/DT-DI On behalf of the DAQ team
JEDI polarimetry – Developments at SMART|EDM_Lab
Pasquale Migliozzi INFN Napoli
LHC experiments Requirements and Concepts ALICE
Controlling a large CPU farm using industrial tools
ALICE – First paper.
RT2003, Montreal Niko Neufeld, CERN-EP & Univ. de Lausanne
The COMPASS event store in 2002
Commissioning of the ALICE HLT, TPC and PHOS systems
ProtoDUNE SP DAQ assumptions, interfaces & constraints
Example of DAQ Trigger issues for the SoLID experiment
Characteristics of Reconfigurable Hardware
Low Level HLT Reconstruction Software for the CMS SST
LHCb Trigger, Online and related Electronics
The LHCb High Level Trigger Software Framework
The LHCb Level 1 trigger LHC Symposium, October 27, 2001
Implementation of DHLT Monitoring Tool for ALICE
Presentation transcript:

Fermilab Scientific Computing Division Fermi National Accelerator Laboratory, Batavia, Illinois, USA. The Power of Data Driven Triggering DAQ Topology The NOνA DAQ system is designed to support continuous readout of over 368,000 detector channels. NOνA accomplishes this by creating a hierarchical readout structure that transmits raw data to aggregation nodes which perform multiple stages of event building and organization before retransmitting the data in real time to the next stage of the DAQ chain. In this manner data moves from Front End digitizer boards (FEBs) to DCMs to Event Building & Buffering nodes and finally into a data logging station. The flow of data out of the buffer nodes is controlled by a Global Trigger process that issues extraction directives based on time windows identified to be of interest. These windows can represent beam spills, calibration triggers, or data driven trigger decisions. The Buffer nodes used for DAQ processing are a farm of high performance commodity computers. Each node has a minimum of 16 compute cores and 32 GB of available memory for data processing and buffering. This permits them to run both the event builder process and the full multi-threaded data driven triggering suite, and buffer a minimum of 20 seconds of full, continuous, zero-bias data readout. The Nova detector utilizes a simple XY range stack geometry. This allows for the application of very efficient and powerful global tracking algorithms to identify event topologies that are of interest to physics analyses. The NOνA online physics program needs the ability to produce data driven triggers based on the identification and global reconstruction characteristics of four broad classes of event topology: Being able to generate DDTs based on these topologies allows NOνA to: Improve detector calibrations using “horizontal” muons Verify detector/beam timing with contained neutrino events Search for exotic phenomena in non-beam data (magnetic monopole, WIMP annihilation signatures) Search for directional cosmic neutrino sources Detect nearby supernovae ν µ CC candidate Multi-prong neutrino candidate (cosmic ray) Realtime Event Filtering ARTDAQ is a software product, under active development at Fermilab, that provides tools to build programs that combine realtime event building and filtering of events as part of an integrated DAQ system. The NOνA Data Driven Trigger (NOνA-DDT) demonstrator module uses the high speed event feeding and event filtering components of ARTDAQ to provide a real world proof of concept application that meets the NOvA trigger requirements. NOνA-DDT takes already-built 5ms wide raw data windows from shared memory, and injects them into the art framework where trigger decision modules are run and a decision message is broadcast to the NOνA Global Trigger System. The ART framework allows the usage of standard NOνA reconstruction concepts to be applied to the online trigger decision process. The common ART framework additionally allows for online reconstruction & filtering algorithms to be applied in the offline environment. This allows the identical code used in the DAQ to be used in offline simulation and testing, for determination of selection efficiencies, reconstruction biases and over all trigger performance. Neutrino Event topologies of interest observed in the NOνA prototype near detector during the run. These topologies exhibit the core features for identification with the NOνA-DDT. Linear tracks Multi Track Vertices EM Shower objects Correlated Hit Cluster Data Driven Trigger Buffering As timeslices of detector data are collected and built they are written into two distinct buffer pools: The Global Trigger pool contains timeslices waiting for up to 20 s to be extracted by a global trigger and sent to the datalogger. The DDT pool is a separate set of IPV4 shared memory buffers designed to be the input queues for data driven trigger analysis processes. The DDT shared memory interface is optimized as a one-writer-many-readers design: The shared memory buffer has N “large” (fixed size) segments which are rotated through by the writer to reduce collisions with reader processes The writing process uses guard indicators before and after the detector data. These indicators allow read processes to tell if the data segment being read may have been over written during read-out. DAQ Event Buffering & Building NOνA-DDT Performance Because the DAQ system performs a 200 to 1 synchronized burst transmission of data between the data concentrator modules (DCMs) and the Event Builder buffer nodes, a large amount of network & data transmission buffering is required to handle the data rate. The buffering has been tuned by adjusting: the size of data packets to match a single complete time window the receive window on the buffer nodes to exploit the buffering capability of the high speed switched fabric between the DCMs and buffer nodes. Cisco 4948 network switch array » Buffer Node Computers 180 Data Concentrator Modules 11,160 Front End Boards Buffer Nodes Data Buffer 5ms data blocks Data Logger Data Driven Triggers System Trigger Processor …. Event builder Data Slice Pointer Table Data Time Window Search Trigger Reception Grand Trigger OR Data Triggered Data Output Data Minimum Bias 0.75GB/S Stream DCM 1 DCMs COtS Ethernet 1Gb/s FEB Zero Suppressed at (6-8MeV/cell) FEB Global Trigger Processor Beam Spill Indicator (Async from Trigger Broadcast Calib. Pulser ( 50-91Hz) Data Driven Trig. Decisions FEBs (368,4600 det. channels) 200 Buffer Nodes (3200 Compute Cores) Shared Memory DDT event stack Read the Full paper at: Network (switch) buffering optimization as a function of far detector average DCM data rates and acquisition window time. The NOνA-DDT prototype system implements a real world trigger algorithm designed to identify tracks in the NOνA detector. The algorithm calculates the pair wise intersections of the Hough trajectories for the detector hits and identifies the resulting track candidate peaks in the Hough space. This is a real world example of an N 2 complexity algorithm and is of particular interest due to its ability to identify candidate slow particle tracks (i.e. Magnetic Monopoles with β>10 -5 ) in the far detector. The NOνA-DDT-Hough algorithm, without parallelization, was run on the NOνA near detector DAQ trigger cluster. It was able to reach decisions in on average 98±14 ms (compared to the target goal of 60). It was able to achieve this level of performance using only 1/16 th of the available CPU resources of each cluster node. This initial test shows that the system, after parallelization and optimizations, will be able to meet and exceed the NOνA trigger requirements. The ARTDAQ framework overhead for processing of 5ms time windows of NOνA near detector readout. Overhead includes unpacking and formatting of the DAQHit data objects. Hough transform execution time for 5m time windows of NOνA near detector readout. No parallelization was applied to the algorithm.