HLT - data compression vs event rejection. Assumptions Need for an online rudimentary event reconstruction for monitoring Detector readout rate (i.e.

Slides:



Advertisements
Similar presentations
HLT Collaboration; High Level Trigger HLT PRR Computer Architecture Volker Lindenstruth Kirchhoff Institute for Physics Chair of Computer.
Advertisements

High Level Trigger (HLT) for ALICE Bergen Frankfurt Heidelberg Oslo.
Combined tracking based on MIP. Proposal Marian Ivanov.
Development of a track trigger based on parallel architectures Felice Pantaleo PH-CMG-CO (University of Hamburg) Felice Pantaleo PH-CMG-CO (University.
High Level Trigger – Applications Open Charm physics Quarkonium spectroscopy Dielectrons Dimuons Jets.
A Fast Level 2 Tracking Algorithm for the ATLAS Detector Mark Sutton University College London 7 th October 2005.
Level-3 trigger for ALICE Bergen Frankfurt Heidelberg Oslo.
27 th June 2008Johannes Albrecht, BEACH 2008 Johannes Albrecht Physikalisches Institut Universität Heidelberg on behalf of the LHCb Collaboration The LHCb.
Data rate reduction in ALICE. Data volume and event rate TPC detector data volume = 300 Mbyte/event data rate = 200 Hz front-end electronics DAQ – event.
Torsten Alt - KIP Heidelberg IRTG 28/02/ ALICE High Level Trigger The ALICE High-Level-Trigger.
ALICE TPC Online Tracking on GPU David Rohr for the ALICE Corporation Lisbon.
HLT & Calibration.
CHEP04 - Interlaken - Sep. 27th - Oct. 1st 2004T. M. Steinbeck for the Alice Collaboration1/20 New Experiences with the ALICE High Level Trigger Data Transport.
> IRTG – Heidelberg 2007 < Jochen Thäder – University of Heidelberg 1/18 ALICE HLT in the TPC Commissioning IRTG Seminar Heidelberg – January 2008 Jochen.
Timm M. Steinbeck - Kirchhoff Institute of Physics - University Heidelberg - DPG 2005 – HK New Test Results for the ALICE High Level Trigger.
July 7, 2008SLAC Annual Program ReviewPage 1 High-level Trigger Algorithm Development Ignacio Aracena for the SLAC ATLAS group.
Algorithms and Methods for Particle Identification with ALICE TOF Detector at Very High Particle Multiplicity TOF simulation group B.Zagreev ACAT2002,
HLT Collaboration (28-Jun-15) 1 High Level Trigger L0 L1 L2 HLT Dieter Roehrich UiB Trigger Accept/reject events Select Select regions of interest within.
ALICE HLT High Speed Tracking and Vertexing Real-Time 2010 Conference Lisboa, May 25, 2010 Sergey Gorbunov 1,2 1 Frankfurt Institute for Advanced Studies,
L3 Filtering: status and plans D  Computing Review Meeting: 9 th May 2002 Terry Wyatt, on behalf of the L3 Algorithms group. For more details of current.
KIP TRACKING IN MAGNETIC FIELD BASED ON THE CELLULAR AUTOMATON METHOD TRACKING IN MAGNETIC FIELD BASED ON THE CELLULAR AUTOMATON METHOD Ivan Kisel KIP,
Tracking at the ATLAS LVL2 Trigger Athens – HEP2003 Nikos Konstantinidis University College London.
The High-Level Trigger of the ALICE Experiment Heinz Tilsner Kirchhoff-Institut für Physik Universität Heidelberg International Europhysics Conference.
Helmholtz International Center for CBM – Online Reconstruction and Event Selection Open Charm Event Selection – Driving Force for FEE and DAQ Open charm:
Tracking at LHCb Introduction: Tracking Performance at LHCb Kalman Filter Technique Speed Optimization Status & Plans.
Faster tracking in hadron collider experiments  The problem  The solution  Conclusions Hans Drevermann (CERN) Nikos Konstantinidis ( Santa Cruz)
The ALICE DAQ: Current Status and Future Challenges P. VANDE VYVRE CERN-EP/AID.
ALICE Upgrade for Run3: Computing HL-LHC Trigger, Online and Offline Computing Working Group Topical Workshop Sep 5 th 2014.
Real data reconstruction A. De Caro (University and INFN of Salerno) CERN Building 29, December 9th, 2009ALICE TOF General meeting.
TPC online reconstruction Cluster Finder & Conformal Mapping Tracker Kalliopi Kanaki University of Bergen.
Fast reconstruction of tracks in the inner tracker of the CBM experiment Ivan Kisel (for the CBM Collaboration) Kirchhoff Institute of Physics University.
1 High Level Processing & Offline event selecton event selecton event processing event processing offine Dieter Roehrich UiB Data volume and event rates.
Tracking at Level 2 for the ATLAS High Level Trigger Mark Sutton University College London 26 th September 2006.
TOF, Status of the Code F. Pierella, Bologna University and INFN TOF Offline Group ALICE Offline Week, June 2002.
Off-line and Detector Database Kopenhagen TPC meeting A.Sandoval.
1 Th.Naumann, DESY Zeuthen, HERMES Tracking meeting, Tracking with the Silicon Detectors Th.Naumann H1 DESY Zeuthen A short collection of experiences.
KIP Ivan Kisel JINR-GSI meeting Nov 2003 High-Rate Level-1 Trigger Design Proposal for the CBM Experiment Ivan Kisel for Kirchhoff Institute of.
Tracking, PID and primary vertex reconstruction in the ITS Elisabetta Crescio-INFN Torino.
STAR Level-3 C. Struck CHEP 98 1 Level-3 Trigger for the Experiment at RHIC J. Berger 1, M. Demello 5, M.J. LeVine 2, V. Lindenstruth 3, A. Ljubicic, Jr.
Normal text - click to edit HLT tracking in TPC Off-line week Gaute Øvrebekk.
STAR TPC Cluster and Hit Finder Software Raimond Snellings.
V0 analytical selection Marian Ivanov, Alexander Kalweit.
Modeling PANDA TDAQ system Jacek Otwinowski Krzysztof Korcyl Radoslaw Trebacz Jagiellonian University - Krakow.
CBM Simulation Walter F.J. Müller, GSI CBM Simulation Week, May 10-14, 2004 Tasks and Concepts.
Features needed in the “final” AliRoot release P.Hristov 26/10/2006.
HLT Kalman Filter Implementation of a Kalman Filter in the ALICE High Level Trigger. Thomas Vik, UiO.
FPGA Co-processor for the ALICE High Level Trigger Gaute Grastveit University of Bergen Norway H.Helstrup 1, J.Lien 1, V.Lindenstruth 2, C.Loizides 5,
Upgrade Letter of Intent High Level Trigger Thorsten Kollegger ALICE | Offline Week |
Development of the parallel TPC tracking Marian Ivanov CERN.
1/13 Future computing for particle physics, June 2011, Edinburgh A GPU-based Kalman filter for ATLAS Level 2 Trigger Dmitry Emeliyanov Particle Physics.
CWG7 (reconstruction) R.Shahoyan, 12/06/ Case of single row Rolling Shutter  N rows of sensor read out sequentially, single row is read in time.
A Fast Hardware Tracker for the ATLAS Trigger System A Fast Hardware Tracker for the ATLAS Trigger System Mark Neubauer 1, Laura Sartori 2 1 University.
The Lead Glass Detector Overview, Experience and Direction.
1 Reconstruction tasks R.Shahoyan, 25/06/ Including TRD into track fit (JIRA PWGPP-1))  JIRA PWGPP-2: Code is in the release, need to switch setting.
Calibration algorithm and detector monitoring - TPC Marian Ivanov.
FTK high level simulation & the physics case The FTK simulation problem G. Volpi Laboratori Nazionali Frascati, CERN Associate FP07 MC Fellow.
LIT participation LIT participation Ivanov V.V. Laboratory of Information Technologies Meeting on proposal of the setup preparation for external beams.
Monthly video-conference, 18/12/2003 P.Hristov1 Preparation for physics data challenge'04 P.Hristov Alice monthly off-line video-conference December 18,
CALIBRATION: PREPARATION FOR RUN2 ALICE Offline Week, 25 June 2014 C. Zampolli.
June 4, 2009 STAR TPC review Estimation of TPC Aging Based on dE/dx Measurements Yuri Fisyak.
Status of Hough Transform TPC Tracker
evoluzione modello per Run3 LHC
ALICE – First paper.
Commissioning of the ALICE HLT, TPC and PHOS systems
ALICE Computing Upgrade Predrag Buncic
Track Reconstruction Algorithms for the ALICE High-Level Trigger
STAR Geometry and Detectors
High Level Trigger Studies for the Efstathios (Stathis) Stefanidis
Low Level HLT Reconstruction Software for the CMS SST
The LHCb Level 1 trigger LHC Symposium, October 27, 2001
Presentation transcript:

HLT - data compression vs event rejection

Assumptions Need for an online rudimentary event reconstruction for monitoring Detector readout rate (i.e. TPC) >> DAQ bandwidth  mass storage bandwidth Some physics observables require running detectors at maximum rate (e.g. quarkonium spectroscopy: TPC/TRD dielectrons; jets in p+p: TPC tracking) Online combination of different detectors can increase selectivity of triggers (e.g. jet quenching: PHOS/TPC high-p T  - jet events)

Data volume and event rate TPC detector data volume = 300 Mbyte/event data rate = 200 Hz front-end electronics DAQ – event building Level-3 system permanent storage system bandwidth 60 Gbyte/sec 15 Gbyte/sec < 1.2 Gbyte/sec < 2 Gbyte/sec

HLT tasks Online (sub)-event reconstruction –optimization and monitoring of detector performance –monitoring of trigger selectivity –fast check of physics program Data rate reduction –data volume reduction regions-of-interest and partial readout data compression –event rate reduction (sub)-event reconstruction and event rejection p+p program –pile-up removal –charged particle jet trigger, etc.

Data rate reduction Volume reduction –regions-of-interest and partial readout –data compression entropy coder vector quantization TPC-data modeling Rate reduction –(sub)-event reconstruction and event rejection before event building

TPC event (only about 1% is shown)

Regions-of-interest and partial readout Example: selection of TPC sector and  -slice based on TRD track candidate

Data compression: Entropy coder Variable Length Coding short codes for long codes for frequent values infrequent values Results: NA49: compressed event size = 72% ALICE: = 65% ( Arne Wiebalck, diploma thesis, Heidelberg) Probability distribution of 8-bit TPC data

Data compression: Vector quantization Sequence of ADC-values on a pad = vector: Vector quantization = transformation of vectors into codebook entries Quantization error: Results: NA49: compressed event size = 29 % ALICE: = 48%-64% (Arne Wiebalck, diploma thesis, Heidelberg) code book compare

Data compression: TPC-data modeling Fast local pattern recognition: Result: NA49: compressed event size = 7 % analytical cluster model quantization of deviations from track and cluster model local track parameters comparison to raw data simple local track model (e.g. helix)track parameters Track and cluster modeling:

Fast pattern recognition Essential part of Level-3 system –crude complete event reconstruction  monitoring –redundant local tracklet finder for cluster evaluation  efficient data compression –selection of ( , ,p T )-slices  ROI –high precision tracking for selected track candidates  jets, dielectrons,...

Fast pattern recognition Sequential approach –cluster finder, vertex finder and track follower STAR code adapted to ALICE TPC –reconstruction efficiency –timing results Iterative feature extraction –tracklet finder on raw data and cluster evaluation Hough transform

Fast cluster finder (1) timing: 5ms per padrow

Fast cluster finder (2)

Fast cluster finder (3) Efficiency Offline efficiency

Fast vertex finder Resolution Timing result: 19 msec on ALPHA (667 MHz)

Fast track finder Tracking efficiency

Fast track finder Timing results

Hough transform (1) Data flow

Hough transform (2)  -slices

Hough transform (3) Transformation and maxima search

Level-3 system architecture TPC sector #1 TPC sector #36 TRDITSXYZ local processing subsector/sector global processing I (2x18 sectors) global processing II (detector merging) global processing III (event reconstruction) ROI data compr. event rejection monitoring Level-3 trigger momentum filter

TPC on-line tracking Assumptions: Bergen fast tracker DEC Alpha 667 MHz Fast cluster finder excluding cluster deconvolution Note: This cluster finder is sub optimal for the inner sectors and additional work is required here. However in order to get some estimate the computation requirements were based on the outer pad rows. It should be noted that the possibly necessary deconvolution in the inner padrows may require comparably more CPU cycles. TPC L3 Tracking estimate: Cluster finder on pad row of the outer sector 5 ms tracking of all (monte carlo) space points for one TPC sector 600 ms Note- this data may not include realistic noise - tracking to first order is linear with the number of tracks provided there are few overlaps - assuming one ideal processor below Cluster finder on one sector (145 padrows) 725 ms Process complete sector 1,325 s Process complete TPC 47,7 s Running at maximum TPC rate (200 Hz), January CPUs Assuming 20% overhead CPUs (parallel computation, network transfer, inner sector additional overhead, sector merging etc.) Moores Law (60%/a) 2006 – 1a commission x10, CPUs