Status and Performance of the ALICE Trigger Electronics

Slides:



Advertisements
Similar presentations
Status of the CTP O.Villalobos Baillie University of Birmingham April 23rd 2009.
Advertisements

The First-Level Trigger of ATLAS Johannes Haller (CERN) on behalf of the ATLAS First-Level Trigger Groups International Europhysics Conference on High.
ALICE Trigger System Features Overall layout Central Trigger Processor Local Trigger Unit Software Current status On behalf of ALICE collaboration:D. Evans,
Level-1 Topology Processor for Phase 0/1 - Hardware Studies and Plans - Uli Schäfer Johannes Gutenberg-Universität Mainz Uli Schäfer 1.
24-28 May, 2010 S. Mastroianni - 17th Real-Time Conference, Lisboa, Portugal ARGO-YBJ is a cosmic ray air shower detector based on a single layer of RPC.
28 August 2002Paul Dauncey1 Readout electronics for the CALICE ECAL and tile HCAL Paul Dauncey Imperial College, University of London, UK For the CALICE-UK.
Marián Krivda on behalf of the ALICE trigger group The University of Birmingham Triggering Discoveries in High Energy Physics 9-14 September 2013, Jammu,
28 June 2004 ATLAS SCT/Pixel TIM FDR/PRR Martin Postranecky TIM OVERVIEW1 ATLAS SCT/Pixel TIM FDR/PRR 28 June 2004 Physics & Astronomy HEP Electronics.
February 19th 2009AlbaNova Instrumentation Seminar1 Christian Bohm Instrumentation Physics, SU Upgrading the ATLAS detector Overview Motivation The current.
The Track-Finding Processor for the Level-1 Trigger of the CMS Endcap Muon System D.Acosta, A.Madorsky, B.Scurlock, S.M.Wang University of Florida A.Atamanchuk,
Global Trigger H. Bergauer, K. Kastner, S. Kostner, A. Nentchev, B. Neuherz, N. Neumeister, M. Padrta, P. Porth, H. Rohringer, H. Sakulin, J. Strauss,
DLS Digital Controller Tony Dobbing Head of Power Supplies Group.
11 CTP Training A.Jusko, M. Krivda and R.Lietava..
Status of Data Exchange Implementation in ALICE David Evans LEADE 26 th March 2007.
Claudia-Elisabeth Wulz Institute for High Energy Physics Vienna Level-1 Trigger Menu Working Group CERN, 9 November 2000 Global Trigger Overview.
Heavy Ions at the LHC Peter G. Jones University of Birmingham, UK NP UK Community Meeting, Cosener’s House, September 2008.
Local Trigger Unit (LTU) status T. Blažek, V. Černý, M. Kovaľ, R. Lietava Comenius University, Bratislava M. Krivda University of Birmingham 30/08/2012.
Status of Global Trigger Global Muon Trigger Sept 2001 Vienna CMS-group presented by A.Taurok.
Local Trigger Unit for NA62 Marián Krivda 1), Cristina Lazzeroni 1), Vlado Černý 2), Tomáš Blažek 2), Roman Lietava 1)2) 1) University of Birmingham, UK.
Annual Review Cern -June 13th, 2006 F. Loddo I.N.F.N. Bari RPC Electronics: Technical Trigger Flavio Loddo I.N.F.N. Bari On behalf of the RPC-Trigger group.
Rome 4 Sep 04. Status of the Readout Electronics for the HMPID ALICE Jose C. DA SILVA ALICE.
ATLAS Trigger / current L1Calo Uli Schäfer 1 Jet/Energy module calo µ CTP L1.
The ALICE Central Trigger Processor (CTP) Upgrade Marian Krivda 1) and Jan Pospíšil 2) On behalf of ALICE collaboration 1) University of Birmingham, Birmingham,
12GeV Trigger Workshop Christopher Newport University 8 July 2009 R. Chris Cuevas Welcome! Workshop goals: 1.Review  Trigger requirements  Present hardware.
TGC Timing Adjustment Chikara Fukunaga (TMU) ATLAS Timing Workshop 5 July ‘07.
TTC for NA62 Marian Krivda 1), Cristina Lazzeroni 1), Roman Lietava 1)2) 1) University of Birmingham, UK 2) Comenius University, Bratislava, Slovakia 3/1/20101.
A Super-TFC for a Super-LHCb (II) 1. S-TFC on xTCA – Mapping TFC on Marseille hardware 2. ECS+TFC relay in FE Interface 3. Protocol and commands for FE/BE.
NA 62 TTC partition timing T.Blažek, V.Černý, R.Lietava, M.Kovaľ, M.Krivda Bratislava, Birmingham We are developing procedures for timing parameter adjustment.
LHC CMS Detector Upgrade Project RCT/CTP7 Readout Isobel Ojalvo, U. Wisconsin Level-1 Trigger Meeting June 4, June 2015, Isobel Ojalvo Trigger Meeting:
Rutherford Appleton Laboratory September 1999Fifth Workshop on Electronics for LHC Presented by S. Quinton.
August 24, 2011IDAP Kick-off meeting - TileCal ATLAS TileCal Upgrade LHC and ATLAS current status LHC designed for cm -2 s 7+7 TeV Limited to.
Preparing software for LTU T.Blažek, V.Černý, M.Krivda, R.Lietava, M.Mojžiš Bratislava, Birmingham TDAQ working group meeting, CERN, March 24,
Sergio Vergara Limon, Guy Fest, September Electronics for High Energy Physics Experiments.
DAQ and Trigger for HPS run Sergey Boyarinov JLAB July 11, Requirements and available test results 2. DAQ status 3. Trigger system status and upgrades.
EPS HEP 2007 Manchester -- Thilo Pauly July The ATLAS Level-1 Trigger Overview and Status Report including Cosmic-Ray Commissioning Thilo.
"North American" Electronics
Online clock software status
ATLAS calorimeter and topological trigger upgrades for Phase 1
The Totem trigger architecture The LONEG firmware archtecture
L1Calo Phase-1 architechure
Firmware Structure Alireza Kokabi Mohsen Khakzad Friday 9 October 2015
The ELENA BPM System. Status and Plans.
ALICE Trigger Upgrade CTP and LTU PRR
ATLAS Local Trigger Processor
The University of Chicago
L0 processor for NA62 Marian Krivda 1) , Cristina Lazzeroni 1) , Roman Lietava 1)2) 1) University of Birmingham, UK 2) Comenius University, Bratislava,
CMS EMU TRIGGER ELECTRONICS
ATLAS L1Calo Phase2 Upgrade
ITS combined test seen from DAQ and ECS F.Carena, J-C.Marin
8-layer PC Board, 2 Ball-Grid Array FPGA’s, 718 Components/Board
The First-Level Trigger of ATLAS
Regional Cal. Trigger Milestone: Production & Testing Complete
Example of DAQ Trigger issues for the SoLID experiment
UK ECAL Hardware Status
Timing System GSI R. Bär / U. Krause 15. Feb. 2008
Trigger system Marián Krivda (University of Birmingham)
M. Krivda for the ALICE trigger project, University of Birmingham, UK
BESIII EMC electronics
Commissioning of the ALICE-PHOS trigger
LHCb Trigger, Online and related Electronics
Design Principles of the CMS Level-1 Trigger Control and Hardware Monitoring System Ildefons Magrans de Abril Institute for High Energy Physics, Vienna.
11th Pisa meeting on advanced detectors
The CMS Tracking Readout and Front End Driver Testing
Sector Processor Status Report
The LHCb Front-end Electronics System Status and Future Development
Multi Chip Module (MCM) The ALICE Silicon Pixel Detector (SPD)
TELL1 A common data acquisition board for LHCb
The Trigger Control System of the CMS Level-1 Trigger
Plans for the 2004 CSC Beam Test
Beam instrumentation and background monitoring
Presentation transcript:

Status and Performance of the ALICE Trigger Electronics David Evans The University of Birmingham, UK School of Physics and Astronomy, The University of Birmingham, Birmingham, UK L. Barnby, M. Bombara, D. Evans, G.T. Jones, P.G. Jones, P. Jovanovic, A. Jusko, R. Kour, M. Krivda, C. Lazzeroni, R. Lietava, Z. Matthews, S. Navin, A. Palaha, P. Petrov, O. Villalobos-Baillie The Institute of Experimental Physics of Slovak Academy of Sciences, Košice, Slovakia I. Králik, L. Šándor P.J. Šafárik University, Faculty of Science, Košice, Slovakia J. Urbán

Layout of Talk Introduction ALICE Physics Running Conditions Overview of the Central Trigger Processor (CTP) Trigger classes Sub-detector clusters Past-future protection The Local Trigger Unit Emulation of the CTP Error generation Current status of the project

ALICE Physics Study of the strong interaction (QCD) at LHC energies In particular, the quark-gluon plasma (QGP) and its properties Collide Pb-Pb at LHC energies to form QGP Also collide p-p as reference. In addition, ALICE has extensive p-p physics programme.

Running Conditions 1 month per year – lead-lead collisions at 5.5 TeV per nucleon (1.2 PeV per Pb-Pb collision) L ~ 1027 cm-2 s-1 Interaction rate ~ 8 kHz Event size ~ 86 MB 7 months per year – proton-proton collisions at 7 TeV L ~ 2x1033 cm-2 s-1 Interaction rate ~ 200 kHz Event size ~ 2.5 MB

The ALICE Experiment Central Trigger Processor Size: 16 x 26 metres Weight: 10,000 tonnes Detectors: 18 Central Trigger Processor

Highlights of the ALICE CTP Three levels of hierarchical hardware triggers: L0 (1.2µs after beam interaction) → L1 (6.5µs) → L2 (88 µs) At any time, 24 ALICE sub-detectors dynamically partitioned in up to 6 clusters. Cluster configuration arbitrary and fully programmable – could be exclusive, more likely to overlap. Past-future Protection logic selects events with either no pile-up, or a number of pile-up interactions up to a programmable limit. Independent protection for each cluster; operates on all three trigger levels. High data traffic over the TTC system Channel A: L1 signal Channel B: Orbit, Pre-pulse But also, for each trigger: L1 Data Message - 5 16-bit words RoI Message - 4 words L2a Message - 8 words Upgrades: L0 can be transmitted over TTC (as well as LVDS cable). Status of trigger inputs now transmitted to DAQ at time of L0 trigger.

Position of TTC crate in ALICE Racks are located below the Di-Muon magnet in cavern. short latency for L0 – 1.2µs, but… stray magnetic field radiation no access when beam is on TTC crate supplies LHC clock direct to CTP and to Time-of-Flight detectors. CTP sends L0, L1, L2 signals and messages to sub-detectors via individual TTC partitions.

Context diagram of the Central Trigger Processor (CTP) CTP inputs LHC timing – BC, Orbit 60 trigger inputs 24 L0 24 L1 12 L2 24 BUSY inputs CTP outputs 24 independent sets 7 outputs per sub-detector 168 signals total CTP readout Trigger data for events accepted at L2 level Interaction Record CTP interface ECS, DAQ, RoIP

Block diagram of the CTP Synchronous, pipelined processor 40.08 MHz bunch-crossing clock (BC) Modularity and scalability Logic blocks designed as individual VME boards 6U form-factor 8 PCB layers moderate density

CTP boards in a VME crate Front panel connections Timing inputs Trigger inputs BUSY inputs CTP outputs Interface links Internal connections Custom backplane

L0 Processor board - layout example - VME interface VME Controller FPGA Flash memory Trigger input LVDS receivers L0 Logic FPGA ADC, phase measurement Snap-shot memory (1M x 32) Backplane LVDS transceivers ADC, supply monitor (I2C)

CTP board gallery L0 processor BUSY processor FAN-OUT board In a VME crate …

Trigger (Physics) Classes Trigger class - a basic processing structure throughout the CTP logic There are 50 independently programmable “physics” classes An additional test class - software-triggered, configured “on the fly” (application: calibration trigger, etc.) “Rules of engagement”: A cluster can be associated with an arbitrary number of trigger classes A trigger class, on the other hand, affects only a single cluster The associations are programmable

Generation of the Class L0 Trigger Trigger input selection: fully programmable Shared resources: 2 Scaled-down BCs (1 – 109) 2 Random triggers (1 – 109) 4 BC Masks (1 bit per bunch) Reduction of the class-trigger rate trigger pre-scalers (1 – 106) Cluster selection: Cluster BUSY (1 out of 6) Mandatory global vetoes: DAQ BUSY (trigger enable) CTP BUSY (CTP readout) CTP Dead Time (1.5µs)* All/Rare: boosts the acquisition of rare events * Practically has no effect on trigger efficiency

Past-future Protection circuit 4 independently programmable circuits at each trigger level (+1 for Test Class) 2 identical blocks, based on dual-port memory Sliding time-window during which the interaction signal (INTa/b) is counted Programmable parameters: Protection interval (ΔTa/b) 2 Thresholds (THa1/2, THb1/2) Output delay (a/b) Output logic function Delay and alignment of output signals

Local Trigger Unit (LTU) Uniform interface between the CTP and sub-detectors: easier control easier mods/upgrades Unique features: Full CTP emulation (stand-alone mode) Error emulation (front-end tests) VME, 6U form-factor Similar to other CTP boards

Context diagram of the LTU Front panel connections: Inputs from CTP (LVDS) Outputs to TTCvi, TTCex L0, BUSY – sub-detector TTCvi functionality now incorporated in upgraded LTU firmware (LTUvi) Hence no longer used.

Block diagram of the LTU LTU modes: Global (run) mode - propagates CTP signals Stand-alone mode - provides full CTP emulation

Switchboard CTP designed to handle up to 24 simultaneous L0 inputs Recent hardware upgrade CTP designed to handle up to 24 simultaneous L0 inputs covered all known L0 inputs plus gave 6 spare at time of construction. Explosion of possible L0 inputs in last 2 years 18  >40 (although not all needed in single run) Solution: make Switchboard from programmable fan-in/out boards. 25 L0 inputs can be chosen from 50 at any time. All 50 switchboard inputs can be monitored – even if not included in trigger.

CTP - September 2008 Beam pick-up T0 SPD V0 CTP in stable operation with up to 12 sub-detectors in single cluster and with multiple clusters. (i.e. several hours at a time of stable running). During brief period of circulating beams: Timing measurements for trigger inputs made and corrected in software. Trigger on pixel detector (SPD) & trigger detectors. Also triggers on bunch crossings Trigger timing (before alignment) versus bunch number single shot for SPD, V0, beam-pickup BPTX, T0 triggers

Summary ALICE Trigger electronics installed and successfully commissioned with all major sub-detectors and control systems. Several upgrades to hardware, firmware and software made since installation. The ALICE trigger system provided stable operation during several months of cosmic ray running and during first beam from LHC in Sept 2008. We are ready for looking forward to first collisions later this year. Many thanks to the RT2009 organisers and thank you for listening.