Presentation is loading. Please wait.

Presentation is loading. Please wait.

High Level Triggering Fred Wickens. 2 High Level Triggering (HLT) Introduction to triggering and HLT systems –What is Triggering –What is High Level Triggering.

Similar presentations


Presentation on theme: "High Level Triggering Fred Wickens. 2 High Level Triggering (HLT) Introduction to triggering and HLT systems –What is Triggering –What is High Level Triggering."— Presentation transcript:

1 High Level Triggering Fred Wickens

2 2 High Level Triggering (HLT) Introduction to triggering and HLT systems –What is Triggering –What is High Level Triggering –Why do we need it Case study of ATLAS HLT (+ some comparisons with other experiments) Summary

3 3 Simple trigger for spark chamber set-up

4 4 Dead time Experiments frozen from trigger to end of readout –Trigger rate with no deadtime = R per sec. –Dead time / trigger =  sec. –For 1 second of live time = 1 + R  seconds –Live time fraction = 1/(1 + R  ) –Real trigger rate = R/(1 + R  ) per sec.

5 5 Trigger systems 1980’s and 90’s bigger experiments  more data per event higher luminosities  more triggers per second –both led to increased fractional deadtime use multi-level triggers to reduce dead-time –first level - fast detectors, fast algorithms –higher levels can use data from slower detectors and more complex algorithms to obtain better event selection/background rejection

6 6 Trigger systems 1990’s and 2000’s Dead-time was not the only problem Experiments focussed on rarer processes –Need large statistics of these rare events –But increasingly difficult to select the interesting events –DAQ system (and off-line analysis capability) under increasing strain - limiting useful event statistics This is a major issue at hadron colliders, but will also be significant at ILC Use the High Level Trigger to reduce the requirements for –The DAQ system –Off-line data storage and off-line analysis

7 7 Summary of ATLAS Data Flow Rates From detectors> 10 14 Bytes/sec After Level-1 accept~ 10 11 Bytes/sec Into event builder~ 10 9 Bytes/sec Onto permanent storage~ 10 8 Bytes/sec  ~ 10 15 Bytes/year

8 8 TDAQ Comparisons

9 9 The evolution of DAQ systems

10 10 Typical architecture 2000+

11 11 Level 1 (Sometimes called Level-0 - LHCb) Time:one  very few microseconds Standard electronics modules for small systems Dedicated logic for larger systems –ASIC - Application Specific Integrated Circuits –FPGA - Field Programmable Gate Arrays Reduced granularity and precision –calorimeter energy sums –tracking by masks Event data stored in front-end electronics (at LHC use pipeline as collision rate shorter than Level-1 decision time)

12 12 Level 2 1) few microseconds (10-100) –hardwired, fixed algorithm, adjustable parameters 2) few milliseconds (1-100) –Dedicated microprocessors, adjustable algorithm 3-D, fine grain calorimetry tracking, matching Topology –Different sub-detectors handled in parallel Primitives from each detector may be combined in a global trigger processor or passed to next level

13 13 Level 2 - cont’d 3) few milliseconds (10-100) - 2008 –Processor farm with Linux PC’s –Partial events received with high-speed network –Specialised algorithms –Each event allocated to a single processor, large farm of processors to handle rate –If separate Level 2, data from each event stored in many parallel buffers (each dedicated to a small part of the detector)

14 14 Level 3 millisecs to seconds processor farm –microprocessors/emulators/workstations –Now standard server PC’s full or partial event reconstruction –after event building (collection of all data from all detectors) Each event allocated to a single processor, large farm of processors to handle rate

15 15 Summary of Introduction For many physics analyses, aim is to obtain as high statistics as possible for a given process –We cannot afford to handle or store all of the data a detector can produce! What does the trigger do –select the most interesting events from the myriad of events seen I.e. Obtain better use of limited output band-width Throw away less interesting events Keep all of the good events(or as many as possible) –But note must get it right - any good events thrown away are lost for ever! High level trigger allows much more complex selection algorithms

16 Case study of the ATLAS HLT system Concentrate on issues relevant for ATLAS (CMS very similar issues), but try to address some more general points

17 17 Starting points for any HLT system physics programme for the experiment –what are you trying to measure accelerator parameters –what rates and structures detector and trigger performance –what data is available –what trigger resources do we have to use it

18 18 Interesting events are buried in a sea of soft interactions Higgs production High energy QCD jet production Physics at the LHC B physics top physics

19 19 The LHC and ATLAS/CMS LHC has –design luminosity 10 34 cm -2 s -1 (In 2008 from 10 30 - 10 32 ?) –bunch separation 25 ns (bunch length ~1 ns) This results in – ~ 23 interactions / bunch crossing ~ 80 charged particles (mainly soft pions) / interaction ~2000 charged particles / bunch crossing Total interaction rate10 9 sec -1 –b-physicsfraction ~ 10 -3 10 6 sec -1 –t-physicsfraction ~ 10 -8 10 sec -1 –Higgsfraction ~ 10 -11 10 -2 sec -1

20 20 Physics programme Higgs signal extraction important but very difficult Also there is lots of other interesting physics –B physics and CP violation –quarks, gluons and QCD –top quarks –SUSY –‘new’ physics Programme will evolve with: luminosity, HLT capacity and understanding of the detector –low luminosity (2008 - 2009) high PT programme (Higgs etc.) b-physics programme (CP measurements) –high luminosity (2010?) high PT programme (Higgs etc.) searches for new physics

21 21 Trigger strategy at LHC To avoid being overwhelmed use signatures with small backgrounds –Leptons –High mass resonances –Heavy quarks The trigger selection looks for events with: –Isolated leptons and photons, –  -, central- and forward-jets –Events with high E T –Events with missing E T

22 22 ObjectsPhysics signatures Electron 1e>25, 2e>15 GeVHiggs (SM, MSSM), new gauge bosons, extra dimensions, SUSY, W, top Photon 1γ>60, 2γ>20 GeVHiggs (SM, MSSM), extra dimensions, SUSY Muon 1μ>20, 2μ>10 GeVHiggs (SM, MSSM), new gauge bosons, extra dimensions, SUSY, W, top Jet 1j>360, 3j>150, 4j>100 GeVSUSY, compositeness, resonances Jet >60 + E T miss >60 GeVSUSY, exotics Tau >30 + E T miss >40 GeVExtended Higgs models, SUSY Example Physics signatures

23 23 ARCHITECTURE 40 MHz TriggerDAQ ~1 PB/s (equivalent) ~ 200 Hz~ 300 MB/sPhysics Three logical levels LVL1 - Fastest: Only Calo and Mu Hardwired LVL2 - Local: LVL1 refinement + track association LVL3 - Full event: “Offline” analysis ~2.5  s ~40 ms ~4 sec. Hierarchical data-flow On-detector electronics: Pipelines Event fragments buffered in parallel Full event in processor farm

24 24 Selected (inclusive) signatures

25 25 Trigger design - Level-1 Level-1 –sets the context for the HLT –reduces triggers to ~75 kHz –has a very short time budget few micro-sec (ATLAS/CMS ~2.5 - much used in cable delays!) Detectors used must provide data very promptly, must be simple to analyse –Coarse grain data from calorimeters –Fast parts of muon spectrometer (I.e. not precision chambers) –NOT precision trackers - too slow, too complex –(LHCb does use some simple tracking data from their VELO detector to veto events with more than 1 primary vertex) –(CMS plans track trigger for sLHC - L1 time => ~6 micro-s) –Proposed FP420 detectors provide data too late

26 26 ATLAS Level-1 trigger system Calorimeter and muon –trigger on inclusive signatures muons; em/tau/jet calo clusters; missing and sum E T Hardware trigger –Programmable thresholds –Selection based on multiplicities and thresholds

27 27 ATLAS em cluster trigger algorithm “Sliding window” algorithm repeated for each of ~4000 cells

28 28 ATLAS Level 1 Muon trigger RPC: Restive Plate ChambersTGC: Thin Gap ChambersMDT: Monitored Drift Tubes RPC - Trigger Chambers - TGC Measure muon momentum with very simple tracking in a few planes of trigger chambers

29 29 Level-1 Selection The Level-1 trigger - an “or” of a large number of inclusive signals - set to match the current physics priorities and beam conditions Precision of cuts at Level-1 is generally limited Adjust the overall Level-1 accept rate (and the relative frequency of different triggers) by –Adjusting thresholds –Pre-scaling (e.g. only accept every 10th trigger of a particular type) higher rate triggers Can be used to include a low rate of calibration events Menu can be changed at the start of run –Pre-scale factors may change during the course of a run

30 30 Example Level-1 Menu for 2x10^33 Level-1 signatureOutput Rate (Hz) EM25i12000 2EM15i4000 MU20800 2MU6200 J200200 3J90200 4J65200 J60 + XE60400 TAU25i + XE302000 MU10 + EM15i100 Others (pre-scaled, exclusive, monitor, calibration)5000 Total~25000

31 31 Trigger design - Level-2 Level-2 reduce triggers to ~2 kHz –Note CMS does not have a physically separate Level-2 trigger, but the HLT processors include a first stage of Level-2 algorithms Level-2 trigger has a short time budget –ATLAS ~40 milli-sec average Note for Level-1 the time budget is a hard limit for every event, for the High Level Trigger it is the average that matters, so a some events can take several times the average, provided thay are a minority Full detector data is available, but to minimise resources needed: –Limit the data accessed –Only unpack detector data when it is needed –Use information from Level-1 to guide the process –Analysis proceeds in steps with possibility to reject event after each step –Use custom algorithms

32 32 Regions of Interest The Level-1 selection is dominated by local signatures (I.e. within Region of Interest - RoI) –Based on coarse granularity data from calo and mu only Typically, there are 1-2 RoI/event ATLAS uses RoI’s to reduce network b/w and processing power required

33 33 Trigger design - Level-2 - cont’d Processing scheme –extract features from sub-detector data in each RoI –combine features from one RoI into object –combine objects to test event topology Precision of Level-2 cuts –Emphasis is on very fast algorithms with reasonable accuracy Do not include many corrections which may be applied off-line –Calibrations and alignment available for trigger not as precise as ones available for off-line

34 34 ARCHITECTURE HLTHLT 40 MHz 75 kHz ~2 kHz ~ 200 Hz 40 MHz RoI data = 1-2% ~2 GB/s FE Pipelines 2.5  s LVL1 accept Read-Out Drivers ROD LVL1 2.5  s Calorimeter Trigger Muon Trigger Event Builder EB ~3 GB/s ROS Read-Out Sub-systems Read-Out Buffers ROB 120 GB/sRead-Out Links Calo MuTrCh Other detectors ~ 1 PB/s Event Filter EFP ~ 1 sec EFN ~3 GB/s ~ 300 MB/s TriggerDAQ LVL2 ~ 10 ms L2P L2SV L2N L2P ROIB LVL2 accept RoI requests RoI’s

35 35 CMS Event Building CMS perform Event Building after Level-1 This simplifies the architecture, but places much higher demand on technology: –Network traffic ~100 GB/s Use Myrinet instead of GbE for the EB network Plan a number of independent slices with barrel shifter to switch to a new slice at each event –Time will tell which philosophy is better

36 36 t i m e e30i + Signature  ecand + Signature  e e + e30 + Signature  EM20i + Level1 seed  Cluster shape Cluster shape STEP 1 Iso– lation Iso– lation STEP 4 pt> 30GeV pt> 30GeV STEP 3 track finding track finding STEP 2 HLT Strategy: Validate step-by-step Check intermediate signatures Reject as early as possible Sequential/modular approach facilitates early rejection LVL1 triggers on two isolated e/m clusters with pT>20GeV (possible signature: Z–>ee) Example for Two electron trigger

37 37 Trigger design - Event Filter / Level-3 Event Filter reduce triggers to ~200 Hz Event Filter budget ~ 4 sec average Full event detector data is available, but to minimise resources needed: –Only unpack detector data when it is needed –Use information from Level-2 to guide the process –Analysis proceeds in steps with possibility to reject event after each step –Use optimised off-line algorithms

38 38 Electron slice at the EF TrigCaloRec EF tracking TrigEgammaRec EFTrackHypo Wrapper of CaloRec Wrapper of newTracking Wrapper of EgammaRec EFCaloHypo EFEgammaHypo matches electromagnetic clusters with tracks and builds egamma objects

39 39 HLT Processing at LHCb

40 40 Trigger design - HLT strategy Level 2 –confirm Level 1, some inclusive, some semi- inclusive, some simple topology triggers, vertex reconstruction (e.g. two particle mass cuts to select Zs) Level 3 –confirm Level 2, more refined topology selection, near off-line code

41 41 Example HLT Menu for 2x10^33 HLT signatureOutput Rate (Hz) e25i40 2e15i<1 gamma60i25 2gamma20i2 mu20i40 2mu1010 j40010 3j16510 4j11010 j70 + xE7020 tau35i + xE455 2mu6 with vertex, decay-length and mass cuts (J/psi, psi ’, B)10 Others (pre-scaled, exclusive, monitor, calibration)20 Total~200

42 42 Example B-physics Menu for 10^33 LVL1 : MU6 rate 24kHz (note there are large uncertainties in cross-section) In case of larger rates use MU8 => 1/2xRate 2MU6 LVL2: Run muFast in LVL1 RoI ~ 9kHz Run ID recon. in muFast RoI mu6 (combined muon & ID) ~ 5kHz Run TrigDiMuon seeded by mu6 RoI (or MU6) Make exclusive and semi-inclusive selections using loose cuts –B(mumu), B(mumu)X, J/psi(mumu) Run IDSCAN in Jet RoI, make selection for Ds(PhiPi) EF: Redo muon reconstruction in LVL2 (LVL1) RoI Redo track reconstruction in Jet RoI Selections for B(mumu) B(mumuK*) B(mumuPhi), BsDsPhiPi etc.

43 43 LHCb Trigger Menu

44 44 Matching problem Background Physics channel Off-line On-line

45 45 Matching problem (cont.) ideally –off-line algorithms select phase space which shrink-wraps the physics channel –trigger algorithms shrink-wrap the off-line selection in practice, this doesn’t happen –need to match the off-line algorithm selection For this reason many trigger studies quote trigger efficiency wrt events which pass off-line selection –BUT off-line can change algorithm, re-process and recalibrate at a later stage SO, make sure on-line algorithm selection is well known, controlled and monitored

46 46 Selection and rejection as selection criteria are tightened –background rejection improves –BUT event selection efficiency decreases

47 47 Selection and rejection Example of a ATLAS Event Filter (I.e. Level-3) study of the effectiveness of various discriminants used to select 25 GeV electrons from a background of dijets

48 48 Other issues for the Trigger Efficiency and Monitoring –In general need high trigger efficiency –Also for many analyses need a well known efficiency Monitor efficiency by various means –Overlapping triggers –Pre-scaled samples of triggers in tagging mode (pass-through) Final detector calibration and alignment constants not available immediately - keep as up-to-date as possible and allow for the lower precision in the trigger cuts when defining trigger menus and in subsequent analyses Code used in trigger needs to be very robust - low memory leaks, low crash rate, fast Beam conditions and HLT resources will evolve over several years (for both ATLAS and CMS) –In 2008 luminosity low, but also HLT capacity will be < 50% of full system (funding constraints)

49 49 Summary High-level triggers allow complex selection procedures to be applied as the data is taken –Thus allow large numbers of events to be accumulated, even in presence of very large backgrounds –Especially important at LHC - but significant at most accelerators The trigger stages - in the ATLAS example –Level 1 uses inclusive signatures muons; em/tau/jet calo clusters; missing and sum E T –Level 2 refines Level 1 selection, adds simple topology triggers, vertex reconstruction, etc –Level 3 refines Level 2 adds more refined topology selection Trigger menus need to be defined, taking into account: –Physics priorities, beam conditions, HLT resources Include items for monitoring trigger efficiency and calibration Must get it right - any events thrown away are lost for ever!

50 50 Additional Foils

51 51

52 52 The evolution of DAQ systems

53 53 ATLAS Detector

54 54 ATLAS event - tracker end-view

55 55 Trigger functional design Level 1 Input 40 MHz Accept 75 kHz Latency 2.5 μs  Inclusive triggers based on fast detectors  Muon, electron/photon, jet, sum and missing E T triggers  Coarse(r) granularity, low(er) resolution data  Special purpose hardware (FPGAs, ASICs) Level 2 Input 75 (100) kHz Accept O(1) kHz Latency ~10 ms  Confirm Level 1 and add track information  Mainly inclusive but some simple event topology triggers  Full granularity and resolution available  Farm of commercial processors with special algorithms Event Filter Input O(1) kHz Accept O(100) Hz Latency ~secs  Full event reconstruction  Confirm Level 2; topology triggers  Farm of commercial processors using near off-line code

56 56 SDX1 USA15 UX15 ATLAS Trigger / DAQ Data Flow ATLAS detector Read- Out Drivers ( RODs ) First- level trigger Read-Out Subsystems ( ROSs ) UX15 USA15 Dedicated links Timing Trigger Control (TTC) 1600 Read- Out Links Gigabit Ethernet RoI Builder pROS Regions Of Interest VME ~150 PCs Data of events accepted by first-level trigger Event data requests Delete commands Requested event data stores LVL2 output Event data pushed @ ≤ 100 kHz, 1600 fragments of ~ 1 kByte each Second- level trigger LVL2 Super- visor SDX1 CERN computer centre DataFlow Manager Event Filter (EF) pROS ~ 500~1600 stores LVL2 output dual-socket server PC’s ~100~30 Network switches Event data pulled: partial events @ ≤ 100 kHz, full events @ ~ 3 kHz Event rate ~ 200 Hz Data storage Local Storage SubFarm Outputs (SFOs) LVL2 farm Network switches Event Builder SubFarm Inputs (SFIs)

57 57 Event’s Eye View - step-1 At each beam crossing latch data into detector front end After processing, data put into many parallel pipelines - moves along the pipeline at every bunch crossing, falls out the far end after 2.5 microsecs Also send calo + mu trigger data to Level-1

58 58 Event’s Eye View - step-2 The Level-1 Central Trigger Processor combines the information from the Muon and Calo triggers and when appropriate generates the Level-1 Accept (L1A) The L1A is distributed in real-time via the TTC system to the detector front-ends to send data from the accepted event to the detector ROD’s (Read-Out Drivers) –Note must arrive before data has dropped out of the pipe- line - hence hard dead-line of 2.5 micro-secs –The TTC system (Trigger, Timing and Control) is a CERN system used by all of the LHC experiments. Allows very precise real-time data distribution of small data packets Detector ROD’s receive data, process and reformat it as necessary and send via fibre links to TDAQ ROS

59 59 Event’s Eye View - Step-3 At L1A the different parts of LVL1 also send RoI data to the RoI Builder (RoIB), which combines the information and sends as a single packet to a Level-2 Supervisor PC –The RoIB is implemented as a number of VME boards with FPGAs to identify and combine the fragments coming from the same event from the different parts of Level-1

60 60 ATLAS Level-2 Trigger Read-Out Subsystems ( ROSs ) USA15 Gigabit Ethernet RoI Builder pROS Regions Of Interest ~150 PCs Event data requests Requested event data stores LVL2 output Second- level trigger LVL2 Super- visor SDX1 CERN computer centre DataFlow Manager Event Filter (EF) pROS ~ 500~1600 stores LVL2 output dual-socket server PC’s ~100~30 Network switches Event data for Level-2 pulled: partial events @ ≤ 100 kHz Event rate ~ 200 Hz Data storage Local Storage SubFarm Outputs (SFOs) LVL2 farm Network switches Event Builder SubFarm Inputs (SFIs) Region of Interest Builder (RoIB) passes formatted information to one of the LVL2 supervisors. LVL2 supervisor selects one of the processors in the LVL2 farm and sends it the RoI information. LVL2 processor requests data from the ROSs as needed (possibly in several steps), produces an accept or reject and informs the LVL2 supervisor. Result of processing is stored in pseudo-ROS (pROS) for an accept. Reduces network traffic to ~2 GB/s c.f. ~150 GB/s if do full event build LVL2 supervisor passes decision to the DataFlow Manager (controls Event Building). Step-4

61 61 ATLAS Event Building Read-Out Subsystems ( ROSs ) USA15 Gigabit Ethernet RoI Builder pROS Regions Of Interest ~150 PCs Event data requests Delete commands Requested event data stores LVL2 output Second- level trigger LVL2 Super- visor SDX1 CERN computer centre DataFlow Manager Event Filter (EF) pROS ~ 500~1600 stores LVL2 output dual-socket server PC’s ~100~30 Network switches Event data after Level-2 pulled: full events @ ~3 kHz Event rate ~ 200 Hz Data storage Local Storage SubFarm Outputs (SFOs) LVL2 farm Network switches Event Builder SubFarm Inputs (SFIs) For each accepted event the DataFlow Manager selects a Sub- Farm Input (SFI) and sends it a request to take care of the building of a complete Event. The SFI sends requests to all ROSs for data of the event to be built. Completion of building is reported to the DataFlow Manager. For rejected events and for events for which event Building has completed the DataFlow Manager sends "clears" to the ROSs (for 100 - 300 events Together). Network traffic for Event Building is ~5 GB/s Step-5

62 62 ATLAS Event Filter Read-Out Subsystems ( ROSs ) USA15 Gigabit Ethernet RoI Builder pROS Regions Of Interest ~150 PCs Event data requests Delete commands Requested event data stores LVL2 output Second- level trigger LVL2 Super- visor SDX1 CERN computer centre DataFlow Manager Event Filter (EF) pROS ~ 500~1600 stores LVL2 output dual-socket server PC’s ~100~30 Network switches Event rate ~ 200 Hz Data storage Local Storage SubFarm Outputs (SFOs) LVL2 farm Network switches Event Builder SubFarm Inputs (SFIs) A process (EFD) running in each Event Filter farm node collects each complete event from the SFI and assigns it to one of a number of Processing Task’s in that node The Event Filter uses more sophisticated algorithms (near or adapted off-line) and more detailed calibration data to select events based on the complete event data Accepted events are sent to SFO (Sub-Farm Output) node to be written to disk Step-6

63 63 ATLAS Data Output Read-Out Subsystems ( ROSs ) USA15 Gigabit Ethernet RoI Builder pROS Regions Of Interest ~150 PCs Event data requests Delete commands Requested event data stores LVL2 output Second- level trigger LVL2 Super- visor SDX1 CERN computer centre DataFlow Manager Event Filter (EF) pROS ~ 500~1600 stores LVL2 output dual-socket server PC’s ~100~30 Network switches Event rate ~ 200 Hz Data storage Local Storage SubFarm Outputs (SFOs) LVL2 farm Network switches Event Builder SubFarm Inputs (SFIs) The SFO nodes receive the final accepted events and writes them to disk The events include ‘Stream Tags’ to support multiple simultaneous files (e.g. Express Stream, Calibration, b- physics stream, etc) Files are closed when they reach 2 GB or at end of run Closed files are finally transmitted via GbE to the CERN Tier-0 for off-line analysis Step-7

64 64 ATLAS HLT Hardware First 4 racks of HLT processors, each rack contains -~30 HLT PC’s (PC’s very similar to Tier-0/1 compute nodes) -2 Gigabit Ethernet Switches -a dedicated Local File Server

65 65 ATLAS TDAQ Barrack Rack Layout

66 66 Naming Convention First Level Trigger (LVL1) Signatures in capitals e.g. LVL1 HLTtype EM eelectron gphoton MUmumuon HAtau FJ fj forward jet JEjejet energy JTjtjet TMxemissing energy HLT in lower case: name threshold isolated mu 20 i _ passEF EF in tagging mode name threshold isolated MU 20 I New in 13.0.30: Threshold is cut value applied previously was ~95% effic. point. More details : see : https://twiki.cern.ch/twiki/bin/view/Atlas/TriggerPhysicsMenu https://twiki.cern.ch/twiki/bin/view/Atlas/TriggerPhysicsMenu

67 67 Min Bias Triggers Based on SP Counting Trigger if: >40 SCT SP or > 900 Pixel Clusters Trigger if: >40 SCT SP or > 900 Pixels clusters To be done: add MBTS trigger Min. Bias Trigger available for the first time in 13.0.30.3 MBTS – Scintillators on the inside of endcap calorimeter giving LVL1 info.

68 68 Electron Menu Coverage for L=10 31 cm -2 s -1 s TriggerPhysics coverageRate Single electron triggers and pre-scaled triggers with HLT pass-thru’ for commissioning needs Selections with isolated/non-isolated LVL1 thresholds Triggers with L2 and/or EF pass-through e.g. e15, e15i, e15_passHLT, e15i_passHLT, e20_passL2, e20_passEF 5 Hz Low mass pairs J/ ,  ee, DY are sources of isolated e with large stat., useful for calib. at low-p T, efficiency extraction at low-p T e.g. 2e5, 2e10, e5+e7, e5+e10 6 Hz Low-medium-p T double/triple e-trigger Z  ee, Susy, new phenomena e.g. 2e10, 2e15, 3e15 5 Hz High-p T single e-trigger (LVL1 p T ~18 GeV) W  e, Z  ee, top, Susy, Higgs, Exotics etc. Loose selections and lots of redundancy e.g. e20, e20i, e25i, e15_xE20, e10_xE30, e105 5 Hz Very high-p T e-triggerExotics, new phenomena e.g. em105_passHLT1.5 Hz Low p T single e-trigger (LVL1 p T ~7 GeV) Electrons from b,c decays (e typically not well isolated) Useful for E/p studies. Need tighter cuts to limit rate e.g. e12 17 Hz 16 LVL1 Thresholds for EM (electron, photon) & HA (tau) EM3, EM7, EM13, EM13I, EM18, EM18I, EM23I, EM100

69 69 Photon Menus for 10 31 Trigger ItemExamplesPhysics CoverageRate Low pt item HLT pre-scale 10 or 100 g10, g15, g15iHadronic calibration, inclusive and di- photon cross section 4 Hz High pt item, no pre- scale Very high pt item non isolated g20, g20i, g25, g25i g105, g120 Direct photon, hadronic calibration Exotics, SUSY, unknown, hadronic calibration 7 Hz Multi photon, no Isol. no HLT prescale 2g10, 2g15, 2g20, 3g10 Di-photon cross section, Exotics, SUSY, calibration 5 Hz Triggers for commissioning with LVL1 prescale and HLT in tagging mode em15_passHLT, em_15i_passHLT g10_passL2 g10_passEF Selections with/without L1 isolation, triggers with L2/EF pass-through Total rate (including overlaps) ~10 Hz

70 70 Muon Triggers TriggersExamplesMotivationRate Prescaled Low p T single  Unprescaled Low p T dimuon mu4, mu6 2mu4, mu4+mu6, 2mu6 B-physics, J/ , , DY 4 Hz 2.5 Hz Prescaled triggers with HLT pass-thru’ mu20i with calculating but not applying isolation mu20_passHLTcommissioning0.5 Hz high p T triggers with/without isolation mu10, mu15, mu20, mu20i, mu40 2mu10, 2mu20 high-p T physics: Z(  ), Susy, Higgs, Exotics etc. 20 Hz Six LVL1 thresholds : MU4, MU6, MU10, MU15, MU20, MU40 Isolation can be applied at the HLT

71 71 Bphysics LVL1 + Muon at HLT 2mu4 : 2.5 Hz mu4 & mu6 pre-scaled : 4 Hz LVL1 + ID & MU at HLT: mu4_DsPhiPi_FS, MU4_Jpsimumu_FS, MU4_Upsimumu_FS, MU4_Bmumu_FS, MU4_BmumuX_FS Loose selections ~10Hz

72 72 Tau Triggers SignatureExampleMotivationRate Single tau prescaled single tau unprescaled tau45, tau45i tau60, tau100 exotics and heavy Higgs 15 Hz Tau+METtau20i+xe30 W ->  at low luminosity and H−> , SUSY, etc at high lumi. 5 Hz TauTau2tau25i, 2tau35iH->tautau3 Hz tau+e,mu,tau,jettau20i_e10, tau20i_mu10, tau20i_j70, tau20i_4j50, tau20i_bj18 Z tt, preparation for 10 33 SUSY, Charged Higgs 5 Hz 16 LVL1 Thresholds for EM (electron, photon) & HA (tau) HA5, HA6, HA9I, HA11I, HA16I, HA25, HA25I, HA40

73 73 Single Jet Triggers Strategy: Initially use LVL1 selection with no active HLT selection and b-jet trigger in tagging mode 8 LVL1 Jet thresholds: –Highest un-prescaled, value determined by rate considerations (Aim for ~20Hz) –Other threshold set to equalize bandwidth across the E T spectrum –Lowest threshold used to provide RoI for Bphysics trigger.

74 74 Jet Triggers (contd) TriggersMotivation single jetj5,j10,j18,j23,j35,j42,j70,j120,j200,j400QCD, Exotics multi-jet3J10, 4J10, 3J18, 3J23, 4J18, 4J23, 4J35searches pp->XX, X->jj, top, SUSY forward jetsFJ10, FJ18,FJ26, FJ65, 2FJ10, 2FG26, 2FJ65, FJ65_FJ26 VBF jet energy sumJE280, JE340SUSY Trigger Rates for multi-jets Trigger Rates for Forward Jets

75 75 Bjet Triggers Jets tagged as B-jets at HLT based on track information Will allow lower LVL1 jet thresholds to be used For initial running the Bjet triggers will be in tagging mode. Active selection will be switched on once the detector & trigger are understood.

76 76 Missing ET, Total SumET 8 LVL1 Missing ET thresholds

77 77 Combined Triggers TypeExamplesMotivation tau+e, tau+mu, e+mutau15i_e10, tau25i_mu6, tau20i_mu10, e10_mu6 tt, SUSY tau+Missing ETtau45_xe40, tau45i_xe20 W, tt, SUSY, exotics tau+jettau25i_j70W, tt, SUSY, exotics mu+jetmu4_j10exotics jet + missing ETj70_xe30SUSY, exotics Menu contains large no. combined signatures Total Rate 46 Hz

78 78 Total Rates Rate (Hz) LVL147,000 LVL2865 EF200 15 LVL1 LVL2 EF


Download ppt "High Level Triggering Fred Wickens. 2 High Level Triggering (HLT) Introduction to triggering and HLT systems –What is Triggering –What is High Level Triggering."

Similar presentations


Ads by Google