Presentation is loading. Please wait.

Presentation is loading. Please wait.

UMD DOE Review - 30-Aug-05 Drew Baden1 CMS Trigger Elecctronics CMS HCAL TriDAS is a joint effort between: –University of Maryland – Drew Baden –Boston.

Similar presentations


Presentation on theme: "UMD DOE Review - 30-Aug-05 Drew Baden1 CMS Trigger Elecctronics CMS HCAL TriDAS is a joint effort between: –University of Maryland – Drew Baden –Boston."— Presentation transcript:

1 UMD DOE Review - 30-Aug-05 Drew Baden1 CMS Trigger Elecctronics CMS HCAL TriDAS is a joint effort between: –University of Maryland – Drew Baden –Boston University – Jim Rohlf –Princeton University – Chris Tully and Dan Marlow –University of Minnesota – Jeremy Mans –University of Virginia – CMS application pending… Maryland Personnel –Physicists Baden – Level 3 WBS manager for HCAL Trigger/DAQ Jeremy Mans – with us for FY 2004, just left for new Asst Prof position at Univ of Minnesota –Engineers: Dr. Tullio Grassi (CMS project funds) – integration and firmware Bard (HEP) Department-subsidized electronics instrumentation group as needed –Students 1 new RA and 3 undergraduates helped with production this summer

2 UMD DOE Review - 30-Aug-05 Drew Baden2 Trigger/DAQ

3 UMD DOE Review - 30-Aug-05 Drew Baden3 Scope HCAL Trigger/DAQ –Data flow: FrontEnd→ Receiver/Trigger → Concentrator → Farms –“Receiver/Trigger” onward is our responsibility aka TriDAS  Trigger/Data Acquisition System in CMS parlance –Ongoing effort for past 6 years Luminosity –Produce signal for accelerator to tune for CMS Need something that is not dependent upon CMS trigger running Use forward Hadron Calorimeter (HF) –Will also be using this to keep track of CMS sensitivity (“Luminosity database”) –Now working with Dan Marlow (Princeton) and Jeremy Mans (UMN) HF Trigger upgrade –Investigate scheme for running at SLHC (10 35 !) –Optimize for weak boson fusion Higgs production using forward tagged jets

4 UMD DOE Review - 30-Aug-05 Drew Baden4 TriDAS Overview CMS Trigger: Emphasis is on bandwidth and commercial processors Level 1 –3  s latency inside L1 trigger –100 kHz average L1 accept rate (1/400) –100 Gbyte/sec into Level 2 Level 1 Trigger HCAL Data QIE Fibers Trigger/DAQ Level 2/3 DAQ CMS HCAL “Front End” Baden is HCAL “Level 3 WBS Manager” for TriDAS

5 UMD DOE Review - 30-Aug-05 Drew Baden5 CMS: Synchronous Pipeline 25ns between collisions –Similar to Hera (H1 and Zeus) 90ns –Different from Hera since integration times can be over many buckets  Real-time filtering to associate energy (from >1 bucket) with “event” concept Pipeline will not stop (except for a full system reset) –Every 25ns, always another entry –Errors must be dealt with in the data flow –Can inhibit L1 accepts if needed, but not L1 queries Level 1 latency is fixed –Every ~100 buckets, a L1 decision on the event collected ~100 buckets ago All subsystems must have enough buffering to hold this many events All subsystems must be able to find the event in their pipeline Level 2 is a processor farm –Full readout, this is the DAQ path –Event-building implemented via massively parallel commercial structure

6 UMD DOE Review - 30-Aug-05 Drew Baden6 HCAL Electronics Overview FE MODULE FRONT-END Readout Box (RBX) On detector HPD Shield Wall SBSSBS 12 HTRs READ-OUT Crate Trigger Primitives GOL TTC GOL HTRHTR HTRHTR Level 1 TRIGGER CERN Transmitter 40 bits @40 MHz 20 bits @ 80 MHz =1.6 Gbps FIBERS S-Link: 64 bits @ 25 MHz Rack CPU CLKCLK HTRHTR QIE CCA 2 DCC 1 PC Interface 1 Clk board DCCDCC analog optical signals from HCAL

7 UMD DOE Review - 30-Aug-05 Drew Baden7 HCAL VME Crate VME Bridge module (CAEN) –Configuration and monitoring over VME Fanout module –Receives TTC stream –Clones and fans out timing signals –Global HCAL synchronization w/RCT HCAL Receiver & Trigger (HTR) module –FE-fiber input, linearizers, filters… –Maintains pipeline –TP output via SLBs to RCT –DAQ output of raw/TP data to DCC –Spy over VME for monitoring Data Concentrator Card (DCC) –Inputs from HTRs –Output to DAQ –Generates busy if needed –Spy output via VME VME CRATE 10m Copper 1.2 Gb/s DAQ Calorimeter Regional Trigger BRIDGEBRIDGE Fiber 1.6 Gb/s FanOut FanOut HTRHTR Front End Electronics HTRHTR DCCDCC HTRHTR HTRHTR... TTC fiber

8 UMD DOE Review - 30-Aug-05 Drew Baden8 1.Receive HCAL data from front-ends Synchronize optical links Data validation and linearization Form “trigger primitives” and transmit to Level 1 at 40 MHz Pipeline data, wait for Level 1 accept –Upon receiving L1A: »Zero suppress, format, & transmit raw data to the concentrator (no filtering) »Transmit all trigger primitives along with raw data »Handle DAQ synchronization issues (if any) 2.Calibration processing and buffering of: Radioactive source calibration data Laser/LED calibration data 3.Support a VME data spy monitoring HTR Principal Functions

9 UMD DOE Review - 30-Aug-05 Drew Baden9 HTR Board Description Digital fiber data from front-end (FE) –16 fibers per HTR, 3 channels/fiber, 1.6 Gbaud link –Bit Error Rate (BER) requirement < 1 in 10 12 bits We are shooting for < 1 in 10 15 bits, with optimism We have had to become experts in synchronous high speed serial data transmission –Total data rate 160MByte/s per link = 2.56GBybe/s per board –220 HTR boards running, total bandwidth into our system is 563 GByte/s Data is stored in a circular buffer (pipeline) Data is simultaneously sent to Level 1 for consideration –Trigger primitives are prepared, associated with crossing, and transmitted via daughterboards –Level 1 latency ~ 100 clock ticks, sets the size of the pipeline Level 1 accept 1 in 400 (on average) –Data from trigger ±3 time slots is transmitted to DCC for DAQ after zero suppression All logic and arithmetic implemented in FPGAs –Field Programmable Gate Array, firmware written by Tullio Grassi

10 UMD DOE Review - 30-Aug-05 Drew Baden10 SLB RX_CLK40 SLB RX_BC0 TTC TTCrx CLK80 Crystal Serial Optical Data Ref Clk Deserializers (8) 20 Recovered Clk TPG Path SYS40 Clk TTC Broadcast Async Fifo PLL TTC 40 Clk x2 XILINX LC Fiber Data Princeton Fanout Card (1/VME crate) SYS80 Clk HCAL Trigger/Readout (HTR) Board All I/O on front panel Fiber digital data Copper output to L1 and DCC FPGA logic Fully programmable

11 UMD DOE Review - 30-Aug-05 Drew Baden11 Dual-LC O-to-E VME Deserializers Xilinx XC2V3000-4 Stiffeners 6 SLBs TTC mezzanine HTR Card Production Version (Rev 4)

12 UMD DOE Review - 30-Aug-05 Drew Baden12 HTR Status Production goal –270 Rev 4 HTRs total –Be “Ready for Crates” this fall/winter CMS has postponed this until Spring 06 Current status: http://www.physics.umd.edu/hep/HTR/Rev4/checkout_HTRev4.html –PCB manufacture complete, assembly underway Vendor produces about 20/week Some parts procurement issues, nothing major –Checkout at Maryland, shipping to CERN Currently about 200 boards passed checkout, shipped to CERN –Will finish remainder by Nov 1 this year –Wel have plenty of HTRs to meet near term work needs for HCAL

13 UMD DOE Review - 30-Aug-05 Drew Baden13 HTR Firmware Tullio Grassi is the sole and primary author –Very mature, well simulated, battle tested in numerous HCAL test stands and test beams Items left to be done: –Summing HB/HE overlap and HF 2x3  x  We need the mapping to start this –Zero suppression for DAQ path (to get to 15% occupancy) Current scheme: put a threshold on the TPG associated with the crossing Need MC input on how to do this right Histogram firmware for HCAL sourcing done –New scope introduced in 2003, written and supported by Baden

14 UMD DOE Review - 30-Aug-05 Drew Baden14 Firmware DAQ format evolving –Maryland/Boston/Princeton collaboration Top-level view: See http://www.physics.umd.edu/hep/HTR/preprod/PreProdMainFPGA.pdfhttp://www.physics.umd.edu/hep/HTR/preprod/PreProdMainFPGA.pdf

15 UMD DOE Review - 30-Aug-05 Drew Baden15 LHC Clocking LEP ring is sensitive to: –Distortions in the large (27 km) circumference Tidal distortions Pressure from Lake Geneva –Return currents from DC trains running nearby LHC RF clock keeps 3564 buckets of protons circulating –CMS must remain synchronous with this clock –LEP was concerned about  E~few MeV, LHC will be concerned with  f ~ 25 ppm We have learned to handle this… TIDAL EFFECTS LAKE Geneva EFFECTS Train to Bellgarde EFFECT

16 UMD DOE Review - 30-Aug-05 Drew Baden16 Timing Signal Distribution Rack-to-Rack CAT 7 HTRHTR DCCDCC HTRHTR HTRHTR HTRHTR FANOUTFANOUT HTRHTR DCCDCC HTRHTR HTRHTR HTRHTR FANOUTFANOUT FANOUTFANOUT FANOUTFANOUT FANOUTFANOUT Trigger Timing Control TTC Stream (“RX_CLK”) HCAL VME Crates ECAL Timing is critical in a synchronous pipeline experiment!

17 UMD DOE Review - 30-Aug-05 Drew Baden17 Fanout board 2 operating modes: Global or Crate TTC fiber Clk80 Input from GLOBAL Fanout 18 Outputs 40MHz RX_CLK = 40MHz RX_BC0 INT_BC0 RX_CLK = 40MHz RX_BC0 QPLL EXT 80MHz QPLL can run stand-alone TTCrx EXT_BC0 FPGA Delay TTC Broadcast G G G C C C G

18 UMD DOE Review - 30-Aug-05 Drew Baden18 Luminosity

19 UMD DOE Review - 30-Aug-05 Drew Baden19 HF Luminosity Clients: –LHC accelerator needs something to tune on that is independent of whether CMS is running triggers –We need something to put up on a screen in the control room to tell us our luminosity –We need to keep track of our sensitivity in order to be able to measure cross sections Schemes (there are several…) –Use HF detector, count min bias overlaps –Optimize for linearity with respect to luminosity Iron fiber calorimeter. 3 < η < 5 HF

20 UMD DOE Review - 30-Aug-05 Drew Baden20 HF Luminosity Readout Path 9 HTRs for HF+ and HF- Each HTR has 1 output with luminosity info –100Mbps raw ethernet packets sent to router Router to computer over Gigabit ethernet Dead time, throttle, etc. info from GCT sent to CPU This computer will feed LHC, luminosity DB, etc. HTRHTR HTRHTR HF  9 HTRs/VME crate HTRHTR HTRHTR HF  ROUTERROUTER CPU Global Trigger Luminosity consumers Maryland/Princeton/Virginia

21 UMD DOE Review - 30-Aug-05 Drew Baden21 HF Luminosity R&D Overall responsibility of Dan Marlow… Additional mezzanine card to sit on SLB site –Built and tested at Maryland Firmware makes occupancy and sumET histograms in real time –Joint Maryland (Baden) and Minnesota (Mans) –Send histograms via ethernet to CPU WORKING!!! TOP BOT Crunch data on CPU, fanout to LHC, database, etc –Virginia (Hirosky) Combine with live time/trigger scalers for luminosity database –Princeton (Marlow) Goal –Have all of the hardware/firmware working in SX5 by summer 06 –Spend remainder of the time integrating into LHC (easy) and CMS (not easy)

22 UMD DOE Review - 30-Aug-05 Drew Baden22 Self-Triggering/ 2006 Slice Tests

23 UMD DOE Review - 30-Aug-05 Drew Baden23 2006 Tests Magnetic field mapping Cosmic tests with HCAL and MUON operating synchronously Operation a “vertical slice” of CMS –HCAL+ECAL+MUON+”mini tracker”+Solenoid –Add trigger and DAQ and other central systems HCAL –We will begin preparations for this by implementing a “self- triggering” capability this fall Joint effort by Jeremy Mans/Drew Baden/Tullio Grassi Alows us to understand muons in our detector NOW –Used same “luminosity” mezzanine card and majority logic board

24 UMD DOE Review - 30-Aug-05 Drew Baden24 “Vertical Slice” EB supermodule(s) TK (elements in dummy tube to be defined) TK dummy tube with alignment disk & cables 10 11 DT’s in sector 10 of YB+2, YB+1 and CSC’s of YE+1 lower 60 deg sector provide the principal triggers. HB+ active sectors

25 UMD DOE Review - 30-Aug-05 Drew Baden25 DAQ Software

26 UMD DOE Review - 30-Aug-05 Drew Baden26 DAQ Software In the last year, major work has been completed on the HCAL DAQ and controls software to prepare for operations in 2007. –Updated to new revision of framework (XDAQ) –Connections to the online database –Firmware management and validation –Improved debugging using a web-interface HCAL DAQ Software is very well advanced and ready for the cosmic challenge Jeremy Mans (now at UMN) is responsible for 99% of this!

27 UMD DOE Review - 30-Aug-05 Drew Baden27 LHC Upgrade/HF Jet Trigger

28 UMD DOE Review - 30-Aug-05 Drew Baden28 LHC Upgrade Add functionality to HCAL –W Boson Fusion (WBF) dominant experimentally accessible rate Forward jets + central Higgs decay Tag jets are in HF+HE so HE will need to be included –Higgs id without a tag is very hard Gluon fusion backgrounds are too high, esp at 10 35 –Current trigger at high luminosity will be difficult Depends on scheme for increasing luminosity of course… C. Tully & H. Pi JetMetPRS Aug 2004  (tagged “forward” jets)

29 UMD DOE Review - 30-Aug-05 Drew Baden29 HF Jet Trigger Topology: –Each HF HTR receives 12  x 4  towers –Need 9 HTRs per side for the long fibers Current trigger gangs 3  x 2  for jet clustering New design: –Implement a 4x4 sliding window at the tower level –Add jet isolation for triggering on “tagged” jet in WBF –Implement in 1 9U VME board using FPGAs for the DSP’ing HTR cards

30 UMD DOE Review - 30-Aug-05 Drew Baden30 Who, What, When… HTR Mezzanine card (Maryland) –Transmits data to jet clustering card Jet Clustering card (Minnesota) –Forms jets, sorts, transmits to calorimeter trigger boards Simulation and design and…. –Will eventually include Virginia, Princeton, Wisconsin…and other CMS collaborators Goal –Have something that is working parasitically (at least) during maiden CMS run –Show proof of concept of a design scalable to rest of HCAL and all of ECAL –This should be the prototype of the upgrade to the CMS calorimeter trigger

31 UMD DOE Review - 30-Aug-05 Drew Baden31 Conclusions

32 UMD DOE Review - 30-Aug-05 Drew Baden32 Schedule of Activities at UMD 2006 Testbeam –Run with real ECAL in front – this is our chance for a realistic calibration –Continuing tests of HTR firmware and links Spring 2006 “Slice tests” –“Cosmic Challenge” Test Level 1 trigger path with “slice” of trigger using cosmics Integrate with MUON, ECAL, Tracker First CMS synchronization First tests of a “slice” of the DAQ –All HCAL TriDas production cards involved Finish Production Spring 06 beneficial occupancy of USC –Installation of all racks, crates, and cards –Firmware / software / timing / troubleshooting Continue HF Luminosity development Continue HF WBF Trigger project with Jeremy Mans

33 UMD DOE Review - 30-Aug-05 Drew Baden33 Overall TriDAS Project Cost ItemCost Effort:Engineering$802,669 Technician$138,684 Total$941,353 M&S:R&D$ 218,100 Production$1,929,374 Total$2,147,474 Misc:$45,000 Grand Total$3,133,827 Contingency: –Effort: 50% –M&S: 75% –Based on the uncertainty in the requirements, which will certainly change over time.

34 UMD DOE Review - 30-Aug-05 Drew Baden34 HCAL TriDAS Summary Project –~6% of total HCAL project –Finish up construction phase now, move to M&O phase Production finished by Nov, then Installation/Integration next TriDAS Progress at UMD –HTR card design successful Including fab/assembly challenges –Current manpower has proved to be adequate to the task –Integration engineer (Tullio Grassi) has proved to be invaluable to USCMS/HCAL effort Hopefully, we can convince him to continue on M&O at CERN but no longer a UMD member –Synchronization and clock issue progress Successful tests at CERN in testbeams, integration tests with calorimeter trigger, etc –HCAL is far ahead of most subsystems, we will be contributing a lot of expertise now –FPGA programming successful (thanks to Tullio – big responsibility, well done) Firmware coded/simulated long before hardware implementation ready –This allowed us to be ready as soon as the hardware arrived! Conclusion: HCAL TriDAS project in very good shape all around –We have worked like dogs for 5 years!!!! (and loved [almost] every minute of it!) –HF Luminosity and Trigger projects will keep us busy


Download ppt "UMD DOE Review - 30-Aug-05 Drew Baden1 CMS Trigger Elecctronics CMS HCAL TriDAS is a joint effort between: –University of Maryland – Drew Baden –Boston."

Similar presentations


Ads by Google