Presentation is loading. Please wait.

Presentation is loading. Please wait.

CMS Electronics Week November 2002 1 Electronics Issues Drew Baden University of Maryland USCMS HCAL.

Similar presentations


Presentation on theme: "CMS Electronics Week November 2002 1 Electronics Issues Drew Baden University of Maryland USCMS HCAL."— Presentation transcript:

1 CMS Electronics Week November 2002 1 Electronics Issues Drew Baden University of Maryland USCMS HCAL

2 CMS Electronics Week November 2002 2 Shield Wall SBSSBS HPD FE MODULE 12 HTRs per Readout Crate, 2 DCC FRONT-END RBX Readout Box (On detector) READ-OUT Crate Trigger Primitives Fibers at 1.6 Gb/s 3 QIE-channels per fiber QIE CCA GOL DCCDCC TTC GOL CCA HTRHTR HTRHTR CAL REGIONAL TRIGGER 32 bits @ 40 MHz 16 bits @ 80 MHz CCA S-Link: 64 bits @ 25 MHz Rack CPU FE/DAQ Electronics CLKCLK HTRHTR DAQ

3 CMS Electronics Week November 2002 3 Readout VME Crate “BIT3” board –Slow monitoring over VME –Commercial VME/PCI Interface to CPU FanOut board – Takes TTC stream in – Clone and Fanout timing signals HTR ( HCAL Trigger and Readout ) board – Spy output over VME – FE-Fiber input – TPG output (SLBs) to CRT – DAQ/TP Data output to DCC DCC (Data Concentrator Card) board – Input from HTRs – Spy output – Output to DAQ DCCDCC VME CRATE 20m Copper 1.2 Gb/s DAQ Calorimeter Regional Trigger BIT3BIT3 Fiber 1.6 Gb/s FanOut FanOut HTRHTR Front End Electronics HTRHTR DCCDCC HTRHTR HTRHTR... TTC fiber

4 CMS Electronics Week November 2002 4 HCAL Racks HCAL will need 8 Racks 2 crates/rack ~200 HTR cards ~3000 fibers and ~525 SLB with TPG cables All I/O via front panels Doors on front of rack required Sufficient standoff to satisfy fiber curvature requirement Keeps people from pulling out the fibers Two 6U panels for cable/fiber support Computer access in front of rack for fiber/TPG installation Wireless in counting room? Laptop/monitor/keyboard mounted somewhere close?

5 CMS Electronics Week November 2002 5 TPG Cable Issues Amphenol skew-clear cables work @ 20m ok Skew spec ~125ps @ 20m running at 1.2Gbaud Eye pattern will survive, BER = 10 -15 Each cable carries 2 pair – need 2 cables per SLB connector $100/cable + ~$150 for assembly/testing (custom connector molding) Electrically these are very nice cables, but… Formidable mechanical challenges – 6 of these beasts per HTR! We are investigating quad cable, much thinner Single cable, ~$180 for 20m Would not require custom molding – much cheaper ~$30 for assembly However…skew is almost x2 worse for 20m (230ps) Amphenol spec says this will give 10 -15 BER for 15m @ 1.6Gbaud They were not clear about 1.2Gbaud – we will measure If at all possible a 15m spec will: Save money (~$100k) Give breathing room on BER Save 1 clock tick in L1 latency Decrease mechanical risks on all boards

6 CMS Electronics Week November 2002 6 VME Rack Layout 56U Total rack height, 55 used (Note: Can recover 3U by using 1U exchangers) Rack computer (3U) Air circulation has to be front → back ala DAQ crate Recirculation/Monitoring (4U) Extra Heat Exchanger 2 VME crate zones: Cable support (6U) Front panel only Fibers and TPG cables are formidable VME Crate (9U) Air/Water heat exchanger (2U) Fan Tray (2u) Power Supply zone (6U) Cheaper, robust, safe, D0/CDF Air transport issue here Will have to build wire harness Put A/C circuit breakers here? Return air guide (2U) Return Air Guide 2U SBSSBS Air/Water Heat Exchanger 2U HTRHTR HTRHTR HTRHTR HTRHTR HTRHTR HTRHTR HTRHTR HTRHTR HTRHTR HTRHTR HTRHTR HTRHTR DCCDCC DCCDCC TTCTTC HTRHTR 9U Cable Strain Relief 6U Air/Water Heat Exchanger 2U HTRHTR HTRHTR HTRHTR HTRHTR HTRHTR HTRHTR HTRHTR HTRHTR HTRHTR HTRHTR HTRHTR HTRHTR DCCDCC DCCDCC TTCTTC HTRHTR 9U Cable Strain Relief 6U Air/Water Heat Exchanger 2U Recirculation Fan & Rack Protection 4U Rack Computer (dual Intel/Linux) 3U Fan Tray 2U Power Supply Zone 6U

7 CMS Electronics Week November 2002 7 Power Consumption Estimates VME crate ~ 1.2kW (2 crates/rack only) HTR 70W/slot 7A @ 5V = 35W 11A @ 3.3V = 33W Includes 6 SLBs, but many cards will have fewer 13 or fewer HTR/crate = 910W Fanout card ~20W/slot.5A @ 5V = 2.5W 4.5A @ 3.3v ~16W DCC ~ 60W/double slot 5A @ 5V = 25W 10A @ 3.3V = 33W S-Link64 current draw is a wild guess 2 DCC/crate = 120W Add power for rack computer, monitor and fans to get rack power 1kW max Total power dissipated by entire rack ~3.5kW Note current CMS power consumption ~2kW/crate, >6kW/rack

8 CMS Electronics Week November 2002 8 Production Schedule Front-End CCM Jun – Sep 03 FE cards Jul – Oct 03 RBX Aug 03 – May 04 HPD deliveries from now until Apr 04 HTR Pre-production Jan 03, production Apr 03 – Sep 03 DCC Motherboards nearly complete, logic cards by Aug 03 Awaiting final specs on S-Link64 Fanout Card Complete soon after QPLL, Q2/03

9 CMS Electronics Week November 2002 9 HCAL TriDAS Integration Status First integration completed, summer 02 FE  HTR  DCC  SLINK  CPU All links well established No obvious clocking problems Work needed on synch monitoring and reporting Improvements expected using crystal for TI refclk Will always have TTC/QPLL clock as backup… HTR firmware fairly mature Switch to Virtex2 all but complete TPG and BCID ready but not tested To commence when next HTR version delivered and Wisconsin TPG boards delivered (est Q1 2003 for testing to commence) Will be main effort when next HTR version arrives Jan 2003

10 CMS Electronics Week November 2002 10 HTR Production Schedule Issues: Parts availability for HTR Stratos LC’s, FPGAs, etc., so far so good – should make schedule ok QPLL not needed for HTR since we have a Fanout card per crate Firmware requirements Will learn a lot in 2003…very far along now

11 CMS Electronics Week November 2002 11 Integration Goals 2003 Continued development of HTR and DCC firmware Commission TPG path Firmware requirements/logic, LUTs, synchronization, SLB output… Monitoring, error reporting, etc. Preliminary US-based integration at FNAL Q1/03 Full system as in the previous testbeam Except TPG which will be done initially at UMD, moved to FNAL if appropriate Testbeam in the summer (to begin in spring) Same goals as summer 02 Support calibration effort and continue commissioning the system Operate a “vertical slice” for an extended period Q4/03 Fully pipelined, monitoring, TPG, DAQ, synchronization, clocking…. Develop software to support DAQ activities Testbeam software improvements Software for commissioning HTR needed Allow us to verify fiber mapping, download LUTs, firmware version, etc. By end of 2003 will have most of the HCAL TRIDas functionality

12 CMS Electronics Week November 2002 12 Installation Schedule 1 st integration with L1 Q3Q2Q1Q4Q1Q2Q3Q4Q1Q2Q3Q4Q1Q2Q3Q4 2003200420052006 Vert. Slice Testbeam Production Racks Install in USC: Crates HCAL Alcove Tests (HB/E/O?) Integration HB in UXC HF in UXC HF mated No detector – rely on emulator for further commissioning/debugging Cable HB/F HE+HE- Cable HE Cable HF TriDAS Integration HO in UXC

13 CMS Electronics Week November 2002 13 Installation Requirements Production cards will be available, all systems Front-end emulator will be important No other way to light up the fibers during installation Design very close to actual front-end card (GOL, not TI) Built by FNAL Close interaction with UMD on board UMD firmware HCAL mapping nightmare will have to be implemented very carefully Will need to be able to connect to rack CPU from inside shield wall as we plug the fibers in one at a time Will need to have audio communication between operators inside shield wall and at VME racks

14 CMS Electronics Week November 2002 14 HCAL Installation We have a modest amount of stuff to be installed in USC: 8 VME racks, 16 crates ~3000 Fibers into front of HTR cards Fibers laid by CERN personnel? ~525 TPG cables from HTR’s to RCT We will provide technician for installation We will have 3 senior physicists at CERN: Pawel de Barbaro, Laza Dragoslav, Dick Kellogg Other personnel: Post-doc(s), student(s), US-based physicist(s)… In USC: Need a place for someone to sit in front of HTRs when fibers are being plugged in Access via HCAL Rack computer for VME access to cards Wireless in counting house? Mounted monitor/keyboard to interface with rack computer? Both might be good… Will there be cabinets, work benches, etc?

15 CMS Electronics Week November 2002 15 Installation Manpower Estimates Drawing on D  Level 2 experience for the current Tevatron Run 2a… Each significant card requires on-site expertise: Probably 1-2 postdoc-level (or above) and 1 engineer Maybe the same engineer for both DCC and HTR… HCAL will have an electronics setup at CERN Total personnel estimate: Front End 1 HTR 2 DCC 2 Miscellaneous (grad students, transients, etc.) maybe 4? Very difficult to say with any accuracy

16 CMS Electronics Week November 2002 16 HCAL Clocking System Goals: FE fiber physical layer synchronization locking FPGA clock phase locked with LHC clock Be able to achieve TPG alignment Keep track of and handle BC0/EVN Correct tagging of L1A bucket inside Level 1 pipeline Known issues: Random 4-5 clock latency within TI deserializer Quality of TTC/QPLL clock jitter Whether we can use crystals for TI refclk Unknown issues: Good point!

17 CMS Electronics Week November 2002 17 FE Clocking TTCex fiber input to CCM Agilent Fiber receiver + TTCrx chip + QPLL 40MHz clean clock converted to PECL 40MHz clean PECL clock driven by 1-9 clock driver onto backplane to FE module

18 CMS Electronics Week November 2002 18 FE Link Issue: FE uses GOL Tx and TI Serdes Rx (TLK2501) TLK2501 requires Refclk jitter < 40ps pkpk Equivalent to 6.5kHz bandwidth on PLL Frequency offset < ± 100ppm Equivalent to ± 4kHz on f LHC NB: commercial applications always use crystals Solutions Use crystal for Refclk, or… QPLL jitter spec <50ps http://proj-qpll.web.cern.ch/proj-qpll/qpllHome.htm

19 CMS Electronics Week November 2002 19 HTR Schematic P1 to DCC P2 LVDS to Level 1 Cal Trigger LVDS SLB FPGA Xilinx XC2V LC TI LC FPGA Xilinx XC2V LC TI LC VME FPGA Fibers No P3! 8-way 8 8

20 CMS Electronics Week November 2002 20 Clocking Schematic TTCrx TTC 80 MHz LVPECL Crystal 1 to 8 Fanout Single width VME BC0 80MHz 40MHz SLB TI (16) FPGAFPGA BC0 40MHz 1 to 8 Fanout 80MHz TTC mezz TTC TTC broadcast bus Cat 6/7 quad cable (allows LVDS/PECL) TTC Fanout Board QPLL Start with Fanout card TTCrx Maryland mezzanine card or CERN TTCrm daughterboard QPLL Fanout on Cat6/7 quad twisted pair TTC, BC0, 40MHz, 80MHz In HTR: Send TTC signal to TTCrx mezzanine board, access to all TTC signals Send 80MHz clean clock (cleaned by QPLL) to mux Select 80MHz clean clock OR crystal to TI deserializers 80 MHz 40 MHz

21 CMS Electronics Week November 2002 21 HCAL TRIDas Clock Scheme TTCrx QPLL (‘CC’ means Clean Clock) Cat6/7 RJ45 RJ45 TTCMezz TTC SLB Xilinx TTC broadcast, L1A, BCR, EVR, CLK40 Fanout Card 4 twisted pair… TTC BC0 CC40 CC80 HTR Board CC40 CC80 BC0

22 CMS Electronics Week November 2002 22 Fanout – HTR scheme HTR TTC fiber TTC LVDS CLK80 3.3V-PECL RX_BC0 LVDS Cat6E or Cat7 cable 8 clks to TLKs DS90LV001 Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 MC100LVE310 3.3V PECL CLK40 3.3V-PECL LVDS Fanout x 8 PCK953 LVPECL- to-LVTTL Fanout (top layer) PCK953 LVPECL- to-LVTTL Fanout (top layer) 8 clks to TLKs + TPs To 6 SLBs Diff. to 2 Xilinx + termin. Diff. to 6 SLBs Single-end to 2 xilinx TTC daughter card IN IN_b Brdcst, BrcstStr, L1A, BCntRes to xilinx and SLBs CLK80 LVDS Fanout Board Low-jitter Fanout x 15 O/E Brdcst, BrcstStr BC0 Fanout buffer TTC FPGA Fanout x 15 Brdcst, BrcstStr, BCntRes, L1A CMOS LVDS or diff PECL …….. 15 connectors on bottom layer ? 15 Cables & Connectors tbd …….. NB100LVEP221 is LVDS compatible TTCrx (or daughter card) QPLL AN1568/D Fig 11 Onsemi.com RJ45 ~Fifteen RJ45 connectors PECL fanout e.g. DS90LV110.. 2 Test Points for CLK40 and BC0 CLK40 LVDS PECL fanout.. 80.0789 MHz 3.3V crystal Diff. PECL MC100LVEL37 CK CK/2.. 9U Front-panel space = 325 mm ; => space per connector ~ 21.5 mm Notes: SLBs require fanout of CLK40, BC0. FE-link possibly requires CLK80. PECL fanout was tested in TB2002. One Cat6E cable (low x-talk) replaces the 2 Cat5 cables used in TB2002. TTC and BC0 remain LVDS as in Weiming’s board. HTR needs Broadcast bus, BCntRes and L1A: from TTCrx if we get it to work, otherwise we have to fan them out. LVDS Tullio Grassi

23 CMS Electronics Week November 2002 23 TTCrx Mezzanine card Very simple card: 2 PMC connectors TTCrx chip TTC signal receiver and driver on motherboard Used by HTR, DCC, and Fanout cards

24 CMS Electronics Week November 2002 24 TTC Distribution – Fanout Card Currently HCAL has 6 TTC partitions: Each partition requires TTCvi and TTCex Each HCAL VME crate will have a single TTCrx receiving data directly from TTCex in a single VME card (Fanout Card) Fanout TTC signal to HTR mezzanine card with TTCrx chip Use quad twisted pair CAT6/7 cable TTC and BC0 fanout using LVDS Also fanout of 40 and 80MHz clean clocks over LVPECL Cost savings and simplification TTC monitoring by Fanout card over VME

25 CMS Electronics Week November 2002 25 TTC Monitoring Chris Tully has built a very nice TTC monitoring board: 6U VME form factor board Needs only 5V power, so could be used as a standalone monitor with an appropriate battery Hosts a TTCrm module Front-panel LEDs displays: TTC system activity History of broadcasts Event counter/bunch counter values Useful for debugging and monitoring.

26 CMS Electronics Week November 2002 26 TTC Display Board

27 CMS Electronics Week November 2002 27 Random Latency Issue Texas Instruments TLK2501 Serdes Run with 80MHz frame clock – 20 bits/frame, 1.6GHz bit clock 626ps bit time Latency from data sheet: “The serial-to-parallel data receive latency…fixed once the link is well established. However…variations due to… The minimum…is 76 bit times…the maximum is 107 bit times…” Latency is 47.5 to 66.875 or 19.4ns – could cross a 40MHz bucket boundary! How to fix? Two ways SLB “knows” this latency – we will read it out after each reset HCAL LED fast rise time Can pulse during abort gap and align channels Requires LED pulsing alignment

28 CMS Electronics Week November 2002 28 TPG Alignment TPG alignment performed in SLB Necessary: All HTRs will send common BC0 to SLB’s within each of 16 VME crates Calibration procedure to be performed for crate-crate alignment Initial alignment with LEDs, laser, etc. Final alignment with LHC first beam data CMS should consider pushing for initial beam with only 1 bucket populated This will ensure successful alignment

29 CMS Electronics Week November 2002 29 DAQ Alignment DAQ data must also be aligned Must know L1A bucket for zero suppression Solution: discussed in previous slide Read from SLB FE sending known ID after with fixed offset relative to BC0 during abort gap Comparison of the two for error checking DAQ check on BC0 in DCC for alignment Will send BC0 and EVN with the data to DAQ

30 CMS Electronics Week November 2002 30 MISC Errors What happens if DCC finds mismatch in EVN? DCC will then issue reset request to aTTS system Details not yet defined but is fully programmable Fiber Link/synchronization errors (GOL/TI) Work out protocols to inform DCC Reset requests to aTTS as well FE Clock/GOL PLL link errors If GOL loses synch, then transmitter will send out IDLE characters IDLE characters are illegal in a pipelined system! HTR will trap on IDLE as a signal that FE/GOL is having trouble


Download ppt "CMS Electronics Week November 2002 1 Electronics Issues Drew Baden University of Maryland USCMS HCAL."

Similar presentations


Ads by Google