Presentation is loading. Please wait.

Presentation is loading. Please wait.

ALCPG Workshop, Cornell, July '03 1 G. Eckerlin Data Acquisition for an LC Detector Report from the extended ECFA/DESY LC Workshops Cornell, Jul. 15 th.

Similar presentations


Presentation on theme: "ALCPG Workshop, Cornell, July '03 1 G. Eckerlin Data Acquisition for an LC Detector Report from the extended ECFA/DESY LC Workshops Cornell, Jul. 15 th."— Presentation transcript:

1 ALCPG Workshop, Cornell, July '03 1 G. Eckerlin Data Acquisition for an LC Detector Report from the extended ECFA/DESY LC Workshops Cornell, Jul. 15 th 2003 Outline Requirements and operation mode DAQ concept and basic design Feasibility and examples Current R&D efforts Summary

2 ALCPG Workshop, Cornell, July '03 1 G. Eckerlin Data Acquisition for an LC Detector This Report is based on the TESLA TDR and contributions at the ECFA/DESY LC Workshops For details see the last workshops of the series : 1 st workshop Krakow/Poland Sep. 2001 2 nd workshop St. Malo/France Apr. 2002 3 rd workshop Prague /Czeck Rep. Nov. 2002 4 th workshop Amsterdam/Netherlands Apr. 2003

3 ALCPG Workshop, Cornell, July '03 1 G. Eckerlin Data Rates at the LC Physics (example 500GeV Tesla) e + e -  X 0.0002/BX e + e -  e + e - X  0.7/BX Beam induced background: neutrons 2000n/BX e+e- pairs ● VXD inner layer 1000 hits/BX ● TPC 15 tracks/BX ● ECAL 3000 cells/BX -> Background is dominating the rates !

4 ALCPG Workshop, Cornell, July '03 1 G. Eckerlin The LC will have high luminosity this will give high data rates and need large event building bandwidth Want to do precision physics need large event samples and good (controlled!) efficiencies. this needs very low deadtime at high rates and high 'selection' efficiency Want to exploit new physics potentials may be interessted in by now unknown signatures this needs a very flexible event selection The LC will have high intensity beams large variation of background possible Needs all the above requirements DAQ Requirements

5 ALCPG Workshop, Cornell, July '03 1 G. Eckerlin LC Bunch Structure repetition rate: 5 Hz (3 Hz) [120Hz] bunches per train: 2820 (4500) [192] bunch separation: 337 ns (189 ns)[1.4ns] train - length: 950  s (850  s)[260ns] train separation: 199 ms (332ms)[8.3ms] values for 500GeV (800GeV) Tesla and [NLC 500GeV]

6 ALCPG Workshop, Cornell, July '03 1 G. Eckerlin readout and store complete bunch train in pipeline  no hardware trigger  up to 1 ms active front end pipeline for Tesla (~ 300 ns for NLC) zero suppression and data compression in front end  reduce bandwidth needs software selection on full data of complete train  full event information available for software filter  Store classified events according to (physics) needs modular system (PC farms, switched network,...)  scalability use commercial available hardware components and industry standards  maintainability Software Trigger Concept

7 ALCPG Workshop, Cornell, July '03 1 G. Eckerlin Data Acquisition Levels 1 ms pipeline, no trigger interrupt, sparcification software event selection using full information of a complete train readout between trains (8-200ms) detector specific, common functionality Uniform interface, standardized Commodity hardware high level prog. Lang.

8 ALCPG Workshop, Cornell, July '03 1 G. Eckerlin 1ms active pipeline  TPC : no gating -> reduce ion feedback by GEMs or Micromegas  VTX : high occupancy in the 1 st silicon layer  -> need column parallel readout during train Readout of pipeline inbetween trains (8-200ms)  need sparsification on the detector level  if readout time bigger than train separation  -> need parallel readout and buffering in the front end No active pipeline outside the train periods  may front end electronics be swtiched off/on many times per sec??  how to collect calibration data (cosmic ) ? Impact on the Detector Readout

9 ALCPG Workshop, Cornell, July '03 1 G. Eckerlin From SLD experience 300MPixel, 5MHz readout, 8bit ADC 960Mbit/s, 30 - 100kBytes/event LC requirements main background from pair production, falling rapidly with radius L1/50MHz : 4.3hits/mm 2 L2/25MHz : 2.4 hits/mm 2 (2500pixel/mm 2 ) LC DAQ requirements maximum rate off CCD 3.3TPixel/s -> sparsification needed 5bit resolution sufficient, 10Mbyte stored on readout chip readout per side : 5 LVDS (1/ layer) converted into 1 optic fiber DAQ can store 20% of CCD in raw mode (for diagnostics) CCD Vertex Detector Readout (LCFI Collaboration)

10 ALCPG Workshop, Cornell, July '03 1 G. Eckerlin TPC Electronics 1 - 3 * 10 6 readout channels need zero suppression/cluser finding 10MHz digitizing, 10 bits (track point = 3 pads * 5 time bins) Simulation physics event: 500 K word (incl. time stamp) pair background : small compared to physics event size for safety used 50K space points (random) -> 0.1% occupancy ➔ Occupancy less then 0.1% if you believe background simulations (1% of the 1.5 * 10 9 voxels are assumed for the DAQ studies to be save) TPC Readout (Mike Ronan)

11 ALCPG Workshop, Cornell, July '03 1 G. Eckerlin Data Volume (from TESLA TDR) Data Volume (from TESLA TDR) ➔ 1.1 Gbytes/s

12 ALCPG Workshop, Cornell, July '03 1 G. Eckerlin Common features of TESLA and NLC/JLC :  Short time between bunch crossings inside a bunch train  Long time between trains with nothing at all.....  High Lumi within a few msec Cannot afford deadtime inside a train  Need to have a pipeline active during the full train length  TESLA : 1msJLC/NLC : 260ns  Readout before next train (or in parallel to active pipeline)  TESLA : 199msJLC/NLC : 8.3ms Can we do this ?  TELSA : 220MB/train in 200ms -> 1.1GB/sec bandwidth  JLC/NLC : ~ 10MB/train* in 8.3ms -> 1.2GB/sec bandwidth  (to compare with LHC/CMS : 50 - 500Gbit/sec)  No problem * JCL/NLC number are scaled TESLA numbers assuming identical background rates and detector setup Common DAQ for TESLA/NLC ?

13 ALCPG Workshop, Cornell, July '03 1 G. Eckerlin Current R&D Activities Detector readout R&D Using existing (modified) electronics (TPC, HCAL,...) Developing of new front end readout chips (LCFI, SiLC, Calice,...) Test systems and tools Existing setup for TPCs in several Labs (based on STAR, ALEPH,....) Building of a new test system for ECAL/HCAL beam tests (CMS based) General remarks First common tools for different detectors are being prepared Discussion on a general data handling model has started

14 ALCPG Workshop, Cornell, July '03 1 G. Eckerlin A LC Data Handling Model Data handling logically divided into 4 layers Front end readout Central data collection Event building and processing Permanent storage (not covered here) Between these layers interfaces have to be defined to Enable parallel development of components in different R&D groups Reduce dependencies and allow technology changes at a later stage Control and Monitoring will be an essential part Clock distribution, bunch tagging Run control, process control, calibration, monitoring Keep the design Global Detector Network aware !

15 ALCPG Workshop, Cornell, July '03 1 G. Eckerlin Functionality of the Layers Front end readout hardware : preamplifier, shaper, ADC, memory,... functions : zero-suppression, hit/cluster finding, digital buffer, multiplex Central data collection one unique intelligent interface cards for all detectors input buffer, data formatting, data reduction (data compression?) output buffer and standardized interface to event building Event building and processing commodity hardware : network switches, processor farm (PC?) full event data processing, all bunch crossings of a complete train software finders will tag 'bunches of interest' for permanent storage

16 ALCPG Workshop, Cornell, July '03 1 G. Eckerlin Coordinate and Foster DAQ R&D To coordinate the DAQ R&D Define building blocks and their functionalities Specify Interfaces between these blocks Identify uncovered readout R&D areas Foster R&D efforts Encourage common test systems for different R&D groups Develop tools to ease beam tests Collect and provide information on existing R&D efforts Need to coordinate and foster increasing number of activities A DAQ working group at the ECFA study will address this in Europe -> should start to do this on a world wide basis !

17 ALCPG Workshop, Cornell, July '03 1 G. Eckerlin Summary Software Trigger concept is adequate for both LC options requirements look similar (operation mode, event rates, bandwidth) impact on detector frontend readout seems to be ok Many R&D efforts started world wide on detector prototyps some need R&D for readout already, others take existing electronics need common tools and setups to ease beam tests need to follow detector R&D efforts to keep them compatible Make this a world wide effort contribute to the DAQ R&D groups and the DAQ page of the World wide study R&D : www.desy.de/~eckerlin/wwslc


Download ppt "ALCPG Workshop, Cornell, July '03 1 G. Eckerlin Data Acquisition for an LC Detector Report from the extended ECFA/DESY LC Workshops Cornell, Jul. 15 th."

Similar presentations


Ads by Google