Peter Göttlicher, DESY-FEB, XFEL-DAQ-2007/03/05 1 XFEL-PIXEL Readout and control Detector requirements/proposal Basic concepts for experiments Control.

Slides:



Advertisements
Similar presentations
Operating System.
Advertisements

Bunch-by-Bunch Instrumentation Module Design Goals: –Unified conceptual design for all bunch-by-bunch instrumentation Beam position monitoring Bunch tune.
6 Mar 2002Readout electronics1 Back to the drawing board Paul Dauncey Imperial College Outline: Real system New VFE chip A simple system Some questions.
HCAL heat dissipation and power management Rough ideas, calculations and numbers What is easy? Where are problems? Heat transfer Charge storage for pulsed.
An SiPM Based Readout for the sPHENIX Calorimeters Eric J. Mannel IEEE NSS/MIC October 30, 2013.
2D Detectors DAQ Overview 2D detectors are organized as tiles providing 10G Ethernet serialized portions of the full.
20 Feb 2002Readout electronics1 Status of the readout design Paul Dauncey Imperial College Outline: Basic concept Features of proposal VFE interface issues.
4 Dec 2001First ideas for readout/DAQ1 Paul Dauncey Imperial College Contributions from all of UK: result of brainstorming meeting in Birmingham on 13.
Digital Graphics and Computers. Hardware and Software Working with graphic images requires suitable hardware and software to produce the best results.
Peter Göttlicher Amsterdam, June 5 th 2014 AGIPD, The Electronics for a High Speed X-ray Imager at the Eu-XFEL for the AGIPD consortium: A.Allagholi, J.
26 February 2009Dietrich Beck FPGA Solutions... FPGA and LabVIEW Pattern Generator Multi-Channel-Scaler.
The SLHC and the Challenges of the CMS Upgrade William Ferguson First year seminar March 2 nd
XFEL Large Pixel Detector DAQ. Project Team Technical Team: STFC Rutherford DAQ Glasgow University Surrey University Science Team: UCL Daresbury Bath.
Basic Microcomputer Design. Inside the CPU Registers – storage locations Control Unit (CU) – coordinates the sequencing of steps involved in executing.
Overview of Computing. Computer Science What is computer science? The systematic study of computing systems and computation. Contains theories for understanding.
06/03/06Calice TB preparation1 HCAL test beam monitoring - online plots & fast analysis - - what do we want to monitor - how do we want to store & communicate.
MR (7/7/05) T2K electronics Beam structure ~ 8 (9?) bunches / spill bunch width ~ 60 nsec bunch separation ~ 600 nsec spill duration ~ 5  sec Time between.
SPiDeR  SPIDER DECAL SPIDER Digital calorimetry TPAC –Deep Pwell DECAL Future beam tests Wishlist J.J. Velthuis for the.
AHCAL electronics. Status and Outlook Peter Göttlicher for the AHCAL developers CALICE meeting UT Arlington, March 11th, 2010.
Detectors for Light Sources Contribution to the eXtreme Data Workshop of Nicola Tartoni Diamond Light Source.
Combined Test Beams Definition of Combined Test Beams Why would we want them? Why we might not want them? Host laboratory issues? Experimenter issues.
Introduction to Software Development. Systems Life Cycle Analysis  Collect and examine data  Analyze current system and data flow Design  Plan your.
Presentation for the Exception PCB February 25th 2009 John Coughlan Ready in 2013 European X-Ray Free Electron Laser XFEL DESY Laboratory, Hamburg Next.
Memory Cell Operation.
End Station A Test Beam (ESTB) Carsten Hast and Mauro Pivi August 23 th 2012 Proposals, Procedures, Schedule.
Computer Architecture Memory, Math and Logic. Basic Building Blocks Seen: – Memory – Logic & Math.
Online Software 8-July-98 Commissioning Working Group DØ Workshop S. Fuess Objective: Define for you, the customers of the Online system, the products.
Joel Goldstein, RAL 4th ECFA/DESY LC Workshop, 1/4/ Vertex Readout Joel Goldstein PPd, RAL 4 th ECFA/DESY LC Workshop DAQ Session 1 st April 2003.
IPHC - DRS Gilles CLAUS 04/04/20061/20 EUDET JRA1 Meeting, April 2006 MAPS Test & DAQ Strasbourg OUTLINE Summary of MimoStar 2 Workshop CCMOS DAQ Status.
High Speed Detectors at Diamond Nick Rees. A few words about HDF5 PSI and Dectris held a workshop in May 2012 which identified issues with HDF5: –HDF5.
Mathias Reinecke CALICE week Manchester DIF development – Status and Common Approach Mathias Reinecke for the CALICE DAQ and Front-End developers.
High-resolution, fast and radiation-hard silicon tracking station CBM collaboration meeting March 2005 STS working group.
Update on the project - selected topics - Valeria Bartsch, Martin Postranecky, Matthew Warren, Matthew Wing University College London CALICE a calorimeter.
Semiconductor Memory Types
11 October 2002Paul Dauncey - CDR Introduction1 CDR Introduction and Overview Paul Dauncey Imperial College London.
11 th April 2003L1 DCT Upgrade FDR – TSF SessionMarc Kelly University Of Bristol On behalf of the TSF team Firmware and Testing on the TSF Upgrade Marc.
Amsterdam, Oct A. Cotta Ramusino, INFN Ferrara 1 EUDRB: status report and plans for interfacing to the IPHC’s M26 Summary: EUDRB developments.
ECFA Workshop, Warsaw, June G. Eckerlin Data Acquisition for the ILD G. Eckerlin ILD Meeting ILC ECFA Workshop, Warsaw, June 11 th 2008 DAQ Concept.
Unit 3 - Computer Systems. Logical vs Physical A computer system can be represented in either a logical or physical form Both are useful in understanding.
Time to resolve Design Issues 1. Disclamer Due to the engineering meeting last week I have not had time to coordinate this material with Tim or the L3.
COMPUTER HARDWARE & SOFTWARE INTRODUCTION TO LIBRARY & INFORMATION SCIENCES (5501) WORKSHOP SPRING 2013 By: Huma Malik Librarian, Preston University, Islamabad.
DAQ and Trigger for HPS run Sergey Boyarinov JLAB July 11, Requirements and available test results 2. DAQ status 3. Trigger system status and upgrades.
ESS Timing System Plans Timo Korhonen Chief Engineer, Integrated Control System Division Nov.27, 2014.
MADEIRA Valencia report V. Stankova, C. Lacasta, V. Linhart Ljubljana meeting February 2009.
Basic Computer Fundamentals
Giovanna Lehmann Miotto CERN EP/DT-DI On behalf of the DAQ team
DIF – LDA Command Interface
WP18, High-speed data recording Krzysztof Wrona, European XFEL
6. Structure of Computers
LHC experiments Requirements and Concepts ALICE
Chapter 7.2 Computer Architecture
APV Readout Controllers DAQ
Testbeam Timing Issues.
Automation and Feedbacks
ProtoDUNE SP DAQ assumptions, interfaces & constraints
Graeme Winter STFC Computational Science & Engineering
Toward a costing model What next? Technology decision n Schedule
Photon beamline crates
HPAD 2D-pixel detector Outline - Basics of the AP-HPAD
ITS combined test seen from DAQ and ECS F.Carena, J-C.Marin
Example of DAQ Trigger issues for the SoLID experiment
UK ECAL Hardware Status
CALICE meeting 2007 – Prague
Motherboard External Hard disk USB 1 DVD Drive RAM CPU (Main Memory)
Logical Computer System
The CMS Tracking Readout and Front End Driver Testing
Presentation transcript:

Peter Göttlicher, DESY-FEB, XFEL-DAQ-2007/03/05 1 XFEL-PIXEL Readout and control Detector requirements/proposal Basic concepts for experiments Control electronics Selecting/rejecting bunches Backend-systems Time schedule Conclusion

Peter Göttlicher, DESY-FEB, XFEL-DAQ-2007/03/05 2 Detector requirements Call for experiments: large pixel detectors:> 1Mega-Pixel large number of frames:XFEL: bunches/sec large dynamic range Photons/pixel/frame distinguish0 and 1 Photon, rest Poisson-limit With 2Bytes/pixel and 1MPixel60GByte/sec -but experiments will not handle frame rate -Data reduction of "compression" is not effective: Statements of nearly nothing to factor 2-4.

Peter Göttlicher, DESY-FEB, XFEL-DAQ-2007/03/05 3 Three Proposals AP-HPAD: analogue pipe-line Hybrid Array Pixel Detector Si-pixel: ASIC stores signal while bunch train into capacitors digitization/data-transfer in pause between trains 400 frames/train LSDD: Linear silicon drift detector One dimension is coded into drift-time: 200MS/s while train data transfer in pause between trains. 600 frames/train LPD :Large Pixel Detector analogue pipeline while train digitization between trains 512 (256) frames/train Þ All extendable beyond 1Mpixel All around every 5-10 bunch All might run in parallel

Peter Göttlicher, DESY-FEB, XFEL-DAQ-2007/03/05 4 Idea: Basic setup of all experiments Specific to experiment, here AP_HPAD Generic for all experiments

Peter Göttlicher, DESY-FEB, XFEL-DAQ-2007/03/05 5 Idea: Concept for control electronics General task: - Boot - Collect information - Synchronize to XFEL clocks, time - Synchronize Interface electronics Backend Experimental area - Generate/distribute Clocks/data - User interface - Write to backend Monitoring/Status - Usage at other laboratories

Peter Göttlicher, DESY-FEB, XFEL-DAQ-2007/03/05 6 Selecting/rejecting bunches Aim: - Best use of limited pipeline in detector-head - Limit resources for backend Staged concept: - Store predefined bunches into pipeline of experiment - Fast reject (before next bunch): hardware detector, Slow reject: After bunch train Handle slower information, information from XFEL - Transfer to backend-system More fancy selection in CPU-farm Rates: hard to guess, input from science needed - gas stream through X-ray-beam: - solid targets in beam (just first bunch) - that is not all. Who needs what?/how much?

Peter Göttlicher, DESY-FEB, XFEL-DAQ-2007/03/05 7 Idea: Data stream to and concept of backend-system Aim: - Collect complete frames into one CPU - Data in mass storage sorted for "Offline"-analysis Allowance: - Calculation on frames before sending to mass storage - Decisions on frame to store/reject inside backend system. - Rejecting frames on more fancy information from external signals. - Write data frame-wise to mass storage - easier offline.

Peter Göttlicher, DESY-FEB, XFEL-DAQ-2007/03/05 8 Interface: Experiments to backend-system Rough: Data from Pixels written to RAM and read in other sequence By that: One group of frames transfers to 1 Link to BE. Still 8 or more of such modules transfer to same backend-system

Peter Göttlicher, DESY-FEB, XFEL-DAQ-2007/03/05 9 Need, controlled time sequence: So that each CPU gets data from all interface-modules without time conflict

Peter Göttlicher, DESY-FEB, XFEL-DAQ-2007/03/05 10 Idea: Concept of backend-system SwitchCPU-Farm Interface electronics Stage 1: » 100 Links/1Gbit/s mass storage control electronics Feedback monitoring data XFEL-accel.?

Peter Göttlicher, DESY-FEB, XFEL-DAQ-2007/03/05 11 Rates: - Not well defined, what does which of experiment - AP-HPAD: 2Bytes/pixel/frame is reasonable - LSDD: Start with over sampling: Where to reduce to what, has to be a common diccussion - LPD: Rate expected to be similar to AP-HPAD, but I haven't seen their proposal. - First tests of compressions had been not effective Recent statement for single case was factor 2-4. Estimate to storage: 1Mpixel * 500bunches/train * 2Bytes/pixel * 10trains/s 10GBytes/sec Discussion with inputs from backend/science needed to settle rates/uptime/technical-effort/offline-effort

Peter Göttlicher, DESY-FEB, XFEL-DAQ-2007/03/05 12 Time schedule 2007: Experiments write proposal 15-April 2007 end : Designing, Constructing, but also Laboratory tests and usage at other beam lines 2009(?) DAQ needed for tests - no large CPU-farm LCLS has 120Hz and not 30000bunches/sec 2013: 1Mpixel dedectors at XFEL3*(10-20GB)/sec 20xx: Upgrades beyond 1MPixel (e.g. 4Mpixel)

Peter Göttlicher, DESY-FEB, XFEL-DAQ-2007/03/05 13 Conclusion Concept is there, open to change and do an other way Two aspects: Control electronics Backend-system Ideas about numbers, but a lot of open questions Discussions needed with input for - effort in backend - science requirements - offline possibilities - feasibility of interface-electronics Common effort for all experiments is a strong wish