Trigger Meeting: Greg Iles5 March 20021 The APV Emulator (APVE) Task 1. –The APV25 has a 10 event buffer in de-convolution mode. –Readout of an event =

Slides:



Advertisements
Similar presentations
Sumitha Ajith Saicharan Bandarupalli Mahesh Borgaonkar.
Advertisements

Athens University Paris Sphicas Vassilis Karageorgos (Diploma) NCSR Demokritos Theo Geralis Christos Markou Isidoros Michailakis (Electronics Engineer)
Trigger System Functions Master/Slave Operation –Located in Readout Boards’ BE-FPGA, but only active as Master in one slot. –Master controls asynchronous.
MICE Tracker Front End Progress Tracker Data Readout Basics Progress in Increasing Fraction of Muons Tracker Can Record Determination of Recordable Muons.
IO Controller Module Arbitrates IO from the CCP Physically separable from CCP –Can be used as independent data logger or used in future projects. Implemented.
DSP online algorithms for the ATLAS TileCal Read Out Drivers Cristobal Cuenca Almenar IFIC (University of Valencia-CSIC)
Fiber Channel Video Controller Students: Tsachy Kapchitz Michael Grinkrug Supervisor: Alex Gurovich in cooperation with: Elbit Systems המעבדה למערכות ספרתיות.
Global Trigger H. Bergauer, K. Kastner, S. Kostner, A. Nentchev, B. Neuherz, N. Neumeister, M. Padrta, P. Porth, H. Rohringer, H. Sakulin, J. Strauss,
Trigger Supervisor (TS) J. William Gu Data Acquisition Group 1.TS position in the system 2.First prototype TS 3.TS functions 4.TS test status.
DLS Digital Controller Tony Dobbing Head of Power Supplies Group.
ASIC/FPGA design flow. FPGA Design Flow Detailed (RTL) Design Detailed (RTL) Design Ideas (Specifications) Design Ideas (Specifications) Device Programming.
U N C L A S S I F I E D FVTX Detector Readout Concept S. Butsyk For LANL P-25 group.
Leo Greiner IPHC meeting HFT PIXEL DAQ Prototype Testing.
Status of the Beam Phase and Intensity Monitor for LHCb Richard Jacobsson Zbigniew Guzik Federico Alessio TFC Team: Motivation Aims Overview of the board.
CERN Real Time conference, Montreal May 18 – 23, 2003 Richard Jacobsson 1 Driving the LHCb Front-End Readout TFC Team: Arek Chlopik, IPJ, Poland Zbigniew.
September 8-14, th Workshop on Electronics for LHC1 Channel Control ASIC for the CMS Hadron Calorimeter Front End Readout Module Ray Yarema, Alan.
Status of Global Trigger Global Muon Trigger Sept 2001 Vienna CMS-group presented by A.Taurok.
11th March 2008AIDA FEE Report1 AIDA Front end electronics Report February 2008.
CPT Week, April 2001Darin Acosta1 Status of the Next Generation CSC Track-Finder D.Acosta University of Florida.
Claudia-Elisabeth Wulz Anton Taurok Institute for High Energy Physics, Vienna Trigger Internal Review CERN, 6 Nov GLOBAL TRIGGER.
Design of a Novel Bridge to Interface High Speed Image Sensors In Embedded Systems Tareq Hasan Khan ID: ECE, U of S Term Project (EE 800)
Background Physicist in Particle Physics. Data Acquisition and Triggering systems. Specialising in Embedded and Real-Time Software. Since 2000 Project.
Instrumentation DepartmentCCLRC Rutherford Appleton Laboratory28 March 2003 FED Project Plan 2003 FED Project aiming to satisfy 2 demands/timescales: Module.
FED RAL: Greg Iles5 March The 96 Channel FED Tester What needs to be tested ? Requirements for 96 channel tester ? Baseline design Functionality.
Xiangming Sun1PXL Sensor and RDO review – 06/23/2010 STAR XIANGMING SUN LAWRENCE BERKELEY NATIONAL LAB Firmware and Software Architecture for PIXEL L.
PROJECT - ZYNQ Yakir Peretz Idan Homri Semester - winter 2014 Duration - one semester.
Dec.11, 2008 ECL parallel session, Super B1 Results of the run with the new electronics A.Kuzmin, Yu.Usov, V.Shebalin, B.Shwartz 1.New electronics configuration.
John Coughlan Tracker Week October FED Status Production Status Acceptance Testing.
FPGA firmware of DC5 FEE. Outline List of issue Data loss issue Command error issue (DCM to FEM) Command lost issue (PC with USB connection to GANDALF)
16th July 2003Tracker WeekJohn Coughlan et. al.FED-UK Group Final FED Progress Report CMS Tracker Week 16th July 2003.
Final FED 1 Testing Set Up Testing idea and current status Preliminary results Future development Summary M. Noy
LHCb front-end electronics and its interface to the DAQ.
1 07/10/07 Forward Vertex Detector Technical Design – Electronics DAQ Readout electronics split into two parts – Near the detector (ROC) – Compresses and.
Costas Foudas, The Tracker Interface to TCS, The CMS Silicon Tracker FED Crates What goes in the FED Crates ? What do we do about the VME controller.
Tracker Week October CCLRC, Rutherford Appleton Laboratory, Oxon, UK Imperial College, London, UK Brunel University,
1Ben ConstanceFONT Meeting 1st August 2008 ATF2 digital feedback board 9 channel board with replaceable daughter board (RS232 etc.) − Board will log data.
Software Standard CMS FEC software for the control part We have been advised to look for a shortcut in order to give triggers to the VFAT via the FEC –We.
FVTX Electronics (WBS 1.5.2, 1.5.3) Sergey Butsyk University of New Mexico Sergey Butsyk DOE FVTX review
A Super-TFC for a Super-LHCb (II) 1. S-TFC on xTCA – Mapping TFC on Marseille hardware 2. ECS+TFC relay in FE Interface 3. Protocol and commands for FE/BE.
1 MICE Tracker Readout Update Introduction/Overview TriP-t hardware tests AFE IIt firmware development VLSB firmware development Hardware progress Summary.
Vienna Group Discussion Meeting on Luminosity CERN, 9 May 2006 Presented by Claudia-Elisabeth Wulz Luminosity.
Time Management.  Time management is concerned with OS facilities and services which measure real time.  These services include:  Keeping track of.
CMS FED Testing Update M. Noy & J. Leaver Imperial College Silicon Group.
LECC2004: Performance of the CMS Silicon Tracker FED: Greg Iles13 September Performance of the CMS Silicon Tracker Front-End Driver 10th Workshop.
Trigger Workshop: Greg Iles Feb Optical Global Trigger Interface Card Dual CMC card with Virtex 5 LX110T 16 bidirectional.
LKr readout and trigger R. Fantechi 3/2/2010. The CARE structure.
Tracker Week February presented by John Coughlan RAL FED Status FEDv2 Testing Pre-Series Manufacture Final Production.
Ba A B B1 FADC B2 SD_FP FLEX_I/O ROC VME64x A: [ HELICITY, HELICITY_FLIP ] (NIM or ECL) Port 1 Port 2 a: [ HELICITY, HELICITY_FLIP ] (LVDS) B: [ HELICITY_TRIGGER,
18/05/2000Richard Jacobsson1 - Readout Supervisor - Outline Readout Supervisor role and design philosophy Trigger distribution Throttling and buffer control.
General Tracker Meeting: Greg Iles4 December Status of the APV Emulator (APVE) First what whyhow –Reminder of what the APVE is, why we need it and.
Rutherford Appleton Laboratory September 1999Fifth Workshop on Electronics for LHC Presented by S. Quinton.
Evelyn Thomson Ohio State University Page 1 XFT Status CDF Trigger Workshop, 17 August 2000 l XFT Hardware status l XFT Integration tests at B0, including:
FPGA based signal processing for the LHCb Vertex detector and Silicon Tracker Guido Haefeli EPFL, Lausanne Vertex 2005 November 7-11, 2005 Chuzenji Lake,
Super BigBite DAQ & Trigger Jens-Ole Hansen Hall A Collaboration Meeting 16 December 2009.
LECC2003: The 96 Chann FED Tester: Greg Iles30 September The 96 channel FED Tester Outline: (1) Background (2) Requirements of the FED Tester (3)
BIS main electronic modules - Oriented Linac4 - Stéphane Gabourin TE/MPE-EP Workshop on Beam Interlock Systems Jan 2015.
Calliope-Louisa Sotiropoulou FTK: E RROR D ETECTION AND M ONITORING Aristotle University of Thessaloniki FTK WORKSHOP, ALEXANDROUPOLI: 10/03/2014.
1 Programming of FPGA in LiCAS ADC for Continuous Data Readout Week 4 Report Tuesday 22 nd July 2008 Jack Hickish.
14-BIT Custom ADC Board Rev. B
Production Firmware - status Components TOTFED - status
Readout System of the CMS Pixel Detector
AFE II Status First board under test!!.
Status of the Beam Phase and Intensity Monitor for LHCb
Vertex 2005 November 7-11, 2005 Chuzenji Lake, Nikko, Japan
8-layer PC Board, 2 Ball-Grid Array FPGA’s, 718 Components/Board
Example of DAQ Trigger issues for the SoLID experiment
The CMS Tracking Readout and Front End Driver Testing
PID meeting Mechanical implementation Electronics architecture
The Trigger Control System of the CMS Level-1 Trigger
FED Design and EMU-to-DAQ Test
Presentation transcript:

Trigger Meeting: Greg Iles5 March The APV Emulator (APVE) Task 1. –The APV25 has a 10 event buffer in de-convolution mode. –Readout of an event = 7  s –Triggers arrive in a Poisson distribution with mean period = 10  s. –Finite buffer + Random triggers => Possibility of buffer overflow –OVERFLOW => DEAD TRACKER (APV reset required) Task 2. –The FED provides the median APV pipeline address of all its channels and compares it against a “golden” pipeline address provided by the APVE. APVE protects against buffer overflow APVE detects loss of sync in a Tracker partition What does the APVE do & why ?

Trigger Meeting: Greg Iles5 March Tracker APV in deconvolution mode What does APVE do & why ? Primary task: Preventing buffer overflow in APVs –Its takes too long to send a ‘buffers full’ signal from APVs in the tracker to Trigger Control System (TCS). –Therefore require an APV close to the TCS. Secondary task: Synchronisation check –All APV data frames include the memory cell (pipeline) address used to store the L1A data in the APV. –The pipeline address is sent to all FEDs to ensure that all APVs are in-sync. APVE 1: Full 2: Full 3: Empty 10: Empty Data frame TCS: Inhibit L1A ? Reset and L1A Busy FED: Data OK? Pipeline address (min period = 75ns) (period = 7000ns)

Trigger Meeting: Greg Iles5 March How does APVE work ? L1A Throttle –A counter keeps track of the number of filled APV buffers. L1A => INCREMENTS counter. Output frame => DECREMENTS the counter. Reset => CLEARS the counter. APVE must receive the same L1As and Resets as APVs within the Tracker or System fails –When the counter reaches preset values it asserts Warn followed by Busy. Synchronisation check –Header on APV data frame provides pipeline address Real APV25 Buffer counter L1A APV data frame Pipeline address to FEDs Busy DECREMENT Reset CLEAR Frame output signal Assert busy ? Header recognition APVE INCREMENT

Trigger Meeting: Greg Iles5 March Task 1: L1A Throttle Timing critical –Set by control loop from L1A inhibit gate within Global or Local TCS to APVE and back again. –Want to assert busy < 75ns) –Alternatively we lose an event buffer location in the APV for every 75ns delay. Loss of buffers increases dead time. Require fast access to GTCS/LTCS to receive L1A/RESET and send Fast Control signals. –TCS prefers a single set of Fast Control signals from the Tracker. Fast Merge Module (FMM) signals to go via APVE L1A & RESET Inhibit gate (inside TCS) APVE Buffers Full ? L1A & RESET BUSY

Trigger Meeting: Greg Iles5 March Deadtime

Trigger Meeting: Greg Iles5 March Control structure WARNING –Timing structure of L1As and Resets received by the APVE and the APVs within the Tracker must be identical. –How are resets issued by the TCS transformed into ‘101’ resets for the APV. Also relevant for ‘11’ calibrates. –The APVE will NOT operate if the TTCvi is used as a source of Resets & L1As. APVETTCvi Fast control APV GTCSLTCS TTCex TTCtx FEC CCU Ring FED FMM Fast Merge Module Pipeline address Reset & L1As ? Applying timing constraints to control structure –At present envisage that APVE receives L1A and Reset from both Global and Local TCS.

Trigger Meeting: Greg Iles5 March Current progress Development system built and tested. –Fast Control functions (i.e. busy, warn and out-of-sync). –Programmable (i.e busy asserted when ‘X’ many APV buffers remain empty. –History of signals recorded busy, warn, out-of-sync (i.e. when asserted, when negated) for ‘X’ many occasions. Total time asserted for busy, warn, and out-of-sync. –Interfaces such VME, Wishbone and I2C interface –TCS system implemented for testing. APV Trig & ResetClk FPGA

Reset Test Signal Post TCS, T1 to APV APV Output Busy Warn Reset GTCS Trigger before GTCS inhibit Time penalty incurred in the FPGA to distinguish triggers, '1' from resets, '101‘ and calibrates, '11‘, unless we receive trigger and reset signals separately. ‘101’ reset‘1’ trigger inhibit blocks trigger first tick after APV reset busy asserted warn asserted

End of reset period Busy asserted after 8 triggers. Warn asserted after 5 triggers. APV ready to receive trigger ‘1’ trigger Test Signal Post TCS, T1 to APV APV Output Busy Warn Reset GTCS Trigger before GTCS inhibit

Buffer empties Busy negated when an APV buffer empties, providing space for another event to be stored. It is asserted once more after a further trigger is sent to the APV. ‘1’ trigger APV frame digital header Test Signal Post TCS, T1 to APV APV Output Busy Warn Reset GTCS Trigger before GTCS inhibit

Trigger Meeting: Greg Iles5 March Future.... Simulate APV in real time VHDL code to simulate the APV in real time written and tested on ModelSim (VHDL logic simulation package), but not downloaded into an FPGA. An FPGA is sufficiently fast The APV pipeline logic, a possible obstacle to FPGA implementation, has been tested in a Xilinx Spartan and sufficiently large At a size of 200k gates the design is too big for our Spartan-2 (100k gates), but will fit in a Virtex-2.

Trigger Meeting: Greg Iles5 March Task 2: Sync (global) Task 2. –FED detects individual APVs losing sync –If more than 65 (?) APVs out of sync FED can generate the wrong pipeline address Incorrect data to the DAQ. –Should happen very rarely How important is getting pipeline address to the FED? APVE detects loss of sync in a Tracker partition

Trigger Meeting: Greg Iles5 March Pipeline address via network At present –Check data after it has been sent to DAQ at a frequency of every few seconds and in software. Cons... –Requires lots of software and the pipeline address travels a complex route to FED. APVE Crate CPUCPU APVEAPVE APVEAPVE APVEAPVE APVEAPVE FED Crate CPUCPU FEDFED FEDFED FEDFED CPUCPU FEDFED FEDFED FEDFED x10 (ish) x20

Trigger Meeting: Greg Iles5 March Pipeline address via serial link Possibly..... –Direct serial link (optical) to each FED crate. –Pipeline address distributed to FEDs via single trace on VME back-plane. Cons... –Additional hardware which needs to be designed built and tested. –Reliability & maintenance for duration of LHC. APVE Crate APVEAPVE APVEAPVE APVEAPVE APVEAPVE FED Crate APVPAPVP FEDFED FEDFED CPUCPU FEDFED FEDFED FEDFED x10 (ish) CPUCPU CPUCPU APVPAPVP

Trigger Meeting: Greg Iles5 March Outstanding issues Where do we get L1A and Reset from, if not LTCS and GTCS ? –If not LTCS & GTCS what is the time penalty of obtaining these signals further down the command chain? Where does merge of fast feedback signals take place, if at all ? –APVE needs to be the last stage in this process, or very near it because timing critical. –What is the time penalty imposed by going through FMM? –At present APVE design assumes it is after FMM module and has 4 inputs (Ready/Error/Busy, Warn and Out-Of-Sync) that are OR’ed with APVE fast feedback signals. How do we get pipeline address to the FEDs ? –At present intend to check data after it has been sent to DAQ at a frequency of every few seconds and in software. –Serial link to each FED crate VME bus is possibly a simpler, more robust option, but more awkward to implement.

Trigger Meeting: Greg Iles5 March Conclusions Need to finalise location of APVE in command (RESET/L1A) and fast feedback (BUSY/WARN etc.) control structure. When details finalised we will be able to finish board schematics, manufacture and test APVE. At the beginning we envisage 4 VME boards, one for each partition located in the Global Trigger rack.

Trigger Meeting: Greg Iles5 March APVE IO FPGA Global TCS, Reset/L1A Global TCS Fast Control FMM, Fast Control Local TCS, Reset/L1A Local TCS Fast Control LHC Clock VME, A24/D16 Perhaps also A fibre optic serial links to each FED crate to deliver the pipeline address (approx 10 per APVE)