Data Acquisition for the 12 GeV Upgrade CODA 3. The good news…  There is a group dedicated to development and support of data acquisition at Jefferson.

Slides:



Advertisements
Similar presentations
Integrated Tests of a High Speed VXS Switch Card and 250 MSPS Flash ADC Hai Dong, Chris Cuevas, Doug Curry, Ed Jastrzembski, Fernando Barbosa, Jeff Wilson,
Advertisements

Data Acquisition System for 2D X-Ray Detector Beijing Synchrotron Radiation Facility (BSRF) located at Institute of High Energy Physics is the first synchrotron.
An ATCA and FPGA-Based Data Processing Unit for PANDA Experiment H.XU, Z.-A. LIU,Q.WANG, D.JIN, Inst. High Energy Physics, Beijing, W. Kühn, J. Lang, S.
12 GeV Trigger Workshop Session II - DAQ System July 8th, 2009 – Christopher Newport Univ. David Abbott.
Experimental Facilities Division ANL-ORNL SNS Experimental Data Standards (Status) Richard Riedel SNS Data Acquisition Group Leader.
Design and Development of High Performance PC Based Logic Analyzer MSc Project by Rab Nawaz Advisor: Dr. Shahid Masud.
DAQ WS03 Sept 2006Jean-Sébastien GraulichSlide 1 Interface between Control & Monitoring and DDAQ o Introduction o Some background on DATE o Control Interface.
LHCb readout infrastructure NA62 TDAQ WG Meeting April 1 st, 2009 Niko Neufeld, PH/LBC.
Elliott Wolin PCaPAC 2008 Ljubljana, Slovenia. Outline 1. Introduction 2. What is Publish/Subscribe Messaging 3. What is cMsg a) Client view b) Developer.
Data Acquisition Graham Heyes May 20th Outline Online organization - who is responsible for what. Resources - staffing, support group budget and.
Plans for EPICS in Hall D at Jefferson Lab Elliott Wolin EPICS Collaboration Meeting Vancouver, BC 30-Apr-2009.
CERN - European Laboratory for Particle Physics HEP Computer Farms Frédéric Hemmer CERN Information Technology Division Physics Data processing Group.
May. 11, 2015 David Lawrence JLab Counting House Operations.
Hall D Trigger and Data Rates Elliott Wolin Hall D Electronics Review Jefferson Lab 23-Jul-2003.
 Brief status update of DAQ/Trigger production hardware  Firmware development for HPS application  CLAS12 CTP ‘upgrade’ notes  Summary Status of the.
DAQ Status & Plans GlueX Collaboration Meeting – Feb 21-23, 2013 Jefferson Lab Bryan Moffit/David Abbott.
TID and TS J. William Gu Data Acquisition 1.Trigger distribution scheme 2.TID development 3.TID in test setup 4.TS development.
Trigger Supervisor (TS) J. William Gu Data Acquisition Group 1.TS position in the system 2.First prototype TS 3.TS functions 4.TS test status.
Hall A DAQ status and upgrade plans Alexandre Camsonne Hall A Jefferson Laboratory Hall A collaboration meeting June 10 th 2011.
DAQ Status Graham. EMU / EB status EMU framework prototype is complete. Prototype read, process and send modules are complete. XML configuration mechanism.
DAQ Status Report GlueX Collaboration – Jan , 2009 – Jefferson Lab David Abbott (In lieu of Graham) GlueX Collaboration Meeting - Jan Jefferson.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
CODA Users Workshop (Data Acquisition at Jefferson Lab) By David Abbott.
ILC Trigger & DAQ Issues - 1 ILC DAQ issues ILC DAQ issues By P. Le Dû
David Abbott CODA3 - DAQ and Electronics Development for the 12 GeV Upgrade.
DAQ Issues for the 12 GeV Upgrade CODA 3. A Modest Proposal…  Replace aging technologies  Run Control  Tcl-Based DAQ components  mSQL  Hall D Requirements.
Ethernet Based Embedded IOC for FEL Control Systems J. Yan, D. Sexton, Al Grippo, W. Moore, and K. Jordan ICALEPCS 2007 October 19, 2007 Knoxville Convention.
L3 DAQ the past, the present, and your future Doug Chapin for the L3DAQ group DAQ Shifters Meeting 26 Mar 2002.
David Abbott - JLAB DAQ group Embedded-Linux Readout Controllers (Hardware Evaluation)
1 Trigger and DAQ for SoLID SIDIS Programs Yi Qiang Jefferson Lab for SoLID-SIDIS Collaboration Meeting 3/25/2011.
Gueorgui ANTCHEVPrague 3-7 September The TOTEM Front End Driver, its Components and Applications in the TOTEM Experiment G. Antchev a, b, P. Aspell.
Management of the LHCb DAQ Network Guoming Liu * †, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
Overview of DAQ at CERN experiments E.Radicioni, INFN MICE Daq and Controls Workshop.
Online monitoring and filtering Graham July 2009 Graham July 2009.
David Abbott - Jefferson Lab DAQ group Data Acquisition Development at JLAB.
Serial Data Link on Advanced TCA Back Plane M. Nomachi and S. Ajimura Osaka University, Japan CAMAC – FASTBUS – VME / Compact PCI What ’ s next?
Latest ideas in DAQ development for LHC B. Gorini - CERN 1.
12GeV Trigger Workshop Christopher Newport University 8 July 2009 R. Chris Cuevas Welcome! Workshop goals: 1.Review  Trigger requirements  Present hardware.
Sep. 17, 2002BESIII Review Meeting BESIII DAQ System BESIII Review Meeting IHEP · Beijing · China Sep , 2002.
June 17th, 2002Gustaaf Brooijmans - All Experimenter's Meeting 1 DØ DAQ Status June 17th, 2002 S. Snyder (BNL), D. Chapin, M. Clements, D. Cutts, S. Mattingly.
Modeling PANDA TDAQ system Jacek Otwinowski Krzysztof Korcyl Radoslaw Trebacz Jagiellonian University - Krakow.
SoLiD/PVDIS DAQ Alexandre Camsonne. DAQ limitations Electronics Data transfer.
Jefferson Laboratory Hall A SuperBigBite Spectrometer Data Acquisition System Alexandre Camsonne APS DNP 2013 October 24 th 2013 Hall A Jefferson Laboratory.
1ICALEPCS, October 15-19, 2007, Knoxville, Tennessee Association Euratom-Cea Ph. Moreau Association EURATOM-CEA Département de Recherches sur la Fusion.
DØ Online Workshop3-June-1999S. Fuess Online Computing Overview DØ Online Workshop 3-June-1999 Stu Fuess.
Electronics for HPS Proposal September 20, 2010 S. Boyarinov 1 HPS DAQ Overview Sergey Boyarinov JLAB June 17, 2014.
XLV INTERNATIONAL WINTER MEETING ON NUCLEAR PHYSICS Tiago Pérez II Physikalisches Institut For the PANDA collaboration FPGA Compute node for the PANDA.
1 Electronics Status Trigger and DAQ run successfully in RUN2006 for the first time Trigger communication to DRS boards via trigger bus Trigger firmware.
DAQ Status & Plans GlueX Collaboration Meeting – Feb 21-23, 2013 Jefferson Lab Bryan Moffit/David Abbott.
Management of the LHCb DAQ Network Guoming Liu *†, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
APEX DAQ rate capability April 19 th 2015 Alexandre Camsonne.
HPS TDAQ Review Sergey Boyarinov, Ben Raydo JLAB June 18, 2014.
CODA run-control JLAB DAQ Group V. Gyurjyan C. Timmer A. Smith New Horizons governor school.
CODA Graham Heyes Computer Center Director Data Acquisition Support group leader.
Event Management. EMU Graham Heyes April Overview Background Requirements Solution Status.
1 DAQ.IHEP Beijing, CAS.CHINA mail to: The Readout In BESIII DAQ Framework The BESIII DAQ system consists of the readout subsystem, the.
GlueX Collaboration May05 C. Cuevas 1 Topics: Infrastructure Update New Developments EECAD & Modeling Tools Flash ADC VXS – Crates GlueX Electronics Workshop.
COMPASS DAQ Upgrade I.Konorov, A.Mann, S.Paul TU Munich M.Finger, V.Jary, T.Liska Technical University Prague April PANDA DAQ/FEE WS Игорь.
Super BigBite DAQ & Trigger Jens-Ole Hansen Hall A Collaboration Meeting 16 December 2009.
A 250 MHz Level 1 Trigger and Distribution System for the GlueX Experiment David Abbott, C. Cuevas, E. Jastrzembski, F. Barbosa, B. Raydo, H. Dong, J.
The Control and Hardware Monitoring System of the CMS Level-1 Trigger Ildefons Magrans, Computing and Software for Experiments I IEEE Nuclear Science Symposium,
Fermilab Scientific Computing Division Fermi National Accelerator Laboratory, Batavia, Illinois, USA. Off-the-Shelf Hardware and Software DAQ Performance.
DAQ and Trigger for HPS run Sergey Boyarinov JLAB July 11, Requirements and available test results 2. DAQ status 3. Trigger system status and upgrades.
Gu Minhao, DAQ group Experimental Center of IHEP February 2011
Computer Software Lecture 5.
Enrico Gamberini, Giovanna Lehmann Miotto, Roland Sipos
Fernando J. Barbosa F1TDC Status Update Hall D Collaboration Meeting Indiana University – Bloomington, IN May 20-22, 2004 Fernando J.
Example of DAQ Trigger issues for the SoLID experiment
Hall D Trigger and Data Rates
The only thing worse than an experiment you can’t control is an experiment you can. May 6, 2019 V. Gyurjyan CHEP2007.
Presentation transcript:

Data Acquisition for the 12 GeV Upgrade CODA 3

The good news…  There is a group dedicated to development and support of data acquisition at Jefferson Lab. This includes Hall D.  Much of what Hall D needs is generally useful to the whole JLAB experimental program.  We are not waiting on the 12GeV upgrade.

In the short term…  Hall D Requirements drive development  Replace aging technologies  Run Control  Tcl-Based DAQ components  mSQL  Maintain cross-platform compatibility  Linux, Solaris, OS X, vxWorks  Support new commercial hardware advances

HALL D Existing Halls

GlueX - Requirements  Pipelined Electronics (ADC, TDC)  Dead-timeless system  200 KHz L1 Trigger  Parallel/Staged Event Building  Up to 100 Front-end Crates  1 GByte/s aggregate data throughput  L3 Online Farm  200+ nodes  x10 reduction in data to disk  Storage Management  Ordering/sorting of built events (at kHz)  100 MB/s --> 8 TB/day --> 1 PB/year

Front-End Issues  Trigger rate KHz  Block up Events (200 event block -> 2kHz)  Move some ROL runtime code to modules (FPGAs)  ADCs provide L1 trigger data ( hence we need a distributed high speed clock MHz ??)  New Trigger Supervisor  Perhaps 100+ crates  Support pipeline, event blocking  Manage flow control into DAQ system backend

Level 1 Trigger Distributed high speed clock Subset of ROCs collect sampled ADC data Subset of ROCs collect sampled ADC data and send it to L1 Trigger in sync and send it to L1 Trigger in sync 12 bit sums/crate x 250MHz --> 3 Gbit/s links 12 bit sums/crate x 250MHz --> 3 Gbit/s links Trigger decision goes to Trigger Supervisor Trigger decision goes to Trigger Supervisor

Front-End Issues cont…  Form-Factor for electronics  VME64X  New commercial bridge (TSi148) supports 300MB/s on existing VME backplanes  Support other Hall DAQ applications  High speed switched serial interconnect (4Gbit/s links) needed for GlueX L1 trigger  Commercial solutions - VXS, ATCA ?  DAQ - Trigger - Modules : All must be designed to work together.

Staged/Parallel Event Building EMU built around the ET system for customizable system for customizable processing/distribution of processing/distribution of event streams. event streams.Examples: Data Concentrator for ROCs Data Concentrator for ROCs Sub-Event builder Sub-Event builder Farm distribution point Farm distribution point Event Recorder Event Recorder User Processes can attach to User Processes can attach to any EMU in the system any EMU in the system

L3 Farm  Can be used for analysis or filtering  Support 100s of nodes  Nodes can come and go during event taking  Do other experimental halls need this (Hall B)?  Do filtered events need to be time ordered? 1 GB/s 100 MB/s

RunControl / Monitoring / Slow Controls  First generation Java Agent RunControl is here  Robust fault tolerance  Process abstraction through COOL language  Integration of foreign processes  DP, vxServer, shells  EPICS, CAEN OPC coming  Move toward full integration of Slow Controls  Web Interface for remote monitoring.  Extended and customizable graphing and DAQ system monitoring capabilities  Basis for Cmsg - CODA messaging system currently under development

Other Issues  Integrate existing Hall requirements into a single supportable distribution.  Transition toward Hall D requirements.  Maintain cross platform compatibility  SUN, LINUX, VxWorks  64bit Arch - Opteron, G5 (Mac X)  Embedded Linux (on Single Board Computers)  Move to database independence  Proxy Server (JDBC) to support User’s database choice  User Hooks into the DAQ system  JAVA  Updated Tcl support  Others…??

Summary  CODA version 3 is now being molded - nothing is irreversible.  Our plan is to phase in new tools to provide a smooth transition from CODA2 --> CODA 3  Much software support for Hall D requirements are on a short term timeline (2-3 years).  Front-End (hardware) support is longer term and may go through a “revision 1” iteration for use in existing experiments.

Extra slides

Pipelines (Dead-timeless DAQ) 250MHz 10µs “snapshot” can be stored in memory (5KB/FADC) A Trigger generates a lookback and extraction of the sampled ADC data

VME64X - VXS Interconnect J total pins 45 differential pairs 6 GHz Bandwidth 18 VME Payload Slots 2 Switching slots

What is CODA