MINERvA DAQ Software D. Casper University of California, Irvine.

Slides:



Advertisements
Similar presentations
Kondo GNANVO Florida Institute of Technology, Melbourne FL.
Advertisements

17 Sep 2009Paul Dauncey1 US DHCAL integration with CALICE DAQ Paul Dauncey.
June 19, 2002 A Software Skeleton for the Full Front-End Crate Test at BNL Goal: to provide a working data acquisition (DAQ) system for the coming full.
28 February 2003Paul Dauncey - HCAL Readout1 HCAL Readout and DAQ using the ECAL Readout Boards Paul Dauncey Imperial College London, UK.
23/04/2008VLVnT08, Toulon, FR, April 2008, M. Stavrianakou, NESTOR-NOA 1 First thoughts for KM3Net on-shore data storage and distribution Facilities VLV.
MINER A NuMI MINER A Director’s Review 10 January 2005 D. Casper UC Irvine Electronics and Data Acquisition D. Casper (Irvine) with P. Rubinov (Fermilab)
ACAT 2002, Moscow June 24-28thJ. Hernández. DESY-Zeuthen1 Offline Mass Data Processing using Online Computing Resources at HERA-B José Hernández DESY-Zeuthen.
CHEP04 - Interlaken - Sep. 27th - Oct. 1st 2004T. M. Steinbeck for the Alice Collaboration1/20 New Experiences with the ALICE High Level Trigger Data Transport.
9 Sep 2005DAQ - Paul Dauncey1 DAQ/Online Status Paul Dauncey Imperial College London.
2 July 2003Paul Dauncey - DAQ1 Progress on CALICE DAQ Paul Dauncey Imperial College London, UK.
12 Oct 2005DAQ - Paul Dauncey1 DAQ/Online Status Paul Dauncey Imperial College London.
Linda R. Coney – 24th April 2009 Online Reconstruction & a little about Online Monitoring Linda R. Coney 18 August, 2009.
MINER A NuMI MINER A DAQ Review 12 September 2005 D. Casper UC Irvine WBS7: Electronics and Data Acquisition  Overview (this talk): D. Casper  Front-end.
GLAST LAT ProjectGLAST Flight Software IDT, October 16, 2001 JJRussell1 October 16, 2001 What’s Covered Activity –Monitoring FSW defines this as activity.
MINER A NuMI MINER A DAQ Review 12 September 2005 D. Casper UC Irvine WBS 7.2 & 7.3: Data Acquisition D. Casper (UC Irvine)
VC Sept 2005Jean-Sébastien Graulich Report on DAQ Workshop Jean-Sebastien Graulich, Univ. Genève o Introduction o Monitoring and Control o Detector DAQ.
GLAST LAT Project SE Test Planning meeting October 4, 2004 E. do Couto e Silva 1/13 SVAC Data Taking during LAT SLAC Oct 4, 2004 Eduardo.
CFT Calibration Calibration Workshop Calibration Requirements Calibration Scheme Online Calibration databases.
Fermilab Scientific Computing Division Fermi National Accelerator Laboratory, Batavia, Illinois, USA. The Power of Data Driven Triggering DAQ Topology.
Data Acquisition Software for CMS HCAL Testbeams Jeremiah Mans Princeton University CHEP2003 San Diego, CA.
CERN/IT/DB Multi-PB Distributed Databases Jamie Shiers IT Division, DB Group, CERN, Geneva, Switzerland February 2001.
FLARE Workshop November 4-6, Data Acquisition Mark Bowden, Margaret Votava.
Online Systems Status Review of requirements System configuration Current acquisitions Next steps... Upgrade Meeting 4-Sep-1997 Stu Fuess.
Farm Management D. Andreotti 1), A. Crescente 2), A. Dorigo 2), F. Galeazzi 2), M. Marzolla 3), M. Morandin 2), F.
Workfest Goals Develop the Tools for CDR Simulations HDFast HDGEANT Geometry Definitions Remote Access Education of the rest of the collaboration Needs.
14 Sep 2005DAQ - Paul Dauncey1 Tech Board: DAQ/Online Status Paul Dauncey Imperial College London.
Online Monitoring and Analysis for Muon Tomography Readout System M. Phipps, M. Staib, C. Zelenka, M. Hohlmann Florida Institute of Technology Department.
CDF Offline Production Farms Stephen Wolbers for the CDF Production Farms Group May 30, 2001.
21 st October 2002BaBar Computing – Stephen J. Gowdy 1 Of 25 BaBar Computing Stephen J. Gowdy BaBar Computing Coordinator SLAC 21 st October 2002 Second.
6-10 Oct 2002GREX 2002, Pisa D. Verkindt, LAPP 1 Virgo Data Acquisition D. Verkindt, LAPP DAQ Purpose DAQ Architecture Data Acquisition examples Connection.
Tape logging- SAM perspective Doug Benjamin (for the CDF Offline data handling group)
Status of UTA IAC + RAC Jae Yu 3 rd DØSAR Workshop Apr. 7 – 9, 2004 Louisiana Tech. University.
Prediction W. Buchmueller (DESY) arXiv:hep-ph/ (1999)
Sep 02 IPP Canada Remote Computing Plans Pekka K. Sinervo Department of Physics University of Toronto 4 Sep IPP Overview 2 Local Computing 3 Network.
4 th Workshop on ALICE Installation and Commissioning January 16 th & 17 th, CERN Muon Tracking (MUON_TRK, MCH, MTRK) Conclusion of the first ALICE COSMIC.
25 Feb 2005Paul Dauncey1 TB Review: DAQ Paul Dauncey Imperial College London For Imperial, RAL, UCL.
DEPARTEMENT DE PHYSIQUE NUCLEAIRE ET CORPUSCULAIRE JRA1 Parallel - DAQ Status, Emlyn Corrin, 8 Oct 2007 EUDET Annual Meeting, Palaiseau, Paris DAQ Status.
The KLOE computing environment Nuclear Science Symposium Portland, Oregon, USA 20 October 2003 M. Moulson – INFN/Frascati for the KLOE Collaboration.
Online Reconstruction 1M.Ellis - CM th October 2008.
Sep. 17, 2002BESIII Review Meeting BESIII DAQ System BESIII Review Meeting IHEP · Beijing · China Sep , 2002.
CDMS Computing Project Don Holmgren Other FNAL project members (all PPD): Project Manager: Dan Bauer Electronics: Mike Crisler Analysis: Erik Ramberg Engineering:
1 EIR Nov 4-8, 2002 DAQ and Online WBS 1.3 S. Fuess, Fermilab P. Slattery, U. of Rochester.
DØ Online Workshop3-June-1999S. Fuess Online Computing Overview DØ Online Workshop 3-June-1999 Stu Fuess.
DoE Review January 1998 Online System WBS 1.5  One-page review  Accomplishments  System description  Progress  Status  Goals Outline Stu Fuess.
GLAST LAT Project CU Beam Test Workshop 3/20/2006 C. Sgro’, L. Baldini, J. Bregeon1 Glast LAT Calibration Unit Beam Test Status Report on Online Monitor.
1 Electronics Status Trigger and DAQ run successfully in RUN2006 for the first time Trigger communication to DRS boards via trigger bus Trigger firmware.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
CWG13: Ideas and discussion about the online part of the prototype P. Hristov, 11/04/2014.
CODA Graham Heyes Computer Center Director Data Acquisition Support group leader.
Markus Frank (CERN) & Albert Puig (UB).  An opportunity (Motivation)  Adopted approach  Implementation specifics  Status  Conclusions 2.
Station Acceptance Test. 22/02/2007 M. Takahashi 2 Lab Layout at IC.
L1 DAQ 1 process per DAQ board L1 DAQ 1 process per DAQ board Trigger Distribution System BTF Beam Trigger BTF Beam Trigger 50 Hz L1 DAQ Event build L1.
DAQ Selection Discussion DAQ Subgroup Phone Conference Christopher Crawford
Apr. 25, 2002Why DØRAC? DØRAC FTFM, Jae Yu 1 What do we want DØ Regional Analysis Centers (DØRAC) do? Why do we need a DØRAC? What do we want a DØRAC do?
Bernd Panzer-Steindel CERN/IT/ADC1 Medium Term Issues for the Data Challenges.
12/19/01MODIS Science Team Meeting1 MODAPS Status and Plans Edward Masuoka, Code 922 MODIS Science Data Support Team NASA’s Goddard Space Flight Center.
Eric Hazen1 Ethernet Readout With: E. Kearns, J. Raaf, S.X. Wu, others... Eric Hazen Boston University.
Overview of E906 Electronics, Readouts, and DAQ System E906 DAQ workshop Ting-Hua Chang (7/6 – 7/7/2009)
LHC experiments Requirements and Concepts ALICE
ALICE – First paper.
Cluster Active Archive
Emanuele Leonardi PADME General Meeting - LNF January 2017
Computing Infrastructure for DAQ, DM and SC
Large CMS GEM with APV & SRS electronics
ILD Ichinoseki Meeting
Example of DAQ Trigger issues for the SoLID experiment
CMS Pixel Data Quality Monitoring
Status of TTF HOM Project Aug 9, 2005
CMS Pixel Data Quality Monitoring
August 19th 2013 Alexandre Camsonne
Presentation transcript:

MINERvA DAQ Software D. Casper University of California, Irvine

Hardware Schematic VME-based DAQ VME-based DAQ Readout & Slow Control Readout & Slow Control 2-3 crates total 2-3 crates total ~30,000 channels ~30,000 channels max 4 hits per channel/spill max 4 hits per channel/spill time + two charge measurements for each hit time + two charge measurements for each hit

Data Rate Assuming 100% occupancy each spill: Assuming 100% occupancy each spill: (In real life, expect closer to few %...) (In real life, expect closer to few %...) 30,000 channels  48 bytes/channel maximum = 1.5 MByte per trigger 30,000 channels  48 bytes/channel maximum = 1.5 MByte per trigger NuMI spill trigger rate: 0.5 Hz NuMI spill trigger rate: 0.5 Hz Cosmics rate: ~10 Hz Cosmics rate: ~10 Hz Calibration data: ? Calibration data: ? Total (maximum) raw data rate: 15 MB/sec Total (maximum) raw data rate: 15 MB/sec Expected raw data rate (with zero-suppression): < 0.5 MB/sec Expected raw data rate (with zero-suppression): < 0.5 MB/sec

DAQ Computer As proposed: As proposed: Dual CPU (2  3.2 GHz Xeon) Dual CPU (2  3.2 GHz Xeon) PCI/VME Interface PCI/VME Interface Two dual-port Gigabit ethernet interfaces Two dual-port Gigabit ethernet interfaces 408 GB on-board disk cluster (RAID-5) 408 GB on-board disk cluster (RAID-5) Data storage for ~1 week of data at expected rate Data storage for ~1 week of data at expected rate

DAQ Software Requirements not very demanding: Requirements not very demanding: Low rate Low rate Monolithic readout Monolithic readout Most complexity is in firmware, not software Most complexity is in firmware, not software Database, monitoring, maintenance utilities probably a good area for FNAL contribution Database, monitoring, maintenance utilities probably a good area for FNAL contribution Will we reconstruct data online? Will we reconstruct data online? Certainly at some level, for data quality monitoring Certainly at some level, for data quality monitoring Expect reprocessing offline Expect reprocessing offline Safety and Hazard Monitoring Safety and Hazard Monitoring A separate (but important) issue A separate (but important) issue