CM26 March 2010Jean-Sebastien GraulichSlide 1 Online Summary o The heplnw17 case o DAQ o CAM o Online Reconstruction o Data Base o Data Storage Jean-Sebastien.

Slides:



Advertisements
Similar presentations
MICE CM January 2009Jean-Sébastien GraulichSlide 1 DAQ and CAM Summary o Organization o Control and Monitoring o Detector DAQ o Online Monitoring o Front-End.
Advertisements

MOM Report Ray Gamet MICE Operations Manager University of Liverpool MICO 18 th May 2009.
Software Summary Database Data Flow G4MICE Status & Plans Detector Reconstruction 1M.Ellis - CM24 - 3rd June 2009.
Grid and CDB Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
Batch Production and Monte Carlo + CDB work status Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
MICE Electronics Upgrade P J Smith (University of Sheffield) J Leaver (Imperial College) 17 th June 2009.
1 Database Collaboration Meeting 26 University of California Riverside Mission Inn David Forrest University of Glasgow
DAQ WS03 Sept 2006Jean-Sébastien GraulichSlide 1 Wrap Up o Control and Monitoring o DDAQ o Interface between DDAQ and MCM o Infrastructure Jean-Sebastien.
Controls and Monitoring Implementation Plan J. Leaver 03/06/2009.
Data Quality Assurance Linda R. Coney UCR CM26 Mar 25, 2010.
Linda R. Coney – 24th September 2009 MOM Update End of Sept Run Linda R. Coney 05 October, 2009.
MICE CM15 June 2006Jean-Sébastien GraulichSlide 1 DAQ Status o Detector DAQ Test bench in Geneva o Preparing for Test Beam o FE electronics tests o Detector.
Online Reconstruction Update Linda R. Coney UCR Mar 25, 2010.
1 Configuration Database Review David Forrest University of Glasgow RAL :: 1 st June 2009.
MOM Report Ray Gamet MICE Operations Manager University of Liverpool MICO 11 th May 2009.
Computing Panel Discussion Continued Marco Apollonio, Linda Coney, Mike Courthold, Malcolm Ellis, Jean-Sebastien Graulich, Pierrick Hanlet, Henry Nebrensky.
MICO Meeting, 1 st March 2010 Terry Hart (MOM) - Decay Solenoid - Targets - DAQ - March User Runs Plans - Suggestions.
MICO 9th Aug 2010J.S. GraulichSlide 1 MOM report o Last Week o Run Plan for this Week o Machine Physics Jean-Sebastien Graulich, Geneva.
Pierrick Hanlet MICO June 6, 2011 MICE Operations Manager Report.
DAQ WS03 Sept 2006Jean-Sébastien GraulichSlide 1 Interface between Control & Monitoring and DDAQ o Introduction o Some background on DATE o Control Interface.
MOM Report Paul Soler MICE Operations Manager University of Glasgow MICO May 5, 2009.
MOM Report Linda Coney MICE Operations Manager University of California, Riverside MICO Meeting June 7, 2010.
Henry Nebrensky – CM26 – 24 March 2010 Computing Panel Discussion: SSH Bastion Henry Nebrensky Brunel University 1.
MOM Report Paul Soler MICE Operations Manager University of Glasgow MICO April 4, 2011.
DAQ WS03 Sept 2006Jean-Sébastien GraulichSlide 1 DDAQ Trigger o Reminder: DAQ Trigger vs Particle Trigger o DAQ Trigger o Particle Trigger 1) Possible.
Computing Needs Panel Marco Apollonio, Linda Coney, Mike Courthold, Malcolm Ellis, Jean-Sebastien Graulich, Pierrick Hanlet, Henry Nebrensky.
MOM Report Linda Coney MICE Operations Manager University of California, Riverside MICO Meeting May 24, 2010.
MICE Online Linda R. Coney UCR MICE CM30 Oxford July 2011.
MICE CM25 Nov 2009Jean-Sebastien GraulichSlide 1 Online Summary o Detector DAQ o Controls And Monitoring o Online Data Base o Bottom Lines Jean-Sebastien.
MICE CM26 March '10Jean-Sebastien GraulichSlide 1 Detector DAQ Issues o Achievements Since CM25 o DAQ System Upgrade o Luminosity Monitors o Sequels of.
VC MICO Report July 07Jean-Sébastien GraulichSlide 1 Main news  DAQ test bench running in Geneva 2 LDCs connected to 2 different PCs, 1 GDCs DAQ trigger.
Software Summary 1M.Ellis - CM23 - Harbin - 16th January 2009  Four very good presentations that produced a lot of useful discussion: u Online Reconstruction.
SOFTWARE & COMPUTING Durga Rajaram MICE PROJECT BOARD Nov 24, 2014.
Control and Monitoring System / EPICS Pete Owens Daresbury Laboratory.
MICE CM18 June 07Jean-Sébastien GraulichSlide 1 Detector DAQ Status o Since CM17 o Detector DAQ software o Front End Electronics o Schedule Milestones.
MICE VC June 2009Jean-Sébastien GraulichSlide 1 Feed back from the DAQ review o Facts o Detector DAQ o Important Comments o Bottom line Jean-Sebastien.
MICE VC Aug '10J.S. GraulichSlide 1 MOM report o Achievements Since CM27 o Daily Operations o Run Plans o Summary Jean-Sebastien Graulich, Geneva.
Linda Coney MC VC May 19, 2011 MICE Operations Manager Report.
Control & Monitoring System Update Contributors: Brian Martlew - Daresbury Laboratory - STFC James Leaver - Imperial College Pierrick Hanlet - Fermilab.
Configuration Database MICE Collaboration Meeting 28, Sofia David Forrest University of Glasgow Antony Wilson Science and Technology Facilities Council.
DAQ MICO Report Online Monitoring: –Status All histograms are now implemented Still not fully online –Only monitoring from data file (slightly offline,
MICE CM25 Nov 2009Jean-Sebastien GraulichSlide 1 Detector DAQ Issues o Achievements Since CM24 o Trigger o Event Building o Online Software o Front End.
MICE CM16 Oct 2006Jean-Sébastien GraulichSlide 1 Detector DAQ Status o Since CM15 o Detector DAQ software o Interface DDAQ / Mice Control & Monitoring.
Configuration Database Antony Wilson MICE CM February 2011 RAL 1.
MICE CM June 2009Jean-Sebastien GraulichSlide 1 Online Summary o Achievements Since CM23 o Control Room o Controls o Detector DAQ o Open Issues o Summary.
MOG Aug 2010J.S. GraulichSlide 1 Comments on User Run o All went pretty well but we can always improve Jean-Sebastien Graulich, Geneva.
MICO July 07Jean-Sébastien GraulichSlide 1 Detector DAQ Status o Overview – Hardware status o Installation at RAL o DAQ workshop o Current Activities Jean-Sebastien.
DAQ WS03 Sept 2006Jean-Sébastien GraulichSlide 1 Control and Monitoring Summary of DAQ WS III o General Architecture o RF System C&M o D0 EPICS extensions.
Online Software 8-July-98 Commissioning Working Group DØ Workshop S. Fuess Objective: Define for you, the customers of the Online system, the products.
January 31, MICE DAQ MICE and ISIS Introduction MICE Detector Front End Electronics Software and MICE DAQ Architecture MICE Triggers Status and Schedule.
Online Reconstruction 1M.Ellis - CM th October 2008.
MICE CM January 2009Jean-Sébastien GraulichSlide 1 Detector DAQ Issues o Achievements Since CM23 o Detector DAQ o Trigger o Online Software o Front End.
HIGUCHI Takeo Department of Physics, Faulty of Science, University of Tokyo Representing dBASF Development Team BELLE/CHEP20001 Distributed BELLE Analysis.
MICE CM28 Oct 2010Jean-Sebastien GraulichSlide 1 Detector DAQ o Achievements Since CM27 o DAQ Upgrade o CAM/DAQ integration o Online Software o Trigger.
Controls & Monitoring Status Update J. Leaver 05/11/2009.
VC117 February 2009Jean-Sébastien GraulichSlide 1 MOM Report o Yagmur Torun ( June 15 - July 14) o J.S. Graulich ( July 13 - August 7) Jean-Sebastien Graulich,
Electronics Spares for StepIV Operations MICE CM39 C.Macwaters 25/6/14.
Tracker Cosmic Ray Test 2011 Linda R. Coney UC Riverside CM 29 - February 16, 2011.
MOM Report Linda Coney MICE Operations Manager University of California, Riverside MICE Video Conference May 20, 2010.
MICE CM February 08Jean-Sébastien GraulichSlide 1 Report on DAQ & CAM o Detector DAQ Status Hardware Trigger Software o Controls and Monitoring (CAM) o.
1 Configuration Database David Forrest University of Glasgow RAL :: 31 May 2009.
MICE Online Group Linda R. Coney UCR MICE CM31 UMiss October 2011.
MICE, Uni-Geneva and the transnational access to MICE (MICE-TNA)
Configuration Database David Forrest University of Glasgow.
Coney - CM36 - June Online Overview L. Coney – UCR MICE CM36 – June 2013.
VC98 March 07Jean-Sébastien GraulichSlide 1 DDAQ Status o DAQ Software o Front-End and Trigger o What will happen soon o Schedule Milestones Jean-Sebastien.
L. Coney UC Riverside MICE CM32 – Glasgow – June 2012
MICE Computing and Software
Database High-Level Overview
Tracker Software Status
Presentation transcript:

CM26 March 2010Jean-Sebastien GraulichSlide 1 Online Summary o The heplnw17 case o DAQ o CAM o Online Reconstruction o Data Base o Data Storage Jean-Sebastien Graulich, Geneva  Software also discussed in the same session, not reported here

Online Activities  Main Issue: Breakdown of heplnw17  Not discussed in Online session Collaboration Forum  Revealed Lack of robustness, single point of failure Original misunderstanding: Private network Protected subnet Need for formally agreed support (from PPD or ISIS ?)  Consequence 3 months of relative chaos (bad) Start working on a general computing and network requirement document (good) CM26 March 2010Jean-Sebastien GraulichSlide 2

CM26 March 2010Jean-Sebastien GraulichSlide 3 DAQ achievements  DAQ system upgrade is ready  Luminosity monitors integrated  Trigger system cabling optimized  DAQ and Trigger System Consolidation  Cabling documentation in progress  Progress in EMR front-end electronics

CM26 March 2010Jean-Sebastien GraulichSlide 4 Event Building  The synchronization problem between the two crates persists We incriminate the PCI/VME interface It couldn’t be replaced because all the spares used for the mirror DAQ system Massive failure of boards: 4 out of 10 boards had to be send for repair  In the meanwhile Online monitoring histogram allow to spot the problem A VME and PC power cycle solve it temporarily Shifter’s attention is required

CM26 March 2010Jean-Sebastien GraulichSlide 5 Schedule Milestones  From CM25  CAM data in Online Data StreamNov 09 -> Mai 10  Tracker integrated in DAQ and OLMJan 10 -> July 10  TOF TDC Clock SynchronizationMarch 10 -> Aug 10 More complicated than first thought. Need a dedicated board  Burst Gate Signal in the Trigger System  Burst Gate Signal in the Trigger System Need support here The priority has been set to the DAQ and Trigger system upgrade and consolidation  DAQ System upgradeMai 10  Production of SW/EMR Front End ElectronicsJan 10 -> Started

Control and Monitoring  Outstanding Progress “Control is under control”  Computer management ---> OK  Software Management ---> OK  Data Management ---> OK  Documentation ---> OK  All this sustained by mainly two individuals James Leaver and Pierrick Hanlet CM26 March 2010Jean-Sebastien GraulichSlide 6

CM26 March 2010Jean-Sebastien GraulichSlide 7

CM26 March 2010Jean-Sebastien GraulichSlide 8

CM26 March 2010Jean-Sebastien GraulichSlide 9

CM26 March 2010Jean-Sebastien GraulichSlide 10

CM26 March 2010Jean-Sebastien GraulichSlide 11

CM26 March 2010Jean-Sebastien GraulichSlide 12

CAM  Decay solenoid included in the alarm handler  Linde control panel mirrored into EPICS Very useful for expert remote monitoring  Next: remote gateway and remote archive viewer new IOCs for new equipments  The “Long, hard road to ramp up CaM infrastructure and knowledge base” has lead us to a point where we no longer foresee difficult hurdles to regularly add new IOCs, monitoring, alarm handling, and archiving… CM26 March 2010Jean-Sebastien GraulichSlide 13

CM26 March 2010Jean-Sebastien GraulichSlide 14

CM26 March 2010Jean-Sebastien GraulichSlide 15

CM26 March 2010Jean-Sebastien GraulichSlide 16

Data base  What it does Store Configuration = Set values != read values Document hardware status - Geometry G4MICE - Cabling - Alarm Handler settings, etc  Record automatically the magnet settings, ‘ISIS settings’, target information and DAQ information Superset of what is currently entered manually into the run configuration spreadsheet on the MICO page  Allow retrieving these settings at the start of the run  Also allow saving settings not attached to a run E.g. Pion at 300 MeV/c  EPICS client developed by James Leaver for this CM26 March 2010Jean-Sebastien GraulichSlide 17

Data base status  Progress was suspended in January due to failure of heplnw17  Local copy of DB system under development in Glasgow, progress resumed  The main server functionality requested has now been implemented (except cabling)  Proper migration to Rutherford Lab scheduled the bulk of outstanding work  See David Forrest’s talk for details CM26 March 2010Jean-Sebastien GraulichSlide 18

Data Storage  The only formally-agreed route for access to data (DAQ output) is via the Grid.  The Grid Transfer Box (miceacq05) is located in the MLCR. It will eventually run an autonomous agent that reads the data from the RAID system in the MLCR and uploads it to the Grid, in particular the CASTOR tape system at RAL  In the meantime data IS being uploaded to the Grid, but on a manual, next-day timescale. CM26 March 2010Jean-Sebastien GraulichSlide 19

Data Access  Henry Nebrensky presented a tutorial on how to access the data using the grid  Open Issues Permanent storage is on Tape at RAL Long access time (Robot loading the tape) We should foresee a place where actively used data is stored on disk Files on tape must be at least 200 MB… Eventually, someone on duty (MOM or shifter) will need to have a Grid certificate Once again we have a single (human) point of failure here CM26 March 2010Jean-Sebastien GraulichSlide 20

General Comment  The MOG still suffers for a loose leadership  Compensated by the enthusiasm and commitment of the individuals inside the group  Most members are on short term contracts Linda and Pierrick depend on NFS grant David has to write his Ph.D. James will leave in January 2011 I’ll leave on June 2011 CM26 March 2010Jean-Sebastien GraulichSlide 21