Clara Gaspar, October 2011 The LHCb Experiment Control System: On the path to full automation.

Slides:



Advertisements
Similar presentations
JCOP FW Update ALICE DCS Workshop 6 th and 7 th October, 2005 Fernando Varela Rodriguez, IT-CO Outline Organization Current status Future work.
Advertisements

Clara Gaspar, April 2012 The LHCb Experiment Control System: Automation concepts & tools.
The Detector Control System – FERO related issues
The Control System for the ATLAS Pixel Detector
Clara Gaspar on behalf of the LHCb Collaboration, “Physics at the LHC and Beyond”, Quy Nhon, Vietnam, August 2014 Challenges and lessons learnt LHCb Operations.
Experiment Control Systems at the LHC An Overview of the System Architecture An Overview of the System Architecture JCOP Framework Overview JCOP Framework.
André Augustinus ALICE Detector Control System  ALICE DCS is responsible for safe, stable and efficient operation of the experiment  Central monitoring.
1 ALICE Detector Control System (DCS) TDR 28 January 2004 L.Jirdén On behalf of ALICE Controls Coordination (ACC): A.Augustinus, P.Chochula, G. De Cataldo,
Clara Gaspar, May 2010 The LHCb Run Control System An Integrated and Homogeneous Control System.
L. Granado Cardoso, F. Varela, N. Neufeld, C. Gaspar, C. Haen, CERN, Geneva, Switzerland D. Galli, INFN, Bologna, Italy ICALEPCS, October 2011.
1 CALO DCS power supply status CALO meeting Anatoli Konoplyannikov [ITEP / LAPP] Outline  Introduction  Power supply description with hardware.
Clara Gaspar, March 2006 LHCb’s Experiment Control System Step by Step.
First attempt of ECS training Work in progress… A lot of material was borrowed! Thanks!
Calo Piquet Training Session - Xvc1 ECS Overview Piquet Training Session Cuvée 2012 Xavier Vilasis.
1 DCS TDR Key technical points & milestones TB 15 Dec 2003 L.Jirdén.
Using PVSS for the control of the LHCb TELL1 detector emulator (OPG) P. Petrova, M. Laverne, M. Muecke, G. Haefeli, J. Christiansen CERN European Organization.
Designing a HEP Experiment Control System, Lessons to be Learned From 10 Years Evolution and Operation of the DELPHI Experiment. André Augustinus 8 February.
1 Status & Plans DCS WS L.Jirdén. 2 DCS Planning FINAL INST COM- MISS BEAM OP PRE- INST DET DCS URD ENG. SOLUTIONS PROTOTYPE SUBSYSTEM.
09/11/20061 Detector Control Systems A software implementation: Cern Framework + PVSS Niccolo’ Moggi and Stefano Zucchelli University and INFN Bologna.
MDT PS DCS for ATLAS Eleni Mountricha
1 ALICE Control System ready for LHC operation ICALEPCS 16 Oct 2007 L.Jirdén On behalf of the ALICE Controls Team CERN Geneva.
JCOP Workshop September 8th 1999 H.J.Burckhart 1 ATLAS DCS Organization of Detector and Controls Architecture Connection to DAQ Front-end System Practical.
XXVI Workshop on Recent Developments in High Energy Physics and Cosmology Theodoros Argyropoulos NTUA DCS group Ancient Olympia 2008 ATLAS Cathode Strip.
The Joint COntrols Project Framework Manuel Gonzalez Berges on behalf of the JCOP FW Team.
André Augustinus 10 September 2001 DCS Architecture Issues Food for thoughts and discussion.
D etector C ontrol S ystem ALICE DCS workshop G. De Cataldo CERN-CH, A. Franco INFN Bari, I 1 Finite State Machines (FSM) for the ALICE DCS:
André Augustinus 10 October 2005 ALICE Detector Control Status Report A. Augustinus, P. Chochula, G. De Cataldo, L. Jirdén, S. Popescu the DCS team, ALICE.
Control in ATLAS TDAQ Dietrich Liko on behalf of the ATLAS TDAQ Group.
ALICE, ATLAS, CMS & LHCb joint workshop on
Controls EN-ICE Finite States Machines An introduction Marco Boccioli FSM model(s) of detector control 26 th April 2011.
Management of the LHCb DAQ Network Guoming Liu * †, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
Clara Gaspar, March 2005 LHCb Online & the Conditions DB.
JCOP Review, March 2003 D.R.Myers, IT-CO1 JCOP Review 2003 Architecture.
Bruno Belbute, October 2006 Presentation Rehearsal for the Follow-up meeting of the Protocol between AdI and CERN.
Overview of DAQ at CERN experiments E.Radicioni, INFN MICE Daq and Controls Workshop.
CERN, O.Pinazza: ALICE TOF DCS1 ALICE TOF DCS Answers to DCS Commissioning and Installation related questions ALICE week at CERN O. Pinazza and.
Controls EN-ICE FSM for dummies (…w/ all my respects) 15 th Jan 09.
Clara Gaspar, July 2005 RTTC Control System Status and Plans.
Online Track Summary Volker Guelzow&Jiri Masik. General 23 posters, 28 talks, 5 Sessions: Very much supported by session chairs: Wainer Vandelli (Atlas)
Management of the LHCb Online Network Based on SCADA System Guoming Liu * †, Niko Neufeld † * University of Ferrara, Italy † CERN, Geneva, Switzerland.
DCS overview - L.Jirdén1 ALICE ECS/DCS – project overview strategy and status L.Jirden u Organization u DCS system overview u Implementation.
Data Acquisition, Trigger and Control
Configuration database status report Eric van Herwijnen September 29 th 2004 work done by: Lana Abadie Felix Schmidt-Eisenlohr.
6. Shift Leader 1. Introduction 2. SL Duties 3. Golden Rules 4. Operational Procedure 5. Mode Handshakes 6. Cold Start 7. LHCb State Control 8. Clock Switching.
14 November 08ELACCO meeting1 Alice Detector Control System EST Fellow : Lionel Wallet, CERN Supervisor : Andre Augustinus, CERN Marie Curie Early Stage.
Clara Gaspar, April 2006 LHCb Experiment Control System Scope, Status & Worries.
LHCb Configuration Database Lana Abadie, PhD student (CERN & University of Pierre et Marie Curie (Paris VI), LIP6.
The DCS Databases Peter Chochula. 31/05/2005Peter Chochula 2 Outline PVSS basics (boring topic but useful if one wants to understand the DCS data flow)
Clara Gaspar, March 2003 Hierarchical Control Demo: Partitioning, Automation and Error Recovery in the (Detector) Control System of LHC Experiments.
Markus Frank (CERN) & Albert Puig (UB).  An opportunity (Motivation)  Adopted approach  Implementation specifics  Status  Conclusions 2.
André Augustinus 18 March 2002 ALICE Detector Controls Requirements.
M. Caprini IFIN-HH Bucharest DAQ Control and Monitoring - A Software Component Model.
Clara Gaspar, February 2010 DIM A Portable, Light Weight Package for Information Publishing, Data Transfer and Inter-process Communication.
ALICE. 2 Experiment Size: 16 x 26 meters (some detectors >100m away from IP) Weight: 10,000 tons Detectors: 18 Magnets: 2 Dedicated to study of ultra.
20OCT2009Calo Piquet Training Session - Xvc1 ECS Overview Piquet Training Session Cuvée 2009 Xavier Vilasis.
Clara Gaspar, May 2010 SMI++ A Tool for the Automation of large distributed control systems.
Stefan Koestner IEEE NSS – San Diego 2006 IEEE - Nuclear Science Symposium San Diego, Oct. 31 st 2006 Stefan Koestner on behalf of the LHCb Online Group.
Clara Gaspar, March 2013 Thanks to: Franco Carena, Vasco Chibante Barroso, Giovanna Lehmann Miotto, Hannes Sakulin and Andrea Petrucci for their input.
PVSS an industrial tool for slow control
CMS – The Detector Control System
Controlling a large CPU farm using industrial tools
WinCC-OA Upgrades in LHCb.
The LHCb Run Control System
Philippe Vannerem CERN / EP ICALEPCS - Oct03
IEEE - Nuclear Science Symposium San Diego, Oct. 31st 2006
Experiment Control System
Pierluigi Paolucci & Giovanni Polese
Tools for the Automation of large distributed control systems
Pierluigi Paolucci & Giovanni Polese
Presentation transcript:

Clara Gaspar, October 2011 The LHCb Experiment Control System: On the path to full automation

Clara Gaspar, October The Experiment Control System Detector Channels Front End Electronics Readout Network HLT Farm Storage L0 Experiment Control System DAQ DCS Devices (HV, LV, GAS, Cooling, etc.) External Systems (LHC, Technical Services, Safety, etc) TFC Monitoring Farm ❚ Is in charge of the Control and Monitoring of all areas of the experiment

Clara Gaspar, October 2011 Homogeneity ❚ Same architecture and same tools used throughout the Control System. ❚ Generic Architecture: 3 LV Dev1 LV Dev2 LV DevN DCS SubDetN DCS SubDet2 DCS SubDet1 DCS SubDet1 LV SubDet1 TEMP SubDet1 GAS … … Commands DAQ SubDetN DAQ SubDet2 DAQ SubDet1 DAQ SubDet1 FEE SubDet1 RO FEE Dev1 FEE Dev2 FEE DevN Control Unit Device Unit … … Building Blocks: INFR.TFCLHC ECS HLT Status & Alarms

Clara Gaspar, October ❚ The JCOP * Framework is based on: ❙ SCADA System - PVSSII for: ❘ Device Description (Run-time Database) ❘ Device Access (OPC, Profibus, drivers) +DIM ❘ Alarm Handling (Generation, Filtering, Masking, etc) ❘ Archiving, Logging, Scripting, Trending ❘ User Interface Builder ❘ Alarm Display, Access Control, etc. ❙ SMI++ providing: ❘ Abstract behavior modeling (Finite State Machines) ❘ Automation & Error Recovery (Rule based system) * – The Joint COntrols Project (between the 4 LHC exp. and the CERN Control Group) The Control Framework Device Units Control Units

Clara Gaspar, October Device Units ❚ Provide access to “real” devices: ❙ The FW provides interfaces to all necessary types of devices: ❘ LHCb devices: HV channels, Read Out boards, Trigger processes running in the HLT farm or Monitoring tasks for data quality, etc. ❘ External devices: the LHC, a gas system, etc. ❙ Each device is modeled as a Finite State Machine: ❘ It’s main interface to the outside world is a “State” and a (small) set of “Actions”. Device Unit

Clara Gaspar, October Hierarchical control ❚ Each Control Unit: ❙ Is defined as one or more Finite State Machines ❘ It’s interface to outside is also a state and actions ❙ Can implement rules based on its children’s states ❙ In general it is able to: ❘ Include/Exclude children (Partitioning) 〡 Excluded nodes can run is stand-alone ❘ Implement specific behaviour & Take local decisions 〡 Sequence & Automate operations 〡 Recover errors ❘ User Interfacing 〡 Present information and receive commands DCS Muon DCS Tracker DCS … Muon LV Muon GAS Control Unit

Clara Gaspar, October 2011 FW – Graphical Editor ❚ SMI++ Objects States & Actions 7 ❚ Parallelism, Synchronization ❚ Asynchronous Rules

Clara Gaspar, October Operation Domains ❚ DCS Domain Equipment operation related to a running period (Ex: GAS, Cooling) ❚ HV Domain Equipment operation related to the LHC State (Ex: High Voltages) ❚ DAQ Domain Equipment operation related to a “RUN” (Ex: RO board, HLT process) READY STANDBY1 OFF ERROR Recover STANDBY2 RAMPING_STANDBY1 RAMPING_STANDBY2 RAMPING_READY NOT_READY Go_STANDBY1 Go_STANDBY2 Go_READY RUNNING READY NOT_READY StartStop ERRORUNKNOWN Configure Reset Recover CONFIGURING READY OFF ERRORNOT_READY Switch_ONSwitch_OFF RecoverSwitch_OFF ❚ FSM templates distributed to all Sub-detectors ❚ All Devices and Sub-Systems have been implemented using one of these templates

Clara Gaspar, October 2011 ECS - Automation ❚ Some Examples: ❙ HLT Control (~1500 PCs) ❘ Automatically excludes misbehaving PCs (within limits) ❘ Can (re)include PCs at run- time (they get automatically configured and started) 9 ❙ RunControl ❘ Automatically detects and recovers SubDetector desynchronizations ❘ Can Reset SDs when problems detected by monitoring ❙ AutoPilot ❘ Knows how to start and keep a run going from any state. ❙ BigBrother ❘ Based on the LHC state: ❘ Controls SD Voltages ❘ VELO Closure ❘ RunControl

Clara Gaspar, October 2011 Run Control 10 ❚ Matrix ❚ Activity Domain X Sub-Detector Used for Configuring all Sub-Systems

Clara Gaspar, October LHCb Operations ❚ Two operators on shift: ❙ Data Manager ❙ Shift Leader has 2 views of the System: ❘ Run Control ❘ Big Brother ❚ Big Brother ❙ Manages LHC dependencies: ❘ SubDetector Voltages ❘ VELO Closing ❘ Run Control

Clara Gaspar, October ECS: Some numbers DCS SubDetN DCS SubDet1 DCS … DAQ SubDetN DAQ SubDet1 DAQ … HVTFCLHCHLT ECS ❚ Size of the Control Tree: ❙ Distributed over ~150 PCs ❘ ~100 Linux (50 for the HLT) ❘ ~ 50 Windows ❙ >2000 Control Units ❙ >50000 Device Units ❚ Run Control Timing ❙ Cold Start to Running: 4 minutes ❘ Configure all Sub-detectors, Start & Configure ~40000 HLT processes (always done well before PHYSICS) ❙ Stop/Start Run: 6 seconds

Clara Gaspar, October Conclusions ❚ LHCb has designed and implemented a coherent and homogeneous control system ❚ The Experiment Control System allows to: ❙ Configure, Monitor and Operate the Full Experiment ❙ Run any combination of sub-detectors in parallel in standalone ❚ Some of its main features: ❙ Partitioning, Sequencing, Error recovery, Automation ➨ Come from the usage of SMI++ (integrated with PVSS) ❚ LHCb operations now almost completely automated ❙ Operator task is easier (basically only confirmations) ❙ DAQ Efficiency improved to ~98%