DETECTOR CONTROL SYSTEM FOR THE ALICE EXPERIMENT AT CERN -LHC Anik Gupta ICTDHEP JAMMU 11 th September 2013 Special Thanks to the ALICE CONTROLS COORDINATION.

Slides:



Advertisements
Similar presentations
The Detector Control System – FERO related issues
Advertisements

The Control System for the ATLAS Pixel Detector
Peter Chochula CERN-ALICE ALICE DCS Workshop, CERN September 16, 2002 DCS – Frontend Monitoring and Control.
1 DCS Installation & commissioning TB 18 May 06 L.Jirden Central DCS Detector DCS Status.
André Augustinus 16 June 2003 DCS Workshop Safety.
Peter Chochula CERN-ALICE ALICE DCS Workshop, CERN September 16, 2002 DCS – Frontend Monitoring and Control.
Experiment Control Systems at the LHC An Overview of the System Architecture An Overview of the System Architecture JCOP Framework Overview JCOP Framework.
André Augustinus ALICE Detector Control System  ALICE DCS is responsible for safe, stable and efficient operation of the experiment  Central monitoring.
Peter Chochula, January 31, 2006  Motivation for this meeting: Get together experts from different fields See what do we know See what is missing See.
1 ALICE Detector Control System (DCS) TDR 28 January 2004 L.Jirdén On behalf of ALICE Controls Coordination (ACC): A.Augustinus, P.Chochula, G. De Cataldo,
Supervision of Production Computers in ALICE Peter Chochula for the ALICE DCS team.
Clara Gaspar, May 2010 The LHCb Run Control System An Integrated and Homogeneous Control System.
1 CALO DCS power supply status CALO meeting Anatoli Konoplyannikov [ITEP / LAPP] Outline  Introduction  Power supply description with hardware.
The Detector Safety System for LHC Experiments Stefan Lüders ― CERN EP/SFT & IT/CO CHEP03 ― UC San Diego ― March 27 th, 2003.
Robert Gomez-Reino on behalf of PH-CMD CERN group.
Calo Piquet Training Session - Xvc1 ECS Overview Piquet Training Session Cuvée 2012 Xavier Vilasis.
1 DCS TDR Key technical points & milestones TB 15 Dec 2003 L.Jirdén.
Designing a HEP Experiment Control System, Lessons to be Learned From 10 Years Evolution and Operation of the DELPHI Experiment. André Augustinus 8 February.
Towards a Detector Control System for the ATLAS Pixeldetector Susanne Kersten, University of Wuppertal Pixel2002, Carmel September 2002 Overview of the.
Summary DCS Workshop - L.Jirdén1 Summary of DCS Workshop 28/29 May 01 u Aim of workshop u Program u Summary of presentations u Conclusion.
1 Status & Plans DCS WS L.Jirdén. 2 DCS Planning FINAL INST COM- MISS BEAM OP PRE- INST DET DCS URD ENG. SOLUTIONS PROTOTYPE SUBSYSTEM.
09/11/20061 Detector Control Systems A software implementation: Cern Framework + PVSS Niccolo’ Moggi and Stefano Zucchelli University and INFN Bologna.
1 ALICE Control System ready for LHC operation ICALEPCS 16 Oct 2007 L.Jirdén On behalf of the ALICE Controls Team CERN Geneva.
JCOP Workshop September 8th 1999 H.J.Burckhart 1 ATLAS DCS Organization of Detector and Controls Architecture Connection to DAQ Front-end System Practical.
Clara Gaspar, October 2011 The LHCb Experiment Control System: On the path to full automation.
XXVI Workshop on Recent Developments in High Energy Physics and Cosmology Theodoros Argyropoulos NTUA DCS group Ancient Olympia 2008 ATLAS Cathode Strip.
The Joint COntrols Project Framework Manuel Gonzalez Berges on behalf of the JCOP FW Team.
André Augustinus 10 September 2001 DCS Architecture Issues Food for thoughts and discussion.
1 Responsibilities & Planning DCS WS L.Jirdén.
André Augustinus 17 June 2002 Technology Overview What is out there to fulfil our requirements? (with thanks to Tarek)
D etector C ontrol S ystem ALICE DCS workshop G. De Cataldo CERN-CH, A. Franco INFN Bari, I 1 Finite State Machines (FSM) for the ALICE DCS:
André Augustinus 10 October 2005 ALICE Detector Control Status Report A. Augustinus, P. Chochula, G. De Cataldo, L. Jirdén, S. Popescu the DCS team, ALICE.
André Augustinus 26 October 2004 ALICE Technical Board DCS for ‘services’ costs On behalf of Lennart Jirdén.
Control in ATLAS TDAQ Dietrich Liko on behalf of the ATLAS TDAQ Group.
P. Chochula ALICE Week Colmar, June 21, 2004 Status of FED developments.
20th September 2004ALICE DCS Meeting1 Overview FW News PVSS News PVSS Scaling Up News Front-end News Questions.
André Augustinus 10 March 2003 DCS Workshop Detector Controls Layout Introduction.
Management of the LHCb DAQ Network Guoming Liu * †, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
JCOP Review, March 2003 D.R.Myers, IT-CO1 JCOP Review 2003 Architecture.
André Augustinus 16 September 2002 PVSS & Framework How to get started.
Bruno Belbute, October 2006 Presentation Rehearsal for the Follow-up meeting of the Protocol between AdI and CERN.
Peter Chochula ALICE Offline Week, October 04,2005 External access to the ALICE DCS archives.
André Augustinus 9 October 2006 Interlocks update.
Controls EN-ICE FSM for dummies (…w/ all my respects) 15 th Jan 09.
Peter Chochula.  DCS architecture in ALICE  Databases in ALICE DCS  Layout  Interface to external systems  Current status and experience  Future.
Management of the LHCb Online Network Based on SCADA System Guoming Liu * †, Niko Neufeld † * University of Ferrara, Italy † CERN, Geneva, Switzerland.
DCS overview - L.Jirdén1 ALICE ECS/DCS – project overview strategy and status L.Jirden u Organization u DCS system overview u Implementation.
14 November 08ELACCO meeting1 Alice Detector Control System EST Fellow : Lionel Wallet, CERN Supervisor : Andre Augustinus, CERN Marie Curie Early Stage.
Clara Gaspar, April 2006 LHCb Experiment Control System Scope, Status & Worries.
Management of the LHCb DAQ Network Guoming Liu *†, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
The DCS Databases Peter Chochula. 31/05/2005Peter Chochula 2 Outline PVSS basics (boring topic but useful if one wants to understand the DCS data flow)
ATLAS DCS ELMB PRR, March 4th 2002, H.J.Burckhart1 Embedded Local Monitor Board ELMB  Context  Aim  Requirements  Add-ons  Our aims of PRR.
External Data and DIP Oliver Holme 18 th January 2008.
André Augustinus 17 June 2002 Detector URD summaries or, what we understand from your URD.
André Augustinus 18 March 2002 ALICE Detector Controls Requirements.
M. Caprini IFIN-HH Bucharest DAQ Control and Monitoring - A Software Component Model.
ALICE. 2 Experiment Size: 16 x 26 meters (some detectors >100m away from IP) Weight: 10,000 tons Detectors: 18 Magnets: 2 Dedicated to study of ultra.
ACO & AD0 DCS Status report Mario Iván Martínez. LS1 from DCS point of view Roughly halfway through LS1 now – DCS available through all LS1, as much as.
20OCT2009Calo Piquet Training Session - Xvc1 ECS Overview Piquet Training Session Cuvée 2009 Xavier Vilasis.
CHEP 2010 – TAIPEI Robert Gomez-Reino on behalf of CMS DAQ group.
B. Franek, poster presented at Computing in High Energy and Nuclear Physics, Praha – Czech Republic, 21 – 27 March 2009 This framework provides: -A method.
«The past experience with JCOP in ALICE» «JCOP Workshop 2015» 4-5 November 2015 Alexander Kurepin.
PVSS an industrial tool for slow control
ATLAS MDT HV – LV Detector Control System (DCS)
Controlling a large CPU farm using industrial tools
The LHCb Run Control System
Experiment Control System
Presentation transcript:

DETECTOR CONTROL SYSTEM FOR THE ALICE EXPERIMENT AT CERN -LHC Anik Gupta ICTDHEP JAMMU 11 th September 2013 Special Thanks to the ALICE CONTROLS COORDINATION Anik Gupta ICTDHEP Based on slides given by Peter Chochula, Andre Augustinus and ALICE DCS TRAINING SLIDES 9/11/20131

ATLAS ALICE CMS LHCb center of Geneva Detectors not shown FMD, V0, T0, ZDC, EMC, CPV, ACO HMPID TOF TRD PMD SPD SDD SSD TPC MUON - SPECTROMETER PHOS ALICE experiment – 18 sub-detectors, 2 magnets – 1200 members, 90 institutes, 30 countries – Completely installed and fully commissioned u ALICE Controls u Started 12 years ago u Small(Very Important) central team u Detector groups & LHC experiments (JCOP) u In total ~100 people involved Anik Gupta ICTDHEP9/11/20132

The ALICE Detector Anik Gupta ICTDHEP9/11/20133

ALICE DCS The two main tasks of the ALICE Detector Control System (DCS) are – To assure maximum protection such that detector equipment cannot be damaged by adverse beam conditions – To assure that, whenever beam conditions allow, the detector is in optimal condition to take physics data Anik Gupta ICTDHEP9/11/20134

ALICE DCS To achieve this, the ALICE DCS – controls all relevant detector equipment, – maintains the synchronization with the LHC machine operation, – monitors and controls the experiment infrastructure and services, – provides reliable communication with LHC and other online systems, – has adopted the PVSSII SCADA as main tool ( a CERN-wide standard for control systems) PVSSII has recently be re-branded as “WinCC Open Architecture” after purchase by Siemens Anik Gupta ICTDHEP9/11/20135

The DCS Context Anik Gupta ICTDHEP9/11/20136

ALICE DCS Detector systems are responsibility of detector projects – Detector experts implement the local detector controls DCS central team (6 people): – provides standards, guidelines, infrastructure and support – supervises implementation of detector systems – implements the overall control of all local control systems – implements the interfaces with external systems and the other ALICE online systems – assures smooth operation during data taking campaigns Anik Gupta ICTDHEP9/11/20137

PVSSII Architecture  PVSSII system is composed of specialized program modules (managers)  Managers communicate via TCP/IP  ALICE DCS is built from 100 PVSS systems composed of 900 managers  PVSSII is extended by JCOP and ALICE frameworks on top of which User applications are built CTL API DM EM DRV UI User Application ALICE Framework JCOP Framework PVSSII Anik Gupta ICTDHEP9/11/20138

Distributed PVSS System Several PVSSII systems can be connected into one distributed system using a specialized manager (DIST) The ALICE distributed system consists of over 100 individual systems CTL API DM EM DRV UI CTL API DM EM DRV UI DIST Anik Gupta ICTDHEP9/11/20139

ALICE DCS Distributed System Each detector provides an autonomous distributed system Central servers connect to all detectors providing one large distributed system Anik Gupta ICTDHEP9/11/201310

values/s read by software values/s Injected into PVSSII 1000 values/s Written to ORACLE after smoothing in PVSSII >200 values/s Sent to consumers DCS Dataflow Anik Gupta ICTDHEP9/11/201311

DCS Computing 170 servers 700 embedded computers Oracle DB with 144TB raw storage 1200 network attached devices Anik Gupta ICTDHEP9/11/201312

Challenges The DCS project in ALICE has been launched relatively late, when many of the detector developments were in advanced stage, and technology choices already made The ALICE experiment consists of 18 different detectors – Compared to ~5 for ATLAS/CMS In order to guarantee a smooth integration a huge effort went into hiding complexity and diversity, and standardization on all levels Anik Gupta ICTDHEP9/11/201313

Challenges - standardization High level of standardization achieved for power supplies and services (cooling, PLC based applications…) – Limited number of device models, supported centrally at CERN – Accessed by standard interfaces: CANbus, Profibus or Ethernet – OPC technology used as software interface industry standard, supported in PVSSII  Large diversity in the front-end part – Different architectures and requirements – Variety of control buses (JTAG, CANbus, Profibus, RS232, Ethernet, custom buses – Easynet, DDL...) – DIM technology used to hide diversity Anik Gupta ICTDHEP9/11/201314

Distributed Information Management (DIM) CERN DIM used for command and data transfer – Implemented and supported in PVSSII and C++ (by CERN) – Client/server architecture – Robust and stable – Proven technology already in LEP era DIM Server Name Server DIM Client Register Service Search Service Service Provider Name Send Command Query (subscribe to) value Data / Response Anik Gupta ICTDHEP9/11/201315

Anik Gupta ICTDHEP Building blocks of ALICE DCS 18 detectors with different requirements Effort to device standardization Still large diversity mainly in FEE part Large number of busses (CANbus, JTAG, Profibus, RS232, Ethernet, custom links…) 1200 network-attached devices 270 crates (VME and power supplies) controlled voltage channels 9/11/201316

OPC items Front-End (FED) services parameters supervised by the DCS Monitored at typical rate of 1Hz Hardware diversity is managed through standard interfaces OPC servers for commercial devices FED servers for custom hardware Provides hardware abstraction, uses CERN DIM (TCP/IP based) protocol for communication Anik Gupta ICTDHEP9/11/201317

Core of the DCS is based on commercial SCADA system PVSSII 110 detector computers 60 backend servers DCS Oracle RAC (able to process up to inserts/s) Anik Gupta ICTDHEP9/11/201318

PVSSII distributed system is not a natural system representation for the operator ALICE DCS Is modeled as a FSM using CERN SMI++ tools Hide experiment complexity Focus on operational aspect Anik Gupta ICTDHEP9/11/201319

DET 1 HVLVFEE… CH DCS DET 1 VHVHVLV… CH Hierarchical approach: ALICE DCS is represented as a tree composed of detector systems Each detector system is composed of subsystems Subsystems are structured to devices (crates, boards) and channels Anik Gupta ICTDHEP9/11/201320

DET 1 HVLVFEE… CH DCS DET 1 VHVHVLV… CH SINGLE CHANNEL FSM OFF STB ON CFG RAM UP RAMP DWN ERR CONFIGURE GO_ON GO_OFF RESET GO_STB (simplified) TOP LEVEL FSM STB BEAM Tuning READY Moving BT Moving Ready Moving BT ERR GO_BT GO_RDY GO_STB RESET GO_BT DCS devices are described as FSM State diagrams are standardized for channels and devices of the same type Top level DCS takes into account status of all leaves Anik Gupta ICTDHEP9/11/201321

ALICE Central FSM Hierarchy Scale: – 1 top DCS node – 19 detector nodes – 100 subsystems – logical nodes – devices (leaves) – Each leaf can represent a complex unit containing many channels Anik Gupta ICTDHEP9/11/201322

Hierarchy Partitioning Partitioning allows for concurrent operation A single channel error can put overall DCS into error Operator excludes the affected part Remote expert cures the problem Operator includes the repaired part Anik Gupta ICTDHEP9/11/201323

User Interface High level user interfaces allow to issue ‘meta commands’ to ease operator tasks Operator has access to all detector specific panels Anik Gupta ICTDHEP9/11/201324

Putting pieces together Anik Gupta ICTDHEP9/11/201325

DCS UI ALICE component used by all detectors Standardized operation representation Detector specific panels Central operator can browse and operate detector panels Central panel FSM operation Detector Panel DCS tree Anik Gupta ICTDHEP9/11/201326

Critical actions use high level panels All tools needed for the task are available on the panel Expert system guides the operator High Level Panels 2., Inform SL and ECS Anik Gupta ICTDHEP9/11/201327

Anik Gupta ICTDHEP9/11/201328

PMD CONTROLS USER INTERFACE Anik Gupta ICTDHEP9/11/201329

Status of alarm Warning, Error or Fatal Alarm text associated with the Alarm Value of the datapoint Anik Gupta ICTDHEP9/11/201330

Right click and select ALERT HELP Anik Gupta ICTDHEP9/11/201331

Getting help on Alerts Window with specific instructions opens Anik Gupta ICTDHEP9/11/201332

Alice-LHC handshake procedure This procedure is always executed before the LHC changes the operational mode The goal is to ring ALICE to safe mode and confirm this to LHC Handshake is initiated on LHC request and the DCS operator follows the instructions given at the operational panels (see following slides) Anik Gupta ICTDHEP9/11/201333

Handshake I: Injection WARNING Anik Gupta ICTDHEP9/11/201334

II: Injection Imminent Anik Gupta ICTDHEP9/11/201335

III: Injection READY Anik Gupta ICTDHEP9/11/201336

IV: injection Completed Anik Gupta ICTDHEP9/11/201337

Handshake I: Dump WARNING Anik Gupta ICTDHEP9/11/201338

II: DUMP Imminent Anik Gupta ICTDHEP9/11/201339

III: Dump READY Anik Gupta ICTDHEP9/11/201340

IV: Dump Completed Anik Gupta ICTDHEP9/11/201341

Handshake I: Adjust WARNING Anik Gupta ICTDHEP9/11/201342

II: Adjust Imminent Anik Gupta ICTDHEP9/11/201343

III: Adjust READY Anik Gupta ICTDHEP9/11/201344

IV: Adjust Completed Anik Gupta ICTDHEP9/11/201345

ALICE Safety Anik Gupta ICTDHEP9/11/201346

Safety The DCS receives safety related information: – DSS – Sniffer – CSAM Anik Gupta ICTDHEP9/11/201347

CSAM CSAM (CERN Safety Alarm Monitoring) – Collects, transmits and presents all safety alarms – UI in ALICE presents all level 3 alarms relevant to point 2 Experiment, neighboring tunnel sections, surface Anik Gupta ICTDHEP9/11/201348

Sniffer Sniffer system monitors gas samples taken in solenoid volume and around/below muon arm – Detection of smoke, flammable gas and ODH – Alarm list view (default) and synoptic views Anik Gupta ICTDHEP9/11/201349

DSS DSS (Detector Safety System) – Robust, redundant part of DCS, PLC based – Shared by experiment safety and environment monitoring and detector interlocks – Can take pre-programmed actions Basic concept: – A logical (and, or) combination of (triggered) inputs can raise an alarm. An alarm can trigger one or more outputs (actions). inputs alarmoutputs Anik Gupta ICTDHEP9/11/201350

DSS DSS inputs and alarms DSS outputs (actions) Alarms are acknowledged, and actions are reset by single click on the name (first alarm, then action) Anik Gupta ICTDHEP9/11/201351

Services and Environment To run the experiment, services are needed: – Power (electricity) – Cooling (water) – Magnet (solenoid, dipole) – Gas The DCS monitors the status of these services The DCS monitors the environment of the experiment – Temperatures, atmospheric pressure, humidity – Radiation – Magnetic field Anik Gupta ICTDHEP9/11/201352

Services and Environment Rack power distribution status Cooling water temperatures Anik Gupta ICTDHEP9/11/201353

Services and Environment Magnet status Gas systems status Anik Gupta ICTDHEP9/11/201354

Services and Environment Temperature, pressure, humidity RAMSES radiation monitors B-field probes Anik Gupta ICTDHEP9/11/201355

Summary ALICE DCS is based on commercial SCADA system called WinCC Open Architecture hitherto known as PVSSII DCS has to assure efficient, yet safe operation – Standardize where possible; hide complexity and diversity – Modelling through finite state machines – Continuously Develop user interfaces to ease operator tasks Future – Exploit experience of last years of operations – Review state machines, simplify where possible – Improve user interface – Aim for further automation in the operation – Improve robustness of the communication with external systems Anik Gupta ICTDHEP9/11/201356

Anik Gupta ICTDHEP BACKUP SLIDES 9/11/201357

Device Driver OPC Server PVSS Standardized Device Standardized interface DCOM Commands Status OPC Client OPC Technology OPC – industrial standard Implemented in PVSS as a manager providing generic interface In ALICE most commercial devices are controlled via OPC ~ OPC items used in DCS 9/11/201358Anik Gupta ICTDHEP

Device Driver ???? PVSS Custom Device (Custom) interface ??? Commands Status ??? Controlling Custom Hardware Custom modules typically come with software packages developed by engineers – Incompatible with PVSS Missing link to PVSS – Transport protocol – Communication standard The main objective of ALICE DCS: – Hide the device complexity – Provide a generic communication protocol covering all custom architectures 9/11/201359Anik Gupta ICTDHEP

C C++ Python PERL RUBY LabView LabWindows Java TCL/TK PVSS C++ The tools supplied with the hardware were designed for laboratory environment First step: – PVSS and C++ imposed as the only platforms for the production system 9/11/201360Anik Gupta ICTDHEP

DIM Server Name Server DIM Client Register Service Search Service Service Provider Name Send Command Query (subscribe to) value Data / Response Service The Choice of the Transport Protocol CERN DIM used for command and data transfer – Implemented and supported in PVSS and C++ (by CERN) – Client/server architecture – Robust and stable 9/11/201361Anik Gupta ICTDHEP

Device Driver ???? PVSS Custom Device (Custom) interface ??? Commands Status ??? Device Driver FED (DIM) SERVER PVSS Custom Device (Custom) interface DIM Commands Status FED (DIM) CLIENT Filling the Gap Between FEE and PVSS Transport based on DIM Code embedded in FED framework 9/11/201362Anik Gupta ICTDHEP

FED Server Generic Front-end Device (FED) Architecture Low Level Device Driver Custom logic DIM Server Commands Data Sync DIM Client Communication interface with standardized commands and services Device/specific layer providing high-level functionality (i.e. Configure, reset...) Low-level device interface (i.e. JTAG driver and commands) Generic client implemented as PVSS manager ~ data channels serviced by FEDs in ALICE 9/11/201363Anik Gupta ICTDHEP