André Augustinus 10 October 2005 ALICE Detector Control Status Report A. Augustinus, P. Chochula, G. De Cataldo, L. Jirdén, S. Popescu the DCS team, ALICE.

Slides:



Advertisements
Similar presentations
The Detector Control System – FERO related issues
Advertisements

André Augustinus 15 March 2003 DCS Workshop Safety Interlocks.
The Control System for the ATLAS Pixel Detector
1 DCS Installation & commissioning TB 18 May 06 L.Jirden Central DCS Detector DCS Status.
S.PopescuAlice DCS workshop, Colmar, Standards for control room PVSS panels A brainstorm meeting.
André Augustinus 16 June 2003 DCS Workshop Safety.
André Augustinus ALICE Detector Control System  ALICE DCS is responsible for safe, stable and efficient operation of the experiment  Central monitoring.
CHEP 2012 – New York City 1.  LHC Delivers bunch crossing at 40MHz  LHCb reduces the rate with a two level trigger system: ◦ First Level (L0) – Hardware.
1 ALICE Detector Control System (DCS) TDR 28 January 2004 L.Jirdén On behalf of ALICE Controls Coordination (ACC): A.Augustinus, P.Chochula, G. De Cataldo,
Supervision of Production Computers in ALICE Peter Chochula for the ALICE DCS team.
VC Sept 2005Jean-Sébastien Graulich Report on DAQ Workshop Jean-Sebastien Graulich, Univ. Genève o Introduction o Monitoring and Control o Detector DAQ.
Clara Gaspar, May 2010 The LHCb Run Control System An Integrated and Homogeneous Control System.
L. Granado Cardoso, F. Varela, N. Neufeld, C. Gaspar, C. Haen, CERN, Geneva, Switzerland D. Galli, INFN, Bologna, Italy ICALEPCS, October 2011.
DCS LEB Workshop ‘98, Rome, Detector Control System, H.J.Burckhart,1 Detector Control System H.J Burckhart, CERN u Motivation and Scope u Detector and.
The Detector Safety System for LHC Experiments Stefan Lüders ― CERN EP/SFT & IT/CO CHEP03 ― UC San Diego ― March 27 th, 2003.
Robert Gomez-Reino on behalf of PH-CMD CERN group.
Calo Piquet Training Session - Xvc1 ECS Overview Piquet Training Session Cuvée 2012 Xavier Vilasis.
DCS TCSG November 10th 1999, H.J.Burckhart1 Status of the general purpose I/O system LMB u DCS Architecture u LMB u Local Monitor Box (LMB) u Concept u.
Designing a HEP Experiment Control System, Lessons to be Learned From 10 Years Evolution and Operation of the DELPHI Experiment. André Augustinus 8 February.
Summary DCS Workshop - L.Jirdén1 Summary of DCS Workshop 28/29 May 01 u Aim of workshop u Program u Summary of presentations u Conclusion.
1 Status & Plans DCS WS L.Jirdén. 2 DCS Planning FINAL INST COM- MISS BEAM OP PRE- INST DET DCS URD ENG. SOLUTIONS PROTOTYPE SUBSYSTEM.
1 ALICE Control System ready for LHC operation ICALEPCS 16 Oct 2007 L.Jirdén On behalf of the ALICE Controls Team CERN Geneva.
Requirements Review – July 21, Requirements for CMS Patricia McBride July 21, 2005.
JCOP Workshop September 8th 1999 H.J.Burckhart 1 ATLAS DCS Organization of Detector and Controls Architecture Connection to DAQ Front-end System Practical.
André Augustinus 10 September 2001 Common Applications to Prototype A two way learning process.
Clara Gaspar, October 2011 The LHCb Experiment Control System: On the path to full automation.
André Augustinus 16 September 2002 Safety issues.
Update on Database Issues Peter Chochula DCS Workshop, June 21, 2004 Colmar.
Peter Chochula ALICE DCS Workshop, October 6,2005 DCS Computing policies and rules.
1 Control Room Model 4 Types of control rooms  Field Control Room (FCR): for example operating near the equipment in USA15. Expected to be needed throughout.
XXVI Workshop on Recent Developments in High Energy Physics and Cosmology Theodoros Argyropoulos NTUA DCS group Ancient Olympia 2008 ATLAS Cathode Strip.
DCS Workshop - L.Jirdén1 ALICE DCS PROJECT ORGANIZATION - a proposal - u Project Goals u Organizational Layout u Technical Layout u Deliverables.
The Joint COntrols Project Framework Manuel Gonzalez Berges on behalf of the JCOP FW Team.
André Augustinus 10 September 2001 DCS Architecture Issues Food for thoughts and discussion.
1 Responsibilities & Planning DCS WS L.Jirdén.
André Augustinus 17 June 2002 Technology Overview What is out there to fulfil our requirements? (with thanks to Tarek)
André Augustinus 26 October 2004 ALICE Technical Board DCS for ‘services’ costs On behalf of Lennart Jirdén.
Control in ATLAS TDAQ Dietrich Liko on behalf of the ATLAS TDAQ Group.
Peter Chochula DCS Remote Access and Access Control Peter Chochula.
André Augustinus 21 June 2004 DCS Workshop Detector DCS overview Status and Progress.
Naming and Code Conventions for ALICE DCS (1st thoughts)
André Augustinus 10 March 2003 DCS Workshop Detector Controls Layout Introduction.
15-16/3/04 DCS workshop G. De Cataldo, A,.Franco and A. Tauro 1 Answers from the HMPID to the ACC questions 1.Concerning global DCS overview drawing 2.Concerning.
JCOP Review, March 2003 D.R.Myers, IT-CO1 JCOP Review 2003 Architecture.
André Augustinus 16 September 2002 PVSS & Framework How to get started.
Bruno Belbute, October 2006 Presentation Rehearsal for the Follow-up meeting of the Protocol between AdI and CERN.
F.Carena, CERN/ALICE The ALICE Experiment Control System F. Carena / CERN-PH.
Peter Chochula ALICE Offline Week, October 04,2005 External access to the ALICE DCS archives.
Management of the LHCb Online Network Based on SCADA System Guoming Liu * †, Niko Neufeld † * University of Ferrara, Italy † CERN, Geneva, Switzerland.
André Augustinus 21 June 2004 DCS Workshop Follow-up from last workshop.
DCS overview - L.Jirdén1 ALICE ECS/DCS – project overview strategy and status L.Jirden u Organization u DCS system overview u Implementation.
CMS ECAL DCS 10.Oct S.Zelepoukine ICALEPCS CMS ECAL DCS 1 The Detector Control System for the Electromagnetic Calorimeter of the CMS Experiment.
14 November 08ELACCO meeting1 Alice Detector Control System EST Fellow : Lionel Wallet, CERN Supervisor : Andre Augustinus, CERN Marie Curie Early Stage.
Clara Gaspar, April 2006 LHCb Experiment Control System Scope, Status & Worries.
The DCS Databases Peter Chochula. 31/05/2005Peter Chochula 2 Outline PVSS basics (boring topic but useful if one wants to understand the DCS data flow)
ATLAS DCS ELMB PRR, March 4th 2002, H.J.Burckhart1 Embedded Local Monitor Board ELMB  Context  Aim  Requirements  Add-ons  Our aims of PRR.
Database Issues Peter Chochula 7 th DCS Workshop, June 16, 2003.
André Augustinus 18 March 2002 ALICE Detector Controls Requirements.
André Augustinus 13 June 2005 User Requirements, interlocks, cabling, racks, etc. Some remarks.
M. Caprini IFIN-HH Bucharest DAQ Control and Monitoring - A Software Component Model.
Online Software November 10, 2009 Infrastructure Overview Luciano Orsini, Roland Moser Invited Talk at SuperB ETD-Online Status Review.
20OCT2009Calo Piquet Training Session - Xvc1 ECS Overview Piquet Training Session Cuvée 2009 Xavier Vilasis.
CHEP 2010 – TAIPEI Robert Gomez-Reino on behalf of CMS DAQ group.
The Control and Hardware Monitoring System of the CMS Level-1 Trigger Ildefons Magrans, Computing and Software for Experiments I IEEE Nuclear Science Symposium,
CMS – The Detector Control System
The LHCb Run Control System
Tools for the Automation of large distributed control systems
Presentation transcript:

André Augustinus 10 October 2005 ALICE Detector Control Status Report A. Augustinus, P. Chochula, G. De Cataldo, L. Jirdén, S. Popescu the DCS team, ALICE collaboration, CERN Geneva Switzerland

André Augustinus CERN – European Organization for Nuclear Research 10 October 2005ICALEPCS 2005, Geneva 2 Outline  The ALICE experiment at CERN  Organization of the controls activities in ALICE  Design goals and strategy  DCS architecture  Key concepts  DCS infrastructure  Summary - Conclusion

André Augustinus CERN – European Organization for Nuclear Research 10 October 2005ICALEPCS 2005, Geneva 3  ALICE is one of the four LHC experiments  Located at point 2 of the LHC at CERN  18 different sub-detectors, 2 magnets  Dedicated for heavy ion physics; participate in pp  1000 members, 86 institutes, 29 countries Introduction Located at point 2 of the LHC at CERN A Large Ion Collider Experiment

André Augustinus CERN – European Organization for Nuclear Research 10 October 2005ICALEPCS 2005, Geneva 4 Introduction  Many sub-detector teams have limited expertise in controls, especially in large scale experiments  ALICE Controls Coordination (ACC) team put strong emphasis on coordination and support  Joint COntrols Project (JCOP) is a collaboration between CERN and all LHC experiments to exploit communalities in the control systems JCOP ATLAS CMS LHCb CERN (IT/CO) ALICE (ACC) SubDet

André Augustinus CERN – European Organization for Nuclear Research 10 October 2005ICALEPCS 2005, Geneva 5 Design goals  DCS shall ensure safe and efficient operation Intuitive, user friendly, automation  Many parallel and distributed developments Modular, still coherent and homogeneous  Changing environment – hardware and operation Expandable, flexible  Operational outside datataking, safeguard equipment Available, reliable  Large world-wide user community Efficient and secure remote access  Data collected by DCS shall be available for offline analysis of physics data

André Augustinus CERN – European Organization for Nuclear Research 10 October 2005ICALEPCS 2005, Geneva 6 Strategy and methods  Common tools, components and solutions Strong coordination within experiment (ACC) Close collaboration with other experiments (JCOP)  In ALICE there are many similar sub-systems  Identify communalities through User Requirements Collected in URD (lightweight) and Overview Drawings Through meetings and workshops

André Augustinus CERN – European Organization for Nuclear Research 10 October 2005ICALEPCS 2005, Geneva 7 Hardware Architecture  3 layers: supervisory, control, and field layer Supervisory: operator nodes, server nodes Control: worker nodes connecting to devices Field: devices, sensors and actuators  Reduce sharing of equipment between sub-detectors  Standard hardware for computers  Limit diversity of devices in field layer Dependent on sub-detector hardware Use common hardware for similar tasks General Purpose Monitoring System  Interlocks and DSS for protection of equipment DSS is safe and reliable part of DCS

André Augustinus CERN – European Organization for Nuclear Research 10 October 2005ICALEPCS 2005, Geneva 8 Software Architecture  A tree like structure; representing sub-detectors, sub-systems and devices Leaves (Device Units) ‘drive’ devices Nodes (Control Units) model and control sub-tree below Commands flow down, states flow up the tree  Operation is done from the root node Any sub-tree can be removed from tree and operated independently and concurrently : partitioning  Behaviour and functionality of a control unit is modelled as a Finite State Machine (FSM)

André Augustinus CERN – European Organization for Nuclear Research 10 October 2005ICALEPCS 2005, Geneva 9 Software Architecture Commands States and alarms Each CU logically combines states and distributes commands

André Augustinus CERN – European Organization for Nuclear Research 10 October 2005ICALEPCS 2005, Geneva 10 DCS key concepts  The FSM concept is fundamental to the DCS Intuitive and generic method to model behaviour of a system or a device An object has a well defined collection of states Moves between states by executing actions Triggered by an operator or an external event  DCS will interface to variety of Front End Electronics Front End Device (FED) concept: hides the implementation details through a common client-server interface (based on DIM)  Use common software tools: PVSSII, JCOP framework

André Augustinus CERN – European Organization for Nuclear Research 10 October 2005ICALEPCS 2005, Geneva 11 Common solutions  Does not stop with selection of common tools and standard hardware  Define a standard behaviour for the same class of devices (e.g. HV power supplies) Provide the sub-detectors with a standard state diagram  Define standard states/actions/operational sequences (automation) that can be used when defining behaviour of sub-detector  Guidelines for development, naming, numbering, look and feel of user interfaces etc.

André Augustinus CERN – European Organization for Nuclear Research 10 October 2005ICALEPCS 2005, Geneva 12 DCS infrastructure  DCS needs adequate infrastructure (computers, network,...)  Installation and maintenance of the network will be done by the CERN networking group (IT/CS)  All computers installed for the DCS will be procured, installed and maintained by a central team Highly standardized hardware  Operation of network and computers will follow rules and guidelines and use tools from the “Computing and Network Infrastructure for Controls” (CNIC) working group

André Augustinus CERN – European Organization for Nuclear Research 10 October 2005ICALEPCS 2005, Geneva 13 Network  The controls network will be a separate, well protected network Without direct access from outside the experimental area With remote access only through application gateways With all equipment on secure power  A first estimate shows the need of around 350 network connections, 2/3 in the experimental cavern Not including ~50 switches connecting ~800 embedded processors on the detector  Current installations use the CERN campus network  The controls network will be operational starting the 2 nd quarter of 2006

André Augustinus CERN – European Organization for Nuclear Research 10 October 2005ICALEPCS 2005, Geneva 14 Remote access  With the large world-wide user community remote access is an important aspect  Remote users will access PVSSII projects through a remote user interfaces via a Terminal Server  By default only observer rights, higher privileges can be granted to experts for specific, well defined tasks

André Augustinus CERN – European Organization for Nuclear Research 10 October 2005ICALEPCS 2005, Geneva 15 Remote access  This strategy has been tested with 60 remote users simultaneously running a user interface No degradation of performance (nor project, nor TS)  Tested successfully from several places around the world

André Augustinus CERN – European Organization for Nuclear Research 10 October 2005ICALEPCS 2005, Geneva 16 Computers  2U rack mounted PCs, in specially equipped racks Cooling doors, power control, on secure power  Baseline operating system is Windows Linux is used in specific cases  The DCS will be run as a large distributed PVSSII system Based on several performance tests on large distributed systems More detailed performance tests on several components are being performed

André Augustinus CERN – European Organization for Nuclear Research 10 October 2005ICALEPCS 2005, Geneva 17 Computers  A first distribution of tasks across computing nodes has led to the need of nodes Including servers and system management nodes Combining low resource demanding tasks Maintaining separation between sub-detectors  A core DCS system has been installed this summer 5 machines, to be used by sub-detectors for equipment test at first installations More worker nodes and devices to be installed soon 50% installed by 1 st quarter 2006, rest 3 rd quarter 2006

André Augustinus CERN – European Organization for Nuclear Research 10 October 2005ICALEPCS 2005, Geneva 18 Further activities at site  The DSS is being commissioned Experimental area surveillance  Interfaces to first gas systems and site infrastructure (CERN safety system, power control, environment monitoring, …) are installed and made available to users Will be extended gradually as the installation of the services (cooling, electricity, etc.) progress  Coordinated operation of the online systems (DAQ, Trigger, DCS) will start early 2006 Cosmic runs with TPC and other detectors

André Augustinus CERN – European Organization for Nuclear Research 10 October 2005ICALEPCS 2005, Geneva 19 Summary  Many sub-detectors have implemented parts of their control system and used them in lab and beam tests They could profit from the coordination and collaboration Their very valuable feedback allowed us to optimise and improve the DCS design The chosen architecture proved to be well adapted to the sub-detector needs  The process will continue with the first installations and this together with extensive performance tests will help us to further optimise and refine the system

André Augustinus CERN – European Organization for Nuclear Research 10 October 2005ICALEPCS 2005, Geneva 20 Conclusion  Results so far make that we are confident that the ALICE Detector Control System will be fully operational at the beginning of Well in time to allow safe and efficient operation of the experiment to record first collisions at LHC