ALICE. 2 Experiment Size: 16 x 26 meters (some detectors >100m away from IP) Weight: 10,000 tons Detectors: 18 Magnets: 2 Dedicated to study of ultra.

Slides:



Advertisements
Similar presentations
The Detector Control System – FERO related issues
Advertisements

The Control System for the ATLAS Pixel Detector
Peter Chochula CERN-ALICE ALICE DCS Workshop, CERN September 16, 2002 DCS – Frontend Monitoring and Control.
1 DCS Installation & commissioning TB 18 May 06 L.Jirden Central DCS Detector DCS Status.
Experiment Control Systems at the LHC An Overview of the System Architecture An Overview of the System Architecture JCOP Framework Overview JCOP Framework.
André Augustinus ALICE Detector Control System  ALICE DCS is responsible for safe, stable and efficient operation of the experiment  Central monitoring.
Peter Chochula, January 31, 2006  Motivation for this meeting: Get together experts from different fields See what do we know See what is missing See.
1 ALICE Detector Control System (DCS) TDR 28 January 2004 L.Jirdén On behalf of ALICE Controls Coordination (ACC): A.Augustinus, P.Chochula, G. De Cataldo,
Supervision of Production Computers in ALICE Peter Chochula for the ALICE DCS team.
Clara Gaspar, May 2010 The LHCb Run Control System An Integrated and Homogeneous Control System.
09/12/2009ALICE TOF General meeting 1 Online Controls Andrea Alici INFN and Universtity of Bologna, Bologna, Italy ALICE TOF General Meeting CERN building.
Robert Gomez-Reino on behalf of PH-CMD CERN group.
3 June 2003U. Frankenfeld1 TPC Detector Control System Status.
Calo Piquet Training Session - Xvc1 ECS Overview Piquet Training Session Cuvée 2012 Xavier Vilasis.
1 DCS TDR Key technical points & milestones TB 15 Dec 2003 L.Jirdén.
Summary DCS Workshop - L.Jirdén1 Summary of DCS Workshop 28/29 May 01 u Aim of workshop u Program u Summary of presentations u Conclusion.
1 Status & Plans DCS WS L.Jirdén. 2 DCS Planning FINAL INST COM- MISS BEAM OP PRE- INST DET DCS URD ENG. SOLUTIONS PROTOTYPE SUBSYSTEM.
MDT PS DCS for ATLAS Eleni Mountricha
1 ALICE Control System ready for LHC operation ICALEPCS 16 Oct 2007 L.Jirdén On behalf of the ALICE Controls Team CERN Geneva.
JCOP Workshop September 8th 1999 H.J.Burckhart 1 ATLAS DCS Organization of Detector and Controls Architecture Connection to DAQ Front-end System Practical.
Clara Gaspar, October 2011 The LHCb Experiment Control System: On the path to full automation.
S.Sergeev (JINR). Tracker consists of  4 chambers of 4 views each In total ~7200 drift tubes (~450 per view) To be controlled/monitored  HV system.
Peter Chochula CERN/ALICE 1. 2 Experiment Size: 16 x 26 metres (some detectors placed >100m from the interaction point) Mass: 10,000,000 kg Detectors:
XXVI Workshop on Recent Developments in High Energy Physics and Cosmology Theodoros Argyropoulos NTUA DCS group Ancient Olympia 2008 ATLAS Cathode Strip.
The Joint COntrols Project Framework Manuel Gonzalez Berges on behalf of the JCOP FW Team.
André Augustinus 10 September 2001 DCS Architecture Issues Food for thoughts and discussion.
1 Responsibilities & Planning DCS WS L.Jirdén.
André Augustinus 17 June 2002 Technology Overview What is out there to fulfil our requirements? (with thanks to Tarek)
D etector C ontrol S ystem ALICE DCS workshop G. De Cataldo CERN-CH, A. Franco INFN Bari, I 1 Finite State Machines (FSM) for the ALICE DCS:
André Augustinus 10 October 2005 ALICE Detector Control Status Report A. Augustinus, P. Chochula, G. De Cataldo, L. Jirdén, S. Popescu the DCS team, ALICE.
P. Chochula ALICE Week Colmar, June 21, 2004 Status of FED developments.
The Detector Control Power System of the Monitored Drift Tubes of the ATLAS Experiment Theodoros Alexopoulos NTU Athens TWEPP08 September 17, 2008 Naxos,
Management of the LHCb DAQ Network Guoming Liu * †, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
Overview of DAQ at CERN experiments E.Radicioni, INFN MICE Daq and Controls Workshop.
Peter Chochula ALICE Offline Week, October 04,2005 External access to the ALICE DCS archives.
Controls EN-ICE FSM for dummies (…w/ all my respects) 15 th Jan 09.
Peter Chochula.  DCS architecture in ALICE  Databases in ALICE DCS  Layout  Interface to external systems  Current status and experience  Future.
Alice DCS workshop S.Popescu ISEG Crate controller + HV modules ISEG HV modules 12 Can bus PVSS OPC Client 1 Generic OPC Client Iseg OPC.
Management of the LHCb Online Network Based on SCADA System Guoming Liu * †, Niko Neufeld † * University of Ferrara, Italy † CERN, Geneva, Switzerland.
DCS Workshop, CERN MARCH ACORDE (Alice Cosmic ray detector) 60 scintillator modules (120 HV channels) Each module will have two scintillator counters.
DCS overview - L.Jirdén1 ALICE ECS/DCS – project overview strategy and status L.Jirden u Organization u DCS system overview u Implementation.
DETECTOR CONTROL SYSTEM FOR THE ALICE EXPERIMENT AT CERN -LHC Anik Gupta ICTDHEP JAMMU 11 th September 2013 Special Thanks to the ALICE CONTROLS COORDINATION.
14 November 08ELACCO meeting1 Alice Detector Control System EST Fellow : Lionel Wallet, CERN Supervisor : Andre Augustinus, CERN Marie Curie Early Stage.
Clara Gaspar, April 2006 LHCb Experiment Control System Scope, Status & Worries.
AFP Trigger DAQ and DCS Krzysztof Korcyl Institute of Nuclear Physics - Cracow on behalf of TDAQ and DCS subsystems.
Management of the LHCb DAQ Network Guoming Liu *†, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
The DCS Databases Peter Chochula. 31/05/2005Peter Chochula 2 Outline PVSS basics (boring topic but useful if one wants to understand the DCS data flow)
ATLAS DCS ELMB PRR, March 4th 2002, H.J.Burckhart1 Embedded Local Monitor Board ELMB  Context  Aim  Requirements  Add-ons  Our aims of PRR.
CSC Detector Control System of Silicon Drift Detector in ALICE Jiří Král Czech Technical University.
S.Sergeev (JINR). Tracker consists of  4 stations of 4 views (planes) each In total ~7200 drift tubes (~450 per view) To be controlled/monitored 
The ALICE Silicon Pixel Detector Control system and Online Calibration tools Ivan Amos Calì (a,b) On behalf of the SPD Project in.
T0 DCS Status DCS Workshop March 2006 T.Karavicheva on behalf of T0 team.
DCS workshop,march 10th P.Saturnini, F. Jouve Slow control trigger of the muon arm Muon trigger of the muon arm Low voltage VME Crate External parameters.
E Ethernet C CAN bus P Profibus HV HV cables LV LV cables (+busbar) S Serial (RS232) Signal cable Other/Unknown Liquid or Gas Cable and/or Bus PCI-XYZ.
André Augustinus 17 June 2002 Detector URD summaries or, what we understand from your URD.
ATLAS The ConditionDB is accessed by the offline reconstruction framework (ATHENA). COOLCOnditions Objects for LHC The interface is provided by COOL (COnditions.
M. Caprini IFIN-HH Bucharest DAQ Control and Monitoring - A Software Component Model.
20OCT2009Calo Piquet Training Session - Xvc1 ECS Overview Piquet Training Session Cuvée 2009 Xavier Vilasis.
PVSS an industrial tool for slow control
ATLAS MDT HV – LV Detector Control System (DCS)
CMS – The Detector Control System
Controlling a large CPU farm using industrial tools
The LHCb Run Control System
TPC Detector Control System
Philippe Vannerem CERN / EP ICALEPCS - Oct03
Pierluigi Paolucci & Giovanni Polese
Pierluigi Paolucci & Giovanni Polese
Pierluigi Paolucci & Giovanni Polese
Presentation transcript:

ALICE

2 Experiment Size: 16 x 26 meters (some detectors >100m away from IP) Weight: 10,000 tons Detectors: 18 Magnets: 2 Dedicated to study of ultra relativistic heavy ions collisions at LHC Collaboration Members: 1000 Institutes: 90 Countries: 30 ALICE DCS PROJECT Started 8 years ago Small central team collaborating with: Detector groups (~100 people involved) LHC experiments ALICE DCS Responsible for safe 24/7 operation of ALICE Based on commercial SCADA system PVSSII Controls 18 sub-detectors and about 150 subsystems (HV,LV,FEE…) Provides infrastructure and services for the experiment

4 Very High Voltage (100 kV) 25um HV membrane High Voltage (3 kV) Low Voltage (60 kW) Front-End Electronics Mounted on chambers Cooling 0.1 o C stability Gas (88 m 3 ) 510 cm l = 5.1 m, d = 5.6m 570 k readout channels Largest Ever ALICE TPC

Oct 2008 Split J. Schukraft 5 Inner Tracking System ~ 10 m 2 Si detectors, 6 layers Pixels, Drift, double sided Strips 3.9 cm < r < 43 cm Strip layers Drift layers Pixel layers Pb-Sn solder bumps: ~30µm diameter Readout chips: 725  m native thickness thinned to 150  m after bump deposition pixels, configured via DCS ALICE Inner Tracking System (ITS)

6 Installation of the ITS in ALICE

7 ~ 8 m 2 PbW0 4 crystals, r ~ 5m Operating at -20C 18 k channels controlled and configured via DCS Photon Spectrometer

8 4 RPC trigger planes, 120 m 2, 20 k FEE Muon Spectrometer 10 CSC tracking planes, 90 m 2, 1.1 M FEE

TPC Readout Partition Examples: ALICE TPC 4500 Front End cards > channels ALICE TRD > readout channels tracking CPUs configured via DCS Frontend and readout electronics – new type of challenge for the DCS unlike in the past, DCS is now involved in configuration and control of large number of FEE modules FEE cards often mounted directly on detectors Large number of Ethernet ports required directly on detectors more than 700 SBCs in ALICE DCS operation in magnetic field TPCTRD

Detector Controls System DETECTORS External Services and Systems Electricity Ventilation Cooling Gas Magnets Safety Access Control LHC Configuration Database Archival Database Devices SCADA 1000 ins/s Up to 6GB Infrastructure B-field Space Frame Beam Pipe Radiation Envitonment Alice Systems DAQ TRIGGER HLT Conditions Database ECS OFFLINE 10 SPDPHOFMDT0V0PMDMTRMTKZDCACOSDDSSDTPCTRDTOFHMP Controls Context

values/s read by software values/s Injected into PVSS 1000 values/s Written to ORACLE after smoothing in PVSS >200 values/s Sent to consumers Dataflow in ALICE DCS 6GB of data is needed to fully configure ALICE for operation Several stages of filtering applied to acquired data

18 detectors with different requirements Effort to device standardization Still large diversity mainly in FEE part Large number of busses (CANbus, JTAG, Profibus, RS232, Ethernet, custom links…) 1200 network-attached devices 270 crates (VME and power supplies) controlled voltage channels

OPC items Front-End (FED) services parameters supervised by the DCS Monitored at typical rate of 1Hz Hardware diversity is managed through standard interfaces OPC servers for commercial devices FED servers for custom hardware Provides hardware abstraction, uses CERN DIM (TCP/IP based) protocol for communication

Core of the DCS is based on commercial SCADA system PVSSII 110 detector computers 60 backend servers DCS Oracle RAC (able to process up to inserts/s)

 PVSSII system is composed of specialized program modules (managers)  Managers communicate via TCP/IP  ALICE DCS is built from 100 PVSS systems composed of 900 managers  PVSSII is extended by JCOP and ALICE frameworks on top of which User applications are built 15 CTL API DM EM DRV UI User Application ALICE Framework JCOP Framework PVSSII

16 User Interface Manager Data Manager Driver User Interface Manager User Interface Manager Event Manager API Manager Control Manager In a scattered system, the managers can run on dedicated machines In a simple system all managers run on the same machine

17 Distribution Manager Distribution Manager Distribution Manager User Interface Manager Data Manager Driver User Interface Manager User Interface Manager Event Manager API Manager Control Manager User Interface Manager Data Manager Driver User Interface Manager User Interface Manager Event Manager API Manager Control Manager User Interface Manager Data Manager Driver User Interface Manager User Interface Manager Event Manager API Manager Control Manager In a distributed system several PVSSII systems (simple or scatered) are interconnected

Each detector DCS is built as a distributed PVSSII system Mesh, no hierarchical topology Detector specific

ALICE DCS is built as a distributed system of detector systems Central servers connect to ALL detector systems global data exchange synchronization Monitoring…

PVSSII distributed system is not a natural system representation for the operator ALICE DCS Is modeled as a FSM using CERN SMI++ tools Hide experiment complexity Focus on operational aspect

DET 1 HVLVFEE… CH DCS DET 1 VHVHVLV… CH Hierarchical approach: ALICE DCS is represented as a tree composed of detector systems Each detector system is composed of subsystems Subsystems are structured to devices (crates, boards) and channels

DET 1 HVLVFEE… CH DCS DET 1 VHVHVLV… CH SINGLE CHANNEL FSM OFF STB ON CFG RAM UP RAMP DWN ERR CONFIGURE GO_ON GO_OFF RESET GO_STB (simplified) TOP LEVEL FSM STB BEAM Tuning READY Moving BT Moving Ready Moving BT ERR GO_BT GO_RDY GO_STB RESET GO_BT DCS devices are described as FSM State diagrams are standardized for channels and devices of the same type Top level DCS takes into account status of all leaves

DET 1 HVLVFEE… CH DCS DET 1 VHVHVLV… CH Top level DCS state is computed as a combination of all FSM states Single channel can affect the top level state (hence the readiness of the whole DCS)

DET 1 HVLVFEE… CH DCS DET 1 VHVHVLV… CH DCS operator can exclude any part of the DCS tree and release the control to other users Top level DCS does not take excluded parts of the hierarchy into account

DET 1 HVLVFEE… CH DCS DET 1 VHVHVLV… CH Subsystem expert can take control and take care of the problem Excluded parts of the hierarchy can be returned to the central operator on fly

28 DCS UI ALICE component used by all detectors Standardized operation representation Detector specific panels Central operator can browse and operate detector panels Central panel FSM operation Detector Panel DCS tree Detector systems are developed by detector teams outside CERN More than 100 integration session were organized with individual detectors Verification of: Conformity to rules (naming conventions, interfaces, …) Stability Performance Safety (interlocks, alerts…) Dataflow (configuration, archival) Security (access control) Common operation with ALICE Common sessions with all ALICE detectors

First tracks seen at LHC and recorded by ALICE SPD during injection line tests June 15, 2008 First beams injected to LHC and steered 3km around LHC, passing ALICE around 8pm (Beijing time) … which triggered worldwide celebrations (Beijing, around 8pm)

First beams circulating LHC 10, September 2008 Luminosity monitor (V0) … and seen by ALICE DCS … which was READY

 ALICE DCS is built on commercial software (PVSS, OPC) with CERN extensions (Framework, SMI++, DIM, FED servers…)  Large distributed PVSS system operates according to specifications  ALICE DCS was READY for LHC beams in 2008  Looking forward for physics in 2009