Configuration of BLETC Module System Parameters Validation

Slides:



Advertisements
Similar presentations
“LHC Controls for Sector Test” Planning SPS Extraction Kicker and Septa LHC Injection Kicker and other equipment (beam stoppers, dumps,...) Etienne CARLIER.
Advertisements

 M.A - BIS Workshop – 4th of February 2015 BIS software layers at CERN Maxime Audrain BIS workshop for CERN and ESS, 3-4 of February 2015 On behalf of.
Overview of Data Management solutions for the Control and Operation of the CERN Accelerators Database Futures Workshop, CERN June 2011 Zory Zaharieva,
W. Sliwinski – eLTC – 7March08 1 LSA & Safety – Integration of RBAC and MCS in the LHC control system.
CORE 2: Information systems and Databases CENTRALISED AND DISTRIBUTED DATABASES.
Distribution of machine parameters over GMT in the PS, SPS and future machines J. Serrano, AB-CO-HT TC 6 December 2006.
Online Calibration of the D0 Vertex Detector Initialization Procedure and Database Usage Harald Fox D0 Experiment Northwestern University.
Summary of CSC Track-Finder Trigger Control Software Darin Acosta University of Florida.
B. Todd et al. 25 th August 2009 Observations Since v1.
Eugenia Hatziangeli Beams Department Controls Group CERN, Accelerators and Technology Sector E.Hatziangeli - CERN-Greece Industry day, Athens 31st March.
Post Mortem Collimators and Movable Objects Michel Jonker Post Mortem Workshop Cern, 16 & 17 January 2007.
LHC BLM Software revue June BLM Software components Handled by BI Software section –Expert GUIs  Not discussed today –Real-Time software  Topic.
Wojciech Sliwinski BE/CO for the RBAC team 25/04/2013.
Eva Barbara Holzer MPP, CERN July 31, Eva Barbara Holzer, CERN MPP CERN, July 31, 2009 PROCEDURES FOR CHANGES TO BLM SYSTEM SETTINGS DURING LHC.
BP & RS: BIS & SLP for AB/CO Review, 23 h Sept Realisation of the interlocking between SPS, LHC and CNGS and open issues Beam Interlock Systems.
5 June 2002DOM Main Board Engineering Requirements Review 1 DOM Main Board Software Engineering Requirements Review June 5, 2002 LBNL Chuck McParland.
GAN: remote operation of accelerator diagnosis systems Matthias Werner, DESY MDI.
V. Kain – eLTC – 7March08 1 V.Kain, S. Gysin, G. Kruk, M. Lamont, J. Netzel, A. Rey, W. Sliwinski, M. Sobczak, J. Wenninger LSA & Safety - RBAC, MCS Roled.
B. Todd et al. 19 th August 2009 The Beam Interlock System Thanks to: Machine Protection Panel, R. Schmidt, B. Puccio, M. Zerlauth and many more… 0v2.
Systems, their relations & information. Concepts and Status of the new central service for tracking relations between CERN accelerator systems TE/MPE TM.
TE/TM 30 th March - 0v1 CERN MPP SMP 3v0 - Introduction 3 *fast *safe *reliable *available generates flags & values.
Interfacing the FMCM for additional protection in the LHC and the SPS- LHC/CNGS Transfer Lines to the CERN controls system Cristina Gabriel Casado, Interlock.
D.Macina TS/LEATOTEM Meeting25/02/2004 Roman Pot test at the SPS Test of the Roman Pot prototype in the SPS proposed in December 2003 (CERN/LHCC ):
LHC machine protection close-out 1 Close-out. LHC machine protection close-out 2 Introduction The problem is obvious: –Magnetic field increase only a.
PC Current Interlocking for the SPS Fast Extractions. 1 J. Wenninger July 2009.
Eva Barbara Holzer MPP, CERN July 31, Eva Barbara Holzer, CERN MPP CERN, July 31, 2009 BLM System Audit Sequel.
MPE Workshop 14/12/2010 Post Mortem Project Status and Plans Arkadiusz Gorzawski (on behalf of the PMA team)
BIS main electronic modules - Oriented Linac4 - Stéphane Gabourin TE/MPE-EP Workshop on Beam Interlock Systems Jan 2015.
Design process of the Interlock Systems Patrice Nouvel - CERN / Institut National Polytechnique de Toulouse CLIC Workshop Accelerator / Parameters.
LTC 01/ Operational scenario of the BLM System L. Ponce With the contribution of B. Dehning, M. Sapinski, A. Macpherson, J. Uythoven, V. Kain, J.
Software for Testing the New Injector BLM Electronics
Beam Wire Scanner (BWS) serial link requirements and architecture
RF acceleration and transverse damper systems
HCAL Database Goals for 2009
0v1.
Data providers Volume & Type of Analysis Kickers
DRY RUNS 2015 Status and program of the Dry Runs in 2015
2007 IEEE Nuclear Science Symposium (NSS)
R. Denz, A. Gomez Alonso, AT-MEL-PM
Injectors BLM system: PS Ring installation at EYETS
How SCADA Systems Work?.
M.Jonker CTC MPO-WG status
Injectors BLM system: PS and PSB Ring installations at EYETS
External Review on LHC Machine Protection, B.Dehning
Christos Zamantzas for the BLM team
Distributed BI Systems
Remote setting of LHC BLM thresholds?
Machine Protection Xu Hongliang.
Combiner functionalities
LHCCWG Meeting R. Alemany, M. Lamont, S. Page
Module 2: Computer-System Structures
BLM settings management in LSA
J. Emery, B. Dehning, E. Effinger, G. Ferioli, C
Agenda 9:00-10:00 Beam Interlock System Changes Following the 2006 Audit Benjamin Todd 10:00-11:00 Beam Dump System follow-up from the 2008 Audit Jan Uythoven.
Test and start‐up procedures
Commissioning of BLM system
LHC Beam Loss Monitoring System
Module 2: Computer-System Structures
Operational scenario of the BLM system
LHC BLM Software audit June 2008.
Report from the LHC BLM System Audit1
The LHC Beam Interlock System
Interlocking strategy
Module 2: Computer-System Structures
Module 2: Computer-System Structures
Integration into MP - Considerations -
Review of hardware commissioning
LIU BWS Firmware status
Close-out.
Presentation transcript:

Configuration of BLETC Module System Parameters Validation Configuration and Validation of the LHC Beam Loss Monitoring System. C. Zamantzas, B. Dehning, J. Emery, J. Fitzek, F. Follin, S. Jackson, V. Kain, G. Kruk, M. Misiowiec, C. Roderick, M. Sapinski Abstract: The LHC Beam Loss Monitoring (BLM) system is one of the most complex instrumentation systems deployed in the LHC. As well as protecting the machine, the system is also used as a means of diagnosing machine faults, and providing feedback of losses to the control room and several systems such as the Collimation, the Beam Dump and the Post-Mortem. The system has to transmit and process signals from over 4’000 monitors, and has approaching 3 million configurable parameters. This paper describes the types of configuration data needed, the means used to store and deploy all the parameters in such a distributed system and how operators are able to alter the operating parameters of the system, particularly with regard to the loss threshold values. The various security mechanisms put in place, both at the hardware and software level, to avoid accidental or malicious modification of these BLM parameters are also shown for each case. Configuration of BLETC Module Storage of Parameters The FPGA firmware can be loaded either: Initially the data are imported in MTF and are copied and linked together after scrutinous verification to the Layout DB. The latter provides hierarchical views of the system as well as information on the position in the tunnel and with respect to all other elements of the LHC. Remotely via the Front-End Computer and the VME bus, Locally via the JTAG connection in the front panel Set as auto-configurable by utilising the on-board memory. MTF DB LAYOUT DB LSA DB The complete dataset can be one-way synchronised to LSA where it is split into tables to hold information for each monitor, crate or sector. Parameters are stored in the configuration memory. After the initialisation of the module the FPGA makes a local copy in its embedded memory. The stored parameters can be updated remotely, but if safety reasons require it, only local updates can be enforced by an on-board switch. sd System Parameters Validation The TRIM application GUI triggered by the operators reads the MASTER table, applies unique per monitor threshold coefficients, which are always < 1, and saves the new table as the APPLIED table. The Management of Critical Settings (MCS) Online Check requests from the FEC to read all the currently used parameters (by incrementing the energy levels and recording the used threshold values). It subsequently transmits them to the MCS Online Check for comparison. Both the FEC and the Software Interlock System (SIS) receive the PASS/FAIL result. The APPLIED table is sent to the system and its correctness is verified. In that way, operators and domain experts can readjust by scaling down the individual threshold values of each monitor, if this is found necessary, without impairing the protection characteristics of the system. Drive of settings can be performed only when the accelerator status permits, i.e. offline. Presented at the 9th European Workshop on Beam Diagnostics and Instrumentation for Particle Accelerators (DIPAC09), Basel, May 25-27 2009