Presentation is loading. Please wait.

Presentation is loading. Please wait.

Dr. Salvatore Danzeca EN-STI-ECE

Similar presentations


Presentation on theme: "Dr. Salvatore Danzeca EN-STI-ECE"— Presentation transcript:

1 An overview of FPGA use in the LHC accelerator and the CERN experiments.
Dr. Salvatore Danzeca EN-STI-ECE SEFUW: SpacE FPGA Users Workshop, 3rd Edition Thanks a lot to: F. Anghinolfi, K. Wyllie, E. Chesta, A. Masi, M. Brugger, S. Gilardoni, T. Grassi, E. Gousiou, M. Barros Marin, N. Trikoupis, S. Uznanski, P. Peronnard

2 Outline Radiation environment
The challenge of using COTS FPGAs at CERN Accelerator equipment Cryogenics Power Control, Quench Protection, Beam instrumentation Experiments equipment ALICE TPC CMS HCAL The on-going research field and future solutions FPGA system level testing at CHARM CERN facility Conclusions 3/17/2016 S. Danzeca - An overview of FPGA use in the LHC accelerator and the CERN experiments SEFU Workshop 2016

3 CERN accelerators complex
3/17/2016 S. Danzeca - An overview of FPGA use in the LHC accelerator and the CERN experiments

4 Accelerators: Radiation Sources
Direct beam Losses Collimators and collimator like objects injection, extraction, dump levels usually scale with beam intensity & energy Beam/Beam, Beam/Target Collisions around experimental areas scale with luminosity/p.o.t. & energy Beam-Residual-Gas circular machines: all areas along the ring scales with intensity, residual gas density & energy Synchrotron radiation (lepton machines) RF (e.g, during conditioning) Radiation sources (irradiators, calibration, tomography, etc.) 3/17/2016 S. Danzeca - An overview of FPGA use in the LHC accelerator and the CERN experiments

5 Radiation Levels in the CERN accelerators
Sea Level Avionic ISS Space & Deep Space R2E Single-Event Effects Material Damage R2E Cummulative Damage PIXEL PIXEL TRT Atlas MUON COTS Systems Hardened Electronics Electronics Custom Boards with COTS Damage 3/17/2016 S. Danzeca - An overview of FPGA use in the LHC accelerator and the CERN experiments

6 The challenge 𝑁 𝐹𝑎𝑖𝑙𝑢𝑟𝑒 = 𝑵∗σ ∗𝑓𝑙𝑢𝑒𝑛𝑐𝑒
The reliability is a main concern for the CERN equipment The criticality of the equipment can be very high 2 proton beams at 6.5 TeV of ~3×1014 p+ each Total stored energy of 0.7 GJ Sufficient to melt 1 ton of Cu Tiny fractions of the stored beam suffice to quench a superconducting LHC magnet or even to destroy parts of the accelerator The radiation effects on the FPGAs can lead to lose the beam (beam dump) Lost time for physics The radiation effects on the FPGAs can lead to a failure of the safety system of the LHC Part of the machine can be destroyed In the experiments the radiation effects should be minimized in order to have meaningful data for the physics analysis High number of devices (N): 𝑁 𝐹𝑎𝑖𝑙𝑢𝑟𝑒 = 𝑵∗σ ∗𝑓𝑙𝑢𝑒𝑛𝑐𝑒 3/17/2016 S. Danzeca - An overview of FPGA use in the LHC accelerator and the CERN experiments

7 FPGA vs ASIC use at CERN In the accelerator sector the use of ASIC is very limited due to the high development time and resources needed COTS FPGAs are the primary choice Benefits Accelerator COTS: Low cost (high volume required) low development costs re-programmable well developed & affordable platforms of tools In the experiments the high radiation levels close to the interaction points and the high performances required a custom ASIC development In environments with low/medium radiation and where power/integration issues are less critical FPGA are now an attractive alternative Experiments 3/17/2016 S. Danzeca - An overview of FPGA use in the LHC accelerator and the CERN experiments

8 Accelerator equipment using COTS FPGAs in radiation areas
Cryogenics Power Converter Quench protection system Beam Instrumentation Common architecture: Radiation environment 3/17/2016 S. Danzeca - An overview of FPGA use in the LHC accelerator and the CERN experiments

9 Cryogenics FPGA use Card architecture includes: Anti-fuse FPGAs
LHC: Installed below dipoles in all arcs from cell 8 and in shielded areas. Card architecture includes: Anti-fuse FPGAs Other critical components (regulators ADCs, DACs) Overall design The electronic cards use binary-data interfaces and perform simply math. Complex calculations performed at higher level PC (no radiation). FPGA/PCB design TMR (FFs) safe state-machines watchdogs memory data refreshing resets, power-cycles 800 WorldFIP electronic crates Active channels: 6500 Temperature 800 Pressure 500 Liquid He level gauges 1500 Cold Mass Heaters 500 Beam Screen Heaters 1100 Mechanical Switches (I/O) 3/17/2016 S. Danzeca - An overview of FPGA use in the LHC accelerator and the CERN experiments

10 Why choosing an Antifuse?
Reliability: No SEE after applying all the mitigation techniques Card malfunction can result in a beam-dump and might cause damage to the accelerator (e.g. heaters) Long life time Can withstand more than 800 Gy Drawbacks: Very small code possible (all the resources used in the project) Non-reprogrammability can lead to the necessity to change all the cards if an upgrade is needed (never happened) Higher cost Currently searching for a new FPGA candidate: Reprogrammable, non latching-up, tolerant >1kGy (100krad) 3/17/2016 S. Danzeca - An overview of FPGA use in the LHC accelerator and the CERN experiments

11 Power converter, Quench protection system and Beam instrumentation
Power converter radiation tolerant controller FGClite Radiation tolerant solution based on the ProAsic3 FPGA Will replace the senior FGC2 which led to 10 and 3 dumps during the 2012 and operation respectively (cross section of ~10-10cm2) 1600 units to be produced and replaced Quench protection system DQHSU is a rad-tol fast measurement board based on ProAsic3 to supervise quench heater discharges 1232 cards installed under the LHC dipole Main feature: Communication handling ADC and DACs controller digital notch filter design + processing Beam instrumentation Point to point architecture (different from the common fieldbus) Use of the ProAsic3 with RadHard high speed link (GBT/Versatile link) 350 units to be produced 3/17/2016 S. Danzeca - An overview of FPGA use in the LHC accelerator and the CERN experiments

12 Why choosing an flash based FPGA (ProAsic3)?
Reliability: No SEE after applying all the mitigation techniques Card malfunction can result in a beam-dump Reprogrammability Different features already added on-the-go during the operation Drawbacks: TID limit is less than the Antifuse ~500 Gy Reprogrammability feature not working above 200 Gy Limited resources in terms of logic gates (i.e DSP math) Currently searching for a new FPGA candidate: Reprogrammable, non latching-up, tolerant up to >500 Gy (50krad) with DSP capability 3/17/2016 S. Danzeca - An overview of FPGA use in the LHC accelerator and the CERN experiments

13 Experiments architectures
Many subsystems use or will use FPGAs in radiation areas (but not in extreme radiation areas like central trackers) ATLAS: TileCal, muons CMS: HCAL, muons, GEM, CP-PPS LHCb: RICH, calorimeters, muons ALICE: TDC In the HEP Front-End Electronics installed around 2005, FPGAs were mostly used for control and readout logic with external high speed links. Future users of SRAM FPGAs in the radiation zone will be ATLAS TileCal, ATLAS Lar and LHCb-RICH. They favour the Kintex7 as it has the possibility to implement auto-correction of SEUs in the config bits (using an IP provided by Xilinx.) Use from different groups (CMS HCAL, CMS Muon detectors, ALICE) of the flash based Igloo2 and Smartfusion2 for future upgrades A more detailed list of the experiment FPGA use is in the Backup 3/17/2016 S. Danzeca - An overview of FPGA use in the LHC accelerator and the CERN experiments

14 ALICE TPC Readout Electronics
216 Readout Control Units Read data from Front End Card Sends data over fiber for analysis and permanent storage 2 SRAM based FPGAs & 2 flash based FPGAs per system TMR is not possible due to resource limitation The reconfiguration networwk consists of: A radiation tolerant flash memory, a radiation tolerant flash based FPGA and the DCS board – an Embedded PC with Linux. Corrects SEUs in the configuration memory of the Xilinx Virtex-II pro Why it works: Active Partial Reconfiguration How it works: RCU support FPGA reads one frame at the time from the flash memory and Xilinx configuration memory. The frames are compared bit by bit. If a difference is found, the faulty frame is overwritten. Upgrade for the RCU2 Use a single FPGA based on the SmartFusion2: Linux on ARM+ FPGA J.Alme TWEPP 2013 J.Alme CERN FPGA Workshop 2014 3/17/2016 S. Danzeca - An overview of FPGA use in the LHC accelerator and the CERN experiments

15 FPGA CMS HCAL FPGAs both in the Front-End Electronics (FEE) mounted on the detector and in the Back-End electronics (BEE) located in the counting rooms. Existing system: produced in 2004 The new FPGAs system uses the SERDES at about 5 Gbps 450 boards produced and 1600 FPGAs to be used before the 2018 Upgraded system: in steps between 2014 and 2019. Existing Upgrade Expected TID on FEE 2 Gy 45 Gy (4.5 krad) Expected 1 MeV-equivalent neutron fluence on FEE 1 x 1011 / cm2 7 x 1011 / cm2 Front-End Actel antifuse (for control only) Microesemi flash-based (control and data) Igloo2 Back-End Xilinx and Altera Xilinx Number of FPGA types 2 (FEE) + ~8 (BEE) ~5 (FEE) + 5 (BEE) Number of developers 1 (FEE) + ~4 (BEE) ~6 (FEE) + 4 (BEE) T. Grassi ECFA 2014 3/17/2016 S. Danzeca - An overview of FPGA use in the LHC accelerator and the CERN experiments

16 Current research from different CERN groups
Igloo2- SmartFusion2 as future FPGA for new developments Several test campaigns carried out by EN-STI-ECE group on the full qualification of the igloo2 device family Fabric logic SEU sensitivity Embedded SRAM sensitivity Reprogrammability TID lifetime PLL (on going) ARM processor (on going) System level testing from the CMS HCAL group Test of the SERDES block – high speed communication link Xilinx Version7 FPGAs (Zynq/ Kintex7) ATLAS Liquid Argon (LAr) Calorimeter and LHCb RICH is testing the suitability of the Kintex7 devices EN-STI-ECE group evaluates the possibility of using SRAM based FPGAs (Zynq) in CERN accelerator equipment 3/17/2016 S. Danzeca - An overview of FPGA use in the LHC accelerator and the CERN experiments

17 Testing FPGAs in the accelerator environment
Typical process for quality assurance of COTS based electronics applies also to FPGAs (CERN guidelines) Radiation testing on FPGAs usually carried out in proton/ion beam Once the design is ready the system can be tested: where? 3/17/2016 S. Danzeca - An overview of FPGA use in the LHC accelerator and the CERN experiments

18 CHARM a radiation facility for system testing
CHARM = Cern High Energy AcceleRator Mixed-Field Facility Main purpose Radiation tests of electronic equipment and components in a radiation environment similar to the one of the accelerator Large dimension of the irradiation room - Large volumes electronic equipment - High number of single components - Full systems Numerous representative radiation fields - Mixed-Particle-Energy: Tunnel and Shielded areas, atmospheric and space environments - Direct beam exposure (proton beam 24 GeV) CHARM! 3/17/2016 S. Danzeca - An overview of FPGA use in the LHC accelerator and the CERN experiments

19 Conclusion CERN harsh radiation environment and challenges
Criticality and reliability is a key aspect in the FPGA use at CERN Large unit number ~1000 Accelerator equipment common architecture Antifuse FPGAs Flash based FPGAs Experiment common architecture High speed links Research on new solutions for new developments Collaborations and sharing is essential (and welcome) Radiation test facility to make possible a system level testing 3/17/2016 S. Danzeca - An overview of FPGA use in the LHC accelerator and the CERN experiments

20 Thank you

21 BACKUP

22 Spectra vs. position CHARM can reproduce different scenarios of radiation environments: LHC UJ zone LHC shield zone ISS Orbit Atmospheric 200m 99% LHC – UJ zone BACKUP 90% 99% Atmospheric (h=200m) ISS Orbit 12/11/2018

23 FEE systems of CMS Sub-system Approx radiation FPGAs in 2008-2012
FPGAs after 2012 (radiation ~6x higher) Tracker [2] 200 kGy. 1014 n/cm2 No FPGAs (ASICs only) ECAL [3,4] 25 kGy. HCAL [5] 3 Gy. 1011 n/cm2 Actel anti-fuse FPGA, for control only igloo2 (flash) for control, data processing, TDC and transmission from 2016 Muon detectors 0.4 Gy. 5x1010n/cm2 SRAM FPGAs [6, 7]. igloo2 (flash) for control, data processing, TDC and transmission from 2014 [15] CT-PPS 200 Gy, 2x1012 n/cm2 per 100/fb integrated lumin. Did not exist igloo2 (flash) for control, data processing, TDC and transmission from 2018 ? 23 23

24 FEE systems of ATLAS Sub-system Approx radiation FPGAs in 2008-2023
FPGAs after 2023 (radiation ~6x higher) Tracker Xx kGy. 1014 n/cm2 No FPGAs (ASICs only) Liquid Argon Calorimeter [16] On chamber (3.4 kGy): probably no FPGAs. On sTGC (90 Gy): investigate with xilinx for processing and 10 Gbps readout links. Tile calorimeter [17] 15 Gy. 1011 n/cm2 640 Mb/s, severe errors in data transmission, loss of configurations Demo project with xilinx for processing and 10 Gbps readout links Muon detectors xx Gy. 5x1010n/cm2 24 24

25 FEE systems of LHCb 1014 n/cm2 240 Gy, 1012 n/cm2 70 Gy. 50 Gy. 80 Gy.
Sub-system [9, 10] Approx radiation in FPGAs in FPGAs after 2018 (radiation ~6x higher) Inner Tracker 60 kGy. 1014 n/cm2 No FPGAs in the hot zone (ASICs only) Under study (probably not required) RICH 240 Gy, 1012 n/cm2 Actel AX (antifuse) + Actel ProAsicPlus (flash) for controls Xilinx Kintex7 Outer 70 Gy. SciFi Tracker Under study (Microsemi Igloo2) Calorimeters 50 Gy. Actel AX (antifuse) for 80 MHz processing, Actel ProAsicPlus (flash) for 40MHz processing and control [9] Muons [11] 80 Gy. Actel ProAsicPlus, for calibration system Under study (Flash device) 25 25

26 FEE systems of ALICE Sub-system Approx radiation FPGAs in 2008-2012
FPGAs after 2012 (radiation higher) TPC [20] 16 Gy. 1011 n/cm2 [21] Virtex-II Pro (SRAM) for datapath, with its configuration verified and refreshed by an Actel ProASIC+ (flash) Microsemi SmartFusion2 (flash). Links up to 5 Gbps. Problems observed. FPGA PLL loss of lock → use TTCrx instead [10]. ProASIC3 (flash) for radmon. DDL (Detector Data Link, common to all subsystems) Actel ProASIC+ (flash). 200 MB/s links All others ~0 at the FPGA location


Download ppt "Dr. Salvatore Danzeca EN-STI-ECE"

Similar presentations


Ads by Google