Trigger/DAQ/DCS. LVL1 Trigger Calorimeter trigger Muon trigger Central Trigger Processor (CTP) Timing, Trigger, Control (TTC) Germany, Sweden, UK Italy.

Slides:



Advertisements
Similar presentations
Preparation for ATLAS SCT/ID Endcap Completion at CERN Jo Pater Manchester, 4 January 2006.
Advertisements

ATLAS LHCC 28/11/00 A. Henriques/CERN 1 H8 : Pixel, SCT, TRT, Tiles, LAr em barrel, Muons SPS H6: EMEC, HEC, FCAL, Background studies GIF: Muons (MDT,
June 19, 2002 A Software Skeleton for the Full Front-End Crate Test at BNL Goal: to provide a working data acquisition (DAQ) system for the coming full.
The LAr ROD Project and Online Activities Arno Straessner and Alain, Daniel, Annie, Manuel, Imma, Eric, Jean-Pierre,... Journée de réflexion du DPNC Centre.
The First-Level Trigger of ATLAS Johannes Haller (CERN) on behalf of the ATLAS First-Level Trigger Groups International Europhysics Conference on High.
FDR of the End-cap Muon Trigger Electronics 1/Mar./04 1 Beam Test of TGC electronics in 2003 Introduction Electronics Setup Stand-alone Run Setup Data.
Chris Bee ATLAS High Level Trigger Introduction System Scalability Trigger Core Software Development Trigger Selection Algorithms Commissioning & Preparation.
Uli Schäfer 1 JEM: Status and plans JEM1.2 Status Test results Plans.
Uli Schäfer Discussions with H.B. after last meeting… All ATLAS standard racks and crates will be monitored. Helfrieds group will equip them with ELMBs.
General Trigger Philosophy The definition of ROI’s is what allows, by transferring a moderate amount of information, to concentrate on improvements in.
Trigger and online software Simon George & Reiner Hauser T/DAQ Phase 1 IDR.
LKr readout: present and future R. Fantechi 30/8/2012.
February 19th 2009AlbaNova Instrumentation Seminar1 Christian Bohm Instrumentation Physics, SU Upgrading the ATLAS detector Overview Motivation The current.
October 1st th Workshop on Electronics for LHC Experiment at Amsterdam 1 Beam Test Result of the ATLAS End-cap Muon Level-1 Trigger Chikara Fukunaga.
ATLAS ONLINE MONITORING. FINISHED! Now what? How to check quality of the data?!! DATA FLOWS!
CF 8/June/2003 ATLAS muon week in Gallipoli 1 The 1 st Beam Test of TGC electronics at H8 in May/June ‘03 Chikara Fukunaga on behalf of TGC electronics.
Summary DCS Workshop - L.Jirdén1 Summary of DCS Workshop 28/29 May 01 u Aim of workshop u Program u Summary of presentations u Conclusion.
JCOP Workshop September 8th 1999 H.J.Burckhart 1 ATLAS DCS Organization of Detector and Controls Architecture Connection to DAQ Front-end System Practical.
André Augustinus 10 September 2001 Common Applications to Prototype A two way learning process.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
 Carlos A. Chavez Barajas 22nd, May 2013 Latest work : Data Analysis Non-collision background estimation (SUSY searches) W+Jets cross section (data driven.
Control in ATLAS TDAQ Dietrich Liko on behalf of the ATLAS TDAQ Group.
TDAQ Report LVL1 Calo & Muon DAQ & HLT DCS. Level-1 Calorimeter Trigger Single-tower saturation level  Internally: All modules in the Calorimeter Trigger.
CMX status and plans Yuri Ermoline for the MSU group Level-1 Calorimeter Trigger Joint Meeting CERN, October 2012,
R. Fantechi. TDAQ commissioning Status report on Infrastructure at the experiment PC farm Run control Network …
Status and planning of the CMX Wojtek Fedorko for the MSU group TDAQ Week, CERN April , 2012.
Mobile DAQ Testbench ‘Mobi DAQ’ Paulo Vitor da Silva, Gerolf Schlager.
Ideas about Tests and Sequencing C.N.P.Gee Rutherford Appleton Laboratory 3rd March 2001.
TDAQ Upgrade Software Plans John Baines, Tomasz Bold Contents: Future Framework Exploitation of future Technologies Work for Phase-II IDR.
4 th Workshop on ALICE Installation and Commissioning January 16 th & 17 th, CERN Muon Tracking (MUON_TRK, MCH, MTRK) Conclusion of the first ALICE COSMIC.
U.S. ATLAS Executive Committee August 3, 2005 U.S. ATLAS TDAQ FY06 M&O Planning A.J. Lankford UC Irvine.
IDE DCS development overview Ewa Stanecka, ID Week, CERN
2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD The DataFlow of the ATLAS Trigger.
CHEP March 2003 Sarah Wheeler 1 Supervision of the ATLAS High Level Triggers Sarah Wheeler on behalf of the ATLAS Trigger/DAQ High Level Trigger.
TGC Timing Adjustment Chikara Fukunaga (TMU) ATLAS Timing Workshop 5 July ‘07.
Artemis School On Calibration and Performance of ATLAS Detectors Jörg Stelzer / David Berge.
D0 PMG 6/15/00 PMG Agenda June 15, 2000  Overview (Tuts) u Detector status u Reportable milestones u Financial status u Summary  Response to DOE review.
S.MonteilCOMMISSIONING1 PS/SPD ELECTRONICS OUTLINE 1)STATUS OF PS/SPD FE BOARDS PRODUCTION 2)PHASES OF PS/SPD COMMISSIONING 1)LEDs AND DETECTORS 2)TUBES.
CF 16/Feb/20031 H8 Beam Test in 2003 ─ Preparation status TGC-Japan electronics group.
1 Calorimeters LED control LHCb CALO meeting Anatoli Konoplyannikov /ITEP/ Status of the calorimeters LV power supply and ECS control Status of.
Status of the ATLAS first-level Central Trigger and the Muon Barrel Trigger and First Results from Cosmic-Ray Data David Berge (CERN-PH) for the ATLAS.
Kostas KORDAS INFN – Frascati 10th Topical Seminar on Innovative Particle & Radiation Detectors (IPRD06) Siena, 1-5 Oct The ATLAS Data Acquisition.
Calorimeter Digitisation Prototype (Material from A Straessner, C Bohm et al) L1Calo Collaboration Meeting Cambridge 23-Mar-2011 Norman Gee.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment is one of the four major experiments operating at the Large Hadron Collider.
The ATLAS DAQ System Online Configurations Database Service Challenge J. Almeida, M. Dobson, A. Kazarov, G. Lehmann-Miotto, J.E. Sloper, I. Soloviev and.
MUON DAQ WORKSHOP Muon Week, CERN February 2014 Nicoletta Garelli (SLAC)
New DAQ at H8 Speranza Falciano INFN Rome H8 Workshop 2-3 April 2001.
AFP Trigger DAQ and DCS Krzysztof Korcyl Institute of Nuclear Physics - Cracow on behalf of TDAQ and DCS subsystems.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment [1] is one of the four major experiments operating at the Large Hadron Collider.
August 24, 2011IDAP Kick-off meeting - TileCal ATLAS TileCal Upgrade LHC and ATLAS current status LHC designed for cm -2 s 7+7 TeV Limited to.
Sumary of the LKr WG R. Fantechi 31/8/2012. SLM readout restart First goal – Test the same configuration as in 2010 (rack TS) – All old power supplies.
TDAQ and L1Calo and Chamonix (Personal Impressions) 3 Mar2010 Norman Gee.
14 Aug. 08DOE Review John Huth Future Harvard ATLAS Plans John Huth.
L1Calo Databases ● Overview ● Trigger Configuration DB ● L1Calo OKS Database ● L1Calo COOL Database ● ACE Murrough Landon 16 June 2008.
EPS HEP 2007 Manchester -- Thilo Pauly July The ATLAS Level-1 Trigger Overview and Status Report including Cosmic-Ray Commissioning Thilo.
L1Calo DBs: Status and Plans ● Overview of L1Calo databases ● Present status ● Plans Murrough Landon 20 November 2006.
Gu Minhao, DAQ group Experimental Center of IHEP February 2011
Online Database Work Overview Work needed for OKS database
5/14/2018 The ATLAS Trigger and Data Acquisition Architecture & Status Benedetto Gorini CERN - Physics Department on behalf of the ATLAS TDAQ community.
U.S. ATLAS TDAQ FY06 M&O Planning
Test Beam Request for the Semi-Digital Hadronic Calorimeter
Project definition and organization milestones & work-plan
Online Software Status
ATLAS Canada Alberta Carleton McGill Montréal Simon Fraser Toronto
ATLAS Canada Alberta Carleton McGill Montréal Simon Fraser Toronto
The First-Level Trigger of ATLAS
ATLAS: Level-1 Calorimeter Trigger
12/3/2018 The ATLAS Trigger and Data Acquisition Architecture & Status Benedetto Gorini CERN - Physics Department on behalf of the ATLAS TDAQ community.
TDAQ commissioning and status Stephen Hillier, on behalf of TDAQ
1/2/2019 The ATLAS Trigger and Data Acquisition Architecture & Status Benedetto Gorini CERN - Physics Department on behalf of the ATLAS TDAQ community.
Presentation transcript:

Trigger/DAQ/DCS

LVL1 Trigger Calorimeter trigger Muon trigger Central Trigger Processor (CTP) Timing, Trigger, Control (TTC) Germany, Sweden, UK Italy Japan, Israel Cluster Processor (e/ ,  /h) Pre-Processor (analogue  E T ) Jet / Energy-Sum Processor CERN Muon Barrel Trigger Muon End-cap Trigger Muon-CTP Interface (MUCTPI) ~7200 calorimeter trigger towers O(1M) RPC/TGC channels

Calorimeter trigger Cluster Processor Module (CPM) for e/  /  /h trigger  New version fixes timing problems in fanned-out data  Fabrication problems solved using firms with better QA Jet/Energy Module (JEM)  Full-specification version recently made, tests so far look good Common Merger Module (CMM)  Tested extensively, very close to final version CPM JEM

Calorimeter trigger PreProcessor Module (PPM) - later than planned, but...  Final ASIC prototype is OK  MCM now OK Substrate problem fixed by change of material  PPM stand-alone tests now nearly completed System tests  Many subsystem tests done without PPM e.g. 5 DSSs  3 CPMs  CMM (crate)  CMM (system)  Full system tests with PPM starting very soon  Will participate in test-beam in August/September, including 25 ns run Aim to integrate with –Calorimeters and receivers –Central Trigger Processor –RoI Builder –ATLAS DAQ, run control environment, etc. Produce simple triggers based on calorimeter signals 3 CPMs, 1 JEM, 2 CMMs, TCM and CPU in crate

Tile Calorimeter - PPM Test Test pulse Test pulse recorded in H8

Barrel muon trigger Preproduction of “Splitter” boxes completed  Main production on the way Prototype of final-design “Pad” boards evaluated in lab and (last week) at 25 ns test beam  Seem to work well, but test-beam data still to be analysed Design completed for revised version of CM ASIC  Interaction in progress with IMEC on placement/routing plus simulation to check design  Very urgent! More Pad boxes being prepared for chamber integration tests  Number limited by availability of prototype ASICs (old version) Correlation in  measurements between two BML doublets

Endcap muon trigger System operated successfully in last week’s 25 ns test beam with MUCTPI and CTP demonstrator  Even better efficiency than last year  Many improvements to software Will test new PS boards with revised version of SLB ASIC in test beam August/September Revised version of SLB ASIC is being evaluated in lab tests  Trigger part passes all tests  Problem detected in readout part for certain sequences of L1A signals Probably very localized and hopefully only very minor revision to design required, but still under investigation All other endcap ASICs already final Trig. Eff. PT=6 PT=5  PT=4

Central trigger CTP  Final prototypes either available or coming soon (layout, production)  Tight schedule to test, commission and integrate for test beam later in summer LTP  Prototypes just received MUCTPI  Work queued behind CTP  Only one kind of module needs to be upgraded to achieve full functionality  Existing “demonstrator” adequate in short term LTP prototype

LVL1 Schedule Re-baselined in line with current status and plans  Schedule for calorimeter and central trigger electronics matches availability of detector systems Production of on-detector muon trigger electronics is later than we would like for integration with detectors  Very tight schedule to have barrel electronics available in time to equip chambers before installation Late submission of revised version of the CM ASIC Try to advance ASIC schedule if at all possible Need to prepare for very fast completion and testing of electronics once production ASICs become available Need to prepare for efficient integration of electronics with chamber assemblies prior to installation  End-cap electronics schedule also tight for integration with detectors in early 2005 Detector installation is later than for barrel, so not as critical

Installation Schedule According to present schedule, final availability of all LVL1 subsystems is still driven by detector installation schedule  Latest ATLAS working installation schedule (v. 6.19) shows last TGC chambers (with on-detector trigger electronics) installed January 2007 Leaves little time for commissioning of on-detector electronics before we lose access prior to first beams Action defined for discussion with TC (and Muon PL) to see if there is scope to optimize the installation planning

HLT/DAQ Major activity in the present phase is the test beam  Support for detector and for LVL1 trigger tests Organization in “support teams” who are first point of contact –Call on experts when necessary Dedicated training sessions were organized for the team members Team members participate with experts in problem solving –Good way to spread expertise Electronic log book very useful –Could extend use to detector systems  HLT/DAQ studies Preparation and planning for dedicated period in August –Aim to operate HLT/DAQ system to gain experience in a “real-life” environment »Will need support from detector systems  Generally experience at test beam is very positive for T/DAQ However, work at the test beam takes a lot of effort In parallel, continue development and system evaluation work ... within the constraints of the available effort E.g. Dataflow measurements and modelling

Detector integration with DAQ at H8 Muon Detectors  TGC’s, MDT fully integrated Extended running in combined mode during 25 ns run last week together with MUCTPI (sometimes triggered by CTPD)  RPC almost fully integrated Data were successfully taken in stand-alone mode Calorimeters  Tiles fully integrated in data-taking mode  LAr integration well advanced Inner Detectors  Started for TRT and pixels; plan to integrate SCT later The exercise of joining detectors together has proven to be “easy” if the detector segment has been properly done according to the TDAQ prescriptions

HLT integration for test beam The infrastructure for the EF is prepared  An EF cluster has been divided and pre-assigned to different detectors The configuration allows, nevertheless, to dynamically assign more CPU’s to the partition that requests it  The main work now is to get ATHENA integrated Scheme of having “rolling” unique version of the offline software (8.2.x) specially maintained for the test-beam working well We are now trying to put in place the automatic procedure that sets ~80 environment variables!  The Gatherer is integrated Allows for aggregation of histograms across multiple processors The LVL2 commissioning is progressing well  A LVL1 result has been successfully read out by the L2PU  Progress is being made in integrating algorithms

Example of HLT algorithm work Eff for e pairEfficiency wrt to LVL1 Rates LVL1100 %3.5 kHz EF Calo84.5 %6.2 Hz EF ID71.6 %1.5 Hz EF ID-Calo55.5 %1.5 Hz New result since TDR 2e15i at 2x10 33 cm -2 s -1 Rates consistent with TDR assumptions H  4e m H =130 GeV L = 2x10 33 cm -2 s -1 4 reconstructed electrons in |  |<2.5  At least 2e with p T > 20 GeV Efficiency includes both single and double-object triggers  Good trigger acceptance of Higgs events 2e2  study also being done Trigger Selection Steps Efficiency wrt LVL1 Overall Efficiency LVL1100 %99.6 % L2Calo99.7 %99.4 % EFCalo98.9 %98.5 % EFID98.1 %97.7 % EFIDCalo97.1 %96.7 %

Continuous evolution of Online software Control Databases Monitoring

Example of ongoing work: large-scale tests Operational Scalability and Performance Verified the Operational Scalability and Performance of the Online System on a very large scale close to the size of final Atlas 1000 run controllers processes Partitions of up to 1000 run controllers processes running on 340 PCs communication configuration databasesuccessful Individual tests on corba communication components and configuration database components successful  4 th iteration of Online Software Large Scale tests  340 PCs (800-Hz to 2.4 GHz) of CERN LXSHARE cluster  Linux RH 7.3  Partitions and configuration trees under varying conditions Run control operations: boot DAQ, start run, stop run. shutdown DAQ Number of requests per seconds Number of simultaneous providers Information Service performance: Each provider publishes one information, then updates it as fast as possible

HLT/DAQ procurement plans S-link source card  FDR/PRR successfully concluded in February  Preproduction run before end of year; mass production in 2005 ROB-in  FDR successfully concluded in May  Production of 10 prototype boards of final design due in July Switches  Plan for switch evaluation exists Measurements on some “pizza box” switches in progress  Technical specification document under review  Market survey later this year ROS PCs  Technical specification document under review  Market survey later this year Other PCs  Will be addressed later

HLT/DAQ pre-series system in preparation Approximately 10% slice of full HLT/DAQ system  Validate functionality of final system 1 full ROS rack (11 PCs equipped with ROBins) port Gbit Ethernet switch 1 LVL2 processor rack 1 EF processor rack (partially equipped) 1 ROIB (50% equipped) 1 EFIO rack (DFM, SFI, SFO,...) 1 Online rack DCS equipment  Practical experience Racks, power distribution, cooling, etc  Considering installation in USA15/SDX (as for final system) Check of infrastructure ~6 months before main installation starts Subject to feasibility checks (schedule, working environment, safety issues and regulations Will be discussed in July TMB

DCS Front-End system  ELMB: mass production on-going (LHCC 31/8/04)  CAN branch supervisor: prototype being tested  Rack control system: defined, HW prototype ordered Back-End system  Distributed PVSS system running (SR1)  Hierarchical system with 3 levels set up  Logging to (present) conditions data base (H8)  Prototype Finite State Machine running  Connection to DAQ fully operational  Data retrieval from accelerator in the works by JCOP (LHCC 31/7/04)