Software for the CMS Cosmic Challenge Giacomo BRUNO UCL, Louvain-la-Neuve, Belgium On behalf of the CMS Collaboration CHEP06, Mumbay, India February 16,

Slides:



Advertisements
Similar presentations
Kondo GNANVO Florida Institute of Technology, Melbourne FL.
Advertisements

CMS ECAL Laser Monitoring System Toyoko J. Orimoto, California Institute of Technology, on behalf of the CMS ECAL Group 10th ICATPP Conference on Astroparticle,
1 Databases in ALICE L.Betev LCG Database Deployment and Persistency Workshop Geneva, October 17, 2005.
The First-Level Trigger of ATLAS Johannes Haller (CERN) on behalf of the ATLAS First-Level Trigger Groups International Europhysics Conference on High.
CMS ECAL Laser Monitoring System Toyoko J. Orimoto, California Institute of Technology, on behalf of the CMS ECAL Group High-resolution, high-granularity.
CMS ECAL Laser Monitoring System Toyoko J. Orimoto, California Institute of Technology On behalf othe CMS ECAL Collaboration High-resolution, high-granularity.
Atlas SemiConductor Tracker Andrée Robichaud-Véronneau.
Data Quality Monitoring for CMS RPC A. Cimmino, D. Lomidze P. Noli, M. Maggi, P. Paolucci.
CMS Alignment and Calibration Yuriy Pakhotin on behalf of CMS Collaboration.
First year experience with the ATLAS online monitoring framework Alina Corso-Radu University of California Irvine on behalf of ATLAS TDAQ Collaboration.
Data Acquisition Software for CMS HCAL Testbeams Jeremiah Mans Princeton University CHEP2003 San Diego, CA.
Data Quality Monitoring of the CMS Tracker
The CMS Level-1 Trigger System Dave Newbold, University of Bristol On behalf of the CMS collaboration.
06/03/06Calice TB preparation1 HCAL test beam monitoring - online plots & fast analysis - - what do we want to monitor - how do we want to store & communicate.
The Run Control and Monitoring System of the CMS Experiment Presented by Andrea Petrucci INFN, Laboratori Nazionali di Legnaro, Italy On behalf of the.
Alignment Strategy for ATLAS: Detector Description and Database Issues
RPC PAC Trigger system installation and commissioning How we make it working… On-line software Resistive Plate Chambers Link Boxes Optical Links Synchronization.
ILC Trigger & DAQ Issues - 1 ILC DAQ issues ILC DAQ issues By P. Le Dû
Manoj Kumar Jha INFN – Bologna On Behalf of ATLAS Muon Calibration Group 20 th October 2010/CHEP 2010, Taipei ATLAS Muon Calibration Framework.
The Region of Interest Strategy for the ATLAS Second Level Trigger
Databases E. Leonardi, P. Valente. Conditions DB Conditions=Dynamic parameters non-event time-varying Conditions database (CondDB) General definition:
CMS Databases P. Paolucci. CMS DB structure HLT-CMSSW applicationReconstruction-CMSSW application FronTIER read/write objects.
Web application for detailed real-time database transaction monitoring for CMS condition data ICCMSE 2009 The 7th International Conference of Computational.
ATLAS Liquid Argon Calorimeter Monitoring & Data Quality Jessica Levêque Centre de Physique des Particules de Marseille ATLAS Liquid Argon Calorimeter.
108 Mar 2007L. Musa TPC Commissioning ALICE Technical Forum, CERN, 8 th March 2007 L. Musa Outline Pre-commissioning above ground (2006) Preparing for.
A.Golunov, “Remote operational center for CMS in JINR ”, XXIII International Symposium on Nuclear Electronics and Computing, BULGARIA, VARNA, September,
Il Trigger di Alto Livello di CMS N. Amapane – CERN Workshop su Monte Carlo, la Fisica e le simulazioni a LHC Frascati, 25 Ottobre 2006.
Clara Gaspar, March 2005 LHCb Online & the Conditions DB.
CMS pixel data quality monitoring Petra Merkel, Purdue University For the CMS Pixel DQM Group Vertex 2008, Sweden.
Introduction CMS database workshop 23 rd to 25 th of February 2004 Frank Glege.
4 th Workshop on ALICE Installation and Commissioning January 16 th & 17 th, CERN Muon Tracking (MUON_TRK, MCH, MTRK) Conclusion of the first ALICE COSMIC.
TRT Offline Software DOE Visit, August 21 st 2008 Outline: oTRT Commissioning oTRT Offline Software Activities oTRT Alignment oTRT Efficiency and Noise.
W. Smith, U. Wisconsin, LHCC CMS Review, May 6, 2008 Trigger Coordination - 1 Trigger Coordination Report Wesley H. Smith U. Wisconsin CMS Trigger Coordinator.
CHEP March 2003 Sarah Wheeler 1 Supervision of the ATLAS High Level Triggers Sarah Wheeler on behalf of the ATLAS Trigger/DAQ High Level Trigger.
DOE Review Adi Bornheim California Institute of Technology July 25, 2007 CMS ECAL Status, Test Beams, Monitoring and Integration.
The CMS CERN Analysis Facility (CAF) Peter Kreuzer (RWTH Aachen) - Stephen Gowdy (CERN), Jose Afonso Sanches (UERJ Brazil) on behalf.
Michele de Gruttola 2008 Report: Online to Offline tool for non event data data transferring using database.
Development of the CMS Databases and Interfaces for CMS Experiment: Current Status and Future Plans D.A Oleinik, A.Sh. Petrosyan, R.N.Semenov, I.A. Filozova,
Pixel DQM Status R.Casagrande, P.Merkel, J.Zablocki (Purdue University) D.Duggan, D.Hidas, K.Rose (Rutgers University) L.Wehrli (ETH Zuerich) A.York (University.
DQM for the RPC subdetector M. Maggi and P. Paolucci.
Online Monitoring System at KLOE Alessandra Doria INFN - Napoli for the KLOE collaboration CHEP 2000 Padova, 7-11 February 2000 NAPOLI.
Measurement of the Charge Ratio of Cosmic Muons using CMS Data M. Aldaya, P. García-Abia (CIEMAT-Madrid) On behalf of the CMS Collaboration Sector 10 Sector.
9/17/2008TWEPP 2008, R. Stringer - UC Riverside 1 CMS Tracker Services: present status and potential for upgrade Robert Stringer University of California,
Database David Forrest. What database? DBMS: PostgreSQL. Run on dedicated Database server at RAL Need to store information on conditions of detector as.
CHIPP meeting Appenberg, 24 Aug 2009 Preparation for LHC beam, Jeroen van Tilburg 1/15 Jeroen van Tilburg (Universität Zürich) LHCb: Preparation for LHC.
Yu. Guz 21/04/20061 Database usage by calorimeters Yu. Guz IHEP Protvino.
ECAL Off detector Electronics Workshop Lisbon April T. Camporesi ECAL activity sites 2005 Integration (now) 904 (now) Cosmic ray calibration (May)
Summary of Workshop on Calibration and Alignment Database Magali Gruwé CERN PH/AIP ALICE Computing Day February 28 th 2005.
DQM for the RPC subdetector M. Maggi and P. Paolucci.
Overview of PHENIX Muon Tracker Data Analysis PHENIX Muon Tracker Muon Tracker Software Muon Tracker Database Muon Event Display Performance Muon Reconstruction.
W. Smith, U. Wisconsin, LHCC CMS Review, May 6, 2008 Trigger Coordination - 1 Trigger Coordination Report Wesley H. Smith U. Wisconsin CMS Trigger Coordinator.
ATLAS The ConditionDB is accessed by the offline reconstruction framework (ATHENA). COOLCOnditions Objects for LHC The interface is provided by COOL (COnditions.
CMS Status & Commissioning Menu: 1 Recent Progress Commissioning Prior to and After First Beam Commissioning with first LHC Events Outlook Wolfgang Funk.
Barthélémy von Haller CERN PH/AID For the ALICE Collaboration The ALICE data quality monitoring system.
EPS 2007 Alexander Oh, CERN 1 The DAQ and Run Control of CMS EPS 2007, Manchester Alexander Oh, CERN, PH-CMD On behalf of the CMS-CMD Group.
EPS HEP 2007 Manchester -- Thilo Pauly July The ATLAS Level-1 Trigger Overview and Status Report including Cosmic-Ray Commissioning Thilo.
Emanuele Leonardi PADME General Meeting - LNF January 2017
Micromegas Vertex Tracker Status Report
CMS High Level Trigger Configuration Management
Approved Plots from CMS Cosmic Runs (mostly CRUZET, some earlier)
Controlling a large CPU farm using industrial tools
Commissioning of the ALICE HLT, TPC and PHOS systems
CMS Preshower: Startup procedures: Reconstruction & calibration
ProtoDUNE SP DAQ assumptions, interfaces & constraints
Data Quality Monitoring of the CMS Silicon Strip Tracker Detector
Regional Cal. Trigger Milestone: Production & Testing Complete
CMS Pixel Data Quality Monitoring
DQM for the RPC subdetector
The CMS Tracking Readout and Front End Driver Testing
ATLAS DC2 & Continuous production
Presentation transcript:

Software for the CMS Cosmic Challenge Giacomo BRUNO UCL, Louvain-la-Neuve, Belgium On behalf of the CMS Collaboration CHEP06, Mumbay, India February 16, 2006

CHEP06 16/2/ Mumbay, India 2 Software for the CMS Cosmic Challenge G. Bruno Outline CMS and the Magnet Test Cosmic Challenge (MTCC) Detector configuration Running conditions Software architecture Reconstruction and Calibrations at the MTCC Online operation: event selection and monitoring Offline operation

CHEP06 16/2/ Mumbay, India 3 Software for the CMS Cosmic Challenge G. Bruno The CMS detector 100 kHz 40 MHz TRIGGER and DAQ Two event selection systems HLT is a farm of commercial processors  Event selection based on offline software using full event data

CHEP06 16/2/ Mumbay, India 4 Software for the CMS Cosmic Challenge G. Bruno CMS Magnet Test Cosmic Challenge (MTCC) Goals  Commissioning, training and field mapping of the solenoid magnet  Sub-detector installation (~5% of final detector), performance, triggering, alignment (in the presence of a Magnetic field)  Software (completely new offline SW design: project started at the end of 2004) Read out all detectors with Global DAQ Decipher raw data - debug Local Reconstruction in all detectors (clusters, Muon track segments) Calibration and alignment using Database infrastructure Monitoring and Visualization (based on reconstruction) Data transfer to CERN and other CMS GRID sites Global event reconstruction (tracks) Schedule  Insertion and cabling of sub-detectors by April 10  One month magnet test after April 16 Combined data taking with global DAQ immediately  One week of combined data taking at constant field mid-to-end May  Field mapping to follow (June): only with HCAL and Muon

CHEP06 16/2/ Mumbay, India 5 Software for the CMS Cosmic Challenge G. Bruno Muon detectors at the MTCC ME1 ME2 ME3 ME4 ME2 ME3 ME4 ME3 ME4 DT-10 DT - 11 ME1 DT-10 DT - 11 ME2 Drift Tubes  14 stations (5% of final system)  Tracks can cross up to 4 stations  One η-view and two φ-view track segments per station  ~1 kB/ev CSC  36 stations (8% of final system) 36 k channels  Tracks can cross up to 4 stations  One track segment per station  10÷20 kB/ev RPC  5 measuring planes  Participation is still uncertain Hardware Alignment system  Laser based  Measurements of Muon support structures. Also relative to Tracker support structure O(100 μm) accuracy 500 MB of data per day Data taken through DCS

CHEP06 16/2/ Mumbay, India 6 Software for the CMS Cosmic Challenge G. Bruno Silicon Tracker at the MTCC Silicon Strip  133 modules (1% of final Tracker) 80 k channels  <1 kB/ev (200 kB/ev) with (no) zero suppression  Tracks can cross up to 6 detectors  Custom support structures Need special geometry (ready, including survey measurement for initial alignment) Pixel  Not participating IGUANA visualization of Tracker MTCC geometry

CHEP06 16/2/ Mumbay, India 7 Software for the CMS Cosmic Challenge G. Bruno Calorimeters at the MTCC ECAL  Two supermodules  3.6 k channels (~5% of final system)  <5 kB/ev (80 kB/ev) with (no) zero suppression HCAL  11 barrel + 4 endcap wedges  ~1 k channels (10 % of final system)  5 kB/ev (30 kB/ev) with (no) zero suppression  Trigger electronics IGUANA visualization: real cosmics seen at Test Beams

CHEP06 16/2/ Mumbay, India 8 Software for the CMS Cosmic Challenge G. Bruno Trigger and DAQ at the MTCC L1 Trigger  DT or CSC track segments coincidence in one or more stations Expected rate: up to 2÷3 kHz Can require angular or pointing (to Tracker) cuts. –Rate reduced to about O(100 Hz)  HCAL MIP trigger (possibly) Rate adjustable through thresholds Data sizes  <50 KB/ev if Tracker and ECAL in zero suppression mode DAQ  Two stage event building as in final system super fragment building RU building (single slice)  Filter Farm: 16 Dual-Xeon 2.8 GHz 2GB/80GB  50 MB/s max data to disk rate Up to 1 KHz events to disk rate CMS DAQ MTCC DAQ

CHEP06 16/2/ Mumbay, India 9 Software for the CMS Cosmic Challenge G. Bruno Events at the MTCC Acceptance  2.2 kHz: cosmics crossing instrumented Muon chambers (R 0 )  9 Hz: also cross at least one Tracker module (R T0 )  O(100 Hz): average expected event to disk rate (R D ) Online event selection  Scenario 1: only generic L1 trigger R T = R D / R 0 x R T0 ; R T ==“Tracker” events to disk rate  Scenario 2: Muon pointing trigger (R 0 ~100 Hz) R T = 9 Hz, if R D >=100 Hz Inconvenient for Calorimeters and Muons  Scenario 3: generic L1 trigger + HLT selection Maximize L1 rate (up to 2.2 kHz physical one) Do remaining selection in HLT using a simple ADC counts method Get 2 streams out of DAQ: –Tracker stream (ADC count method) ~ 10 Hz –Main stream, e.g. selection based on L1 info and compatible with (R D -10 Hz) Need tracker local reconstruction running online

CHEP06 16/2/ Mumbay, India 10 Software for the CMS Cosmic Challenge G. Bruno Event Builder Event Processing in new CMS software (online/offline) EVENT DATA: Raw Data Digis Clusters …. Raw2Digi NON EVENT DATA: Cabling calibrations ….. Clusterizer Raw Data Source Offline Conditions DB POOL - Event DB Output Mod Monitor/Filter Details in talks by C. Jones “The new CMS Event Data model” “Access to Non-Event data for CMS”

CHEP06 16/2/ Mumbay, India 11 Software for the CMS Cosmic Challenge G. Bruno Non-Event data: Database model Online DB HLT “Offline” DB Offline DB OMDS (Online DB)  Sub-detector configuration and conditions ORCON (HLT Conditions DB)  Non event data used by the offline software in the HLT ORCOFF (Offline Conditions DB)  Non event data used by the offline software at Tier 0/1/2 Data transfers  Configuration and conditions from OMDS to ORCON and ORCOFF  Offline computed calibrations to ORCOFF and ORCON

CHEP06 16/2/ Mumbay, India 12 Software for the CMS Cosmic Challenge G. Bruno Reconstruction example: Tracker local reco (will be done online for Monitoring and Filtering) RAWDATA  FED buffers DIGIs  Channel ADC counts Clusters  groups of calibrated channel amplitudes Cabling channel bad flag, gain, noise Non-Event DataEvent Data

CHEP06 16/2/ Mumbay, India 13 Software for the CMS Cosmic Challenge G. Bruno Tracker Local Reco: Non-Event data flow Carry out commissioning procedures (see R. Bainbridge’s talk : “Commissioning Procedures and Software for the CMS Silicon Strip Tracker”)  Electronics configured (ready for data taking)  Online DB now contains All detected hardware components and connections pedestals, noise, bad strips, (per channel) Transfer data from Online to Offline DB  program using OCCI libraries for ORACLE access Conditions objects available in Offline DB  Readout connections (needed by Raw2Digi)  Control connections (useful when monitoring)  Noise/bad channels (needed by Clusterizer)

CHEP06 16/2/ Mumbay, India 14 Software for the CMS Cosmic Challenge G. Bruno Filter Farm architecture

CHEP06 16/2/ Mumbay, India 15 Software for the CMS Cosmic Challenge G. Bruno Online : filtering and streaming Filter Unit application  Fully integrated with Event Building and Run Control suite  Fully integrated with offline software (including DQM) and DB infrastructure Can execute an arbitrary set of reconstruction, analysis (monitoring) and filtering (event selection) modules Takes new calibrations at the beginning of every run  Data storage Local output using POOL Forward events to Storage Manager Storage Manager  N to 1 streaming functionality (ROOT based) available  MTCC goals Multi streaming and Event Server functionality (in development) File catalog suitable for export

CHEP06 16/2/ Mumbay, India 16 Software for the CMS Cosmic Challenge G. Bruno Online : monitoring DQM Software architecture (see C. Leonidopoulos’ talk:” Physics and Data Quality Monitoring at CMS ”)  Monitorable Producers Executed in the event loop (access to Event and Non-Event data)  Monitorable Consumers Receive and analyze (comparison with reference, alarms,..) updated monitorables. Have only access to Non-Event data Modes of operation  Online – MonitorConsumer FU nodes book and fill monitorables Can in principle work at the FF input rate Shortest delay Additional load for FU nodes  Online – EventConsumer decoupled from Filter Farm Storage Manager forwards selected events to monitoring dedicated processors  Quasi Online – EventConsumer monitoring applications access event data from “Hot buffers”

CHEP06 16/2/ Mumbay, India 17 Software for the CMS Cosmic Challenge G. Bruno Offline operation Data distribution and access  Disk server (a few TB) available at MTCC site for immediate analysis  Data stored at CERN (CASTOR) and shipped to a number of remote sites using LCG tools Offline analysis  Local reconstruction Understanding and commissioning of all sub-detectors  Alignment corrections computation Use data from the Hardware Alignment system MTCC Goal: have the software that computes corrections and makes them available to reconstruction  Track reconstruction Validation and cross check of Muon Hardware Alignment system data Track reconstruction may not be ready for MTCC. Need to ensure to the best possible extent the quality of data with local reconstruction monitoring

CHEP06 16/2/ Mumbay, India 18 Software for the CMS Cosmic Challenge G. Bruno Conclusions The MTCC is a fundamental milestone for the new CMS software  Global Data Acquisition with (almost) all sub- detectors  Data quality monitoring, event selection and streaming based on local reconstruction  Non-Event Data DB infrastructure  Offline global reconstruction using alignment data CMS is on schedule to achieve these goals