Presentation is loading. Please wait.

Presentation is loading. Please wait.

Cern -April 24, 2007Atlas Data Quality Workshop M. Primavera1 Muon EF Data Quality Muon EF Data Quality M. Primavera - I.N.F.N. Lecce on behalf of Muon.

Similar presentations


Presentation on theme: "Cern -April 24, 2007Atlas Data Quality Workshop M. Primavera1 Muon EF Data Quality Muon EF Data Quality M. Primavera - I.N.F.N. Lecce on behalf of Muon."— Presentation transcript:

1 Cern -April 24, 2007Atlas Data Quality Workshop M. Primavera1 Muon EF Data Quality Muon EF Data Quality M. Primavera - I.N.F.N. Lecce on behalf of Muon EF working group Atlas Data Quality Workshop CERN, April 23- 25, 2007

2 Cern -April 24, 2007Atlas Data Quality Workshop M. Primavera2  Preliminary considerations and first ideas on how to perform Muon EF DQM/DQA presented in Zeuthen in TDAQ Data Quality Workshop:  Online X Offline: using physics signal, comparing trigger/reconstruction ý For the moment efforts are focused on Online DQM/DQA

3 Cern -April 24, 2007Atlas Data Quality Workshop M. Primavera3  Since the Zeuthen Wshop until now:  Monitoring experience collected during last technical run (March 19-23 ) X First approaches with DQMF X Studies performed in offline with Data/SW validation tools of the muon slice to produce a preliminary list of histograms/parameters to be monitored for DQ + checks to be performed on them

4 Cern -April 24, 2007Atlas Data Quality Workshop M. Primavera4 Muon EF LVL2 (muFast) Moore Algs LVL1 MuIdStandAloneAlgs TrigMoore Seeding Algs MuIdCombinedAlgs Hypo Alg LVL2 (muComb) LVL2 ID Offline ID FEX algorithms: Moore Algs (rec. MS) MuIdStandalone Algs (muon extrap. to the inter. Region) MuIdCombined Algs (track comb. with ID) ( Trigger/TrigAlgorithms/TrigMoore +”Moore”+”MuId”) HYPO algorithms: MooreHypo MuIdStandaloneHypo MuIdCombinedHypo (Trigger/TrigHypothesis/TrigMuonHyp o/TrigMooreHypo) Cut variables: pt

5 Cern -April 24, 2007Atlas Data Quality Workshop M. Primavera5 Existing Monitoring: TrigMooreHisto Monitored: muons in MS rec. by FEX Moore ( Trigger/TrigAlgorithms/TrigMoore) Variables monitored: # of muon candidates pt,1/pt,a0,phi,cottheta,(x,y,z) eta,phi of LVL2 muon ROI Histogramming based on THistSvc Other histograms now added in private version: e.g. # hit/track per tecnology,etc…

6 Cern -April 24, 2007Atlas Data Quality Workshop M. Primavera6 However, we are now starting to migrate to the common framework for HLT monitoring (https://twiki.cern.ch/twiki/bin/view/Atlas/TriggerValidationHistograms)https://twiki.cern.ch/twiki/bin/view/Atlas/TriggerValidationHistograms in the next days with a “pilot “version of muon EF monitoring

7 Cern -April 24, 2007Atlas Data Quality Workshop M. Primavera7 Muon EF Monitoring in March Technical run (D. Scannicchio) SW release: Used 12.0.5-HLT with some patches Sample: ~6000 events: mixed physics processes LVL1 pre-selected with low thres. (di-jets,Ws,Zs,etc.) Trigger menus: MU06 for LVL1, mu6 signature in HLT, EF finds track in MS e.g. muon_slice_run2346-withSFO.root/Gathered-PTs/EXPERT/TrigMoore

8 Cern -April 24, 2007Atlas Data Quality Workshop M. Primavera8 A new ATLOG entry has been submitted: Author : Diana and Alina System : TDAQ Type : Information Status : closed System Affected : TDAQ | Monitoring TDAQ_Logbook : Technical Runs TDAQ_Component : Integration TDAQ_Case : Combined Logbook URL : http://pcatdsrv01.cern.ch:8100/ATLAS/754 ================================= More histograms checking have been added to the DataQuality Monitoring utility. These are 3 histograms produced by TrigMoore algorithm running in EF. For now they are checked with a very simple algorithm that verifies if the histograms are filled or not. To be replaced with more sophisticated analysis when Diana will find out more about this. First use of online DQMF for Muon EF: Basic check histograms are filled?

9 Cern -April 24, 2007Atlas Data Quality Workshop M. Primavera9 Muon EF DQM/DQA FEX Algorithm: variables to test quality of the seeded reconstruction, acceptance and rejection Wrt LVL2 or previous step, compare some variables (if available in the sequence step) with LVL2 or previous step Hypo Algorithm: cut variables on which take rejection/acceptance decision, acceptance and rejection Significant histograms and plots shown here are produced by Muon EF offline validation code (assuming a good calibration real detector/MC…) However, to be ready to flag real data by using meaningful checks : As much as possible realistic data samples should be used (e.g. pt distribution of muons from the Tech. run sample looks very strange for muon ….), including pile-up and cavern background effects For the moment, exercises done with monochromatic single muons, muons from  decays (dominating the EF rate at low threshold) and top, but not complete, since there are other histograms/checks identified not yet studied, here discussed (in grey)

10 Cern -April 24, 2007Atlas Data Quality Workshop M. Primavera10 Samples used k single muons: 6 GeV and 17 GeV k muons from  (single particle decays,properly weighted), official production: http://gridui02.usatlas.bnl.gov:25880/server/pandamon/query/?mode=taskquery&qDSInput=%25slice%25& qsubmit=QuerySubmit https://twiki.cern.ch/twiki/bin/view/Atlas/MuonsFromPiK k top: mc11.004100.T1_McAtNLO_top.AANT.v11000201 All processed with release 12.0.6 No final decision on granularity (barrel,Ecs, …) for proposed histos/checks, but all studies performed by looking to the different regions

11 Cern -April 24, 2007Atlas Data Quality Workshop M. Primavera11 seeded (by LVL2 ROI) reconstruction in MS (Moore in EF) H Track quality (fit  2 /n.d.f.) Top events = 35 GeV = 0.92 Muons from  = 8 GeV = 0.91 Checks: monitor <> and r.m.s.

12 Cern -April 24, 2007Atlas Data Quality Workshop M. Primavera12 seeded (by LVL2 ROI) reconstruction in MS (Moore in EF) M Track  2 probability Checks: compare distribution with reference histos Final track normalized (divided by errors) residuals distribution per tecnology good sensitivity to calibration and alignment H Final track normalized (divided by errors) residuals distribution per tecnology good sensitivity to calibration and alignment Checks: if gaussian, fit results with /  compatible with 0/1 Top events

13 Cern -April 24, 2007Atlas Data Quality Workshop M. Primavera13 seeded (by LVL2 ROI) reconstruction in MS (Moore in EF ) H Track pt, ,  Top events

14 Cern -April 24, 2007Atlas Data Quality Workshop M. Primavera14 Checks: compare distributions with reference (in realistic conditions), all checks should be calibrated with real data 6 GeV Muons, all eta 17 GeV Muons, barrel 17 GeV Muons, ECs 17 GeV Muons, all eta

15 Cern -April 24, 2007Atlas Data Quality Workshop M. Primavera15 seeded (by LVL2 ROI) reconstruction in MS (Moore in EF ) H N h tec = Track number of hits per technology (MS) vs eta Checks: stability in <> and r.m.s. of number of hits per tecnology in defined eta region Top ev. MDT hits =20.34 r.m.s.=6.5 |  | < 2.4 Top ev. phi hits =5.7 r.m.s.=2.1 Muons from  MDT hits =19.25 r.m.s.=5.6

16 Cern -April 24, 2007Atlas Data Quality Workshop M. Primavera16 seeded (by LVL2 ROI) reconstruction in MS (Moore in EF ) H R hits = N h tec /(total # of hits per tec. in ROI) good sensitivity to background conditions H R =  + /  - 17 GeV Muons, all eta R MOOREMuiCB 6 GeV 0.99  0.01 17 GeV 1.01  0.02 1.00  0.02 Top ev. 1.16  0.14 all eta  +  - Checks: R stability within errors

17 Cern -April 24, 2007Atlas Data Quality Workshop M. Primavera17 Track estrapolation to the interaction region (MuId Standalone in EF) H monitor match parameters of MuId Standalone track with the closest ID track H monitor parameterization for energy loss in the calorimeters (pt MuIdSA -pt Moore ) wrt pt H (pt, ,  ) after estrapolation check as already described

18 Cern -April 24, 2007Atlas Data Quality Workshop M. Primavera18 Combined reconstruction with ID (MuId Comb. in EF) H Final track  2,(pt, ,  ) at interaction point check as already described H R =  + /  - check as already described H PCA (a 0,z 0 ) at the interaction region if prompt muons, distributions are also related to the luminous region dimensions Top ev.,  ~ 50  m Top ev.,  ~ 5.4 cm z0z0 a0a0 Checks: monitor <>  and  2 of gaussian fits of MuIdCB Z 0 a 0 :distributions with non gaussian tails could require a fit with sum of 2 gaussians: the ratio of integrals of the two gaussian can be used to monitor the relative weight core/tails in distribution

19 Cern -April 24, 2007Atlas Data Quality Workshop M. Primavera19 TrigMOORE wrt LVL2 H Comparison with LVL2 ROI (pt(?), ,  ) H Htrack multiplicity per ROI sensitivity to background, ROI overlapping, etc. Top ev., MuId comb pt vs pt LVL2 <> ~ -44 MeV  ~ 318 MeV Top ev., MuId comb phi vs phi LVL2 <> ~ 0  ~ 0.3 mrad Checks: monitor <>,  and  2 of the fit of the differences (e.g. MuId CB FEX var - LVL2) : same as page before for non gaussian tails

20 Cern -April 24, 2007Atlas Data Quality Workshop M. Primavera20 “Muon EF Trigger monitoring” H Acceptances (|  | < 2.4) wrt LVL2 or wrt previous step AcceptanceMuons from top (%) ~ 35 GeV 6 GeV single muons (%) 17 GeV single muons (%) MOORE wrt LVL2 96.8±0.595.9±0.196.1±0.2 MuidSA wrt LVL2 95.3±0.690.4±0.195.0±0.2 MuidCB wrt LVL2 93.8±0.789.0±0.193.3±0.2 MuidSA wrt MOORE 98.4±0.494.3±0.198.8±0.1 MuidCB wrt MuiSA 96.9±0.598.4±0.198.3±0.1 Checks: acceptance stability by checking counters

21 Cern -April 24, 2007Atlas Data Quality Workshop M. Primavera21 Hypo algorithms H For the moment implemented at the end of the muon EF sequence (MuIdCombined) and not applied in validation. Presently cut variable is pt only, thus for monitoring: histograms/parameters to be checked: pt spectrum before and after cut/acceptance (= accepted # / total # of triggers ) Total # of histos to be checked for Muon EF: reasonably < 50 (considering all sequence steps and ,  slices), maybe collecting experience the number of really meaningful histograms will decrease

22 Cern -April 24, 2007Atlas Data Quality Workshop M. Primavera22 Plans (short term) Implementing histograms inside the code by using common monitoring tools for trigger continue studies on realistic data samples in order: to individuate other meaningful variables for DQ to evaluate reliability of the proposed checks to understand sensitivity to different run conditions/data deteriorations Getting familiar with DQMF algorithms in the Workbench and online Plans (middle term) Start thinking to the implementation of a higher level Muon slice DQ in “quasi-online”/offline environments for deeper DQM/DQA

23 Cern -April 24, 2007Atlas Data Quality Workshop M. Primavera23 Backup slides

24 Cern -April 24, 2007Atlas Data Quality Workshop M. Primavera24 LVL2::muFast LVL2::muComb L2_muX EF::TrigMoore EF:TrigMuidSA EF_muX EF::ID EF_muX EF::TrigMuidCB EF_muX New HLT Sequence: Hypo Test


Download ppt "Cern -April 24, 2007Atlas Data Quality Workshop M. Primavera1 Muon EF Data Quality Muon EF Data Quality M. Primavera - I.N.F.N. Lecce on behalf of Muon."

Similar presentations


Ads by Google