PWG3 Analysis: status, experience, requests Andrea Dainese on behalf of PWG3 ALICE Offline Week, CERN, 13.07.11 Andrea Dainese 1.

Slides:



Advertisements
Similar presentations
The LEGO Train Framework
Advertisements

– Unfortunately, this problems is not yet fully under control – No enough information from monitoring that would allow us to correlate poor performing.
Trains status&tests M. Gheata. Train types run centrally FILTERING – Default trains for p-p and Pb-Pb, data and MC (4) Special configuration need to be.
ALICE Operations short summary LHCC Referees meeting June 12, 2012.
Statistics of CAF usage, Interaction with the GRID Marco MEONI CERN - Offline Week –
Chiara Zampolli in collaboration with C. Cheshkov, A. Dainese ALICE Offline Week Feb 2009C. Zampolli 1.
ALICE Trigger and Event Selection QA 1Kevin McDermott By: Kevin McDermott August 9, Michigan REU Student Program – Final Presentations.
Real data reconstruction A. De Caro (University and INFN of Salerno) CERN Building 29, December 9th, 2009ALICE TOF General meeting.
Costin Grigoras ALICE Offline. In the period of steady LHC operation, The Grid usage is constant and high and, as foreseen, is used for massive RAW and.
Analysis infrastructure/framework A collection of questions, observations, suggestions concerning analysis infrastructure and framework Compiled by Marco.
Offline report – 7TeV data taking period (Mar.30 – Apr.6) ALICE SRC April 6, 2010.
DQM status report Y. Foka (GSI) Offline week from pp to PbPb.
Infrastructure for QA and automatic trending F. Bellini, M. Germain ALICE Offline Week, 19 th November 2014.
Optimizing CMS Data Formats for Analysis Peerut Boonchokchuay August 11 th,
Andrei Gheata, Mihaela Gheata, Andreas Morsch ALICE offline week, 5-9 July 2010.
Analysis trains – Status & experience from operation Mihaela Gheata.
5/2/  Online  Offline 5/2/20072  Online  Raw data : within the DAQ monitoring framework  Reconstructed data : with the HLT monitoring framework.
Cosmic Ray Run III Cosmic Ray AliEn Catalogue LHC08b 1.
ALICE Offline Week, CERN, Andrea Dainese 1 Primary vertex with TPC-only tracks Andrea Dainese INFN Legnaro Motivation: TPC stand-alone analyses.
AliRoot survey P.Hristov 11/06/2013. Offline framework  AliRoot in development since 1998  Directly based on ROOT  Used since the detector TDR’s for.
ALICE Operations short summary ALICE Offline week June 15, 2012.
PROOF and ALICE Analysis Facilities Arsen Hayrapetyan Yerevan Physics Institute, CERN.
1 Checks on SDD Data Piergiorgio Cerello, Francesco Prino, Melinda Siciliano.
1 Outline: Update on muon/dimuon AOD production (R. Arnaldi/E. Scomparin) Other ongoing activities: MUON correction framework (X. Lopez) MUON productions.
Computing for Alice at GSI (Proposal) (Marian Ivanov)
A. Gheata, ALICE offline week March 09 Status of the analysis framework.
AliRoot survey: Analysis P.Hristov 11/06/2013. Are you involved in analysis activities?(85.1% Yes, 14.9% No) 2 Involved since 4.5±2.4 years Dedicated.
1 Offline Week, October 28 th 2009 PWG3-Muon: Analysis Status From ESD to AOD:  inclusion of MC branch in the AOD  standard AOD creation for PDC09 files.
Predrag Buncic CERN Future of the Offline. Data Preparation Group.
Predrag Buncic CERN ALICE Status Report LHCC Referee Meeting 01/12/2015.
Data processing Offline review Feb 2, Productions, tools and results Three basic types of processing RAW MC Trains/AODs I will go through these.
M. Gheata ALICE offline week, October Current train wagons GroupAOD producersWork on ESD input Work on AOD input PWG PWG31 (vertexing)2 (+
Analysis Trains Costin Grigoras Jan Fiete Grosse-Oetringhaus ALICE Offline Week,
Alignment in real-time in current detector and upgrade 6th LHCb Computing Workshop 18 November 2015 Beat Jost / Cern.
PWG3 Analysis: status, experience, requests Andrea Dainese on behalf of PWG3 ALICE Offline Week, CERN, Andrea Dainese 1.
PWG3 analysis (barrel)
Physics selection: online changes & QA M Floris, JF Grosse-Oetringhaus Weekly offline meeting 30/01/
CWG7 (reconstruction) R.Shahoyan, 12/06/ Case of single row Rolling Shutter  N rows of sensor read out sequentially, single row is read in time.
Analysis experience at GSIAF Marian Ivanov. HEP data analysis ● Typical HEP data analysis (physic analysis, calibration, alignment) and any statistical.
M. Gheata ALICE offline week, 24 June  A new analysis train macro was designed for production  /ANALYSIS/macros/AnalysisTrainNew.C /ANALYSIS/macros/AnalysisTrainNew.C.
V5-01-Release & v5-02-Release Peter Hristov 20/02/2012.
1 Reconstruction tasks R.Shahoyan, 25/06/ Including TRD into track fit (JIRA PWGPP-1))  JIRA PWGPP-2: Code is in the release, need to switch setting.
M. Brooks, 28-Mar-02 Heavy/Light meeting 1 Muon Analysis Work Getting Code ready for first data pass - DONE Get ready for second pass on DSTs - muon identification.
V4-20-Release P. Hristov 21/02/ Changes: v4-20-Rev-14 #78385 Please port AliTPCPreprocessor.cxx Rev to Release #78344 Request to port changes.
Analysis framework plans A.Gheata Offline week 13 July 2011.
Some topics for discussion 31/03/2016 P. Hristov 1.
AliRoot survey: Calibration P.Hristov 11/06/2013.
MAUS Status A. Dobbs CM43 29 th October Contents MAUS Overview Infrastructure Geometry and CDB Detector Updates CKOV EMR KL TOF Tracker Global Tracking.
V4-19-Release P. Hristov 11/10/ Not ready (27/09/10) #73618 Problems in the minimum bias PbPb MC production at 2.76 TeV #72642 EMCAL: Modifications.
ANALYSIS TRAIN ON THE GRID Mihaela Gheata. AOD production train ◦ AOD production will be organized in a ‘train’ of tasks ◦ To maximize efficiency of full.
Monthly video-conference, 18/12/2003 P.Hristov1 Preparation for physics data challenge'04 P.Hristov Alice monthly off-line video-conference December 18,
V4-18-Release P. Hristov 21/06/2010.
Jan Fiete Grosse-Oetringhaus
Calibration: preparation for pa
F. Bellini for the DQM core DQM meeting, 04th October 2012
Analysis trains – Status & experience from operation
Predrag Buncic ALICE Status Report LHCC Referee Meeting CERN
Developments of the PWG3 muon analysis code
DPG Activities DPG Session, ALICE Monthly Mini Week
Production status – christmas processing
ALICE analysis preservation
fields of possible improvement
v4-18-Release: really the last revision!
Analysis Trains - Reloaded
Philippe Charpentier CERN – LHCb On behalf of the LHCb Computing Group
AliRoot status and PDC’04
Data Preparation Group Summary of the Activities
Analysis framework - status
ALICE Computing Upgrade Predrag Buncic
Framework for the acceptance and efficiency corrections
Presentation transcript:

PWG3 Analysis: status, experience, requests Andrea Dainese on behalf of PWG3 ALICE Offline Week, CERN, Andrea Dainese 1

Outline Analysis groups in PWG3 (11 papers in progress, using 2010 data) D2H: vertexing HFE: single electrons JPSI2E: dielectrons MUON: muons and dimuons Analysis modes The special case of muon analyses User feedback Central trains ALICE Offline Week, CERN, Andrea Dainese 2

Analyses & Their Input D2H (vertexing): entirely based on AOD about 15 analyses Electrons: mostly based on ESD; tests with AOD ongoing about 10 analyses main point to stay with ESD for a while: electron ID  still being understood/optmized  many detectors (TPC, TOF, TRD, EMCAL) MUON: mostly based on muon-AOD (1% of std AOD), but can use also ESD about 10 analyses some analyses on AAFs >25 analyses on AODs + ~10 on ESDs ALICE Offline Week, CERN, Andrea Dainese 3

Muon Analyses: prompt data availability issues The key concern lately is about “availability” (we realize offline is not responsible for all the delays) The data needed for the muon analyses are accumulating very quickly since the increase of the luminosity (equivalent of full 2010 statistics every day of data taking) Prompt availability is crucial to keep the pace of other experiments (eg LHCb). Typical example: J/  polarization All prerequisites are there: MUON cluster calibration is ready before reconstruction MUON cluster reconstruction is faster than the whole ALICE output ESD is small muon ESD-filtering is fast and muon-AOD is ~1% of std-AOD (0.1% for PbPb)  eg: 18 GB for full PbPb 2010 ALICE Offline Week, CERN, Andrea Dainese 4

Example: LHC11c (logbook : runs with MCH+MTR, >10K events, > 10 minutes ; temp. list, pending e.g. final QA checks) period 1 < period 2 [ ; [ period 3 > (58) runs103 (46) runs47 (8) runs (80% w/o TPC) LBRCTLBRCTLBRCT CINT1or7 212 M 180 M x 1e-5 = 2K J/ ψ CINT7 51 M39 M CINT7 2 M0.3 M CMUS1or7 0.5 GeV/c 39 M 33 M x 2e-4 = 7K J/ ψ CMUS7 ~ 1 GeV/c 23 M0 CMUSH7 ~ 1.7 GeV/c 4.8 M1.9 M CMUSH7 ~ 4.2 GeV/c 7 M1 M CMUL7 ~ 1 GeV/c 3.3 M1.2 M CMUL7 ~ 1 GeV/c 2 M0.3 M CMUU7-X ~ 1 GeV/c 1.4 M 0.5 M x 2e-2 = 10 K J/ ψ CMUU7 ~ 1 GeV/c 1 M 0.1 M x 2e-2 = 2K J/ ψ S = single µ low p T SH = single µ high p T L = like-sign µµ low p T U = unlike-sign µµ low p T X (Y) runs : X from logbook, Y = not bad in RCT ALICE Offline Week, CERN, Andrea Dainese 5 Laurent x 3x 10

Yet, the ϒ is at hand... First quick look at LHC11c data 51 runs from from pass1 ESDs & AOD055 from ESDsfrom AODs (matched+Rabs) tracks ALICE Offline Week, CERN, Andrea Dainese 6

Muon-related requests All targeted to the goal of improving the availability of data to analyze RequestReasonOutcome muon-cluster only reco savannah task #20946 increase ESDs availability during the current period (LHC11c at that moment) refused once by PB produce AODs from pass1 savannah task #20980 get the smallest possible objects to analyze agreed (but to be redone with latest tag) write all events in AODs (i.e. flag not physics selected one, but do not discard them) increase useability of produce AODs pending PB discussion ? ALICE Offline Week, CERN, Andrea Dainese 7

User feedback Analysis (on AODs) on the Grid mainly as single user jobs Two different “modes”: pp analysis: strategy and cuts established; run when new sets are available; few iterations  could go to a central train PbPb analysis: still experimenting/exploring (tune cuts and PID strategies); chaotic analysis; continuous changes in the tasks code; frequent iterations  more difficult to go to a central train Users lately very happy with Grid performance. Few problems: merging: sometimes very painful limited quota: limited #subjobs (problematic in PbPb, where long CPU times require finer splitting) ALICE Offline Week, CERN, Andrea Dainese 8

PWG3 Central Trains Central train macro kindly prepared and tested by Mihaela committed to PWG3/centraltrain Can be configured for AOD or ESD, pp or PbPb Contains ~10-15 taks that run on AODs + 2 tasks on ESDs Natural splitting in 3 trains: 1.ESD (electrons) 2.AOD min. bias (vertexing) 3.AOD muon (much smaller set of interesting events in muon-AOD, different runs,…) We started to identify and train a team of “conductors” first cluster from D2H, coordinated by Zaida ALICE Offline Week, CERN, Andrea Dainese 9

PWG3-D2H Train: first steps 7 analysis tasks (2 x D0, D+, D*, Ds, Lc, CF) The configuration is already quite complex: 2 systems (pp, PbPb) x 2 data type (real, MC) data sample (one train = one period, for now): would be useful to be able to run same train on few periods (eg LHC10b,c,d,e) and merge the output per period a posteriori task configuration: the D2H tasks are configured using a “cuts” object  at the moment, single users add their task many times in the same job to study different cuts and analysis modes  the central train should provide this “functionality”: each task attached few times with different config (passed to the AddTask, maybe via a path to a root file on alien); the OADB doesn’t seem to be a good option to store task config and analysis cuts For MC, we run the CF (THnSparse): also many instances per analysis  issue with memory for merging ALICE Offline Week, CERN, Andrea Dainese 10

PWG3-D2H Train: first steps Frequency of train runs: in order to be useful for the users, the trains should run quite frequently (weekly?), possibly with a subset of the tasks testing and configuring should become semi-automatic / fast however, there are also several datasets being analyzed in parallel…  eg. in D2H, currently: LHC10bcde, LHC11a, LHC10h, + 3 MCs  9 datasets First experience in train setup: run in test-mode locally to optimize the splitting (n. input files)  recommendation to stay < 5h, however in test-mode the time is dominated by file access … not really the time is takes on the WNs check the memory profiles of all tasks  OK for pp and Pb-Pb ALICE Offline Week, CERN, Andrea Dainese 11