5 th Workshop on ALICE Installation and Commissioning April 1 rst & 3 rd, CERN Muon Tracking (MCH) Conclusion of the February/March 08 ALICE COSMIC run.

Slides:



Advertisements
Similar presentations
Status of the CTP O.Villalobos Baillie University of Birmingham April 23rd 2009.
Advertisements

Kondo GNANVO Florida Institute of Technology, Melbourne FL.
11th April 2008TOF DA & ppChiara Zampolli. DAQ-DA Three DAQ-DAs implemented, deployed, validated, and tested: TOFda TOFnoiseda TOFpulserda DCS-DA One.
Peter Chochula, January 31, 2006  Motivation for this meeting: Get together experts from different fields See what do we know See what is missing See.
Linda R. Coney – 24th April 2009 Online Reconstruction & a little about Online Monitoring Linda R. Coney 18 August, 2009.
1 Analysis code for KEK Test-Beam M. Ellis Daresbury Tracker Meeting 30 th August 2005.
Octal ASD Certification Tests at Michigan J. Chapman, Tiesheng Dai, & Tuan Bui August 30, CERN.
High Level Trigger of Muon Spectrometer Indranil Das Saha Institute of Nuclear Physics.
Jianchun (JC) Wang, 08/21/99 RICH Electronics and DAQ Chip Carrier Short Cable Transition Board Long Cable Data Board Crate J.C.Wang Syracuse University.
ACORDE REPORT Pedro Podesta UAS Mexico. Geometry ( No changes) Alignment (No changes) QA (No changes ) Shuttle (In progress) To do Outline.
Performance of the PHENIX Muon Tracking System in Run-2 Ming X. Liu Los Alamos National Lab (for the PHENIX Collaboration) –Detector Commissioning –Detector.
SSD Status P. Christakoglou (NIKHEF-UU) for the SSD collaboration Thanks to: Marco vL, Enrico, Mino, Marek and Massimo.
06/15/2009CALICE TB review RPC DHCAL 1m 3 test software: daq, event building, event display, analysis and simulation Lei Xia.
Vladimir Frolov for Torino group. Experimental activities: The system for testing the MRPC in the Torino INFN laboratory has been fully mounted and checked;
1 HBD Commissioning (II) Itzhak Tserruya HBD group meeting November 28, 2006 Progress from October 3 to November 28, 2006.
06/03/06Calice TB preparation1 HCAL test beam monitoring - online plots & fast analysis - - what do we want to monitor - how do we want to store & communicate.
Experience with analysis of TPC data Marian Ivanov.
Dimuon Forward Spectrometer May 2008 Cosmic Run Day by Day report Interesting runs.
Status of NA62 straw electronics and services Peter LICHARD, Johan Morant, Vito PALLADINO.
Run Coordinator Weekly Report EG6 Commissioning October Raffaella De Vita.
Status of TPC experiment ---- Online & Offline M. Niiyama H. Fujimura D.S. Ahn W.C. Chang.
Real data reconstruction A. De Caro (University and INFN of Salerno) CERN Building 29, December 9th, 2009ALICE TOF General meeting.
DC12 Commissioning Status GOALS: establish operating conditions, determine initial calibration parameters and measure operating characteristics for the.
ATLAS Liquid Argon Calorimeter Monitoring & Data Quality Jessica Levêque Centre de Physique des Particules de Marseille ATLAS Liquid Argon Calorimeter.
108 Mar 2007L. Musa TPC Commissioning ALICE Technical Forum, CERN, 8 th March 2007 L. Musa Outline Pre-commissioning above ground (2006) Preparing for.
QGP France sept 2010Sanjoy Pal Performances of the tracking Chambers of the ALICE MUON Spectrometer 1.
4 th Workshop on ALICE Installation and Commissioning January 16 th & 17 th, CERN Muon Tracking (MUON_TRK, MCH, MTRK) Conclusion of the first ALICE COSMIC.
Bernhard Schmidt DESY - HH PRC open session, October 30, 2002 HERA-B.
The 21st International Conference on Ultrarelativistic nucleus-nucleus collisions, March 30 – April 4, Knoxville, TN Results from cosmics and First LHC.
1 ITS Quality Assurance (& DQM) P. Cerello, P. Christakoglou, W. Ferrarese, M. Nicassio, M. Siciliano ALICE OFFLINE WEEK – April 2008.
HBD electronics status All the ADC and XMIT boards are installed. –Top 3 crates are for the ADC, XMIT boards –Bottom crate is for test pulse boards/future.
D0 Status: 01/14-01/28 u Integrated luminosity s delivered luminosity –week of 01/ pb-1 –week of 01/ pb-1 –luminosity to tape: 40% s major.
1 SDD: DA and preprocessor Francesco Prino INFN Sezione di Torino ALICE offline week – April 11th 2008.
Linda R. Coney – 5 November 2009 Online Reconstruction Linda R. Coney 5 November 2009.
Progress on the beam tracking instrumentation Position measurement device Tests performed and their resolution Decision on electronics Summary.
Muon Tracking Meeting SOR issue First priority Diagnostic: MCH DetectorScriptError Arriving at the initialization stage: the whole or part.
Noise and Cosmics in the DHCAL José Repond Argonne National Laboratory CALICE Collaboration Meeting University Hassan II Casablanca, Morocco September.
1 Checks on SDD Data Piergiorgio Cerello, Francesco Prino, Melinda Siciliano.
MICE CM28 Oct 2010Jean-Sebastien GraulichSlide 1 Detector DAQ o Achievements Since CM27 o DAQ Upgrade o CAM/DAQ integration o Online Software o Trigger.
Pixel DQM Status R.Casagrande, P.Merkel, J.Zablocki (Purdue University) D.Duggan, D.Hidas, K.Rose (Rutgers University) L.Wehrli (ETH Zuerich) A.York (University.
1 Calorimeters LED control LHCb CALO meeting Anatoli Konoplyannikov /ITEP/ Status of the calorimeters LV power supply and ECS control Status of.
GLAST LAT Project CU Beam Test Workshop 3/20/2006 C. Sgro’, L. Baldini, J. Bregeon1 Glast LAT Calibration Unit Beam Test Status Report on Online Monitor.
5-9 June 2006Erika Garutti - CALOR CALICE scintillator HCAL commissioning experience and test beam program Erika Garutti On behalf of the CALICE.
STAR Analysis Meeting, BNL – oct 2002 Alexandre A. P. Suaide Wayne State University Slide 1 EMC update Status of EMC analysis –Calibration –Transverse.
Software for the CMS Cosmic Challenge Giacomo BRUNO UCL, Louvain-la-Neuve, Belgium On behalf of the CMS Collaboration CHEP06, Mumbay, India February 16,
1 HBD Commissioning Itzhak Tserruya DC meeting, BNL December 13, 2006 Progress from October 3 to November 28, 2006.
Summary of Workshop on Calibration and Alignment Database Magali Gruwé CERN PH/AIP ALICE Computing Day February 28 th 2005.
1 Tracker Software Status M. Ellis MICE Collaboration Meeting 27 th June 2005.
Overview of PHENIX Muon Tracker Data Analysis PHENIX Muon Tracker Muon Tracker Software Muon Tracker Database Muon Event Display Performance Muon Reconstruction.
PHOS offline status report Yuri Kharlov ALICE offline week 7 July 2008.
Calorimeter global commissioning: progress and plans Patrick Robbe, LAL Orsay & CERN, 25 jun 2008.
S. K. PalDCS WORKSHOP 17 JUN 2002 FRONT VIEW PMD Layout and Design Two planes: Veto + PreShower Gas detector with hexagonal cells Cell cross section: 0.22.
Javier Castillo3rd LHC Alignment Workshop - CERN - 15/06/ Status of the ALICE MUON Spectrometer Alignment Strategies & Results from Cosmic run Javier.
Calibration algorithm and detector monitoring - TPC Marian Ivanov.
 offline code: changes/updates, open items, readiness  1 st data taking plans and readiness.
D. Elia (INFN Bari)Offline week / CERN Status of the SPD Offline Domenico Elia (INFN Bari) Overview:  Response simulation (timing info, dead/noisy.
Calorimeter Cosmics Patrick Robbe, LAL Orsay & CERN, 20 Feb 2008 Olivier, Stephane, Regis, Herve, Anatoly, Stephane, Valentin, Eric, Patrick.
MAUS Status A. Dobbs CM43 29 th October Contents MAUS Overview Infrastructure Geometry and CDB Detector Updates CKOV EMR KL TOF Tracker Global Tracking.
Barthélémy von Haller CERN PH/AID For the ALICE Collaboration The ALICE data quality monitoring system.
10/8/ HMPID offline status D. Di Bari, A. Mastroserio, L.Molnar, G. Volpe HMPID Group Alice Offline Week.
L1Calo Databases ● Overview ● Trigger Configuration DB ● L1Calo OKS Database ● L1Calo COOL Database ● ACE Murrough Landon 16 June 2008.
Dimuon Spectrometer : Status of the tracking chambers
CLAS12 DAQ & Trigger Status
Marian Ivanov TPC status report.
CLAS12 Calibration and Commissioning (CALCOM)
Commissioning of the ALICE HLT, TPC and PHOS systems
ALICE Offline Week, CERN
ScECAL+AHCAL+TCMT Combined Beam FNAL
11th Pisa meeting on advanced detectors
Implementation of DHLT Monitoring Tool for ALICE
Presentation transcript:

5 th Workshop on ALICE Installation and Commissioning April 1 rst & 3 rd, CERN Muon Tracking (MCH) Conclusion of the February/March 08 ALICE COSMIC run.

Status of Hardware during Run Dimuon Forward Spectrometer (DFS) Tracking Chambers : 5 stations, each station has 2 detection planes or chambers 1 chamber/plane gives an (x,y) coordinates μ+μ+ μ-μ- primary vertex Z Y X Station 1 Station read out during the cosmic run. [Chamber 5] and for the first time, triggered by the muon trigger

Status of Hardware during Run Dimuon Forward Spectrometer (DFS) Tracking Chambers : 5 stations, each station has 2 detection planes or chambers 1 chamber/plane gives an (x,y) coordinates During the Feb/Mar cosmic run : everything was installed (except chamber 6, inside the magnet) and local commissioning (St 4 &5) was being done in parallel 2/5 of the DFS was powered Station 1 and 2 part of Station 3 (chamber 5) has been readout as well (max configuration : 3 LDC, 11 DDL) air cooling (St1 and St2 is very stable), temperatures look fine still some HV trips ! Never seen during previous tests → How sure are we of the gas quality ?

Data Taking Two periods of data taking : Acorde trigger (~80Hz) : from run to run very large number of events ; should be enough to check clustering algorithms. Tracks have a (too) large angle and offer limited interest Muon Trigger (~80mHz): from run to run a total number of events is around 8840 (∑ runs requested to be reconstructed) a first analysis shows shower events are dominant but a few horizontal tracks → very interesting to look at

DAQ No real DAQ issues : compliance with DAQ achieved really stable : 1 run of 10 7 events with 40MHz 2 runs of events with 40MHz Readout times (in 0-suppressed mode after taking pedestal runs, threshold calculation with a DA included in our rpm, etc… = conditions of real data taking ): Readout rate is slower than expected (lab measurements gave busy ~ 220 μs). Well aware of the problem ; need time to be investigated since other tasks have higher priority. Goal is to decrease deadtime (well) below 200 μs. Entry in the loogbook RunBusyL2a_StrobeTimeDeadtime (us)Busy (%) DAQ rate (Hz) CheckPulser rate (Hz) ,9992,03% ,692,83%1010, ,1028,31%10200, ,0859,22%20450, ,2173,76%23500, ,3069,66%20500, ,7893,40%13300, ,2499,96%13351,

DCS/ECS/Trigger/HLT DCS : final state machine for low & high voltage works temperatures, gas flow is monitored (still some improvements -alarms handing and interlocks- to be done) ECS : scripts are ready and working. No issues. + Bridge tested and OK Trigger : Some problems but is it really the trigger ? Clearly some busy problem when run stops “unexpectedly” at DAQ level. It happened a lot during the first week (w9) but not a single problem during w10 (in the Alice_multi partition with MTR & V0). (we didn’t forgot the trigger questionnaire) HLT : We all have seen some results ! Haven’t we ?

HLT Monitoring/Analysis HLT readout on the ldc : 1- subtract pedestal (on the 0-supp data) 2- apply a rough cut (all digits > 10 ADC) 3- scan interesting events 4- display the results on the ACR big screen All plots available at :

HLT Monitoring/Analysis Fired pads isolated with a sharp 10 ADC cut on 0-suppressed and pedestal substracted data.

HLT Monitoring/Analysis A straight line passing the fired pads points clearly to ACORDE.

HLT Monitoring/Analysis A straight line passing the fired pads/strip points towards the center of ALICE (muon trigger decision algorithm)

HLT Monitoring/Analysis

Online Monitoring/Offline Online Monitoring : All with MOOD pedestal, noise and gains visualization some global histograms (pad occupancy, hit map, errors) Data from gdc have been monitored during run in global New Mchview tool : Not online but can read all information at the OCDB level histogram all quantities for any granularity (from single pad to full chamber)

Online Monitoring/Offline MCHVIEW : Analysis of calibration run for St. 1 and 2 Using the file from the OCDB (copied from alien) → Full chain is working from the ECS calibration script up to analysis

Online Monitoring/Offline MOOD : Online display of run Fired pads on chamber 4

Online Monitoring/Offline MOOD : Online display of run Fired pads on chamber 3

Online Monitoring/Offline MOOD : Online display of run Fired pads on chamber 2

Online Monitoring/Offline MOOD : Online display of run Fired pads on chamber 1 Mapping problem now fixed.

Online Monitoring/Offline ALIEVE : offline display of the same event (126) run from reconstructed digits. From offline muon clustering, 5 hits are reconstructed : ************************************************************* * Row * ch * charge * size * absPosX * absPosY * absPosZ * ************************************************************* * 1742 * 0 * * 5 * * * * * 1743 * 0 * * 6 * * * * * 1744 * 1 * * 10 * * * * * 1745 * 2 * * 3 * * * -673 * * 1746 * 2 * * 3 * * * -673 * * 1752 * 3 * * 9 * * * -692 * *************************************************************

Online Monitoring/Offline Data Storage : data are on CASTOR AliRoot reconstruction has been run on a sample of data (by muon “offline shifters); 13 runs → MUON.digits available on alien Data Quality Checks : QA/AMORE is a work in progress - To do List for QA (Mar08) To complete QA classes with: From raw-data: - Hit distribution per chamber (TH1 with the chamber number in the horizontal axes); - Hit distribution per detection element (TH1 with the DE number in the horizontal axes) - Hit distribution per buspatch (10 TH2 (one per chamber) with the buspatch geographical distribution (as in mchview) From recpoint: - Trigger efficiency plots (Diego) - but now in analysis task - Check of the trigger code (Who?) - Trigger scaler values(Diego) - Strip hit distribution (4 TH2 one per chamber with the strip distribution)(Diego) - Local board hit distribution (1 TH2 with the local reponse of each local board)(Diego) From ESD - p, pt, y distribution - already in place - track multiplicity = number of tracks per event (somehow related to tracking efficiency) - track cluster multiplicity distribution = number of clusters per track (somehow related to clustering efficiency); - Multiplicity of clusters per chamber (10 TH1); - Multiplicity of cluster per detection element (144 TH1); - Charge of the cluster per chamber (10 TH1); - Charge of cluster per detection element (144 TH1); - Normalized Chi2 of the track (somehow related to tracking quality) - Distance between clusters and track (somehow related to clustering quality and alignment accuracy) - Number of track matching trigger (somehow related to tracker/trigger efficiency) NEW Christian, Philippe P., Diego

COSMIC Data, Analysis Results AliEve display the MUON digit from reconstructed data For the moment, clusters are reconstructed afterwards: see the clustering algorithm at work on real data implementing trigger selection look for horizontal tracks All pedestal runs (so far) show nominal/expected noise. Magnets (dipole+L3) effect on measured noise : comparison of runs – Relative noise difference : mean=0.04 ADC and σ=0.07 (nominal noise is 1.1 adc) compatible with single channel noise fluctuations.

Hardware/Software Status for next cosmic run + Goals To do for next Run : Some electronics to be fixed on Station 1 & 2 Some software to be installed (related with the trigger/busy) → already scheduled for w15 (along with DAQ/Trigger test) Some work on the interlocks needs to be done (DCS) Goals : Pedestal/noise, calibration procedures are ready and tested Clustering and hit reconstruction seems to work fine. Alignment and tracks : needs tracks ! → tracking algorithm modified to reconstruct tracks without all the chambers (to be tried on Feb/Mar cosmic data) During (or before) the next run, St have to be included in the readout. As soon as a chamber is validated at the “local commissioning” level, it can be included easily in the global DAQ.

Action to Completion Station 1 : ready for beam (dead channels ? but <<1%, all HV ok ≡ 1650 V ) Station 2 : ready for beam (dead channels ? but <<1%, all HV ok except 1 group/24) Station 3 : finishing the local commissioning ch5 is almost done (readout ok, some very localized pedestal issues) ch6 is installed (open position to finish ch5) ; cabling just started Station 4 : finishing the local commissioning ch7 outside is used to test the noise. Wiener LV PS modified → nice improvements (noise went down between 1.5 and 3.5 ADC ) ch 7 inside has some problems. Needs to be retested ch8 not tested. Readout cabling to be done Station 5 : finishing the local commissioning ch9 readout is ok (a few slats have problems). HV to be tested ch10 cabled, readout tests starting next week. Still a lot of work to do before the next cosmic run. Dimuon tracking meeting tomorrow afternoon to define goals and strategy. 1- St 3,4,5 local commissioning implies no dipole field 2- Probably muon trigger only during next run

Conclusion Very good behavior of St1+St2 was achieved during data taking Readout (CROCUS) is compliant with DATE and Trigger patterns handling is fine (L1/2 reject are implemented and tested in lab) 2 LDCs ok, 3 LDCs (when addinh chamber 5) was straightforward → very confident to include easily remaining chambers software for data taking (DA, ECS scripts, MOOD) are in very good shape AMORE/QA are being developed offline reconstruction has been running fine (2 runs with problems in the data transfer → fixed ) Dimuon « offline shifters » have developed tools and ran the reconstruction → first results with clusters/single muon triggers → working on getting tracks

Muon_trk questions One issue related to the debugging of the DA (and maybe other debugging). The online cluster is completely isolated from the outside word. To be more precise, we need : - A machine where we can play with the data and with the DA program (sources, compiling, debugging) in the same conditions as an LDC. We have some CERN (dimlab04, 05,...) where the code can be installed but we don't have the data (in the DATE format, as we have online). During the last cosmic run we fail to reproduce a segmentation fault observed online in one of our CERN. Since we don't made the test with the same data, how can we compare ?