Presentation is loading. Please wait.

Presentation is loading. Please wait.

Study of the electron identification algorithms in TRD Andrey Lebedev 1,3, Semen Lebedev 2,3, Gennady Ososkov 3 1 Frankfurt University, 2 Giessen University.

Similar presentations


Presentation on theme: "Study of the electron identification algorithms in TRD Andrey Lebedev 1,3, Semen Lebedev 2,3, Gennady Ososkov 3 1 Frankfurt University, 2 Giessen University."— Presentation transcript:

1 Study of the electron identification algorithms in TRD Andrey Lebedev 1,3, Semen Lebedev 2,3, Gennady Ososkov 3 1 Frankfurt University, 2 Giessen University and 3 LIT JINR

2 Introduction  Due to the heavy tail Landau distributions of dE/dx the direct cut for 90% electron identification efficiency gives 50% pion misidentification for one layer.  As it was shown by previous studies, to achieve the higher pion suppression one needs the following: Apply a transformation to reduce Landau tails of dE/dx; 1. Apply a transformation to reduce Landau tails of dE/dx; 2. Increase number of TRD layers; 3. Try different radiator types; 4. Apply one of electron identification algorithms algorithms GSI 2013G. Ososkov 2 90% electron identification efficiency e - contribution=dE/dx + TR

3 Electron identification methods (1) GSI 2013  Methods:  Threshold on mean value  Threshold on median value  Likelihood G. Ososkov 3  Likelihood:  Prepare PDF for electron and pion energy loss spectra  Calculate likelihood ratio  In case of many layers one has Median Likelihood  Median:  Sort energy losses  Find middle measurement F med (x)=P(med<x)=[F Landau (x)(1-F Landau (x))] n/2

4 Electron identification methods (2) GSI 2013G. Ososkov 4  Energy loss transformations  Transformation 1 (V. Ivanov)  Transformation: (Eloss-Landau FWHM )/Landau MPV - 0.225  Sort transformed Eloss  Calculate the value of the Landau distribution function (0, 1)  Transformation 2 (G. Ososkov, S. Lebedev)  Sort energy losses  Prepare PDF for ordered energy losses  Calculate likelihood ratio for each energy loss: Energy loss transformation + classifier  Methods:  Artificial Neural Network + transformation 1  Boosted Decision Tree + transformation 2 BDT ANN

5 CBM TRD CERN beam test 2011 setup  From Frankfurt (CERN 2011):  4mm_foam -> 4+4mm geometry and the foam radiator  5 mm fibre -> 5+5mm geometry and ALICE type fiber radiator  4mm f350 -> 350 layer of foil  From Munster (CERN 2011):  B -> Pokalone (250 foils/700 foil distance/24 foil thickness)  F -> PE (220/250/20)  G (30) -> 30 fiber mats  H++ -> 36cm foam GSI 2013G. Ososkov 5 Radiators in our study FrankfurtMünster TRD prototypes from Frankfurt and Münster groups were tested at CERN PS/T9 beam line in October 2011

6 Beam test data from Frankfurt and Munster groups (CERN2011, 3GeV/c ) GSI 2013G. Ososkov 6 Thanks to Andreas Arend and Cyrano Bergmann for the beam test data! 4mm_foam (Fra) H++ (Mun) 4mm_f350 (Fra) Three examples for various radistors

7 Testing procedure How we simulate energy losses: Take random value from spectra histogram TH1::GetRandom ( Cyrano Bergmann)  e - ID algorithm training stage  Simulate two samples for pions and electrons each with many thousands Elosses  Calculate for every Eloss its output for corresponding e - ID algorithm and histogramm it  Find on this histogramm a cut giving 90% of e - effciency and obtain β= probability of pion mis-ID. Pion suppression index = 1/ β  e - ID algorithm testing stage  Simulate a representative sequence of pions and electrons and applying the known cut identify the particle  Calculate pion suppression index for given algorithm

8 Results for H++ 36cm foam (Mun) radiator and 10 layers, 3 GeV/c GSI 2013G. Ososkov 8 zooming

9 Results for 4mm_foam (Fra) radiator and 10 layers, 3 GeV/c GSI 2013G. Ososkov 9

10 Pion suppression vs number of TRD layers (4mm_foam, H++), 3GeV/c GSI 2013G. Ososkov 10 4mm_foam (Fra) H++ 36cm foam (Mun)

11 Pion suppression for different radiator types (CERN2011) The best results were achieved for regular foil type radiator. However, such radiators usually require a significant external support frame to keep the foils, while the reasonable pion suppression can also be achieved with irregular foam radiator. These radiators are self supporting and much cheaper than regular ones. GSI 2013G. Ososkov 11 3 GeV/c 10 layers

12 CERN 2012. Pion suppression for different radiator types (Fra)  ALICE -> ALICE type radiator (5 mm fibre)  f250 ->250 foil layers with 0.7mm spacing  NO rad -> without any radiator  R002 -> PE-Foam, ~260 Transitions with 1mm bubble size, current best compromise GSI 2013G. Ososkov 12 3 GeV/c 10 layers

13 CERN2012. Pion suppression vs number of TRD layers, 3 GeV/c  R002 looks competitive with regular foil radiator for given pion suppression requirements. GSI 2013G. Ososkov 13 ALICE f250 R002

14 Test with TRD geometries for SIS300 and SIS100  C. Bergmann has implemented energy loss simulation based on beam time data in CBMROOT.  We have tested v13c geometry with “H++” radiator.  Electrons and pions were generated with momentum from 1GeV/c to 8 GeV/c GSI 2013G. Ososkov 14

15 Pion suppression results  90% electron identification efficiency  Algorithm based on ANN is used GSI 2013G. Ososkov 15  Electron identification algorithm based on ANN was adapted for the new TRD geometries and H++ radiator and can be used in simulations.  Algorithm can handle TRD tracks with 1 - 10 hits.

16 Summary  TRD beam time data from CERN 2011 and 2012 were analyzed with algorithms implemented in CBMROOT.  Regular foil type radiator (4mm_f350) showed excellent results but extreme difficult to build in large scale.  Required pion suppression can also be achieved with irregular foam radiator (H++, 4mm_foam, R002)  New TRD geometries for SIS300 and SIS100 were checked and showed good performance. GSI 2013G. Ososkov 16

17 17 Thanks for your attention!

18 Decision trees in Particle Identification data sample Single Decision Tree Root Node Branch Leaf Node ● Go through all PID variables, sort them, find the best variable to separate signal from background and cut on it. ● For each of the two subsets repeat the process. ● This forking decision pattern is called a tree. ● Split points are called nodes. ● Ending nodes are called leaves. 1)Multiple cuts on X and Y in a big tree (only grows steps 1-4 shown) However, a danger exists - degrading of classifier performance by demanding perfect training separation, which is called “overtraining” all cuts for the decision tree

19 How to boost Decision Trees weights of misclassified events ● Given a training sample, boosting increases the weights of misclassified events (background wich is classified as signal, or vice versa), such that they have a higher chance of being correctly classified in subsequent trees. ● Trees with more misclassified events are also weighted, having a lower weight than trees with fewer misclassified events. ● Build many trees (~1000) and do a weighted sum of event scores from all trees 1-1 (score is 1 if signal leaf, -1 if background leaf). The renormalized sum of all the scores, possibly weighted, is the final score of the event. High scores mean the event is most likely signal and low scores that it is most likely background. Boosted Decision Trees (BDT) Boosting Algorithm has all the advantages of single decision trees, and less susceptibility to overtraining. Many weak trees (single-cut trees) combined (only 4 trees shown) boosting algorithm produces 500 weak trees together

20 e-/ π separation with boosted decision tree BDT output Result for the BDT classifier 4mm_foam (Fra) radiator and 10 layers, 3 GeV/c : pion supression is 474 for 90% electron efficiency Cut = 0,77


Download ppt "Study of the electron identification algorithms in TRD Andrey Lebedev 1,3, Semen Lebedev 2,3, Gennady Ososkov 3 1 Frankfurt University, 2 Giessen University."

Similar presentations


Ads by Google