Presentation is loading. Please wait.

Presentation is loading. Please wait.

Tracker Reconstruction SoftwarePerformance Review, Oct 16, 2002 Summary of Core “Performance Review” for TkrRecon How do we know the Tracking is working?

Similar presentations


Presentation on theme: "Tracker Reconstruction SoftwarePerformance Review, Oct 16, 2002 Summary of Core “Performance Review” for TkrRecon How do we know the Tracking is working?"— Presentation transcript:

1 Tracker Reconstruction SoftwarePerformance Review, Oct 16, 2002 Summary of Core “Performance Review” for TkrRecon How do we know the Tracking is working? GLAST Science Analysis Software Wednesday, Oct 17, 2002 The TkrRecon Group

2 Tracker Reconstruction SoftwarePerformance Review, Oct 16, 2002 2 Is the Tracking working? The Simple Answer YES!

3 Tracker Reconstruction SoftwarePerformance Review, Oct 16, 2002 3 How well is the Tracking is working? The real question  We know the tracking is working to a certain level: –The event display tells us we are not that far off most of the time –Basic distributions of TkrRecon output quantities look reasonable –Output from merit is not complete garbage –Bill Atwood’s presentation from last week outlined some studies he has done: Studies of track fits with muons Studies of 100 MeV gammas Some initial PSF plots –Gut feeling is that base performance already better than PDR  Detailed studies of TkrRecon performance are in the early stages –Yesterday’s report outlined the following: Event Display Initial studies of Simulation/Digitization/Clustering Some Basic TkrRecon output distributions Some very preliminary comparisons of MC predictions versus TkrRecon output Algorithm timing –Other tasks in progress TkrReconTestSuite – help to compare the performance of different algorithms Vertexing algorithm studies Etc.

4 Tracker Reconstruction SoftwarePerformance Review, Oct 16, 2002 4 The Event Display What does it tell us?  Pattern Recognition –Problem: associate clusters to form tracks –Single biggest problem for TkrRecon  Event Display –Human eye very good at solving pattern recognition problem –Can very quickly “see” major flaws in the pattern recon  Example –1 GeV muon –Point source (z = 1000) –-1.0 < cos(theta) < -0.8

5 Tracker Reconstruction SoftwarePerformance Review, Oct 16, 2002 5 Event Display Gamma Example  Example –100 MeV gamma –Point source (z = 1000) –-1.0 < cos(theta) < -0.8

6 Tracker Reconstruction SoftwarePerformance Review, Oct 16, 2002 6 McHits, Fit to Landau Distribution “expect” 0.155 MeV We need to investigate the “landau” distribution in thin layers

7 Tracker Reconstruction SoftwarePerformance Review, Oct 16, 2002 7 McHits, Energy Deposit “long” hits “short” hits “long” hits Energy from photons comes from Compton scatters below Geant range cutoff, and from stopping photons. We need to fix digitization to dump all this energy into one or two strips. !!!

8 Tracker Reconstruction SoftwarePerformance Review, Oct 16, 2002 8 Summary of Simulation and Digitization  Energy loss of charged particles –For both electrons and muons, the energy loss in the silicon is less that predicted by the naïve wallet card formula, even without considering the relativistic rise: = 0.140 MeV (peak at 0.166 MeV) compared to expectation of 0.155 MeV –Need to understand effect of thin radiator, perhaps do a better count of secondary energy deposit.  Energy loss of neutral particles! –Because of the way energy below the cutoff is counted, photons appear to lose energy in the silicon when they emit a low-energy compton, or such-like. With a range cutoff of 0.7 mm, there are photon energy losses up to about 0.15 MeV –These losses are not handled correctly in digitization. The loss is spread evenly among all of the strips crossed by the photon, whereas it almost certainly occurs in on place. This leads to an underestimation of this source of noise. –The fix: If the photon stops in the silicon, add all the energy to the strip in which it stops. If it exits, add the energy to one of the strips that it crosses, randomly.

9 Tracker Reconstruction SoftwarePerformance Review, Oct 16, 2002 9 TkrRecon – Basic Recon Plots 100 MeV Gammas -Back -Front -Back -Front

10 Tracker Reconstruction SoftwarePerformance Review, Oct 16, 2002 10 TkrRecon / Monte Carlo Comparisons TkrRecon / Monte Carlo Comparisons Monte Carlo prediction (from McPositionHits, etc.) vs Recon output -Recon -MC -Recon -MC -Recon -MC -Recon -MC 100 MeV Gammas generated into the cone –0.8 < cos(theta) < -1.0

11 Tracker Reconstruction SoftwarePerformance Review, Oct 16, 2002 11 TkrRecon / Monte Carlo Comparisons TkrRecon / Monte Carlo Comparisons Monte Carlo prediction (from McPositionHits, etc.) vs Recon output Blue is MC Red is Recon Reconstructed Gamma Pointing Resolution Front Back 100 MeV Gammas generated into the cone –0.8 < cos(theta) < -1.0

12 Tracker Reconstruction SoftwarePerformance Review, Oct 16, 2002 12 TkrRecon Algorithm Timing ~10 ms resolution Reconstruction time dominated by Track Finding (No surprise) Clustering ~ 2 ms Track Fit ~ 7 ms Track Finding ~ 400 ms Total TkrRecon ~ 400 ms Vertexing ~ 200 us


Download ppt "Tracker Reconstruction SoftwarePerformance Review, Oct 16, 2002 Summary of Core “Performance Review” for TkrRecon How do we know the Tracking is working?"

Similar presentations


Ads by Google