Download presentation

Presentation is loading. Please wait.

Published byEnrique Jaquess Modified about 1 year ago

1
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen1 Performance of Statistical Learning Methods Jens Zimmermann Max-Planck-Institut für Physik, München Forschungszentrum Jülich GmbH Performance Examples from Astrophysics Performance vs. Control H1 Neural Network Trigger Controlling Statistical Learning Methods Overtraining Efficiencies Uncertainties Comparison of Learning Methods Artificial Intelligence Higgs Parity Measurement at the ILC

2
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen2 Performance of Statistical Learning Methods: MAGIC Significance and number of excess events scale the uncertainties in the flux calculation.

3
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen3 Performance of Statistical Learning Methods: XEUS Pileup vs. Single photon classical algorithm „XMM“ ? pileups not recognised by XMM but by NN

4
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen4 Control of Statistical Learning Methods There may be many different successful applications of statistical learning methods. There may be great performance improvements compared to classical methods. This does not impress people who fear that statistical learning methods are not well under control. First talk: Understanding and Interpretation Now: Control and correct Evaluation

5
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen5 The Neural Network Trigger in the H1 Experiment L1 2.3 µs L2 20 µs L4 100 ms 10 MHz 500 Hz 50 Hz 10 Hz Trigger Scheme H1 at HERA ep Collider, DESY „L2NN“ Each neural network on L2 verifies a specific L1 sub-trigger.

6
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen6 Triggering Deeply Virtual Compton Scattering L1 sub-trigger 41 triggers DVCS by requiring Significant energy deposition in SpaCal Within Time Window L2 neural network additional information Liquid argon energies SpaCal centre energies z-vertex information Triggering with 4 Hz Must be reduced to 0.8 Hz Theory Signal (DVCS) Background (upstream beam-gas interaction)

7
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen7 Determine the correct efficiency 50% training set25% test set signal should peak at 1 background should peak at 0 25% selection set Tune training parameters to avoid overtraining optimise performance

8
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen8 Determine the Correct Efficiency [%] training set test set

9
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen9 Check Statistical Uncertainties propagation of uncertaintiesefficiency statistical uncertainty of the efficiency e.g. 80% ± 4% for 80 of 100

10
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen10 Check Systematical Uncertainties There is only a propagation of systematical uncertainties of the inputs Assuming x 1 with absolute error 1 x 2 with relative error 2 = 5% x 3 with relative error 3 =10%

11
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen11 Check Systematical Uncertainties example: DVCS dataset

12
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen12 Comparison of Hypotheses efficiencies for fixed rejection of 80% NN: 96.5% vs. SVM: 95.7% Statistically significant? Build 95% confidence interval! is the variation over different parts of the test set

13
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen13 Comparison of Learning Methods Cross-Validation: Divide dataset into k parts, train k classifiers by using each part once as test set. is the variation over the different trainings Compare performances over different training sets! efficiencies for fixed rejection of 60%

14
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen14 two events with low NN-output Artificial Intelligence overlaycosmic CC cosmic H1-L2NN: Triggering Charged Current

15
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen15 Artificial Intelligence background found in J/ selection H1-L2NN: Triggering J/

16
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen16 Higgs Parity Measurement at the ILC Parity induces favourite -configuration: anti-parallel for H parallel for A H/A + - = 5.09 Significance is amplitude divided by its uncertainty Significance measured for 500 events and averaged over 600 pseudo-experiments Classical approach: fit angular distribution 0 22 A

17
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen17 Higgs Parity Measurement at the ILC Statistical learning approach: direct discrimination trained towards 0trained towards 1 = 6.26 Significance is difference of measured means divided by its uncertainty Significance measured for 500 events and averaged over 600 pseudo-experiments

18
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen18 Conclusion Statistical Learning Methods successful in many applications in high energy and astrophysics. Significant performance improvements compared to classical algorithms. Statistical learning methods are well under control: - efficiencies can be determined - uncertainties can be calculated. Comparison of learning methods reveals statistically significant differences. Statistical Learning Methods sometimes show more artificial intelligence than expected.

Similar presentations

© 2016 SlidePlayer.com Inc.

All rights reserved.

Ads by Google