Michigan REU Final Presentations, August 10, 2006Matt Jachowski 1 Multivariate Analysis, TMVA, and Artificial Neural Networks Matt Jachowski

Slides:



Advertisements
Similar presentations
Multi-Layer Perceptron (MLP)
Advertisements

A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
Slides from: Doug Gray, David Poole
Matthew Schwartz Harvard University March 24, Boost 2011.
Artificial Neural Networks
Machine Learning Neural Networks
Lecture 14 – Neural Networks
Introduction to Statistics and Machine Learning 1 How do we: understandunderstand interpretinterpret our measurements How do we get the data for our measurements.
Searching for Single Top Using Decision Trees G. Watts (UW) For the DØ Collaboration 5/13/2005 – APSNW Particles I.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
Supervised learning 1.Early learning algorithms 2.First order gradient methods 3.Second order gradient methods.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2008 Shreekanth Mandayam ECE Department Rowan University.
Slide 1 EE3J2 Data Mining EE3J2 Data Mining Lecture 15: Introduction to Artificial Neural Networks Martin Russell.
Artificial Neural Networks Artificial Neural Networks are (among other things) another technique for supervised learning k-Nearest Neighbor Decision Tree.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Spring 2002 Shreekanth Mandayam Robi Polikar ECE Department.
Machine Learning Motivation for machine learning How to set up a problem How to design a learner Introduce one class of learners (ANN) –Perceptrons –Feed-forward.
Artificial Neural Networks
Artificial Neural Networks
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2006 Shreekanth Mandayam ECE Department Rowan University.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
Artificial Neural Networks
TMVA Andreas Höcker (CERN) CERN meeting, Oct 6, 2006 Toolkit for Multivariate Data Analysis.
G. Cowan Lectures on Statistical Data Analysis Lecture 7 page 1 Statistical Data Analysis: Lecture 7 1Probability, Bayes’ theorem 2Random variables and.
Classification Part 3: Artificial Neural Networks
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Multiple-Layer Networks and Backpropagation Algorithms
Introduction to Neural Networks Debrup Chakraborty Pattern Recognition and Machine Learning 2006.
© Copyright 2004 ECE, UM-Rolla. All rights reserved A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C.
Back-Propagation MLP Neural Network Optimizer ECE 539 Andrew Beckwith.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Intelligence Techniques Multilayer Perceptrons.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
From Machine Learning to Deep Learning. Topics that I will Cover (subject to some minor adjustment) Week 2: Introduction to Deep Learning Week 3: Logistic.
Comparison of Bayesian Neural Networks with TMVA classifiers Richa Sharma, Vipin Bhatnagar Panjab University, Chandigarh India-CMS March, 2009 Meeting,
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
Use of Multivariate Analysis (MVA) Technique in Data Analysis Rakshya Khatiwada 08/08/2007.
Applying Neural Networks Michael J. Watts
Multi-Layer Perceptron
Non-Bayes classifiers. Linear discriminants, neural networks.
N. Saoulidou & G. Tzanakos1 ANN Basics : Brief Review N. Saoulidou, Fermilab & G. Tzanakos, Univ. of Athens.
11 1 Backpropagation Multilayer Perceptron R – S 1 – S 2 – S 3 Network.
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.
Neural Networks Demystified by Louise Francis Francis Analytics and Actuarial Data Mining, Inc.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
EEE502 Pattern Recognition
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
Introduction to Neural Networks Freek Stulp. 2 Overview Biological Background Artificial Neuron Classes of Neural Networks 1. Perceptrons 2. Multi-Layered.
COMP53311 Other Classification Models: Neural Network Prepared by Raymond Wong Some of the notes about Neural Network are borrowed from LW Chan’s notes.
Neural Networks 2nd Edition Simon Haykin
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Neural Network Analysis of Dimuon Data within CMS Shannon Massey University of Notre Dame Shannon Massey1.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
Multivariate Data Analysis with TMVA4 Jan Therhaag ( * ) (University of Bonn) ICCMSE09 Rhodes, 29. September 2009 ( * ) On behalf of the present core developer.
Combining Models Foundations of Algorithms and Machine Learning (CS60020), IIT KGP, 2017: Indrajit Bhattacharya.
Neural networks.
CSE 473 Introduction to Artificial Intelligence Neural Networks
Christoph Rosemann and Helge Voss DESY FLC
Neural Networks and Backpropagation
Neural Networks Advantages Criticism
Synaptic DynamicsII : Supervised Learning
Multilayer Perceptron & Backpropagation
Backpropagation.
Lecture 04: Multilayer Perceptron
Artificial Intelligence 10. Neural Networks
Presentation transcript:

Michigan REU Final Presentations, August 10, 2006Matt Jachowski 1 Multivariate Analysis, TMVA, and Artificial Neural Networks Matt Jachowski

Michigan REU Final Presentations, August 10, 2006Matt Jachowski 2 Multivariate Analysis Techniques dedicated to analysis of data with multiple variables Active field – many recently developed techniques rely on computational ability of modern computers

Michigan REU Final Presentations, August 10, 2006Matt Jachowski 3 Multivariate Analysis and HEP Goal is to classify events as signal or background Single event defined by several variables (energy, transverse momentum, etc.) Use all the variables to classify the event Multivariate analysis!

Michigan REU Final Presentations, August 10, 2006Matt Jachowski 4 Multivariate Analysis and HEP Rectangular cuts optimization common

Michigan REU Final Presentations, August 10, 2006Matt Jachowski 5 Multivariate Analysis and HEP Likelihood Estimator analysis also common Use of more complicated methods (Neural Networks, Boosted Decision Trees) not so common (though growing) – why? –Difficult to implement –Physicists are skeptical of new methods

Michigan REU Final Presentations, August 10, 2006Matt Jachowski 6 Toolkit for Multivariate Analysis (TMVA) ROOT-integrated software package with several MVA techniques Automatic training, testing, and evaluation of MVA methods Guidelines and documentation to describe methods for users – this isn’t a black box!

Michigan REU Final Presentations, August 10, 2006Matt Jachowski 7 Toolkit for Multivariate Analysis (TMVA) Easy to configure methods Easy to “plug-in” HEP data Easy to compare different MVA methods

Michigan REU Final Presentations, August 10, 2006Matt Jachowski 8 TMVA in Action

Michigan REU Final Presentations, August 10, 2006Matt Jachowski 9 TMVA and Me TMVA started in October 2005 –Still young –Very active group of developers My involvement –Decorrelation for Cuts Method (mini project) –New Artificial Neural Network implementation (main project)

Michigan REU Final Presentations, August 10, 2006Matt Jachowski 10 Decorrelated Cuts Method Some MVA methods suffer if data has linear correlations –i.e. Likelihood Estimator, Cuts Linear correlations can be easily transformed away I implemented this for the Cuts Method

Michigan REU Final Presentations, August 10, 2006Matt Jachowski 11 Decorrelated Cuts Method Find the square root of the covariance matrix ( C=C’C’) Decorrelate the data Apply cuts to decorrelated data

Michigan REU Final Presentations, August 10, 2006Matt Jachowski 12 Artificial Neural Networks (ANNs) Robust non-linear MVA technique

Michigan REU Final Presentations, August 10, 2006Matt Jachowski 13

Michigan REU Final Presentations, August 10, 2006Matt Jachowski 14 Training an ANN Challenge is training the network Like human brain, network learns from seeing data over and over again Technical details: Ask me if you’re really interested

Michigan REU Final Presentations, August 10, 2006Matt Jachowski 15 MLP MLP (Multi-Layer Perceptron) – my ANN implementation for TMVA MLP is TMVA’s main ANN MLP serves as base for any future ANN developments in TMVA

Michigan REU Final Presentations, August 10, 2006Matt Jachowski 16 MLP – Information & Statistics Implemented in C++ Object-Oriented 4,000+ lines of code 16 classes

Michigan REU Final Presentations, August 10, 2006Matt Jachowski 17 Acknowledgements Joerg Stelzer Andreas Hoecker CERN University of Michigan Ford NSF

Michigan REU Final Presentations, August 10, 2006Matt Jachowski 18 Questions? (I have lots of technical slides in reserve that I would be glad to talk about)

Michigan REU Final Presentations, August 10, 2006Matt Jachowski 19

Michigan REU Final Presentations, August 10, 2006Matt Jachowski 20 Synapses and Neurons w 0j w 1j w nj y0y0 y1y1 ynyn vjvj yjyj v0v0 v1v1 vnvn

Michigan REU Final Presentations, August 10, 2006Matt Jachowski 21 Synapses and Neurons yjyj vjvj

Michigan REU Final Presentations, August 10, 2006Matt Jachowski 22 Universal Approximation Theorem Every continuous function that maps intervals of real numbers to some output interval of real numbers can be approximated arbitrarily closely by a multi-layer perceptron with just one hidden layer (with non-linear activation functions). inputs weights between input and hidden layer bias non-linear activation function weights between hidden and output layer output

Michigan REU Final Presentations, August 10, 2006Matt Jachowski 23 Training an MLP Training Event: Network: x0x0 x1x1 x2x2 x3x3 y

Michigan REU Final Presentations, August 10, 2006Matt Jachowski 24 Training an MLP Adjust weights to minimize error (or an estimator that is some function of the error)

Michigan REU Final Presentations, August 10, 2006Matt Jachowski 25 Back-Propagation Algorithm Make correction in direction of steepest descent Corrections made to output layer first, propagated backwards