Exploring Artificial Neural Networks to Discover Higgs at LHC Using Neural Networks for B-tagging By Rohan Adur

Slides:



Advertisements
Similar presentations
Perceptron Lecture 4.
Advertisements

1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
NEURAL NETWORKS Perceptron
ACAT05 May , 2005 DESY, Zeuthen, Germany Search for the Higgs boson at LHC by using Genetic Algorithms Mostafa MJAHED Ecole Royale de l’Air, Mathematics.
Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
1 B-tagging meeting overview Li bo Shandong University.
Artificial Neural Networks - Introduction -
Artificial Neural Networks - Introduction -
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
September 27, 2005FAKT 2005, ViennaSlide 1 B-Tagging and ttH, H → bb Analysis on Fully Simulated Events in the ATLAS Experiment A.H. Wildauer Universität.
Particle Identification in the NA48 Experiment Using Neural Networks L. Litov University of Sofia.
On the Trail of the Higgs Boson Meenakshi Narain.
Algorithms and Methods for Particle Identification with ALICE TOF Detector at Very High Particle Multiplicity TOF simulation group B.Zagreev ACAT2002,
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Artificial neural networks:
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
E. Devetak - LCWS t-tbar analysis at SiD Erik Devetak Oxford University LCWS /11/2008 Flavour tagging for ttbar Hadronic ttbar events ID.
Artificial Neural Networks
1 g g s Richard E. Hughes The Ohio State University for The CDF and D0 Collaborations Low Mass SM Higgs Search at the Tevatron hunting....
Explorations in Neural Networks Tianhui Cai Period 3.
Neural Networks AI – Week 23 Sub-symbolic AI Multi-Layer Neural Networks Lee McCluskey, room 3/10
LCWS 2004, ParisSonja Hillert, University of Oxfordp. 1 Flavour tagging performance analysis for vertex detectors LCWS 2004, Paris Sonja Hillert (Oxford)
1 ZH Analysis Yambazi Banda, Tomas Lastovicka Oxford SiD Collaboration Meeting
Position Reconstruction in Miniature Detector Using a Multilayer Perceptron By Adam Levine.
Exploring Artificial Neural Networks to discover the Higgs boson at the LHC.
Artificial Intelligence Techniques Multilayer Perceptrons.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
Use of Multivariate Analysis (MVA) Technique in Data Analysis Rakshya Khatiwada 08/08/2007.
B-tagging Performance based on Boosted Decision Trees Hai-Jun Yang University of Michigan (with X. Li and B. Zhou) ATLAS B-tagging Meeting February 9,
E. Devetak – CERN CLIC1 LCFI: vertexing and flavour-tagging Erik Devetak Oxford University CERN-CLIC Meeting 14/05/09 Vertexing Flavour Tagging Charge.
Performance of Track and Vertex Reconstruction and B-Tagging Studies with CMS in pp Collisions at sqrt(s)=7 TeV Boris Mangano University of California,
Possibility of tan  measurement with in CMS Majid Hashemi CERN, CMS IPM,Tehran,Iran QCD and Hadronic Interactions, March 2005, La Thuile, Italy.
Measurement of b-tagging Fake rates in Atlas Data M. Saleem * In collaboration with Alexandre Khanov** F. Raztidinova**, P.Skubic * *University of Oklahoma,
Vertex finding and B-Tagging for the ATLAS Inner Detector A.H. Wildauer Universität Innsbruck CERN ATLAS Computing Group on behalf of the ATLAS collaboration.
B-Tagging Algorithms for CMS Physics
CSC321 Introduction to Neural Networks and Machine Learning Lecture 3: Learning in multi-layer networks Geoffrey Hinton.
An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Face Image-Based Gender Recognition Using Complex-Valued Neural Network Instructor :Dr. Dong-Chul Kim Indrani Gorripati.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Jet Tagging Studies at TeV LC Tomáš Laštovička, University of Oxford Linear Collider Physics/Detector Meeting 14/9/2009 CERN.
NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.
Introduction to Neural Networks Freek Stulp. 2 Overview Biological Background Artificial Neuron Classes of Neural Networks 1. Perceptrons 2. Multi-Layered.
Analysis of H  WW  l l Based on Boosted Decision Trees Hai-Jun Yang University of Michigan (with T.S. Dai, X.F. Li, B. Zhou) ATLAS Higgs Meeting September.
B-tagging based on Boosted Decision Trees
Perceptrons Michael J. Watts
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
E. Devetak - IOP 081 Anomalous – from tools to physics Erik Devetak Oxford - RAL IOP 2008 Lancaster‏ Anomalous coupling (Motivation – Theory)
1 Studies of Heavy Flavour Jet Tagging with ZVTOP in JAS3 Overview Jet flavour tagging in JAS3 Cosθ dependence Primary vertex momentum Use of neutral energy.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
B-Tagging Algorithms at the CMS Experiment Gavril Giurgiu (for the CMS Collaboration) Johns Hopkins University DPF-APS Meeting, August 10, 2011 Brown University,
EPS Manchester Daniela Bortoletto Associated Production for the Standard Model Higgs at CDF D. Bortoletto Purdue University Outline: Higgs at the.
Marcello Barisonzi First results of AOD analysis on t-channel Single Top DAD 30/05/2005 Page 1 AOD on Single Top Marcello Barisonzi.
Artificial Neural Networks By: Steve Kidos. Outline Artificial Neural Networks: An Introduction Frank Rosenblatt’s Perceptron Multi-layer Perceptron Dot.
Low Mass Standard Model Higgs Boson Searches at the Tevatron Andrew Mehta Physics at LHC, Split, Croatia, September 29th 2008 On behalf of the CDF and.
Neural Network Architecture Session 2
Erik Devetak Oxford University 18/09/2008
Using IP Chi-Square Probability
Tree-level New Physics searches in semileptonic decays at Belle
on top mass measurement based on B had decay length
B Tagging Efficiency and Mistag Rate Measurement in ATLAS
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
Jet Identification in QCD Samples at DØ
Training a Neural Network
of the Artificial Neural Networks.
b-tagging commissioning strategy for ATLAS
network of simple neuron-like computing elements
Neural Networks Chapter 5
Structure of a typical back-propagated multilayered perceptron used in this study. Structure of a typical back-propagated multilayered perceptron used.
Measurement of b-jet Shapes at CDF
Presentation transcript:

Exploring Artificial Neural Networks to Discover Higgs at LHC Using Neural Networks for B-tagging By Rohan Adur

Exploring Artificial Neural Networks to Discover Higgs at LHC Outline: What are Neural Networks and how do they work? How can Neural Networks be used in b- jet tagging to discover the Higgs boson? What results have I obtained using Neural Networks to find b-jets?

Neural Networks - Introduction Neural Networks simulate neurons in biological systems They are made up of neurons connected by synapses They are able to solve non-linear problems by learning from experience, rather than being explicitly programmed for a particular problem

The Simple Perceptron The Simple Perceptron is the simplest form of a Neural Network It consists of one layer of input units and one layer of output units, connected by weighted synapses Output layer Input layer Synapses connected by weights

The Simple Perceptron contd. Requires a training set, for which the required output is known Synapse weights start at random values. A learning algorithm then changes the weights until they give the correct output and the weights are frozen The trained network can then be used on data it has never seen before Output layer Input layer Synapses connected by weights

The Multilayer Perceptron The main drawback of the simple perceptron is that it is only able to solve linearly-separable problems Introduce a hidden layer to produce the Multilayer Perceptron The Multilayer Perceptron is able to solve non-linear problems Output layer Input layer Synapses Hidden Layer

Finding Higgs The Higgs boson is expected to decay to b- quarks, which will produce b-jets b-jet detection at LHC is important in detecting Higgs 40 million events happening per second b-taggers must reject light quark jets

b-tagging B mesons are able to travel a short distance before decaying, so b- jets will originate away from the primary vertex Several b-taggers exist IP3D tagger uses the Impact Parameter of the b-jets B ~ 1mm Primary Vertex B-jets IP Secondary Vertex SecVtx tagger reconstructs the secondary vertex and rejects jets which have a low probability of coming from this vertex

IP3D Tagger Good amount of separation between b-jets and light jets

b-tagger performance

Neural Network for b-tagging The current best tagger is a combination of IP3D and SV1 tag weights Using Neural Networks, can this tagger be combined with others to provide better separation?

The Multilayer Perceptron and b-tagging The TMultiLayerPerceptron class is an implementation of a Neural Network built into the ROOT framework It contains several learning methods. The best was found to be the default BFGS method Train with output = 1 for signal and output = 0 for background The b-tagging weights were obtained using the ATHENA release The data was obtained from Rome ttbar AOD files Once extracted, the weights were used to train the Neural Network

Results 5 Inputs used: Transverse momentum, IP3D tag, SV1 tag, SecVtx Tag and Mass 12 Hidden units and 1 Output unit

Results Contd.

EfficiencyIP3D+SV1Neural Network 60% % Rejection rates Mistagging efficiency EfficiencyIP3D+SV1Neural Network 60%1.14%0.73% 50%0.57%0.26% At fixed rejection RejectionIP3D+SV1Neural Network 10057%62%

Discussion of Results Using a Neural Network, b-taggers can be combined to provide up to double the purity at fixed efficiency At fixed rejection rate, the Neural Network provides 5% more signal than the IP3D+SV1 tagger alone Neural Network performance is not always reproducible. Each time training is undertaken a different network is produced

Conclusions Neural Networks are a powerful tool for b- jet classification Neural Networks can be used to significantly increase b-tagging efficiency/rejection ratios and could be useful in the search for Higgs Training a Neural Network on real data will be the next hurdle