Naïve Bayes Classifier

Slides:



Advertisements
Similar presentations
INTRODUCTION TO MACHINE LEARNING Bayesian Estimation.
Advertisements

INC 551 Artificial Intelligence Lecture 11 Machine Learning (Continue)
Data Mining Classification: Alternative Techniques
Classification Techniques: Decision Tree Learning
What we will cover here What is a classifier
Naïve Bayes Classifier
On Discriminative vs. Generative classifiers: Naïve Bayes
Data Mining Classification: Naïve Bayes Classifier
Algorithms: The basic methods. Inferring rudimentary rules Simplicity first Simple algorithms often work surprisingly well Many different kinds of simple.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Introduction to Bayesian Learning Bob Durrant School of Computer Science University of Birmingham (Slides: Dr Ata Kabán)
Review. 2 Statistical modeling  “Opposite” of 1R: use all the attributes  Two assumptions: Attributes are  equally important  statistically independent.
Introduction to Bayesian Learning Ata Kaban School of Computer Science University of Birmingham.
Revision (Part II) Ke Chen COMP24111 Machine Learning Revision slides are going to summarise all you have learnt from Part II, which should be helpful.
Thanks to Nir Friedman, HU
Jeff Howbert Introduction to Machine Learning Winter Classification Bayesian Classifiers.
Crash Course on Machine Learning
Naïve Bayes Classifier Ke Chen Extended by Longin Jan Latecki COMP20411 Machine Learning.
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
NAÏVE BAYES CLASSIFIER 1 ACM Student Chapter, Heritage Institute of Technology 10 th February, 2012 SIGKDD Presentation by Anirban Ghose Parami Roy Sourav.
Bayesian Networks. Male brain wiring Female brain wiring.
Naïve Bayes Classifier Ke Chen Modified and extended by Longin Jan Latecki
DATA MINING LECTURE 10 Classification k-nearest neighbor classifier Naïve Bayes Logistic Regression Support Vector Machines.
Last lecture summary Naïve Bayes Classifier. Bayes Rule Normalization Constant LikelihoodPrior Posterior Prior and likelihood must be learnt (i.e. estimated.
1 Data Mining Lecture 5: KNN and Bayes Classifiers.
Bayesian Learning CS446 -FALL ‘14 f:X  V, finite set of values Instances x  X can be described as a collection of features x = (x 1, x 2, … x n ) x i.
CS464 Introduction to Machine Learning1 Bayesian Learning Features of Bayesian learning methods: Each observed training example can incrementally decrease.
Bayesian Classification. Bayesian Classification: Why? A statistical classifier: performs probabilistic prediction, i.e., predicts class membership probabilities.
Classification Techniques: Bayesian Classification
Naïve Bayes Classifier Ke Chen Modified and extended by Longin Jan Latecki
Last lecture summary (SVM). Support Vector Machine Supervised algorithm Works both as – classifier (binary) – regressor De facto better linear classification.
Chapter 6 Bayesian Learning
Data Mining Practical Machine Learning Tools and Techniques Chapter 4: Algorithms: The Basic Methods Section 4.2 Statistical Modeling Rodney Nielsen Many.
Algorithms for Classification: The Basic Methods.
CHAPTER 6 Naive Bayes Models for Classification. QUESTION????
Machine Learning in Practice Lecture 5 Carolyn Penstein Rosé Language Technologies Institute/ Human-Computer Interaction Institute.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
DATA MINING LECTURE 10b Classification k-nearest neighbor classifier
COMP24111 Machine Learning Naïve Bayes Classifier Ke Chen.
Bayesian Learning Bayes Theorem MAP, ML hypotheses MAP learners
Generative classifiers: The Gaussian classifier Ata Kaban School of Computer Science University of Birmingham.
Bayesian Learning. Bayes Classifier A probabilistic framework for solving classification problems Conditional Probability: Bayes theorem:
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Classification COMP Seminar BCB 713 Module Spring 2011.
BAYESIAN LEARNING. 2 Bayesian Classifiers Bayesian classifiers are statistical classifiers, and are based on Bayes theorem They can calculate the probability.
Knowledge-based systems Sanaullah Manzoor CS&IT, Lahore Leads University
Data Mining Chapter 4 Algorithms: The Basic Methods Reporter: Yuen-Kuei Hsueh.
Review of Decision Tree Learning Bamshad Mobasher DePaul University Bamshad Mobasher DePaul University.
COMP24111 Machine Learning Naïve Bayes Classifier Ke Chen.
Last lecture summary Naïve Bayes Classifier. Bayes Rule Normalization Constant LikelihoodPrior Posterior Prior and likelihood must be learnt (i.e. estimated.
Bayesian Learning Reading: Tom Mitchell, “Generative and discriminative classifiers: Naive Bayes and logistic regression”, Sections 1-2. (Linked from.
Classification: Naïve Bayes Classifier
Naïve Bayes Classifier
Naive Bayes Classifier
CSE543: Machine Learning Lecture 2: August 6, 2014
text processing And naïve bayes
Data Science Algorithms: The Basic Methods
Naïve Bayes Classifier
Naïve Bayes Classifier
Data Mining Lecture 11.
Machine Learning Week 1.
Classification Techniques: Bayesian Classification
Revision (Part II) Ke Chen
Revision (Part II) Ke Chen
Naïve Bayes Classifier
Generative Models and Naïve Bayes
Artificial Intelligence 9. Perceptron
Generative Models and Naïve Bayes
NAÏVE BAYES CLASSIFICATION
Naïve Bayes Classifier
Naïve Bayes Classifier
Presentation transcript:

Naïve Bayes Classifier Ke Chen

Outline Background Probability Basics Probabilistic Classification Naïve Bayes Principle and Algorithms Example: Play Tennis Relevant Issues Summary

Background There are three methods to establish a classifier a) Model a classification rule directly Examples: k-NN, decision trees, perceptron, SVM b) Model the probability of class memberships given input data Example: perceptron with the cross-entropy cost c) Make a probabilistic model of data within each class Examples: naive Bayes, model based classifiers a) and b) are examples of discriminative classification c) is an example of generative classification b) and c) are both examples of probabilistic classification

Probability Basics Prior, conditional and joint probability for random variables Prior probability: Conditional probability: Joint probability: Relationship: Independence: Bayesian Rule

Probability Basics Quiz: We have two six-sided dice. When they are tolled, it could end up with the following occurance: (A) dice 1 lands on side “3”, (B) dice 2 lands on side “1”, and (C) Two dice sum to eight. Answer the following questions:

Probabilistic Classification Establishing a probabilistic model for classification Discriminative model Discriminative Probabilistic Classifier

Probabilistic Classification Establishing a probabilistic model for classification (cont.) Generative model Generative Probabilistic Model for Class 1 for Class 2 for Class L

Probabilistic Classification MAP classification rule MAP: Maximum A Posterior Assign x to c* if Generative classification with the MAP rule Apply Bayesian rule to convert them into posterior probabilities Then apply the MAP rule

Naïve Bayes Bayes classification Naïve Bayes classification Difficulty: learning the joint probability Naïve Bayes classification Assumption that all input features are conditionally independent! MAP classification rule: for For a class, the previous generative model can be decomposed by n generative models of a single input.

Naïve Bayes Algorithm: Discrete-Valued Features Learning Phase: Given a training set S, Output: conditional probability tables; for elements Test Phase: Given an unknown instance , Look up tables to assign the label c* to X’ if

Example Example: Play Tennis

Example Learning Phase 2/9 3/5 4/9 0/5 3/9 2/5 2/9 2/5 4/9 3/9 1/5 3/9 Outlook Play=Yes Play=No Sunny 2/9 3/5 Overcast 4/9 0/5 Rain 3/9 2/5 Temperature Play=Yes Play=No Hot 2/9 2/5 Mild 4/9 Cool 3/9 1/5 Humidity Play=Yes Play=No High 3/9 4/5 Normal 6/9 1/5 Wind Play=Yes Play=No Strong 3/9 3/5 Weak 6/9 2/5 P(Play=Yes) = 9/14 P(Play=No) = 5/14

Example Test Phase Given a new instance, predict its label x’=(Outlook=Sunny, Temperature=Cool, Humidity=High, Wind=Strong) Look up tables achieved in the learning phrase Decision making with the MAP rule P(Outlook=Sunny|Play=Yes) = 2/9 P(Temperature=Cool|Play=Yes) = 3/9 P(Huminity=High|Play=Yes) = 3/9 P(Wind=Strong|Play=Yes) = 3/9 P(Play=Yes) = 9/14 P(Outlook=Sunny|Play=No) = 3/5 P(Temperature=Cool|Play==No) = 1/5 P(Huminity=High|Play=No) = 4/5 P(Wind=Strong|Play=No) = 3/5 P(Play=No) = 5/14 P(Yes|x’): [P(Sunny|Yes)P(Cool|Yes)P(High|Yes)P(Strong|Yes)]P(Play=Yes) = 0.0053 P(No|x’): [P(Sunny|No) P(Cool|No)P(High|No)P(Strong|No)]P(Play=No) = 0.0206 Given the fact P(Yes|x’) < P(No|x’), we label x’ to be “No”.

Example Test Phase Given a new instance, x’=(Outlook=Sunny, Temperature=Cool, Humidity=High, Wind=Strong) Look up tables MAP rule P(Outlook=Sunny|Play=Yes) = 2/9 P(Temperature=Cool|Play=Yes) = 3/9 P(Huminity=High|Play=Yes) = 3/9 P(Wind=Strong|Play=Yes) = 3/9 P(Play=Yes) = 9/14 P(Outlook=Sunny|Play=No) = 3/5 P(Temperature=Cool|Play==No) = 1/5 P(Huminity=High|Play=No) = 4/5 P(Wind=Strong|Play=No) = 3/5 P(Play=No) = 5/14 P(Yes|x’): [P(Sunny|Yes)P(Cool|Yes)P(High|Yes)P(Strong|Yes)]P(Play=Yes) = 0.0053 P(No|x’): [P(Sunny|No) P(Cool|No)P(High|No)P(Strong|No)]P(Play=No) = 0.0206 Given the fact P(Yes|x’) < P(No|x’), we label x’ to be “No”.

Naïve Bayes Algorithm: Continuous-valued Features Numberless values for a feature Conditional probability often modeled with the normal distribution Learning Phase: Output: normal distributions and Test Phase: Given an unknown instance Instead of looking-up tables, calculate conditional probabilities with all the normal distributions achieved in the learning phrase Apply the MAP rule to make a decision

Naïve Bayes Example: Continuous-valued Features Temperature is naturally of continuous value. Yes: 25.2, 19.3, 18.5, 21.7, 20.1, 24.3, 22.8, 23.1, 19.8 No: 27.3, 30.1, 17.4, 29.5, 15.1 Estimate mean and variance for each class Learning Phase: output two Gaussian models for P(temp|C)

Relevant Issues Violation of Independence Assumption For many real world tasks, Nevertheless, naïve Bayes works surprisingly well anyway! Zero conditional probability Problem If no example contains the attribute value In this circumstance, during test For a remedy, conditional probabilities estimated with

Summary Naïve Bayes: the conditional independence assumption Training is very easy and fast; just requiring considering each attribute in each class separately Test is straightforward; just looking up tables or calculating conditional probabilities with estimated distributions A popular generative model Performance competitive to most of state-of-the-art classifiers even in presence of violating independence assumption Many successful applications, e.g., spam mail filtering A good candidate of a base learner in ensemble learning Apart from classification, naïve Bayes can do more…