Instance Based Learning (Adapted from various sources)

Slides:



Advertisements
Similar presentations
Text Categorization.
Advertisements

1 Classification using instance-based learning. 3 March, 2000Advanced Knowledge Management2 Introduction (lazy vs. eager learning) Notion of similarity.
Nonparametric Methods: Nearest Neighbors
Data Mining Classification: Alternative Techniques
Data Mining Classification: Alternative Techniques
K-means method for Signal Compression: Vector Quantization
1 CS 391L: Machine Learning: Instance Based Learning Raymond J. Mooney University of Texas at Austin.
Instance Based Learning
1 Machine Learning: Lecture 7 Instance-Based Learning (IBL) (Based on Chapter 8 of Mitchell T.., Machine Learning, 1997)
Lazy vs. Eager Learning Lazy vs. eager learning
1er. Escuela Red ProTIC - Tandil, de Abril, Instance-Based Learning 4.1 Introduction Instance-Based Learning: Local approximation to the.
Classification and Decision Boundaries
Navneet Goyal. Instance Based Learning  Rote Classifier  K- nearest neighbors (K-NN)  Case Based Resoning (CBR)
Nearest Neighbor. Predicting Bankruptcy Nearest Neighbor Remember all your data When someone asks a question –Find the nearest old data point –Return.
K nearest neighbor and Rocchio algorithm
Instance based learning K-Nearest Neighbor Locally weighted regression Radial basis functions.
Learning from Observations Chapter 18 Section 1 – 4.
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Carla P. Gomes Module: Nearest Neighbor Models (Reading: Chapter.
Instance-Based Learning
These slides are based on Tom Mitchell’s book “Machine Learning” Lazy learning vs. eager learning Processing is delayed until a new instance must be classified.
CES 514 – Data Mining Lec 9 April 14 Mid-term k nearest neighbor.
Aprendizagem baseada em instâncias (K vizinhos mais próximos)
KNN, LVQ, SOM. Instance Based Learning K-Nearest Neighbor Algorithm (LVQ) Learning Vector Quantization (SOM) Self Organizing Maps.
Instance Based Learning Bob Durrant School of Computer Science University of Birmingham (Slides: Dr Ata Kabán) 1.
INSTANCE-BASE LEARNING
The UNIVERSITY of Kansas EECS 800 Research Seminar Mining Biological Data Instructor: Luke Huan Fall, 2006.
CS Instance Based Learning1 Instance Based Learning.
Nearest-Neighbor Classifiers Sec minutes of math... Definition: a metric function is a function that obeys the following properties: Identity:
1 Text Categorization  Assigning documents to a fixed set of categories  Applications:  Web pages  Recommending pages  Yahoo-like classification hierarchies.
K Nearest Neighborhood (KNNs)
DATA MINING LECTURE 10 Classification k-nearest neighbor classifier Naïve Bayes Logistic Regression Support Vector Machines.
1 Data Mining Lecture 5: KNN and Bayes Classifiers.
 2003, G.Tecuci, Learning Agents Laboratory 1 Learning Agents Laboratory Computer Science Department George Mason University Prof. Gheorghe Tecuci 9 Instance-Based.
© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/ Statistical Inference (By Michael Jordon) l Bayesian perspective –conditional perspective—inferences.
1 Instance Based Learning Ata Kaban The University of Birmingham.
Pattern Recognition April 19, 2007 Suggested Reading: Horn Chapter 14.
CpSc 881: Machine Learning Instance Based Learning.
CpSc 810: Machine Learning Instance Based Learning.
K nearest neighbors algorithm Parallelization on Cuda PROF. VELJKO MILUTINOVIĆ MAŠA KNEŽEVIĆ 3037/2015.
CS Machine Learning Instance Based Learning (Adapted from various sources)
Eick: kNN kNN: A Non-parametric Classification and Prediction Technique Goals of this set of transparencies: 1.Introduce kNN---a popular non-parameric.
Debrup Chakraborty Non Parametric Methods Pattern Recognition and Machine Learning.
Kansas State University Department of Computing and Information Sciences CIS 890: Special Topics in Intelligent Systems Wednesday, November 15, 2000 Cecil.
Instance-Based Learning Evgueni Smirnov. Overview Instance-Based Learning Comparison of Eager and Instance-Based Learning Instance Distances for Instance-Based.
1 Text Categorization  Assigning documents to a fixed set of categories  Applications:  Web pages  Recommending pages  Yahoo-like classification hierarchies.
Machine Learning Supervised Learning Classification and Regression K-Nearest Neighbor Classification Fisher’s Criteria & Linear Discriminant Analysis Perceptron:
Distance and Similarity Measures
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
Instance Based Learning
Ch8: Nonparametric Methods
Lecture 05: K-nearest neighbors
Classification Nearest Neighbor
K Nearest Neighbor Classification
Classification Nearest Neighbor
Nearest-Neighbor Classifiers
یادگیری بر پایه نمونه Instance Based Learning Instructor : Saeed Shiry
Text Categorization Assigning documents to a fixed set of categories
Instance Based Learning
COSC 4335: Other Classification Techniques
Nearest Neighbor Classifiers
Chap 8. Instance Based Learning
Machine Learning: UNIT-4 CHAPTER-1
Nearest Neighbors CSC 576: Data Mining.
Lecture 03: K-nearest neighbors
Data Mining Classification: Alternative Techniques
Nearest Neighbor Classifiers
CSE4334/5334 Data Mining Lecture 7: Classification (4)
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
INFORMATION RETRIEVAL TECHNIQUES BY DR. ADNAN ABID
Presentation transcript:

Instance Based Learning (Adapted from various sources) Fall 2004 Machine Learning Instance Based Learning (Adapted from various sources) CS478 - Machine Learning

Introduction Instance-based learning is often termed lazy learning, as there is typically no “transformation” of training instances into more general “statements” Instead, the presented training data is simply stored and, when a new query instance is encountered, a set of similar, related instances is retrieved from memory and used to classify the new query instance Hence, instance-based learners never form an explicit general hypothesis regarding the target function. They simply compute the classification of each new query instance as needed

k-NN Approach The simplest, most used instance-based learning algorithm is the k-NN algorithm k-NN assumes that all instances are points in some n-dimensional space and defines neighbors in terms of distance (usually Euclidean in R-space) k is the number of neighbors considered

Graphic Depiction Properties: All possible points within a sample's Voronoi cell are the nearest neighboring points for that sample For any sample, the nearest sample is determined by the closest Voronoi cell edge

Basic Idea Using the second property, the k-NN classification rule is to assign to a test sample the majority category label of its k nearest training samples In practice, k is usually chosen to be odd, so as to avoid ties The k = 1 rule is generally called the nearest-neighbor classification rule

k-NN Algorithm For each training instance t=(x, f(x)) Add t to the set Tr_instances Given a query instance q to be classified Let x1, …, xk be the k training instances in Tr_instances nearest to q Return Where V is the finite set of target class values, and δ(a,b)=1 if a=b, and 0 otherwise (Kronecker function) Intuitively, the k-NN algorithm assigns to each new query instance the majority class among its k nearest neighbors

Simple Illustration q is + under 1-NN, but – under 5-NN

Distance-weighted k-NN Replace by:

Scale Effects Different features may have different measurement scales E.g., patient weight in kg (range [50,200]) vs. blood protein values in ng/dL (range [-3,3]) Consequences Patient weight will have a much greater influence on the distance between samples May bias the performance of the classifier

Standardization Transform raw feature values into z-scores is the value for the ith sample and jth feature is the average of all for feature j is the standard deviation of all over all input samples Range and scale of z-scores should be similar (providing distributions of raw feature values are alike)

Distance Metrics Minkowsky = l-norm

Issues with Distance Metrics Most distance measures were designed for linear/real-valued attributes Two important questions in the context of machine learning: How best to handle nominal attributes What to do when attribute types are mixed

Distance for Nominal Attributes Do a single example here. Shape (Round, Square) 6/10 and 3/5

Distance for Heterogeneous Data Wilson, D. R. and Martinez, T. R., Improved Heterogeneous Distance Functions, Journal of Artificial Intelligence Research, vol. 6, no. 1, pp. 1-34, 1997

Some Remarks k-NN works well on many practical problems and is fairly noise tolerant (depending on the value of k) k-NN is subject to the curse of dimensionality (i.e., presence of many irrelevant attributes) k-NN needs adequate distance measure k-NN relies on efficient indexing