The blue and green colors are actually the same.

Slides:



Advertisements
Similar presentations
Applications of one-class classification
Advertisements

Godfather to the Singularity
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
Data Mining Classification: Alternative Techniques
Quiz 1 on Wednesday ~20 multiple choice or short answer questions
An Overview of Machine Learning
Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem
CS 590M Fall 2001: Security Issues in Data Mining Lecture 3: Classification.
Project 4 out today –help session today –photo session today Project 2 winners Announcements.
Machine Learning Usman Roshan Dept. of Computer Science NJIT.
Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem
Computer Vision James Hays, Brown
Step 3: Classification Learn a decision rule (classifier) assigning bag-of-features representations of images to different classes Decision boundary Zebra.
CSE 185 Introduction to Computer Vision Pattern Recognition.
CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu Lecture 35 – Review for midterm.
Data mining and machine learning A brief introduction.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
ADVANCED CLASSIFICATION TECHNIQUES David Kauchak CS 159 – Fall 2014.
CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu Lecture 24 – Classifiers 1.
Machine Learning Crash Course Computer Vision James Hays Slides: Isabelle Guyon, Erik Sudderth, Mark Johnson, Derek Hoiem Photo: CMU Machine Learning Department.
Image Categorization Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 03/15/11.
Classifiers Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/09/15.
Empirical Research Methods in Computer Science Lecture 7 November 30, 2005 Noah Smith.
Classifiers Given a feature representation for images, how do we learn a model for distinguishing features from different classes? Zebra Non-zebra Decision.
CSSE463: Image Recognition Day 11 Lab 4 (shape) tomorrow: feel free to start in advance Lab 4 (shape) tomorrow: feel free to start in advance Test Monday.
Modern Boundary Detection II Computer Vision CS 143, Brown James Hays Many slides Michael Maire, Jitendra Malek Szeliski 4.2.
Classification Derek Hoiem CS 598, Spring 2009 Jan 27, 2009.
Visual Categorization With Bags of Keypoints Original Authors: G. Csurka, C.R. Dance, L. Fan, J. Willamowski, C. Bray ECCV Workshop on Statistical Learning.
Overview of the final test for CSC Overview PART A: 7 easy questions –You should answer 5 of them. If you answer more we will select 5 at random.
COMP24111: Machine Learning Ensemble Models Gavin Brown
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 12: Advanced Discriminant Analysis Objectives:
Lecture 5: Statistical Methods for Classification CAP 5415: Computer Vision Fall 2006.
Representation in Vision Derek Hoiem CS 598, Spring 2009 Jan 22, 2009.
Computer Vision Lecture 7 Classifiers. Computer Vision, Lecture 6 Oleh Tretiak © 2005Slide 1 This Lecture Bayesian decision theory (22.1, 22.2) –General.
1 Kernel Machines A relatively new learning methodology (1992) derived from statistical learning theory. Became famous when it gave accuracy comparable.
Machine Learning Usman Roshan Dept. of Computer Science NJIT.
Usman Roshan Dept. of Computer Science NJIT
Data Mining, Machine Learning, Data Analysis, etc. scikit-learn
Machine Learning Models
Photo: CMU Machine Learning Department Protests G20
Photo: CMU Machine Learning Department Protests G20
Machine Learning Crash Course
IMAGE PROCESSING RECOGNITION AND CLASSIFICATION
Machine Learning Crash Course
CSSE463: Image Recognition Day 11
Table 1. Advantages and Disadvantages of Traditional DM/ML Methods
CS 2750: Machine Learning Dimensionality Reduction
The Elements of Statistical Learning
COMP61011 : Machine Learning Ensemble Models
Basic machine learning background with Python scikit-learn
Recognition using Nearest Neighbor (or kNN)
Special Topics in Data Mining Applications Focus on: Text Mining
Overview of machine learning for computer vision
Machine Learning Crash Course
CSSE463: Image Recognition Day 11
Machine Learning 101 Intro to AI, ML, Deep Learning
CS4670: Intro to Computer Vision
CS4670: Intro to Computer Vision
Data Mining, Machine Learning, Data Analysis, etc. scikit-learn
Data Mining, Machine Learning, Data Analysis, etc. scikit-learn
Announcements Project 4 out today Project 2 winners help session today
CSSE463: Image Recognition Day 11
CSSE463: Image Recognition Day 11
Introduction.
Lecture: Object Recognition
Derek Hoiem CS 598, Spring 2009 Jan 27, 2009
Usman Roshan Dept. of Computer Science NJIT
The “Margaret Thatcher Illusion”, by Peter Thompson
Support Vector Machines 2
Presentation transcript:

The blue and green colors are actually the same

Machine Learning: Overview Computer Vision James Hays, Brown 09/19/11 Slides: Isabelle Guyon, Erik Sudderth, Mark Johnson, Derek Hoiem Photo: CMU Machine Learning Department protests G20

Machine learning: Overview Core of ML: Making predictions or decisions from Data. This overview will not go in to depth about the statistical underpinnings of learning methods. Were looking at ML as a tool. Take CSCI 1950-F: Introduction to Machine Learning to learn more about ML.

Impact of Machine Learning Machine Learning is arguably the greatest export from computing to other scientific fields.

Machine Learning Applications Slide: Isabelle Guyon

Image Categorization Training Labels Training Images Classifier Training Training Image Features Trained Classifier Slide: Derek Hoiem

Image Categorization Training Labels Training Images Classifier Training Training Image Features Testing Test Image Trained Classifier Outdoor Prediction Slide: Derek Hoiem

Claim: The decision to use machine learning is more important than the choice of a particular learning method. If you hear somebody talking of a specific learning mechanism, be wary (e.g. YouTube comment "Oooh, we could plug this in to a Neural network and blah blah blah)

Example: Boundary Detection Is this a boundary?

Image features Training Labels Training Images Classifier Training Training Image Features Trained Classifier Slide: Derek Hoiem

General Principles of Representation Coverage – Ensure that all relevant info is captured Concision – Minimize number of features without sacrificing coverage Directness – Ideal features are independently useful for prediction Image Intensity Slide: Derek Hoiem

Image representations Templates – Intensity, gradients, etc. Histograms – Color, texture, SIFT descriptors, etc. Slide: Derek Hoiem

Classifiers Training Labels Training Images Classifier Training Training Image Features Trained Classifier Slide: Derek Hoiem

Learning a classifier Given some set of features with corresponding labels, learn a function to predict the labels from the features xx x x x x x x o o o o o x2 x1 Slide: Derek Hoiem

One way to think about it… Training labels dictate that two examples are the same or different, in some sense Features and distance measures define visual similarity Classifiers try to learn weights or parameters for features and distance measures so that visual similarity predicts label similarity Slide: Derek Hoiem

Slide: Erik Sudderth

Dimensionality Reduction PCA, ICA, LLE, Isomap PCA is the most important technique to know. It takes advantage of correlations in data dimensions to produce the best possible lower dimensional representation, according to reconstruction error. PCA should be used for dimensionality reduction, not for discovering patterns or making predictions. Don't try to assign semantic meaning to the bases.

Many classifiers to choose from SVM Neural networks Naïve Bayes Bayesian network Logistic regression Randomized Forests Boosted Decision Trees K-nearest neighbor RBMs Etc. Which is the best one? Slide: Derek Hoiem

Next Two Lectures: Friday we'll talk about clustering methods (k-means, mean shift) and their common usage in computer vision -- building "bag of words representations inspired by the NLP community. We'll be using these models for projects 2 and 3. Monday we'll focus specifically on classification methods, e.g. nearest neighbor, naïve-Bayes, decision trees, linear SVM, Kernel methods. Well be using these for projects 3 and 4 (and optionally 2).