Olivier Duchenne , Armand Joulin , Jean Ponce Willow Lab , ICCV2011.

Slides:



Advertisements
Similar presentations
Introduction to Support Vector Machines (SVM)
Advertisements

Lecture 9 Support Vector Machines
ECG Signal processing (2)
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
Three things everyone should know to improve object retrieval
Support Vector Machine & Its Applications Mingyue Tan The University of British Columbia Nov 26, 2004 A portion (1/3) of the slides are taken from Prof.
SVM - Support Vector Machines A new classification method for both linear and nonlinear data It uses a nonlinear mapping to transform the original training.

1 Welcome to the Kernel-Class My name: Max (Welling) Book: There will be class-notes/slides. Homework: reading material, some exercises, some MATLAB implementations.
An Introduction of Support Vector Machine
Support Vector Machines
SVM—Support Vector Machines
Pattern Recognition and Machine Learning: Kernel Methods.
Support vector machine
Machine learning continued Image source:
Discriminative and generative methods for bags of features
Support Vector Machines
Support Vector Machine
Pattern Recognition and Machine Learning
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
The value of kernel function represents the inner product of two training points in feature space Kernel functions merge two steps 1. map input data from.
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
University of Texas at Austin Machine Learning Group Department of Computer Sciences University of Texas at Austin Support Vector Machines.
Support Vector Classification (Linearly Separable Case, Primal) The hyperplanethat solves the minimization problem: realizes the maximal margin hyperplane.
1 Introduction to Kernels Max Welling October (chapters 1,2,3,4)
Classification Problem 2-Category Linearly Separable Case A- A+ Malignant Benign.
Support Vector Machines and Kernel Methods
Support Vector Machines
The Implicit Mapping into Feature Space. In order to learn non-linear relations with a linear machine, we need to select a set of non- linear features.
Kernel Methods and SVM’s. Predictive Modeling Goal: learn a mapping: y = f(x;  ) Need: 1. A model structure 2. A score function 3. An optimization strategy.
Lecture 10: Support Vector Machines
Learning in Feature Space (Could Simplify the Classification Task)  Learning in a high dimensional space could degrade generalization performance  This.
Radial-Basis Function Networks
Overview of Kernel Methods Prof. Bennett Math Model of Learning and Discovery 2/27/05 Based on Chapter 2 of Shawe-Taylor and Cristianini.
Support Vector Machine & Image Classification Applications
Support Vector Machine (SVM) Based on Nello Cristianini presentation
计算机学院 计算感知 Support Vector Machines. 2 University of Texas at Austin Machine Learning Group 计算感知 计算机学院 Perceptron Revisited: Linear Separators Binary classification.
IEEE TRANSSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
Svetlana Lazebnik, Cordelia Schmid, Jean Ponce
Kernel Methods A B M Shawkat Ali 1 2 Data Mining ¤ DM or KDD (Knowledge Discovery in Databases) Extracting previously unknown, valid, and actionable.
Support Vector Machines Reading: Ben-Hur and Weston, “A User’s Guide to Support Vector Machines” (linked from class web page)
Classifiers Given a feature representation for images, how do we learn a model for distinguishing features from different classes? Zebra Non-zebra Decision.
An Introduction to Support Vector Machines (M. Law)
Beyond Sliding Windows: Object Localization by Efficient Subwindow Search The best paper prize at CVPR 2008.
Kernels Usman Roshan CS 675 Machine Learning. Feature space representation Consider two classes shown below Data cannot be separated by a hyperplane.
Support Vector Machines Project מגישים : גיל טל ואורן אגם מנחה : מיקי אלעד נובמבר 1999 הטכניון מכון טכנולוגי לישראל הפקולטה להנדסת חשמל המעבדה לעיבוד וניתוח.
Support vector machine LING 572 Fei Xia Week 8: 2/23/2010 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A 1.
Support Vector Machines. Notation Assume a binary classification problem. –Instances are represented by vector x   n. –Training examples: x = (x 1,
Support Vector Machines Exercise solutions Ata Kaban The University of Birmingham.
Support Vector Machines Reading: Ben-Hur and Weston, “A User’s Guide to Support Vector Machines” (linked from class web page)
CSSE463: Image Recognition Day 15 Today: Today: Your feedback: Your feedback: Projects/labs reinforce theory; interesting examples, topics, presentation;
Support Vector Machines Part 2. Recap of SVM algorithm Given training set S = {(x 1, y 1 ), (x 2, y 2 ),..., (x m, y m ) | (x i, y i )   n  {+1, -1}
Support Vector Machine
CS 9633 Machine Learning Support Vector Machines
PREDICT 422: Practical Machine Learning
Recap Finds the boundary with “maximum margin”
ECE 5424: Introduction to Machine Learning
Support Vector Machines
Lecture 19. SVM (III): Kernel Formulation
Paper Presentation: Shape and Matching
Kernels Usman Roshan.
Support Vector Machines Introduction to Data Mining, 2nd Edition by
CS 2750: Machine Learning Support Vector Machines
Neuro-Computing Lecture 4 Radial Basis Function Network
Brief Review of Recognition + Context
Recap Finds the boundary with “maximum margin”
Usman Roshan CS 675 Machine Learning
CSSE463: Image Recognition Day 15
A Graph-Matching Kernel for Object Categorization
Introduction to Machine Learning
Presentation transcript:

Olivier Duchenne , Armand Joulin , Jean Ponce Willow Lab , ICCV2011

 Many applications: 1. Object recognition 2. Text categorization 3. time-series prediction 4. Gene expression profile analysis......

 Given a set of data (x 1, y 1 ), (x 2, y 2 ),..., (x n, y n ), the Kernel Method maps them into a potentially much higher dimensional feature space F.

 For a given learning problem one now considers the same algorithm in instead of R N, one works with the sample The kernel method seeks a pattern among the data in the feature sapce.

Idea: The nonlinear problem in a lower space can be solved by a linear method in a higher space. Example:

 【 Kernel function 】 A kernel function is a function k that for all x, z ∈ X satisfies where is a mapping from X to an (inner product) feature space F 

 The computation of a scalar product between two feature space vectors, can be readily reformulated in terms of a kernel function k

Is necessary? Not necessary What kind of k can be used? symmetric positive semi-definite ( kernel matrix ) Given a feature mapping, caan we compute the inner product in feature space? Yes Given a kernel function k, whether a feature mapping is existence? Yes [Mercer’s theorem]

 Linear Kernel  Polynomial Kernel  RBF (Gaussian) Kernel  Inverse multiquadric Kernel

 Kernel matrix  Consider the problem of finding a real-valued linear function that best intopolates a given training set S = {(x 1, y 1 ), (x 2, y 2 ),..., (x l, y l )} (least square)

 Dual form where K is the kernel matrix.

CAT DINOSAUR PAND A

 Feature correspondences can be used to construct an image comparison kernel that is appropriate for SVM-based classification, and often outperforms BOFs.  Image representations that enforce some degree of spatial consistency usually perform better in image classification tasks than pure bags of features that discard all spatial information.

 We need to design a good image similarity measure: ≈ ?

Graph-matching Method in this paper Sparse Features NN Classifier Slow Use pair-wise Information Lower performance As Dense SVM Classifier Fast enough Use pair-wise Information State-of-the-art performance

 An image I = a graph G = Nodes + Edges A node n=d n (x n,y n ) represent a region of I,  Each region is represented by a image Feature vector F n,e.g. SIFT....

Matching two iamges is realized by maximizing the energy function: