Generative Models for Image Analysis Stuart Geman (with E. Borenstein, L.-B. Chang, W. Zhang)

Slides:



Advertisements
Similar presentations
Weakly supervised learning of MRF models for image region labeling Jakob Verbeek LEAR team, INRIA Rhône-Alpes.
Advertisements

Learning deformable models Yali Amit, University of Chicago Alain Trouvé, CMLA Cachan.
Pattern Recognition and Machine Learning
ECE 8443 – Pattern Recognition LECTURE 05: MAXIMUM LIKELIHOOD ESTIMATION Objectives: Discrete Features Maximum Likelihood Resources: D.H.S: Chapter 3 (Part.
Biointelligence Laboratory, Seoul National University
Chapter 4: Linear Models for Classification
Zhenwen Dai Jӧrg Lücke Frankfurt Institute for Advanced Studies,
Unsupervised Learning of Visual Object Categories Michael Pfeiffer
Visual Recognition Tutorial
CPSC 425: Computer Vision (Jan-April 2007) David Lowe Prerequisites: 4 th year ability in CPSC Math 200 (Calculus III) Math 221 (Matrix Algebra: linear.
Assuming normally distributed data! Naïve Bayes Classifier.
Region labelling Giving a region a name. Image Processing and Computer Vision: 62 Introduction Region detection isolated regions Region description properties.
HMM-BASED PATTERN DETECTION. Outline  Markov Process  Hidden Markov Models Elements Basic Problems Evaluation Optimization Training Implementation 2-D.
First introduced in 1977 Lots of mathematical derivation Problem : given a set of data (data is incomplete or having missing values). Goal : assume the.
Generic Object Recognition -- by Yatharth Saraf A Project on.
Announcements  Project proposal is due on 03/11  Three seminars this Friday (EB 3105) Dealing with Indefinite Representations in Pattern Recognition.
Primal Sketch Integrating Structure and Texture Ying Nian Wu UCLA Department of Statistics Keck Meeting April 28, 2006 Guo, Zhu, Wu (ICCV, 2003; GMBV,
Multiple Human Objects Tracking in Crowded Scenes Yao-Te Tsai, Huang-Chia Shih, and Chung-Lin Huang Dept. of EE, NTHU International Conference on Pattern.
Huang,Kaizhu Classifier based on mixture of density tree CSE Department, The Chinese University of Hong Kong.
Object Class Recognition Using Discriminative Local Features Gyuri Dorko and Cordelia Schmid.
Presented by Zeehasham Rasheed
Expectation Maximization for GMM Comp344 Tutorial Kai Zhang.
Classification and application in Remote Sensing.
Kernel Methods Part 2 Bing Han June 26, Local Likelihood Logistic Regression.
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
Multiple Object Class Detection with a Generative Model K. Mikolajczyk, B. Leibe and B. Schiele Carolina Galleguillos.
Using Image Priors in Maximum Margin Classifiers Tali Brayer Margarita Osadchy Daniel Keren.
Exercise Session 10 – Image Categorization
Methods in Medical Image Analysis Statistics of Pattern Recognition: Classification and Clustering Some content provided by Milos Hauskrecht, University.
ECSE 6610 Pattern Recognition Professor Qiang Ji Spring, 2011.
Irfan Essa, Alex Pentland Facial Expression Recognition using a Dynamic Model and Motion Energy (a review by Paul Fitzpatrick for 6.892)
SVCL Automatic detection of object based Region-of-Interest for image compression Sunhyoung Han.
Building local part models for category-level recognition C. Schmid, INRIA Grenoble Joint work with G. Dorko, S. Lazebnik, J. Ponce.
Statistical NLP: Lecture 8 Statistical Inference: n-gram Models over Sparse Data (Ch 6)
Computer Science Department Pacific University Artificial Intelligence -- Computer Vision.
ECE 8443 – Pattern Recognition LECTURE 07: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Class-Conditional Density The Multivariate Case General.
Lecture notes for Stat 231: Pattern Recognition and Machine Learning 3. Bayes Decision Theory: Part II. Prof. A.L. Yuille Stat 231. Fall 2004.
Automatic Image Annotation by Using Concept-Sensitive Salient Objects for Image Content Representation Jianping Fan, Yuli Gao, Hangzai Luo, Guangyou Xu.
Estimating the Likelihood of Statistical Models of Natural Image Patches Daniel Zoran ICNC – The Hebrew University of Jerusalem Advisor: Yair Weiss CifAR.
MACHINE LEARNING 8. Clustering. Motivation Based on E ALPAYDIN 2004 Introduction to Machine Learning © The MIT Press (V1.1) 2  Classification problem:
Face Detection Ying Wu Electrical and Computer Engineering Northwestern University, Evanston, IL
Bayesian Generalized Kernel Mixed Models Zhihua Zhang, Guang Dai and Michael I. Jordan JMLR 2011.
Parsing Images with Context/Content Sensitive Grammars Eran Borenstein, Stuart Geman, Ya Jin, Wei Zhang.
Separating Style and Content with Bilinear Models Joshua B. Tenenbaum, William T. Freeman Computer Examples Barun Singh 25 Feb, 2002.
Paper Reading Dalong Du Nov.27, Papers Leon Gu and Takeo Kanade. A Generative Shape Regularization Model for Robust Face Alignment. ECCV08. Yan.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Supervised Learning Resources: AG: Conditional Maximum Likelihood DP:
Lecture 2: Statistical learning primer for biologists
The generalization of Bayes for continuous densities is that we have some density f(y|  ) where y and  are vectors of data and parameters with  being.
KNN & Naïve Bayes Hongning Wang Today’s lecture Instance-based classifiers – k nearest neighbors – Non-parametric learning algorithm Model-based.
MIT AI Lab / LIDS Laboatory for Information and Decision Systems & Artificial Intelligence Laboratory Massachusetts Institute of Technology A Unified Multiresolution.
Protein Family Classification using Sparse Markov Transducers Proceedings of Eighth International Conference on Intelligent Systems for Molecular Biology.
CS332 Visual Processing Department of Computer Science Wellesley College High-Level Vision Face Recognition I.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Discriminative Training and Machine Learning Approaches Machine Learning Lab, Dept. of CSIE, NCKU Chih-Pin Liao.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Bayesian statistics named after the Reverend Mr Bayes based on the concept that you can estimate the statistical properties of a system after measuting.
Regularization of energy-based representations Minimize total energy E p (u) + (1- )E d (u,d) E p (u) : Stabilizing function - a smoothness constraint.
Machine Learning: A Brief Introduction Fu Chang Institute of Information Science Academia Sinica ext. 1819
Bayesian Inference and Visual Processing: Image Parsing & DDMCMC. Alan Yuille (Dept. Statistics. UCLA) Tu, Chen, Yuille & Zhu (ICCV 2003).
KNN & Naïve Bayes Hongning Wang
Generalization Performance of Exchange Monte Carlo Method for Normal Mixture Models Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
CH 5: Multivariate Methods
Lecture 26: Faces and probabilities
Markov Random Fields for Edge Classification
Image Parsing & DDMCMC. Alan Yuille (Dept. Statistics. UCLA)
Unsupervised Learning II: Soft Clustering with Gaussian Mixture Models
Separating Style and Content with Bilinear Models Joshua B
Mathematical Foundations of BME
Separating Style and Content with Bilinear Models Joshua B
Presentation transcript:

Generative Models for Image Analysis Stuart Geman (with E. Borenstein, L.-B. Chang, W. Zhang)

I.Bayesian (generative) image models II.Feature distributions and data distributions III.Conditional modeling IV.Sampling and the choice of null distribution V.Other applications of conditional modeling

I. Bayesian (generative) image models Prior Conditional likelihood Posterior focus here on

II. Feature distributions and data distributions image patch Model patch through a feature model:

e.g. detection and recognition of eyes image patch actually:

The first is fine for estimating λ but not fine for estimating T Use maximum likelihood…but what is the likelihood? ?

III. Conditional modeling

Conditional modeling: a perturbation of the null distribution

Estimation Much Easier!

Example: learning eye templates image patch

Example: learning eye templates

Maximize the data likelihood for the mixing probabilities, the feature parameters, and the templates themselves…

Example: learning (right) eye templates

How good are the templates? A classification experiment…

Classify East Asian and South Asian * mixing over 4 scales, and 8 templates East Asian: (L) examples of training images (M) progression of EM (R) trained templates South Asian: (L) examples of training images (M) progression of EM (R) trained templates Classification Rate: 97%

Other examples: noses 16 templates multiple scales, shifts, and rotations samples from training setlearned templates

Other examples: mixture of noses and mouths samples from training set (1/2 noses, 1/2 mouths) 32 learned templates

Other examples: train on 58 faces …half with glasses…half without 32 learned templates samples from training set 8 learned templates

Other examples: train on 58 faces …half with glasses…half without 8 learned templates random eight of the 58 faces row 2 to 4, top to bottom: templates ordered by posterior likelihood

Other examples: train random patches (“sparse representation”) 500 random 15x15 training patches from random internet images 24 10x10 templates

Other examples: coarse representation training of 8 low-res (10x10) templates sample from training set (down-converted images)

IV. Sampling and the choice of null distribution

(approximate) sampling…

V. Other applications of conditional modeling

Markov model Markov property… Estimation Computation Representation

Markov model

characters, plate sides generic letter, generic number, L-junctions of sides license plates parts of characters, parts of plate sides plate boundaries, strings (2 letters, 3 digits, 3 letters, 4 digits) license numbers (3 digits + 3 letters, 4 digits + 2 letters) Hierarchical models and the Markov dilemma

Original imageZoomed license region Top object: Markov distribution Top object: perturbed (“content-sensitive”) distribution Hierarchical models and the Markov dilemma

PATTERN SYNTHESIS = PATTERN ANALYSIS Ulf Grenander