Learning Spatially Localized, Parts- Based Representation.

Slides:



Advertisements
Similar presentations
Linear Subspaces - Geometry. No Invariants, so Capture Variation Each image = a pt. in a high-dimensional space. –Image: Each pixel a dimension. –Point.
Advertisements

Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
SPC-advanced training course X-R 控制图. SPC-advanced training course 過程控制 a) 計算及繪畫控制限 - 樣本容量低於 7 時,極差並沒有 LCL.
© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/ Support Vector Machines l Find a linear hyperplane (decision boundary) that will separate.
Input Space versus Feature Space in Kernel- Based Methods Scholkopf, Mika, Burges, Knirsch, Muller, Ratsch, Smola presented by: Joe Drish Department of.
Computer vision: models, learning and inference Chapter 13 Image preprocessing and feature extraction.
Nonsmooth Nonnegative Matrix Factorization (nsNMF) Alberto Pascual-Montano, Member, IEEE, J.M. Carazo, Senior Member, IEEE, Kieko Kochi, Dietrich Lehmann,
Proposed concepts illustrated well on sets of face images extracted from video: Face texture and surface are smooth, constraining them to a manifold Recognition.
As applied to face recognition.  Detection vs. Recognition.
A Comprehensive Study on Third Order Statistical Features for Image Splicing Detection Xudong Zhao, Shilin Wang, Shenghong Li and Jianhua Li Shanghai Jiao.
São Paulo Advanced School of Computing (SP-ASC’10). São Paulo, Brazil, July 12-17, 2010 Looking at People Using Partial Least Squares William Robson Schwartz.
特徵值與多變量 1 Definition 1 If A is an n  n matrix, a real number λ is called an eigenvalue of A if If A is an n  n matrix, a real number λ is called an eigenvalue.
Face Recognition Under Varying Illumination Erald VUÇINI Vienna University of Technology Muhittin GÖKMEN Istanbul Technical University Eduard GRÖLLER Vienna.
1.1 線性方程式系統簡介 1.2 高斯消去法與高斯-喬登消去法 1.3 線性方程式系統的應用(-Skip-)
Principal Component Analysis
具備人臉追蹤與辨識功能的一個 智慧型數位監視系統 系統架構 在巡邏模式中 ,攝影機會左右來回巡視,並 利用動態膚色偵測得知是否有移動膚色物體, 若有移動的膚色物體則進入到追蹤模式,反之 則繼續巡視。
Introduction to Java Programming Lecture 17 Abstract Classes & Interfaces.
CONTENT BASED FACE RECOGNITION Ankur Jain 01D05007 Pranshu Sharma Prashant Baronia 01D05005 Swapnil Zarekar 01D05001 Under the guidance of Prof.
Principal Component Analysis IML Outline Max the variance of the output coordinates Optimal reconstruction Generating data Limitations of PCA.
Face Recognition using PCA (Eigenfaces) and LDA (Fisherfaces)
: Happy Number ★ ? 題組: Problem Set Archive with Online Judge 題號: 10591: Happy Number 解題者:陳瀅文 解題日期: 2006 年 6 月 6 日 題意:判斷一個正整數 N 是否為 Happy Number.
自動機 (Automata) Time: 1:10~2:00 Monday: practice exercise, quiz 2:10~4:00 Wednesday: lecture Textbook: (new!) An Introduction to Formal Languages and Automata,
Multidimensional Analysis If you are comparing more than two conditions (for example 10 types of cancer) or if you are looking at a time series (cell cycle.
Face Recognition Jeremy Wyatt.
: Problem G e-Coins ★★★☆☆ 題組: Problem Set Archive with Online Judge 題號: 10306: Problem G e-Coins 解題者:陳瀅文 解題日期: 2006 年 5 月 2 日 題意:給定一個正整數 S (0
1 Short Vectors of Planar Lattices Via Continued Fraction Friedrich Eisenbrand Information Processing Letters, 田錦燕 95/05/5.
2005/7 Linear system-1 The Linear Equation System and Eliminations.
Face Collections : Rendering and Image Processing Alexei Efros.
Smart Traveller with Visual Translator for OCR and Face Recognition LYU0203 FYP.
9.8 Solution of Differential Equations by Means of Taylor Series.
CS 485/685 Computer Vision Face Recognition Using Principal Components Analysis (PCA) M. Turk, A. Pentland, "Eigenfaces for Recognition", Journal of Cognitive.
SVD(Singular Value Decomposition) and Its Applications
Summarized by Soo-Jin Kim
Facial Feature Extraction by Kernel Independent Component Analysis
Presented By Wanchen Lu 2/25/2013
Supervised Hebbian Learning
Feature extraction 1.Introduction 2.T-test 3.Signal Noise Ratio (SNR) 4.Linear Correlation Coefficient (LCC) 5.Principle component analysis (PCA) 6.Linear.
Non Negative Matrix Factorization
1 Consider a system of linear equations.  The variables, or unknowns, are referred to as x 1, x 2, …, x n while the a ij ’s and b j ’s are constants.
1 Recognition by Appearance Appearance-based recognition is a competing paradigm to features and alignment. No features are extracted! Images are represented.
Local Non-Negative Matrix Factorization as a Visual Representation Tao Feng, Stan Z. Li, Heung-Yeung Shum, HongJiang Zhang 2002 IEEE Presenter : 張庭豪.
REVERSIBLE AND HIGH- CAPACITY DATA HIDING IN MEDICAL IMAGES 報告學生:翁偉傑 1 Published in IET Image Processing Received on 25th June 2008 Revised on 15th June.
Face Recognition by Support Vector Machines 指導教授 : 王啟州 教授 學生 : 陳桂華 Guodong Guo, Stan Z. Li, and Kapluk Chan School of Electrical and Electronic Engineering.
CSE 185 Introduction to Computer Vision Face Recognition.
資訊碩一 吳振華 An Extended Method of the Parametric Eigenspace Method by Automatic Background Elimination.
Intelligent Control and Automation, WCICA 2008.
An Efficient Greedy Method for Unsupervised Feature Selection
A NOVEL METHOD FOR COLOR FACE RECOGNITION USING KNN CLASSIFIER
Face recognition via sparse representation. Breakdown Problem Classical techniques New method based on sparsity Results.
Speech Lab, ECE, State University of New York at Binghamton  Classification accuracies of neural network (left) and MXL (right) classifiers with various.
NONNEGATIVE MATRIX FACTORIZATION WITH MATRIX EXPONENTIATION Siwei Lyu ICASSP 2010 Presenter : 張庭豪.
MACHINE LEARNING 7. Dimensionality Reduction. Dimensionality of input Based on E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)
Principal Component Analysis (PCA).
2D-LDA: A statistical linear discriminant analysis for image matrix
Face detection and recognition Many slides adapted from K. Grauman and D. Lowe.
Non-Negative Matrix Factorization ( NMF ) Reportor: MaPeng Paper :D.D.Lee andS.Seung,”Learning the parts of objects by non-negative matrix factorization”
Detecting and Locating Human Eyes in Face Images Based on Progressive Thresholding Reporter: Kai-Lin Yang Date:2012/01/06 Authors: IEEE International Conference.
Principal Component Analysis (PCA)
School of Computer Science & Engineering
Machine Learning Dimensionality Reduction
Context-based vision system for place and object recognition
Outline Multilinear Analysis
Principal Component Analysis
Outline Peter N. Belhumeur, Joao P. Hespanha, and David J. Kriegman, “Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection,”
Object Modeling with Layers
Principal Component Analysis
PCA is “an orthogonal linear transformation that transfers the data to a new coordinate system such that the greatest variance by any projection of the.
Dimensionality Reduction
Non-Negative Matrix Factorization
NON-NEGATIVE COMPONENT PARTS OF SOUND FOR CLASSIFICATION Yong-Choon Cho, Seungjin Choi, Sung-Yang Bang Wen-Yi Chu Department of Computer Science &
Presentation transcript:

Learning Spatially Localized, Parts- Based Representation

Abstract  In this paper, we propose a novel method, called local non-negative matrix factorization (LNMF).  This gives a set of bases which not only allows a non-subtractive representation of image but also manifests localized features.

Introduction  The case of N*M image pixels, each taking a value in {0,1, …,255};there is a huge number of possible configurations:  Subspace analysis helps to reveal dimensional structures if patterns observed in high dimensional spaces.

Introduction (PCA) Principal Component Analysis (PCA)  Dimension reduction is achieved by discarding least significant components.  PCA is unable to extract basis components manifesting localized features.

Introduction (NMF) Non-negative matrix factorization (NMF) NMF 特殊的地方在於其對矩陣分解過程的非負限制。這限制會使得能 得到更好的反應原始數據的局部特徵。

Method (NMF) NMF: Constrained Non-Negative Matrix Factorization  Let a set of training images be given as an n* matrix X.  A basis image by n*m matrix B.  H is the matrix of m* coefficients of weights.  Dimension reduction is achieved when m<n. Kullback – Leibler divergence

Method (LNMF) LNMF: Given the existing constrains for all i, we wish that should be as small as possible. Imposed by =min. Different bases should be as orthogonal as possible, so as to minimize redundancy. Imposed by. Only components giving most important information should be retained. Imposed by.

Experiments Data Preparation The set of the 10 images for each person is randomly partitioned into training subset of 5 images and a test set of the other 5. The training set is then used to learn basis components, and the test set for evaluate.

Experiments Learning Basis Components

Experiments Reconstruction

Experiments Face Recognition

Experiments Face Recognition