Presentation is loading. Please wait.

Presentation is loading. Please wait.

半年工作小结 报告人:吕小惠 2011 年 8 月 25 日. 报告提纲 一.学习了 Non-negative Matrix Factorization convergence proofs 二.学习了 Sparse Non-negative Matrix Factorization 算法 三.学习了线性代数中有关子空间等基础知.

Similar presentations


Presentation on theme: "半年工作小结 报告人:吕小惠 2011 年 8 月 25 日. 报告提纲 一.学习了 Non-negative Matrix Factorization convergence proofs 二.学习了 Sparse Non-negative Matrix Factorization 算法 三.学习了线性代数中有关子空间等基础知."— Presentation transcript:

1 半年工作小结 报告人:吕小惠 2011 年 8 月 25 日

2 报告提纲 一.学习了 Non-negative Matrix Factorization convergence proofs 二.学习了 Sparse Non-negative Matrix Factorization 算法 三.学习了线性代数中有关子空间等基础知 识 四.学习了 Ma yi 等人提出的 Generalized PCA 的相关理论

3 GPCA 论文信息 Generalized Principal Component Analysis(GPCA) René Vidal,Member IEEE;Yi Ma,Member IEEE;Shankar Sastry,Fellow IEEE IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE,VOL. 27,NO. 12, Page 1945-1959,December 2005

4 GPCA 从数据中寻找基向量

5 GPCA Abstract This paper presents an algebro-geometric solution to the problem of segmenting an unknown number of subspaces of unknown and varying dimensions from sample data points.

6 GPCA Abstract We represent the subspaces with a set of homogeneous polynomials whose degree is the number of the subspaces and the derivatives at a data point give normal vectors to the subspace passing through the point.

7 GPCA Abstract When the number of subspaces is known,we show that these polynomials can be estimated linearly from data,hence subspace segmentation is reduced to classifying one point per subspace.

8 GPCA Abstract We select these points optimally from the data set by minimizing certain distance function,thus dealing automatically with moderate noise in the data. A basis for the complement of each subspace is then recovered by applying standard PCA to the collection of derivatives(normal vectors).

9 GPCA Abstract Extensions of GPCA that deal with data in a high- dimensional space and with an unknown number of subspaces are also presented. Our experiments on low-dimensional data show that GPCA out-performs existing algebraic algorithms based on polynomials factorization and provides a good initialization to iterative techniques such as K- subspace and Expectation Maximization.

10 GPCA Abstract We also present applications of GPCA to computer vision problems such as face clustering, temporal video segmentation, and 3-D motion segmentation from point correspondences in multiple affine views.

11 Subspace Segmentation We consider the following alternative extension of PCA to the case of data lying a union of subspaces as illustrated in Figure 1 for two subspaces in R 3.

12 Subspace Segmentation

13 GPCA Theorem 1(Generalized Principal Component Analysis): A union of n subspaces of R D,can be represented with a set of homogeneous polynomials of degree n in D variables. These polynomials can be estimated linearly given enough sample points in general position in the subspaces.

14 GPCA A basis for the complement of each subspace can be obtained from the derivatives of these polynomials at a point in each of the subspaces. Such points can be recursively selected via polynomial division. Therefore, the subspace segmentation problem is mathematically equivalent to fitting,differentiating,and dividing a set of homogeneous polynomials. 拟合,微分,分 解

15 Thank you! 报告人:吕小惠 2011 年 8 月 25 日


Download ppt "半年工作小结 报告人:吕小惠 2011 年 8 月 25 日. 报告提纲 一.学习了 Non-negative Matrix Factorization convergence proofs 二.学习了 Sparse Non-negative Matrix Factorization 算法 三.学习了线性代数中有关子空间等基础知."

Similar presentations


Ads by Google