Download presentation
Presentation is loading. Please wait.
Published byDarren Cummings Modified over 9 years ago
1
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Adaptive nonlinear manifolds and their applications to pattern recognition Hujun Yin, Weilin Huang InS, Vol.180, 2010, pp. 2649–2662. Presenter : Wei-Shen Tai 2010/5/26
2
N.Y.U.S.T. I. M. Intelligent Database Systems Lab 2 Outline Introduction Nonlinear PCA approaches Multidimensional scaling approaches SOM approaches Adaptive nonlinear manifolds for face recognition Conclusions Comments
3
N.Y.U.S.T. I. M. Intelligent Database Systems Lab 3 Motivation Data processing and projection techniques An increasingly important role in pattern recognition. SOM Its topology-preserving property is utilized to extract and visualize relative mutual relationships among the data.
4
N.Y.U.S.T. I. M. Intelligent Database Systems Lab 4 Objective A growing variant of the metric preserving ViSOM Builds the connection between SOMs and MDS. Shows the advantages of SOM-based methods in dimensionality reduction.
5
N.Y.U.S.T. I. M. Intelligent Database Systems Lab 5 Nonlinear PCA approaches Kernel PCA The entire input space is partitioned into non-overlapping regions and a local PCA can then be formed in each region. The first three layers of the associative network project the original data onto a curve or surface, The last three layers define the curve and surface.
6
N.Y.U.S.T. I. M. Intelligent Database Systems Lab 6 Nonlinear PCA approaches (cont.) Local linear embedding The local linearity is defined on a local neighborhood, via the e ball or k nearest neighbors. The data is then projected to the subspace spanned by the principal eigen functions. Principal curve Defined as a smooth and self-consistent curve. smoothing techniques: Initialization: Choose the first linear principal component as the initial curve. Projection: Project the data points onto the current curve and calculate the projections index. Exception: For each index, take the mean of data points projected onto it as the new curve point.
7
N.Y.U.S.T. I. M. Intelligent Database Systems Lab 7 Multidimensional scaling approaches MDS A traditional subject related to dimension reduction and data visualization. It aims to embed high dimensional data points onto a low dimensional (often 2-or 3-D) plane by preserving as closely as possible inter-point metrics. Raw Stress Sammon mapping is an intermediate normalization to preserve good local distributions and at the same time to maintain a global structure. Adaptive MDS MDS is the lack of an explicit projection function, as MDS is usually a point-to-point mapping and cannot naturally accommodate new data points. Implementing or parameterizing MDS using neural networks to overcome this limitation.
8
N.Y.U.S.T. I. M. Intelligent Database Systems Lab 8 SOM as nonmetric-scaling manifold Drawbacks For data visualization, the inter-neuron distances of the data space, have to be crudely or qualitatively marked by colors or grey levels on the trained map. The coordinates of the neurons are fixed on the lower dimensional grid and do not resemble the distances in the data space.
9
N.Y.U.S.T. I. M. Intelligent Database Systems Lab 9 ViSOM as metric-scaling manifold ViSOM For metric scaling and visualization, a direct and faithful display of data structure and distribution. Growing ViSOM (gViSOM) 1. Start with a small initial map (e.g. 5* 5), either rectangular or hexagonal. Set the desired resolution and the neighborhood size (locality). 2. Randomly draw a sample from the data space and find the winning neuron with the shortest distance. 3. If the sample falls within the neighborhood, update the weights of the neurons of the neighborhood using the ViSOM algorithm; otherwise go back to step 2.
10
N.Y.U.S.T. I. M. Intelligent Database Systems Lab 10 Growing ViSOM 4. At regular iteration intervals (e.g. 2000 iterations), if the growing condition is met (that is, the data is underrepresented by the existing map), grow the map by adding a column or row to the side with the highest activities (measured by the winning frequencies). The added column or row is a linear extrapolation of the existing map. 5. As in the ViSOM, at regular intervals (every certain number of iterations), refresh the map (neurons) probabilistically. 6. Check if the map has converged. If not go back to step 2; if so go to the next step. 7. Project the data samples onto the map, either to the neurons or by the LLP resolution enhancement.
11
N.Y.U.S.T. I. M. Intelligent Database Systems Lab 11 Adaptive nonlinear manifolds for face recognition
12
N.Y.U.S.T. I. M. Intelligent Database Systems Lab 12 Dimensionality reduction for face recognition
13
N.Y.U.S.T. I. M. Intelligent Database Systems Lab 13 Conclusion PCA approaches They still need to set a number of kernel or geometrical parameters. Usually, they are not adaptive learning methods. gViSOM An adaptive and metric preserving manifold based on ViSOM. For metric scaling and visualization, a direct and faithful display of data structure and distribution.
14
N.Y.U.S.T. I. M. Intelligent Database Systems Lab 14 Comments Advantage A variant of ViSOM for providing an adaptive and flexible structure map. It can overcome the limitation of fixed-size map in ViSOM. Drawback The description of parameter setting seems incomplete and rough. “These parameters were chosen on the best results obtained” is the only reason in this study. Application Pattern recognition and other applications related to dimensionality reduction.
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.