Presentation is loading. Please wait.

Presentation is loading. Please wait.

Random Projections of Signal Manifolds Michael Wakin and Richard Baraniuk Random Projections for Manifold Learning Chinmay Hegde, Michael Wakin and Richard.

Similar presentations


Presentation on theme: "Random Projections of Signal Manifolds Michael Wakin and Richard Baraniuk Random Projections for Manifold Learning Chinmay Hegde, Michael Wakin and Richard."— Presentation transcript:

1 Random Projections of Signal Manifolds Michael Wakin and Richard Baraniuk Random Projections for Manifold Learning Chinmay Hegde, Michael Wakin and Richard Baraniuk Random Projections of Smooth Manifolds Richard Baraniuk and Michael Wakin Presented by: John Paisley Duke University

2 Overview/Motivation Random projections can allow for linear, nonadaptive dimensionality reduction. If we can ensure that the manifold information is preserved in these projections, we can use all manifold learning techniques in this compressed space and know the results will be (essentially) the same. Therefore we can sense compressively, meaning we can bypass the overhead and directly sense the compressed (dimensionality reduced) signal.

3 Random Projections of Signal Manifolds (ICASSP 2006) This paper: If we have manifold information, we can perform compressive sensing using significantly fewer measurements. Whitney’s Embedding Theorem: For a noiseless manifold with intrinsic dimensionality of K, this theorem implies that a signal x in R N, projected into R M by the M x N orthonormal matrix, P (y = Px), can be recovered with high probability if M > 2K Note that K is the intrinsic dimensionality, which is different from (and less than) the level of sparsity.

4 Random Projections of Signal Manifolds (ICASSP 2006) The recovery algorithm considered here is a simple search through the projected manifold for the nearest neighbor. Consider the case where the data is noisy, so slightly off the manifold, and define

5 Random Projections of Signal Manifolds (ICASSP 2006)

6 Random Projections for Manifold Learning (NIPS 2007) How does a random projection of a manifold, impact the ability to estimate the intrinsic dimensionality of the manifold and to embed that manifold into a Euclidean space that preserves geodesic distances (e.g. via the Isomap algorithm)? How many projections are needed? Grassberger-Procacia (GP) algorithm: A common algorithm for estimating the intrinsic dimensionality of a manifold. Also written as C(r 1 )/C(r 2 ) = (r 1 /r 2 ) K where K is the intrinsic dimensionality. This method uses the fact that the volume of the intersection of a K dimensional object and a hypersphere of radius r is proportional to r K

7 Random Projections for Manifold Learning (NIPS 2007) Isomap algorithm: Produces a mapping where the Euclidean distance in the mapped space equals the geodesic distance in the original space.

8 Random Projections for Manifold Learning (NIPS 2007) Lower bound on M for the GP algorithm. The proof is in

9 Random Projections for Manifold Learning (NIPS 2007) Lower bound on M for the Isomap algorithm. The proof is in

10 Random Projections for Manifold Learning (NIPS 2007) ML-RP algorithm (manifold learning using random projections) –Developed in paper to find M

11 Random Projections for Manifold Learning (NIPS 2007)

12

13 Random Projections of Smooth Manifolds (in Foundations of Computational Mathematics)

14 Sketch of proof Sample points from the manifold such that the (geodesic) distortion of any point on the manifold to the nearest sampled point is less than some value. Also, sample points from the tangent space of the manifold, ensuring the distance of all points to the nearest sample is less than some threshold. Then use the JL-lemma to ensure that the embedding of all of these sampled points preserves relative distances. Then use some theorems and the facts about how the points were sampled to extend this distance preservation to all points on the manifold.


Download ppt "Random Projections of Signal Manifolds Michael Wakin and Richard Baraniuk Random Projections for Manifold Learning Chinmay Hegde, Michael Wakin and Richard."

Similar presentations


Ads by Google