Image Manifolds 16-721: Learning-based Methods in Vision Alexei Efros, CMU, Spring 2007 © A.A. Efros With slides by Dave Thompson.

Slides:



Advertisements
Similar presentations
Coherent Laplacian 3D protrusion segmentation Oxford Brookes Vision Group Queen Mary, University of London, 11/12/2009 Fabio Cuzzolin.
Advertisements

DIMENSIONALITY REDUCTION Computer Graphics CourseJune 2013.
Multiple Shape Correspondence by Dynamic Programming Yusuf Sahillioğlu 1 and Yücel Yemez 2 Pacific Graphics 2014 Computer Eng. Depts, 1, 2, Turkey.
Manifold Learning Dimensionality Reduction. Outline Introduction Dim. Reduction Manifold Isomap Overall procedure Approximating geodesic dist. Dijkstra’s.
Presented by: Mingyuan Zhou Duke University, ECE April 3, 2009
Non-linear Dimensionality Reduction CMPUT 466/551 Nilanjan Ray Prepared on materials from the book Non-linear dimensionality reduction By Lee and Verleysen,
More details on presentations Aim to speak for ~50 min (after 15 min review, leaving 10 min for discussions) Try to plan discussion topics It’s fine to.
University of Joensuu Dept. of Computer Science P.O. Box 111 FIN Joensuu Tel fax Isomap Algorithm.
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm Lecture #20.
“Random Projections on Smooth Manifolds” -A short summary
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Video Texture : Computational Photography Alexei Efros, CMU, Fall 2006 © A.A. Efros.
Image Quilting for Texture Synthesis & Transfer Alexei Efros (UC Berkeley) Bill Freeman (MERL) +=
Automated Extraction and Parameterization of Motions in Large Data Sets SIGGRAPH’ 2004 Lucas Kovar, Michael Gleicher University of Wisconsin-Madison.
LLE and ISOMAP Analysis of Robot Images Rong Xu. Background Intuition of Dimensionality Reduction Linear Approach –PCA(Principal Component Analysis) Nonlinear.
Three Algorithms for Nonlinear Dimensionality Reduction Haixuan Yang Group Meeting Jan. 011, 2005.
3D Geometry for Computer Graphics
A Global Geometric Framework for Nonlinear Dimensionality Reduction Joshua B. Tenenbaum, Vin de Silva, John C. Langford Presented by Napat Triroj.
Atul Singh Junior Undergraduate CSE, IIT Kanpur.  Dimension reduction is a technique which is used to represent a high dimensional data in a more compact.
Data-driven methods: Video : Computational Photography Alexei Efros, CMU, Fall 2007 © A.A. Efros.
NonLinear Dimensionality Reduction or Unfolding Manifolds Tennenbaum|Silva|Langford [Isomap] Roweis|Saul [Locally Linear Embedding] Presented by Vikas.
Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee.
Nonlinear Dimensionality Reduction by Locally Linear Embedding Sam T. Roweis and Lawrence K. Saul Reference: "Nonlinear dimensionality reduction by locally.
Nonlinear Dimensionality Reduction Approaches. Dimensionality Reduction The goal: The meaningful low-dimensional structures hidden in their high-dimensional.
Jinhui Tang †, Shuicheng Yan †, Richang Hong †, Guo-Jun Qi ‡, Tat-Seng Chua † † National University of Singapore ‡ University of Illinois at Urbana-Champaign.
Manifold learning: Locally Linear Embedding Jieping Ye Department of Computer Science and Engineering Arizona State University
Texture Synthesis by Non-parametric Sampling Alexei Efros and Thomas Leung UC Berkeley.
Multimodal Interaction Dr. Mike Spann
Video Textures Arno Schödl Richard Szeliski David Salesin Irfan Essa Microsoft Research, Georgia Tech.
Graph Embedding: A General Framework for Dimensionality Reduction Dong XU School of Computer Engineering Nanyang Technological University
Dimensionality reduction: Some Assumptions High-dimensional data often lies on or near a much lower dimensional, curved manifold. A good way to represent.
1 Recognition by Appearance Appearance-based recognition is a competing paradigm to features and alignment. No features are extracted! Images are represented.
IEEE TRANSSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
Data-driven methods: Video & Texture Cs195g Computational Photography James Hays, Brown, Spring 2010 Many slides from Alexei Efros.
Computer Vision Lab. SNU Young Ki Baik Nonlinear Dimensionality Reduction Approach (ISOMAP, LLE)
Multimedia Programming 21: Video and its applications Departments of Digital Contents Sang Il Park Lots of Slides are stolen from Alexei Efros, CMU.
Advanced Computer Graphics (Fall 2010) CS 283, Lecture 26 Texture Synthesis Ravi Ramamoorthi Slides, lecture.
ISOMAP TRACKING WITH PARTICLE FILTER Presented by Nikhil Rane.
CSE 185 Introduction to Computer Vision Face Recognition.
Dimensionality Reduction
Manifold learning: MDS and Isomap
Non-Isometric Manifold Learning Analysis and an Algorithm Piotr Dollár, Vincent Rabaud, Serge Belongie University of California, San Diego.
CSC2535: Computation in Neural Networks Lecture 12: Non-linear dimensionality reduction Geoffrey Hinton.
Nonlinear Dimensionality Reduction Approach (ISOMAP)
Jan Kamenický.  Many features ⇒ many dimensions  Dimensionality reduction ◦ Feature extraction (useful representation) ◦ Classification ◦ Visualization.
Data Mining Course 0 Manifold learning Xin Yang. Data Mining Course 1 Outline Manifold and Manifold Learning Classical Dimensionality Reduction Semi-Supervised.
Non-Linear Dimensionality Reduction
Project 11: Determining the Intrinsic Dimensionality of a Distribution Okke Formsma, Nicolas Roussis and Per Løwenborg.
Project by: Cirill Aizenberg, Dima Altshuler Supervisor: Erez Berkovich.
EE4-62 MLCV Lecture Face Recognition – Subspace/Manifold Learning Tae-Kyun Kim 1 EE4-62 MLCV.
Tony Jebara, Columbia University Advanced Machine Learning & Perception Instructor: Tony Jebara.
Data Projections & Visualization Rajmonda Caceres MIT Lincoln Laboratory.
Math 285 Project Diffusion Maps Xiaoyan Chong Department of Mathematics and Statistics San Jose State University.
CSC321: Lecture 25: Non-linear dimensionality reduction Geoffrey Hinton.
Video Textures Arno Schödl Richard Szeliski David Salesin Irfan Essa Microsoft Research, Georgia Tech.
CS685 : Special Topics in Data Mining, UKY The UNIVERSITY of KENTUCKY Dimensionality Reduction CS 685: Special Topics in Data Mining Spring 2008 Jinze.
Nonlinear Dimension Reduction: Semi-Definite Embedding vs. Local Linear Embedding Li Zhang and Lin Liao.
CSC321: Extra Lecture (not on the exam) Non-linear dimensionality reduction Geoffrey Hinton.
Manifold Learning JAMES MCQUEEN – UW DEPARTMENT OF STATISTICS.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Dimensionality Reduction Part 1: Linear Methods Comp Spring 2007.
Spectral Methods for Dimensionality
Nonlinear Dimensionality Reduction
Unsupervised Riemannian Clustering of Probability Density Functions
Dimensionality Reduction
Texture Synthesis by Non-parametric Sampling
Outline Nonlinear Dimension Reduction Brief introduction Isomap LLE
Object Modeling with Layers
Image Quilting for Texture Synthesis & Transfer
Nonlinear Dimension Reduction:
NonLinear Dimensionality Reduction or Unfolding Manifolds
Presentation transcript:

Image Manifolds : Learning-based Methods in Vision Alexei Efros, CMU, Spring 2007 © A.A. Efros With slides by Dave Thompson

Images as Vectors = m n n*m

Importance of Alignment = m n n*m = =?

Text Synthesis [ Shannon,’48] proposed a way to generate English- looking text using N-grams: Assume a generalized Markov model Use a large text to compute prob. distributions of each letter given N-1 previous letters Starting from a seed repeatedly sample this Markov chain to generate new letters Also works for whole words WE NEEDTOEATCAKE

Mark V. Shaney (Bell Labs) Results (using alt.singles corpus): “As I've commented before, really relating to someone involves standing next to impossible.” “One morning I shot an elephant in my arms and kissed him.” “I spent an interesting evening recently with a grain of salt”

Video Textures Arno Schödl Richard Szeliski David Salesin Irfan Essa Microsoft Research, Georgia Tech

Video textures

Our approach How do we find good transitions?

Finding good transitions Compute L 2 distance D i, j between all frames Similar frames make good transitions frame ivs. frame j

Markov chain representation Similar frames make good transitions

Transition costs Transition from i to j if successor of i is similar to j Cost function: C i  j = D i+1, j

Transition probabilities Probability for transition P i  j inversely related to cost: P i  j ~ exp ( – C i  j /  2 ) high  low 

Preserving dynamics

Cost for transition i  j C i  j = w k D i+k+1, j+k

Preserving dynamics – effect Cost for transition i  j C i  j = w k D i+k+1, j+k

Video sprite extraction

Video sprite control Augmented transition cost:

Interactive fish

Advanced Perception David R. Thompson manifold learning with applications to object recognition

plenoptic function manifolds in vision

appearance variation manifolds in vision images from hormel corp.

deformation manifolds in vision images from

Find a low-D basis for describing high-D data. X ~= X' S.T. dim(X') << dim(X) uncovers the intrinsic dimensionality manifold learning

If we knew all pairwise distances… ChicagoRaleighBostonSeattleS.F.AustinOrlando Chicago0 Raleigh6410 Boston Seattle S.F Austin Orlando Distances calculated with geobytes.com/CityDistanceTool

Multidimensional Scaling (MDS) For n data points, and a distance matrix D, D ij =...we can construct a m-dimensional space to preserve inter-point distances by using the top eigenvectors of D scaled by their eigenvalues j i

MDS result in 2D

Actual plot of cities

Don’t know distances

Don’t know distnaces

1. data compression 2. “curse of dimensionality” 3. de-noising 4. visualization 5. reasonable distance metrics why do manifold learning?

reasonable distance metrics ?

? linear interpolation

reasonable distance metrics ? manifold interpolation

Isomap for images Build a data graph G. Vertices: images (u,v) is an edge iff SSD(u,v) is small For any two images, we approximate the distance between them with the “shortest path” on G

Isomap 1. Build a sparse graph with K-nearest neighbors D g = (distance matrix is sparse)

Isomap 2. Infer other interpoint distances by finding shortest paths on the graph (Dijkstra's algorithm). D g =

Isomap shortest-distance on a graph is easy to compute

Isomap results: hands

- preserves global structure - few free parameters - sensitive to noise, noise edges - computationally expensive (dense matrix eigen-reduction) Isomap: pro and con

Leakage problem

Find a mapping to preserve local linear relationships between neighbors Locally Linear Embedding

Locally Linear Embedding

1. Find weight matrix W of linear coefficients: Enforce sum-to-one constraint. LLE: Two key steps

2. Find projected vectors Y to minimize reconstruction error must solve for whole dataset simultaneously LLE: Two key steps

LLE: Result preserves local topology PCA LLE

- no local minima, one free parameter - incremental & fast - simple linear algebra operations - can distort global structure LLE: pro and con