 In the previews parts we have seen some kind of segmentation method.  In this lecture we will see graph cut, which is a another segmentation method.

Slides:



Advertisements
Similar presentations
Liang Shan Clustering Techniques and Applications to Image Segmentation.
Advertisements

Partitional Algorithms to Detect Complex Clusters
Information Networks Graph Clustering Lecture 14.
Normalized Cuts and Image Segmentation
Online Social Networks and Media. Graph partitioning The general problem – Input: a graph G=(V,E) edge (u,v) denotes similarity between u and v weighted.
Clustering II CMPUT 466/551 Nilanjan Ray. Mean-shift Clustering Will show slides from:
Foreground/Background Image Segmentation. What is our goal? To label each pixel in an image as belonging to either the foreground of the scene or the.
Author: Jie chen and Yousef Saad IEEE transactions of knowledge and data engineering.
10/11/2001Random walks and spectral segmentation1 CSE 291 Fall 2001 Marina Meila and Jianbo Shi: Learning Segmentation by Random Walks/A Random Walks View.
Lecture 6 Image Segmentation
Lecture 21: Spectral Clustering
Spectral Clustering 指導教授 : 王聖智 S. J. Wang 學生 : 羅介暐 Jie-Wei Luo.
CS 584. Review n Systems of equations and finite element methods are related.
CS 376b Introduction to Computer Vision 04 / 08 / 2008 Instructor: Michael Eckmann.
Normalized Cuts and Image Segmentation Jianbo Shi and Jitendra Malik, Presented by: Alireza Tavakkoli.
© 2003 by Davi GeigerComputer Vision October 2003 L1.1 Image Segmentation Based on the work of Shi and Malik, Carnegie Mellon and Berkley and based on.
Region Segmentation. Find sets of pixels, such that All pixels in region i satisfy some constraint of similarity.
A Unified View of Kernel k-means, Spectral Clustering and Graph Cuts
Segmentation Graph-Theoretic Clustering.
Cutting complete weighted graphs Jameson Cahill Ido Heskia Math/CSC 870 Spring 2007.
Image Segmentation A Graph Theoretic Approach. Factors for Visual Grouping Similarity (gray level difference) Similarity (gray level difference) Proximity.
אשכול בעזרת אלגורתמים בתורת הגרפים
CS4670: Computer Vision Kavita Bala Lecture 7: Harris Corner Detection.
Application of Graph Theory to OO Software Engineering Alexander Chatzigeorgiou, Nikolaos Tsantalis, George Stephanides Department of Applied Informatics.
Computer Vision - A Modern Approach Set: Segmentation Slides by D.A. Forsyth Segmentation and Grouping Motivation: not information is evidence Obtain a.
Segmentation via Graph Cuts
SVD(Singular Value Decomposition) and Its Applications
Image Segmentation Image segmentation is the operation of partitioning an image into a collection of connected sets of pixels. 1. into regions, which usually.
Image Segmentation Rob Atlas Nick Bridle Evan Radkoff.
CSE554AlignmentSlide 1 CSE 554 Lecture 8: Alignment Fall 2014.
Segmentation Techniques Luis E. Tirado PhD qualifying exam presentation Northeastern University.
Presenter : Kuang-Jui Hsu Date : 2011/5/3(Tues.).
Clustering appearance and shape by learning jigsaws Anitha Kannan, John Winn, Carsten Rother.
Segmentation using eigenvectors
Graph Partitioning and Clustering E={w ij } Set of weighted edges indicating pair-wise similarity between points Similarity Graph.
CSSE463: Image Recognition Day 34 This week This week Today: Today: Graph-theoretic approach to segmentation Graph-theoretic approach to segmentation Tuesday:
Segmentation using eigenvectors Papers: “Normalized Cuts and Image Segmentation”. Jianbo Shi and Jitendra Malik, IEEE, 2000 “Segmentation using eigenvectors:
Region Segmentation Readings: Chapter 10: 10.1 Additional Materials Provided K-means Clustering (text) EM Clustering (paper) Graph Partitioning (text)
Image Segmentation February 27, Implicit Scheme is considerably better with topological change. Transition from Active Contours: –contour v(t) 
Segmentation Course web page: vision.cis.udel.edu/~cv May 7, 2003  Lecture 31.
Chapter 14: SEGMENTATION BY CLUSTERING 1. 2 Outline Introduction Human Vision & Gestalt Properties Applications – Background Subtraction – Shot Boundary.
Slide Copyright © 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley.
Spectral Analysis based on the Adjacency Matrix of Network Data Leting Wu Fall 2009.
Jad silbak -University of Haifa. What we have, and what we want: Most segmentations until now focusing on local features (K-Means). We would like to extract.
Spectral Clustering Jianping Fan Dept of Computer Science UNC, Charlotte.
Learning Spectral Clustering, With Application to Speech Separation F. R. Bach and M. I. Jordan, JMLR 2006.
Image Segmentation Superpixel methods Speaker: Hsuan-Yi Ko.
CS654: Digital Image Analysis Lecture 28: Advanced topics in Image Segmentation Image courtesy: IEEE, IJCV.
Mestrado em Ciência de Computadores Mestrado Integrado em Engenharia de Redes e Sistemas Informáticos VC 15/16 – TP10 Advanced Segmentation Miguel Tavares.
Spectral Clustering Shannon Quinn (with thanks to William Cohen of Carnegie Mellon University, and J. Leskovec, A. Rajaraman, and J. Ullman of Stanford.
A Tutorial on Spectral Clustering Ulrike von Luxburg Max Planck Institute for Biological Cybernetics Statistics and Computing, Dec. 2007, Vol. 17, No.
Course 3 Binary Image Binary Images have only two gray levels: “1” and “0”, i.e., black / white. —— save memory —— fast processing —— many features of.
Lecture 10: Harris Corner Detector CS4670/5670: Computer Vision Kavita Bala.
Normalized Cuts and Image Segmentation Patrick Denis COSC 6121 York University Jianbo Shi and Jitendra Malik.
Grouping and Segmentation Previously: model-based grouping (e.g., straight line, figure 8…) Now: general ``bottom-up” image organization (Like detecting-specific-brightness-patterns.
Document Clustering with Prior Knowledge Xiang Ji et al. Document Clustering with Prior Knowledge. SIGIR 2006 Presenter: Suhan Yu.
Spectral clustering of graphs
Spectral partitioning works: Planar graphs and finite element meshes
Miguel Tavares Coimbra
Region Segmentation Readings: Chapter 10: 10
CSSE463: Image Recognition Day 34
Jianping Fan Dept of CS UNC-Charlotte
Degree and Eigenvector Centrality
Segmentation Graph-Theoretic Clustering.
Grouping.
Digital Image Processing
Spectral Clustering Eric Xing Lecture 8, August 13, 2010
3.3 Network-Centric Community Detection
CSSE463: Image Recognition Day 34
“Traditional” image segmentation
Presentation transcript:

 In the previews parts we have seen some kind of segmentation method.  In this lecture we will see graph cut, which is a another segmentation method based on a powerful mathematical tool.

I am famous for my motivational skills. Everyone always says that they have to work a lot harder when I’m around

Global information – graph cut, Not like most of the other segmentation methods we have seen, uses global information Instead of local information

Motivation! (app) Original Image Edited Image More interesting examples in the end of the lecture…

 Definitions and mathematical reminders.  The Normalized cut method as a general tool (this will be the main part)  How to use graph cut on an image for segmentation.  Some examples of usage of graph cut based segmentation.

 Graph-Cut (From Wikipedia): In graph theory, a cut is a partition of the vertices of a graph into two disjoint subsets.

 Graph-Cut cost: if we will take a weighted undirected graph, then the cost of the cut is the sum of weights on all the edges connecting the two parts

Cut = = 9

 Min cut Cost if for two groups of vertices X and Y we define the cut as:  Then min cut is a Partition of the graph that minimizes the cut.

 Clustering- Dividing data into groups according to some division parameter.

 Given an undirected weighted graph, we would like to find a partition of the graph into two (reasonable sizes) or more clusters.

Every node is connected to every other node, so regular partition will rather choose a single node (all weights are equal to 1):

Separate one node cut cost is

Separate two nodes cut cost is 8 And so on…

 So we saw that min cut will favor partitioning the graph so one of the groups will have only one node.  That’s not what we want:

 We will define now a new definition:  Disassociation the measure of the normalized cut:  Where V is the group of the graph vertices. A B

A B A B A B

 Now that we have that definition we can measure the cost of the cut as a fraction of all the edges in the graph.

 So, we will define Ncut as:  And we will want to minimize it. A B

 To minimize Ncut we need to: ◦ Minimize the: which will grantee strongly associated nodes to stick together. ◦ Maximize the: Which will grantee that every cluster will have reasonable amount of nodes.

 We can also see it by defining: A B

 And this expression is getting bigger as assoc(A,A) and assoc(B,B) are bigger as well as when cut(A,B) is getting smaller A B

 As it turn out by some mathematical manipulations:  This Implies that minimizing Ncut is equivalent to maximizing Nassoc(A,B) that is maximizing assoc(A,A) and assoc(B,B) and minimizing the edges between A and B, we are making Ncut to be minimized. A B

 So, what is the problem??  The problem of finding the min Ncut is in NP complete!  Solution- instead of an indicator saying which node is in which group (1 or 0) we can work in the real numbers world, where an approximation can be found efficiently.

 For an undirected weighted graph, let W be a matrix where w i,j denotes the weight of the edge e(i,j): a b c d a a b bc c d d

 For an undirected weighted graph, let D be a matrix where D i,i denotes the sum of the weights of the edges e(i,j) for some other node j: a b c d a a b bc c d d

 For an undirected weighted graph, let L (Laplacian) be a matrix where L = D – W:

 O.K, so we have the Laplacian matrix… what does it gives us??

 The Laplacian matrix (L) is a real symmetric matrix and so All eigenvectors of it are perpendicular to each other.

 L is positive semi-definite: for all non-zero column vector  And thus its Eigen values are non-negative

 L has n eigen-values with real positive values, where the smallest one is 0 and it is corresponding to eigen-vector of 1’s.

 The number of times 0 appears as an eigenvalue in L is the number of connected components in the graph (in our case only one).  The smallest non-zero eigenvalue of L is called the spectral gap (we will use this eigenvalue)

 From Wikipedia: In the clustering of data, spectral clustering techniques make use of the eigenvalues of the similarity matrix of the data to perform dimensionality reduction before clustering in fewer dimensions.  Ncut- is an example for spectral clustering, lets see how…

 For a graph partition in to two groups A and B, an indicator vector y is a 1xn bool indicators vector such that for every it tells us if node i is in group A or not (0 or 1) a b c d e f a b c d e f

 As Jianbo Shi and Jitendra Malik proved in their paper from 1997 using the Laplacian matrix properties, we get that:  Where is a indicator vector, but instead of we replace it with.  What is -b…? L = D-W

 If we will take the diagonal matrix D, then b is the sum of all, where, divided by the sum of all, where

L = D-W D is diagonalizable Decompression d i,I and multiply by 2multiply by 2

 we get:  And we can see that for minimization, when is big we would like and to be as close as possible, and for that is small we care a little less for it.

 We said that and we will want to minimize  But keeping is NP-complete.

 So, instead of making we can perform a relaxation by allowing real values in the, and then it becomes a eigen vector/value problem:  where are the eigenvectors and the eigenvalues are represent the cuts cost

 Now, we will want to find the smallest eigenvalue (corresponding to the smallest cut) of:  But we saw that it is and eigenvector of ones. That not what we wanted because we will get a all nodes in the same group partition.

 Solution – we will take the second smallest eigenvalue –the spectral gap.

 Eigen-values are corresponding to the cut cost.  The smallest eigen-value is 0 but it’s eigen- vector is all 1’s, so it is not good for us.  We choose the second smallest eigen-value, and his eigen-vector is our partition of the graph.

 We saw that for node that have strong weights between them we get similar values in their coordinates, so we can take a threshold T on the eigenvector entry's and return to the discrete world! a b c d -1.6a -0.9b 0.9c 2d 1a 1b -bc d

 We saw that for node that have strong weights between them we get similar values in their coordinates, so we can take a threshold T on the eigenvector entry's and return to the discrete world! a b c d -1.6a -0.9b 0.9c 2d 1a 1b -bc d

 Graph cut will divide our graph to only two parts. But what if we want to divide further?  One option is to use the 3 rd eigenvector to get the next division. But we need to remember that each partition contains some error and that is way it is not recommended.

 Given G=(V,E) compute matrixes D and W  Compute for eigenvector with the second smallest eigenvalue and perform bipartition of the graph.  Stopping condition: ◦ Check stability of eigenvector values- see if the values are continues. ◦ Ncut < T - where T is a predefined value to indicate if the Ncut should be stopped.

 Given an image, we will construct a graph G=(V,E), where for each pixel we assign a node.  Also for pixels and we denote as the similarity between the pixels.

 Similarity between two pixels i and j can be defined as:  For pixels within radius, where is a feature vector based on intensities, colors etc. and is the spatial location.

 We can see that for two pixels i,j that values are corresponding to their feature vectors similarity and also their special distance.

 Graph cut intensity based example:

Movie time

 Graph-cut is one kind of spectral clustering.  We are using a threshold to divide the graph after the dimension reduction.  Although the math behind it is a little complicated at start, the method itself is simple to implement.

 J. Shi and J.Malik, “Normalized Cuts and Image Segmentation,” Proc. CVPR also IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(8), , August  Slides Courtesy: Eric Xing, M. Hein & U.V. Luxburg.  Richard Szeliski Computer Vision: Algorithms and Applications September 3, 2010 draft 