Presentation is loading. Please wait.

Presentation is loading. Please wait.

Normalized Cuts and Image Segmentation

Similar presentations


Presentation on theme: "Normalized Cuts and Image Segmentation"— Presentation transcript:

1 Normalized Cuts and Image Segmentation
Advanced Topics in Computer Vision Amir Lev-Tov IDC, Herzliya

2 Main References [1] Normalized Cuts and Image Segmentation, Shi and Malik, IEEE Conf. Computer Vision and Pattern Recognition, 1997. [2] Normalized Cuts and Image Segmentation, Shi and Malik, IEEE Transactions on pattern analysis and machine intelligence, Vol 22, No 8, 2000

3 More References [3] Weiss Y. Segmentation using eigenvectors: a unifying view. Proceedings IEEE International Conference on Computer Vision, 1999. [4] Ng A.Y., Jordan , M.I., and Weiss Y, On Spectral Clustering: Analysis and an algorithm, NIPS 2001 [5] Rayleigh’s Quotient, Nail Gumerov, 2003 [6] Wu and Leahy, an optimal graph theoretic approach to data clustering, PAMI, 1993

4 Mathematical Introduction
Definition: is an Eigen Value of n x n matrix A, if there exist a non-trivial vector such that: That vector is called Eigen Vector of A corresponding to the Eigen Value All Eigenvectors correspond to different Eigenvalues, are mutually linearlly independent (Orthogonal set).

5 Mathematical Introduction
Matrix A is called Hermitian if Where A* is the conjugate transpose of A: Real matrix is Hermitian  Symmetric Let A be a Hermitian matrix. Then is called the Rayleigh’s Quotient of A.

6 Mathematical Introduction
For real matrices the definition becomes: Where A is just symmetric. כלומר ע"י פתרון המערכת AV=LAMBDA*V

7 Mathematical Introduction
Theorem: Rayleigh’s Quotient gets its minimum value at A’s minimal eigenvalue, and the corresponding eigenvector achieve this minimum. Moreover: if A has n eigenvalues then R(A,v) has n stationary points achieved at their eigenvectors respectively. כלומר ע"י פתרון המערכת AV=LAMBDA*V

8 Mathematical Introduction
The Generalized Rayleigh’s Quotient is: where B is hermitian (real->symmetric) and positive definite matrix (all its eigenvalues are non-zero positives). Minimum achieved by solving: Bv במקום רק v במכנה

9 Segmentation Introduction

10 Segmentation Introduction
Problem: Divide an image into subsets of pixels (Segments). Some methods: Thresholding Region Growing K-means Mean-Shift Use of changes in color, texture etc. Contours

11 Segmentation Introduction
The problem is not very well defined, for example, how many groups are in the picture? 4? Maybe 3 ? 2? Or even every X

12 Segmentation Introduction
In order to get good Segmentation: Low level cues like colors, texutre etc. High level knowledge as global impression from the picture (top->down). Need good similarity function Number of segments is not known in advance

13 The Graph partitioning method
Main Idea: Model the image by a graph G=(V,E). Assign similarity values to edges weights. Find a cut in G of minimal value, which yield partition of V into two subsets. Matrix representation of computations. Using Linear Algebra tools and Spectral Analysis to solve the new minimization problem. Recursively repartition the subpartitions. להסביר על המשמעות של Spectral Analysis

14 Graph Modeling The Graph G=(V,E) Nodes: Edges: Weights: Pixels
Some other higher level features Edges: Between every pair of nodes in V Weights: Weight w(i,j) is function of similarity between node i and j. הערה על משקל הקשתות בציור – גבוה בתוך הקבוצה, נמוך בין קבוצות.

15 Graph Modeling Objective
Partition the set of vertices into disjoint sets Number of segments m is not known. Cut: Case of m=2, Bi-partition of V into A and B: The Cut Value is: The optimal cut is the one that minimizes its value הרעיון: משקלי הקשתות בתוך אותו סגמנט צריכים להיות גבוהים יחסית למשקלי קשתות בין סגמנטים שונים.

16 Minimun Cut Wu and Leahy[1993]: Use the mentioned cut criteria
Partition G into k subgraphs recursively Minimize the maximum cut value Produce good segmentation on some of the images

17 Min Cut - The Problem It is not the best cut !
Favors cutting small sets of isolated nodes:

18 Normalized Cut [Shi,Malick,1997]
Normalize the cut value with the volume of the partition: Where

19 Normalized Cut Properties:
Sets with weak connections  Get low Ncut value. High Association within Sets  Get low Ncut value. But - small sets are panalized with high Ncut value. להסביר, במיוחד את השלישי.

20 Normalized Association
Naturally related criterions:

21 Computing the Optimal Cut
Given partition of nodes from V into A,B : Let be an N=|V| dimensional indicator for A, i.e if node i is in A, and -1 otherwise Let be the total connection from node i to all other nodes. Rewrite:

22 Computing the Optimal Cut
Objective: Transform into Rayleigh’s Quotient-like expression:

23 Matrix Representation
Let D be an N x N diagonal matrix with d on its diagonal: Let W be an N x N symmetrical affinity matrix with

24 Matrix Representation
Let k be the Ratio between degree of A to V: Let be an N x 1 vector of all ones Note: (1+x)/2 and (1-x)/2 are indicators for and respectively

25 Matrix Representation
We can rewrite our expression as: Ncut(A,B) = Ncut(x) = Here 1+x and 1-x used instead of the indicators (1+x)/2. It doesn’t change our minimzation problem and goal 1D1= sum of di

26 Matrix Representation
It can be shown that the previous expression is equal to: Where is ratio between degree of A to degree of B

27 Matrix Representation
Setting new indicator y=(1+x)-b(1-x) we get the Constraint: yD1 זה בעצם סכום הדרגות ממושקל בערך האינדיקטור

28 Matrix Representation
Denominator:

29 Finding the Minimum Putting the last two expression together we get
the Rayleigh’s quotient: With the conditions: Minimum achieved by finding the minimal Eigenvalue of the system(1) (relaxing y to take on real values) Corresponding Eigenvector will be in fact indicator vector for nodes in the segment (A) - להסביר שבעצם רצינו להגיע לצורה הזו של ביטוי ריילי. - תנאים מקדימים לריילי - סימטריות

30 Finding the Minimum But – we have two constraints:
We’ll see that the first one is satisfied: Replacing y by we get the standard eigensystem (2): is an eigenvector of it,with an eigenvalue of 0. Since the Laplaician matrix (D-W) is symmetric semi-positive definite, so that the new system

31 Finding the Minimum Thus, z0 is the smallest eigenvector of (2)
Also known: all the eigenvectors of equation (2) are orthogonal to each other In particular, z1, the second smallest eigenvector is orthogonal to z0

32 Finding the Minimum In terms of our original system)1):
Is the smallest eigenvector with Where y1 is the 2nd smallest eigenvector of (1) The 1st constraint is automatically satisfied:

33 Finding the Minimum In Rayleigh’s Quotient, under the constraint that z is orthogonal to the j-1 smallest eigenvectors, the quotient is minimized by the next smallest eigenvector zj and its minimum is the eigen value We get: Consequently:

34 Finding the Minimum Conclusion: the 2nd smallest eigenvector of (1)
is the real solution to our Normalized Cut problem. What about the 2nd constraint that y takes on discrete values?? Solving the discrete problem is NP-Complete Solution – approximate the continuous solution by splitting the vector coordinates at different thresholds, choosing the one that gives the best NCut value.

35 Complexity Wait! What about the original graph problem ?
MinCut – Has Polynomial-Time algorithm by the MaxFlow algorithm. Impractical for images Normalized Cut – NP-Complete Need fast approximations

36 Complexity Solving standard eigenvalue problem
Impractical for segmenting large number of pixels Special properties of our problem: The graph often locally connected=>sparse matrix Only the top eigenvectors are needed Low precision requirements Using Lanczos eigensolver - n = number of nodes in the graph=number of pixels

37 Repartitioning Recursively apply the above method to each of the partitions Subject to some “stability” criteria: Create sub partitions by varying the splitting point around the optimal value and check if Ncut value change much Until certain Ncut threshold exceeded Another approach: Use high order eigenvectors Pros: more discriminative information Cons: according to Shi&Malik, Approximation error accumulates with every eigenvector taken להסביר על ה stability criteria

38 Summary of the Algorithm
Given features, construct the graph Solve for eigenvectors with the smallest eigenvalues Use the eigenvector with the 2nd smallest eigenvalue to bipartition the graph Find the splitting point that minimizes Ncut Check stability and Ncut threshold to decide whether to divide the current partition. Recursively repartition the segmented parts if necessary

39 Experiments Pixels as Graph nodes Weight Function:
X(i) – Spatial location of node I F(i) – Feature vector based on Intensity, Color or Texture information at node i

40 Experiments Example of F(i): F(i) = 1, For point set segmentation
F(i) = I(i), Intensity value, for segmenting brightness images

41 Experiments Point set Taken from [1]

42 Experiments Synthetic image of corner Taken from [1]

43 Experiments Taken from [1]

44 Experiments “Color” image Taken from [1]

45 Experiments Without well defined boundaries: Taken from [1]

46 Experiments Texture segmentation Different orientation stripes
Taken from [1]

47 A little bit more.. Taken from [2]

48 A little bit more.. High order eigenvectors Taken from [2]

49 High order eigenvectors
בתמונות האחרונות קשה יותר לבצע סגמנטציה על הערכים הרציפים של ה ו"ע (מסדר גבוה) Taken from [2]

50 1st Vs 2nd Eigenvectors Taken from [3]

51 Summery Treat the problem as graph partitioning The new idea:
Normalized Cut instead of Regular Cut NCut criteria measures both: Dissimilarity between groups Similarity within a group Global impression extraction of the image Spectral Analysis in favor of segmenting images Generalized eigenvalue system gives real solution=>”segmenting” this data provide clustering of the original image

52 Thanks!


Download ppt "Normalized Cuts and Image Segmentation"

Similar presentations


Ads by Google