Region Segmentation Readings: Chapter 10: 10.1 Additional Materials Provided K-means Clustering (text) EM Clustering (paper) Graph Partitioning (text)

Slides:



Advertisements
Similar presentations
K Means Clustering , Nearest Cluster and Gaussian Mixture
Advertisements

Department of Computer Engineering
Segmentácia farebného obrazu
Normalized Cuts and Image Segmentation
Clustering & image segmentation Goal::Identify groups of pixels that go together Segmentation.
Image Segmentation Selim Aksoy Department of Computer Engineering Bilkent University
Lecture 6 Image Segmentation
Image segmentation. The goals of segmentation Group together similar-looking pixels for efficiency of further processing “Bottom-up” process Unsupervised.
CS 376b Introduction to Computer Vision 04 / 08 / 2008 Instructor: Michael Eckmann.
1 Image Segmentation Image segmentation is the operation of partitioning an image into a collection of connected sets of pixels. 1. into regions, which.
1 Image Segmentation Image segmentation is the operation of partitioning an image into a collection of connected sets of pixels. 1. into regions, which.
Normalized Cuts and Image Segmentation Jianbo Shi and Jitendra Malik, Presented by: Alireza Tavakkoli.
Region Segmentation. Find sets of pixels, such that All pixels in region i satisfy some constraint of similarity.
© University of Minnesota Data Mining for the Discovery of Ocean Climate Indices 1 CSci 8980: Data Mining (Fall 2002) Vipin Kumar Army High Performance.
Segmentation CSE P 576 Larry Zitnick Many slides courtesy of Steve Seitz.
Segmentation Divide the image into segments. Each segment:
Announcements Project 2 more signup slots questions Picture taking at end of class.
Today: Image Segmentation Image Segmentation Techniques Snakes Scissors Graph Cuts Mean Shift Wednesday (2/28) Texture analysis and synthesis Multiple.
Prénom Nom Document Analysis: Data Analysis and Clustering Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
CS 376b Introduction to Computer Vision 04 / 04 / 2008 Instructor: Michael Eckmann.
Clustering Color/Intensity
Clustering.
Lecture 4 Unsupervised Learning Clustering & Dimensionality Reduction
Image Segmentation Today’s Readings Forsyth & Ponce, Chapter 14
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
Segmentation and Clustering Today’s Readings Forsyth & Ponce, Chapter 7 (plus lots of optional references in the slides) From Sandlot ScienceSandlot Science.
Computer Vision - A Modern Approach Set: Segmentation Slides by D.A. Forsyth Segmentation and Grouping Motivation: not information is evidence Obtain a.
Image Segmentation Selim Aksoy Department of Computer Engineering Bilkent University
~5,617,000 population in each state
Image Segmentation Image segmentation is the operation of partitioning an image into a collection of connected sets of pixels. 1. into regions, which usually.
Image Segmentation Rob Atlas Nick Bridle Evan Radkoff.
Segmentácia farebného obrazu. Image segmentation.
Methods in Medical Image Analysis Statistics of Pattern Recognition: Classification and Clustering Some content provided by Milos Hauskrecht, University.
Computer Vision James Hays, Brown
Mean-shift and its application for object tracking
Presenter : Kuang-Jui Hsu Date : 2011/5/3(Tues.).
Clustering appearance and shape by learning jigsaws Anitha Kannan, John Winn, Carsten Rother.
Image Segmentation February 27, Implicit Scheme is considerably better with topological change. Transition from Active Contours: –contour v(t) 
Segmentation Course web page: vision.cis.udel.edu/~cv May 7, 2003  Lecture 31.
CSE 185 Introduction to Computer Vision Pattern Recognition 2.
EECS 274 Computer Vision Segmentation by Clustering II.
CS654: Digital Image Analysis Lecture 30: Clustering based Segmentation Slides are adapted from:
CS654: Digital Image Analysis
Gaussian Mixture Models and Expectation-Maximization Algorithm.
Flat clustering approaches
Image Segmentation Superpixel methods Speaker: Hsuan-Yi Ko.
1 Image Segmentation Image segmentation is the operation of partitioning an image into a collection of connected sets of pixels. 1. into regions, which.
 In the previews parts we have seen some kind of segmentation method.  In this lecture we will see graph cut, which is a another segmentation method.
CS 2750: Machine Learning Clustering Prof. Adriana Kovashka University of Pittsburgh January 25, 2016.
Normalized Cuts and Image Segmentation Patrick Denis COSC 6121 York University Jianbo Shi and Jitendra Malik.
1 Kernel Machines A relatively new learning methodology (1992) derived from statistical learning theory. Became famous when it gave accuracy comparable.
Course Introduction to Medical Imaging Segmentation 1 – Mean Shift and Graph-Cuts Guy Gilboa.
Image Segmentation Image segmentation is the operation of partitioning an image into a collection of connected sets of pixels. 1. into regions, which usually.
Image Segmentation Image segmentation is the operation of partitioning an image into a collection of connected sets of pixels. 1. into regions, which usually.
Unsupervised Learning
Region Segmentation Readings: Chapter 10: 10
Color, Texture and Segmentation
Classification of unlabeled data:
Grouping.
Image Segmentation Image segmentation is the operation of partitioning an image into a collection of connected sets of pixels. 1. into regions, which usually.
Seam Carving Project 1a due at midnight tonight.
Image Segmentation CS 678 Spring 2018.
Spectral Clustering Eric Xing Lecture 8, August 13, 2010
Segmentation (continued)
Announcements Project 4 questions Evaluations.
Announcements Project 4 out today (due Wed March 10)
EM Algorithm and its Applications
“Traditional” image segmentation
Presentation transcript:

Region Segmentation Readings: Chapter 10: 10.1 Additional Materials Provided K-means Clustering (text) EM Clustering (paper) Graph Partitioning (text) Mean-Shift Clustering (paper) 1

Image Segmentation Image segmentation is the operation of partitioning an image into a collection of connected sets of pixels. 1. into regions, which usually cover the image 2. into linear structures, such as - line segments - curve segments 3. into 2D shapes, such as - circles - ellipses - ribbons (long, symmetric regions) 2

Example: Regions 3

Main Methods of Region Segmentation 1. Region Growing 2. Split and Merge 3. Clustering 4

Clustering There are K clusters C 1,…, C K with means m 1,…, m K. The least-squares error is defined as Out of all possible partitions into K clusters, choose the one that minimizes D. Why don’t we just do this? If we could, would we get meaningful objects? D =   || x i - m k ||. k=1 x i  C k K 2 5

K-Means Clustering Form K-means clusters from a set of n-dimensional vectors 1. Set ic (iteration count) to 1 2. Choose randomly a set of K means m 1 (1), …, m K (1). 3. For each vector x i compute D(x i, m k (ic)), k=1,…K and assign x i to the cluster C j with nearest mean. 4. Increment ic by 1, update the means to get m 1 (ic),…,m K (ic). 5. Repeat steps 3 and 4 until C k (ic) = C k (ic+1) for all k. 6

K-Means Example 1 7

K-Means Example 2 8

K-Means Example 3 9

K-means Variants Different ways to initialize the means Different stopping criteria Dynamic methods for determining the right number of clusters (K) for a given image The EM Algorithm: a probabilistic formulation of K-means 10

K-Means Boot Step : –Initialize K clusters: C 1, …, C K Each cluster is represented by its mean m j Iteration Step : –Estimate the cluster for each data point –Re-estimate the cluster parameters xixi C(xi)C(xi) 11

K-Means Example 12

K-Means Example Where do the red points belong? 13

K-means vs. EM 14 K-means EM Cluster mean mean, variance, Representation and weight Cluster randomly select initialize K Initialization K means Gaussian distributions Expectation assign each point soft-assign each point to closest mean to each distribution Maximization compute means compute new params of current clusters of each distribution

Notation 15 N( ,  ) is a 1D normal (Gaussian) distribution with mean  and standard deviation  (so the variance is  2.

16 N( ,  ) is a multivariate Gaussian distribution with mean  and covariance matrix . What is a covariance matrix? R G B R  R 2  RG  RB G  GR  G 2  GB B  BR  BG  B 2 variance(X):  X 2 =  (x i -  ) 2 (1/N) cov(X,Y) =  (x i -  x )(y i -  y ) (1/N)

17 1. Suppose we have a set of clusters: C 1, C 2,..., C K over a set of data points X = {x 1, x 2,..., x N }. P(C j ) is the probability or weight of cluster C j. P(C j | x i ) is the probability of cluster C j given point xi. P(x i | C j ) is the probability of point x i belonging to cluster C j. 2.Suppose that a cluster C j is represented by a Gaussian distribution N(  j,  j ). Then for any point x i :

EM: Expectation-Maximization Boot Step : –Initialize K clusters: C 1, …, C K Iteration Step : –Estimate the cluster of each data point –Re-estimate the cluster parameters (  j,  j ) and P(C j ) for each cluster j. For each cluster j Expectation Maximization 18

1-D EM with Gaussian Distributions Each cluster C j is represented by a Gaussian distribution N(  j,  j ). Initialization: For each cluster C j initialize its mean  j, variance  j 2, and weight  j. N(  1,  1 )  1 = P(C 1 ) N(  2,  2 )  2 = P(C 2 ) N(  3,  3 )  3 = P(C 3 ) 19

Expectation For each point x i and each cluster C j compute P(C j | x i ). P(C j | x i ) = P(x i | C j ) P(C j ) / P(x i ) P(x i ) =  P(x i | C j ) P(C j ) Where do we get P(x i | C j ) and P(C j )? j 20

1.Use the pdf for a normal distribution: 2. Use  j = P(C j ) from the current parameters of cluster C j. 21

Maximization Having computed P(C j | x i ) for each point x i and each cluster C j, use them to compute new mean, variance, and weight for each cluster. jj 22 j2=j2=

x 1 ={r 1, g 1, b 1 } x 2 ={r 2, g 2, b 2 } … x i ={r i, g i, b i } … Cluster Parameters (  1,  1 ), p(C 1 ) for C 1 (  2,  2 ), p(C 2 ) for C 2 … (  k,  k ), p(C k ) for C k Multi-Dimensional Expectation Step for Color Image Segmentation Input (Known)Input (Estimation)Output + Classification Results p(C 1 |x 1 ) p(C j |x 2 ) … p(C j |x i ) … 23

x 1 ={r 1, g 1, b 1 } x 2 ={r 2, g 2, b 2 } … x i ={r i, g i, b i } … Cluster Parameters (  1,  1 ), p(C 1 ) for C 1 (  2,  2 ), p(C 2 ) for C 2 … (  k,  k ), p(C k ) for C k Multi-dimensional Maximization Step for Color Image Segmentation Input (Known)Input (Estimation)Output + Classification Results p(C 1 |x 1 ) p(C j |x 2 ) … p(C j |x i ) … 24

Full EM Algorithm Multi-Dimensional Boot Step : –Initialize K clusters: C 1, …, C K Iteration Step : –Expectation Step –Maximization Step (  j,  j ) and P(C j ) for each cluster j. 25

Visualizing EM Clusters 26 mean ellipses show one, two, and three standard deviations

EM Demo Demo Example 27

EM Applications Blobworld: Image segmentation using Expectation-Maximization and its application to image querying Yi’s Generative/Discriminative Learning of object classes in color images 28

Blobworld: Sample Results 29

Jianbo Shi’s Graph-Partitioning An image is represented by a graph whose nodes are pixels or small groups of pixels. The goal is to partition the vertices into disjoint sets so that the similarity within each set is high and across different sets is low. 30

Minimal Cuts Let G = (V,E) be a graph. Each edge (u,v) has a weight w(u,v) that represents the similarity between u and v. Graph G can be broken into 2 disjoint graphs with node sets A and B by removing edges that connect these sets. Let cut(A,B) =  w(u,v). One way to segment G is to find the minimal cut. u  A, v  B 31

Cut(A,B) cut(A,B) =  w(u,v) u  A, v  B A B w1 w2 32

Normalized Cut Minimal cut favors cutting off small node groups, so Shi proposed the normalized cut. cut(A, B) cut(A,B) Ncut(A,B) = asso(A,V) asso(B,V) asso(A,V) =  w(u,t) u  A, t  V How much is A connected to the graph as a whole. normalized cut 33

Example Normalized Cut Ncut(A,B) = A B 34

Shi turned graph cuts into an eigenvector/eigenvalue problem. Set up a weighted graph G=(V,E) –V is the set of (N) pixels –E is a set of weighted edges (weight w ij gives the similarity between nodes i and j) –Length N vector d: d i is the sum of the weights from node i to all other nodes –N x N matrix D: D is a diagonal matrix with d on its diagonal –N x N symmetric matrix W: W ij = w ij 35

Let x be a characteristic vector of a set A of nodes – x i = 1 if node i is in a set A – x i = -1 otherwise Let y be a continuous approximation to x Solve the system of equations (D – W) y = D y for the eigenvectors y and eigenvalues Use the eigenvector y with second smallest eigenvalue to bipartition the graph (y => x => A) If further subdivision is merited, repeat recursively 36

How Shi used the procedure Shi defined the edge weights w(i,j) by w(i,j) = e * e if ||X(i)-X(j)|| 2 < r 0 otherwise -||F(i)-F(j)|| 2 /  I -||X(i)-X(j)|| 2 /  X where X(i) is the spatial location of node i F(i) is the feature vector for node I which can be intensity, color, texture, motion… The formula is set up so that w(i,j) is 0 for nodes that are too far apart. 37

Examples of Shi Clustering See Shi’s Web Page 38

Problems with EM Local minima Need to know number of segments Need to choose generative model Problems with Graph Cuts Need to know when to stop Very Slooooow 39

Mean-Shift Clustering Simple, like K-means But you don’t have to select K Statistical method Guaranteed to converge to a fixed number of clusters. 40

Finding Modes in a Histogram How Many Modes Are There? –Easy to see, hard to compute 41

Mean Shift [ Comaniciu & Meer] Iterative Mode Search 1.Initialize random seed, and window W 2.Calculate center of gravity (the “mean”) of W: 3.Translate the search window to the mean 4.Repeat Step 2 until convergence 42

Numeric Example Must Use Normalized Histogram! x H(x) N(x) 5/15 4/15 3/15 2/15 1/15  x N(x) = 10(5/15)+11(4/15)+12(3/15)+13(2/15)+14(1/15) = mean shift window W centered at 12

Mean Shift Approach –Initialize a window around each point –See where it shifts—this determines which segment it’s in –Multiple points will shift to the same segment 44

Segmentation Algorithm First run the mean shift procedure for each data point x and store its convergence point z. Link together all the z’s that are closer than.5 from each other to form clusters Assign each point to its cluster Eliminate small regions 45

Mean-shift for image segmentation Useful to take into account spatial information –instead of (R, G, B), run in (R, G, B, x, y) space 46

References –Shi and Malik, “Normalized Cuts and Image Segmentation,” Proc. CVPR 1997.Normalized Cuts and Image Segmentation –Carson, Belongie, Greenspan and Malik, “Blobworld: Image Segmentation Using Expectation-Maximization and its Application to Image Querying,” IEEE PAMI, Vol 24, No. 8, Aug –Comaniciu and Meer, “Mean shift analysis and applications,” Proc. ICCV 1999.Mean shift analysis and applications 47