Download presentation
1
CS654: Digital Image Analysis
Lecture 26: Image segmentation
2
Recap of Lecture 25 Line detection Least square, RANSAC
Hough space/ parametric space Voting Circle detection GHT
3
Outline of Lecture 26 Thresholding Otsuβs method
Region based segmentation Region growing Split and Merge
4
Introduction Detected boundaries between regions
Pixel distribution was not exploited Can the regions be segmented directly? How to exploit the top-down/ bottom-up cues? Threshold Region-based segmentation
5
Thresholding π π₯,π¦ = 1 πΌπ π π₯,π¦ >π 0 πΌπ π π₯,π¦ β€π
Global thresholding, local thresholding, adaptive thresholding Multiple thresholding π π₯,π¦ = 1 πΌπ π π₯,π¦ >π 0 πΌπ π π₯,π¦ β€π π π₯,π¦ = π; πΌπ π π₯,π¦ > π 2 π; πΌπ π 1 <π π₯,π¦ β€ π 2 π; πΌπ π π₯,π¦ β€ π 1
6
Key factors Automatic threshold selection Separation between peaks
The noise content in the image The relative size of the object and the background Uniformity of the illumination source Uniformity of the reflectance property of the image Automatic threshold selection
7
Iterative threshold selection
Select an initial estimate of the threshold π . A good initial value is the average intensity of the image Calculate the mean grey values π 1 and π 2 of the partitions, π
1, π
2 Partition the image into two groups, π
1, π
2 , using the threshold Select a new threshold: Repeat steps 2-4 until the mean values π 1 and π 2 in successive iterations do not change. π= π 1 + π 2
8
Optimal thresholding Definition: Methods based on approximation of the histogram of an image using a weighted sum of two or more probability densities with normal distribution.
9
Example Input image Histogram Segmented image
10
Otsuβs Image Segmentation Algorithm
Find the threshold that minimizes the weighted within-class variance. Equivalent to: maximizing the between-class variance. Operates directly on the gray level histogram [e.g numbers, P(i)] It is fast (once the histogram is computed).
11
Otsu: Assumptions Histogram (and the image) are bimodal.
No use of spatial coherence, nor any other notion of object structure. Assumes stationary statistics, but can be modified to be locally adaptive Assumes uniform illumination (implicitly), so the bimodal brightness behavior arises from object appearance differences only.
12
Otsuβs method: Formulation
The weighted within-class variance is: Where the class probabilities are estimated as: And the class means are given by:
13
Otsuβs method: Formulation
Finally, the individual class variances are: Run through the full range of t values and pick the value that minimizes Is this algorithm first enough?
14
Between/ Within/ Total Variance
The total variance does not depend on threshold (obviously). For any given threshold, the total variance is the weighted sum of the within-class variances The between class variance, which is the sum of weighted squared distances between the class means and the global mean.
15
Total variance The total variance can be expressed as
Within-class, from before Between-class, Minimizing the within-class variance is the same as maximizing the between-class variance. compute the quantities in recursively as we run through the range of t values.
16
Recursive algorithm Initialization... ; Recursion...
17
Example Input image Histogram Global thresholding Otsuβs method
18
Example (in presence of noise)
Input image Histogram Global threshold Smoothened image Histogram Otsuβs method
19
Image partitioning Input image Histogram Global threshold
Global Otsuβs Method Image partitioning Local Otsuβs method
20
Thresholding (non-uniform background)
Input image Global thresholding using Otsuβs method Local thresholding with moving average
21
Region based segmentation
Goal: find coherent (homogeneous) regions in the image Coherent regions contain pixel which share some similar property Advantages: Better for noisy images Disadvantages: Oversegmented (too many regions), Undersegmented (too few regions) Canβt find objects that span multiple disconnected regions
22
Types of segmentations
Input Oversegmentation Undersegmentation Multiple Segmentations
23
Region Segmentation: Criteria
A segmentation is a partition of an image πΌ into a set of regions π satisfying: ο Si = S Partition covers the whole image. Si ο Sj = ο¦, i οΉ j No regions intersect. ο’ Si, P(Si) = true P(Si ο Sj) = false, i οΉ j, Si adjacent Sj Homogeneity predicate is satisfied by each region. Union of adjacent regions does not satisfy it. Define and implement the similarity predicate.
24
Methods of Region Segmentation
Region growing Split and merge Clustering
25
Region Growing It start with one pixel of a potential region
Try to grow it by adding adjacent pixels till the pixels being compared are too dissimilar The first pixel selected can be The first unlabelled pixel in the image A set of seed pixels can be chosen from the image. Usually a statistical test is used to decide which pixels can be added to a region
26
Example Histogram Initial Seed image Input image Final seeds
Threshold 1 Threshold 2 Region growing Output image
27
Split and Merge Split into four disjoint coordinates any region π
π for which π( π
π )=false When no further splitting is possible, merge adjacent region for which π( π
π βͺ π
π )= true Stop when no further merging is possible
28
Example Input image R1 R2 R1 R3 R4 R4 Split Split Merge R21 R22 R23
29
Quadtree representation
30
Clustering Task of grouping a set of objects
Objects in the same group (called a cluster) are more similar (in some sense or another) to each other Object of one cluster is different from an object of the another cluster Connectivity model, centroid model, distribution model, density model, graph based model, hard clustering, soft-clustering, β¦
31
Feature Space Source: K. Grauman
32
Centroid model Computational time is short
User have to decide the number of clusters before starting classifying data The concept of centroid One of the famous method: K-means Method
33
Partitional Clustering
K-mean algorithm : Decide the number of the final classified result with N. Numbers of cluster: N ο¨we now assume N=3
34
Partitional Clustering
K-mean algorithm : Randomly choose N point for the centroids of cluster. (N=3) Numbers of cluster: N
35
Partitional Clustering
K-mean algorithm : Find the nearest point for every centroid of cluster. Classify the point into the cluster. Notice the definition of th nearest!
36
Partitional Clustering
K-mean algorithm : Calculate the new centroid of every cluster. Notice the definition of the centroid!
37
Partitional Clustering
K-mean algorithm : Repeat step1~ step4 until all the point are classified.
38
Partitional Clustering
K-mean algorithm : Repeat step1~ step4 until all the point are classified.
39
Partitional Clustering
K-mean algorithm : Repeat step1~ step4 until all the point are classified.
40
Partitional Clustering
K-mean algorithm : Repeat step1~ step4 until all the point are classified.
41
Partitional Clustering
K-mean algorithm : Repeat step1~ step4 until all the point are classified.
42
Partitional Clustering
K-mean algorithm : Data clustering completed For N=3
43
Example Image Clusters on intensity Clusters on color
Slides by D.A. Forsyth
44
Example Input image Segmentation using K-means
45
Thank you Next Lecture: Image segmentation
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.