Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS654: Digital Image Analysis

Similar presentations


Presentation on theme: "CS654: Digital Image Analysis"β€” Presentation transcript:

1 CS654: Digital Image Analysis
Lecture 26: Image segmentation

2 Recap of Lecture 25 Line detection Least square, RANSAC
Hough space/ parametric space Voting Circle detection GHT

3 Outline of Lecture 26 Thresholding Otsu’s method
Region based segmentation Region growing Split and Merge

4 Introduction Detected boundaries between regions
Pixel distribution was not exploited Can the regions be segmented directly? How to exploit the top-down/ bottom-up cues? Threshold Region-based segmentation

5 Thresholding 𝑔 π‘₯,𝑦 = 1 𝐼𝑓 𝑓 π‘₯,𝑦 >𝑇 0 𝐼𝑓 𝑓 π‘₯,𝑦 ≀𝑇
Global thresholding, local thresholding, adaptive thresholding Multiple thresholding 𝑔 π‘₯,𝑦 = 1 𝐼𝑓 𝑓 π‘₯,𝑦 >𝑇 0 𝐼𝑓 𝑓 π‘₯,𝑦 ≀𝑇 𝑔 π‘₯,𝑦 = 𝒂; 𝐼𝑓 𝑓 π‘₯,𝑦 > 𝑇 2 𝒃; 𝐼𝑓 𝑇 1 <𝑓 π‘₯,𝑦 ≀ 𝑇 2 𝒄; 𝐼𝑓 𝑓 π‘₯,𝑦 ≀ 𝑇 1

6 Key factors Automatic threshold selection Separation between peaks
The noise content in the image The relative size of the object and the background Uniformity of the illumination source Uniformity of the reflectance property of the image Automatic threshold selection

7 Iterative threshold selection
Select an initial estimate of the threshold 𝑇 . A good initial value is the average intensity of the image Calculate the mean grey values πœ‡ 1 and πœ‡ 2 of the partitions, 𝑅1, 𝑅2 Partition the image into two groups, 𝑅1, 𝑅2 , using the threshold Select a new threshold: Repeat steps 2-4 until the mean values πœ‡ 1 and πœ‡ 2 in successive iterations do not change. 𝑇= πœ‡ 1 + πœ‡ 2

8 Optimal thresholding Definition: Methods based on approximation of the histogram of an image using a weighted sum of two or more probability densities with normal distribution.

9 Example Input image Histogram Segmented image

10 Otsu’s Image Segmentation Algorithm
Find the threshold that minimizes the weighted within-class variance. Equivalent to: maximizing the between-class variance. Operates directly on the gray level histogram [e.g numbers, P(i)] It is fast (once the histogram is computed).

11 Otsu: Assumptions Histogram (and the image) are bimodal.
No use of spatial coherence, nor any other notion of object structure. Assumes stationary statistics, but can be modified to be locally adaptive Assumes uniform illumination (implicitly), so the bimodal brightness behavior arises from object appearance differences only.

12 Otsu’s method: Formulation
The weighted within-class variance is: Where the class probabilities are estimated as: And the class means are given by:

13 Otsu’s method: Formulation
Finally, the individual class variances are: Run through the full range of t values and pick the value that minimizes Is this algorithm first enough?

14 Between/ Within/ Total Variance
The total variance does not depend on threshold (obviously). For any given threshold, the total variance is the weighted sum of the within-class variances The between class variance, which is the sum of weighted squared distances between the class means and the global mean.

15 Total variance The total variance can be expressed as
Within-class, from before Between-class, Minimizing the within-class variance is the same as maximizing the between-class variance. compute the quantities in recursively as we run through the range of t values.

16 Recursive algorithm Initialization... ; Recursion...

17 Example Input image Histogram Global thresholding Otsu’s method

18 Example (in presence of noise)
Input image Histogram Global threshold Smoothened image Histogram Otsu’s method

19 Image partitioning Input image Histogram Global threshold
Global Otsu’s Method Image partitioning Local Otsu’s method

20 Thresholding (non-uniform background)
Input image Global thresholding using Otsu’s method Local thresholding with moving average

21 Region based segmentation
Goal: find coherent (homogeneous) regions in the image Coherent regions contain pixel which share some similar property Advantages: Better for noisy images Disadvantages: Oversegmented (too many regions), Undersegmented (too few regions) Can’t find objects that span multiple disconnected regions

22 Types of segmentations
Input Oversegmentation Undersegmentation Multiple Segmentations

23 Region Segmentation: Criteria
A segmentation is a partition of an image 𝐼 into a set of regions 𝑆 satisfying: οƒˆ Si = S Partition covers the whole image. Si  Sj = , i ο‚Ή j No regions intersect. ο€’ Si, P(Si) = true P(Si οƒˆ Sj) = false, i ο‚Ή j, Si adjacent Sj Homogeneity predicate is satisfied by each region. Union of adjacent regions does not satisfy it. Define and implement the similarity predicate.

24 Methods of Region Segmentation
Region growing Split and merge Clustering

25 Region Growing It start with one pixel of a potential region
Try to grow it by adding adjacent pixels till the pixels being compared are too dissimilar The first pixel selected can be The first unlabelled pixel in the image A set of seed pixels can be chosen from the image. Usually a statistical test is used to decide which pixels can be added to a region

26 Example Histogram Initial Seed image Input image Final seeds
Threshold 1 Threshold 2 Region growing Output image

27 Split and Merge Split into four disjoint coordinates any region 𝑅 𝑖 for which 𝑄( 𝑅 𝑖 )=false When no further splitting is possible, merge adjacent region for which 𝑄( 𝑅 𝑖 βˆͺ 𝑅 𝑗 )= true Stop when no further merging is possible

28 Example Input image R1 R2 R1 R3 R4 R4 Split Split Merge R21 R22 R23

29 Quadtree representation

30 Clustering Task of grouping a set of objects
Objects in the same group (called a cluster) are more similar (in some sense or another) to each other Object of one cluster is different from an object of the another cluster Connectivity model, centroid model, distribution model, density model, graph based model, hard clustering, soft-clustering, …

31 Feature Space Source: K. Grauman

32 Centroid model Computational time is short
User have to decide the number of clusters before starting classifying data The concept of centroid One of the famous method: K-means Method

33 Partitional Clustering
K-mean algorithm : Decide the number of the final classified result with N. Numbers of cluster: N we now assume N=3

34 Partitional Clustering
K-mean algorithm : Randomly choose N point for the centroids of cluster. (N=3) Numbers of cluster: N

35 Partitional Clustering
K-mean algorithm : Find the nearest point for every centroid of cluster. Classify the point into the cluster. Notice the definition of th nearest!

36 Partitional Clustering
K-mean algorithm : Calculate the new centroid of every cluster. Notice the definition of the centroid!

37 Partitional Clustering
K-mean algorithm : Repeat step1~ step4 until all the point are classified.

38 Partitional Clustering
K-mean algorithm : Repeat step1~ step4 until all the point are classified.

39 Partitional Clustering
K-mean algorithm : Repeat step1~ step4 until all the point are classified.

40 Partitional Clustering
K-mean algorithm : Repeat step1~ step4 until all the point are classified.

41 Partitional Clustering
K-mean algorithm : Repeat step1~ step4 until all the point are classified.

42 Partitional Clustering
K-mean algorithm : Data clustering completed For N=3

43 Example Image Clusters on intensity Clusters on color
Slides by D.A. Forsyth

44 Example Input image Segmentation using K-means

45 Thank you Next Lecture: Image segmentation


Download ppt "CS654: Digital Image Analysis"

Similar presentations


Ads by Google