Unsupervised Classification

Slides:



Advertisements
Similar presentations
TWO STEP EQUATIONS 1. SOLVE FOR X 2. DO THE ADDITION STEP FIRST
Advertisements

Slide 1 Insert your own content. Slide 2 Insert your own content.
Copyright © 2002 Pearson Education, Inc. Slide 1.
1 Copyright © 2010, Elsevier Inc. All rights Reserved Fig 3.1 Chapter 3.
Copyright © 2011, Elsevier Inc. All rights reserved. Chapter 4 Author: Julia Richards and R. Scott Hawley.
1 Copyright © 2010, Elsevier Inc. All rights Reserved Fig 2.1 Chapter 2.
1 Copyright © 2013 Elsevier Inc. All rights reserved. Chapter 28.
1 Chapter 40 - Physiology and Pathophysiology of Diuretic Action Copyright © 2013 Elsevier Inc. All rights reserved.
By D. Fisher Geometric Transformations. Reflection, Rotation, or Translation 1.
Combining Like Terms. Only combine terms that are exactly the same!! Whats the same mean? –If numbers have a variable, then you can combine only ones.
Chapters 1 & 2 Theorem & Postulate Review Answers
and 6.855J Cycle Canceling Algorithm. 2 A minimum cost flow problem , $4 20, $1 20, $2 25, $2 25, $5 20, $6 30, $
Graph of a Curve Continuity This curve is _____________These curves are _____________ Smoothness This curve is _____________These curves are _____________.
Graph of a Curve Continuity This curve is continuous
Counting and Number 1 to 15Order the numbers Numbers to 15.
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
List and Search Grants Chapter 2. List and Search Grants 2-2 Objectives Understand the option My Grants List Grant Screen Viewing a Grant Understand the.
Multiplying monomials & binomials You will have 20 seconds to answer the following 15 questions. There will be a chime signaling when the questions change.
0 - 0.
ALGEBRAIC EXPRESSIONS
DIVIDING INTEGERS 1. IF THE SIGNS ARE THE SAME THE ANSWER IS POSITIVE 2. IF THE SIGNS ARE DIFFERENT THE ANSWER IS NEGATIVE.
MULTIPLYING MONOMIALS TIMES POLYNOMIALS (DISTRIBUTIVE PROPERTY)
ADDING INTEGERS 1. POS. + POS. = POS. 2. NEG. + NEG. = NEG. 3. POS. + NEG. OR NEG. + POS. SUBTRACT TAKE SIGN OF BIGGER ABSOLUTE VALUE.
MULTIPLICATION EQUATIONS 1. SOLVE FOR X 3. WHAT EVER YOU DO TO ONE SIDE YOU HAVE TO DO TO THE OTHER 2. DIVIDE BY THE NUMBER IN FRONT OF THE VARIABLE.
SUBTRACTING INTEGERS 1. CHANGE THE SUBTRACTION SIGN TO ADDITION
MULT. INTEGERS 1. IF THE SIGNS ARE THE SAME THE ANSWER IS POSITIVE 2. IF THE SIGNS ARE DIFFERENT THE ANSWER IS NEGATIVE.
Teacher Name Class / Subject Date A:B: Write an answer here #1 Write your question Here C:D: Write an answer here.
Addition Facts
CS4026 Formal Models of Computation Running Haskell Programs – power.
HOW TO COMPARE FRACTIONS
O X Click on Number next to person for a question.
© S Haughton more than 3?
5.9 + = 10 a)3.6 b)4.1 c)5.3 Question 1: Good Answer!! Well Done!! = 10 Question 1:
1 Directed Depth First Search Adjacency Lists A: F G B: A H C: A D D: C F E: C D G F: E: G: : H: B: I: H: F A B C G D E H I.
HOW TO COMPARE FRACTIONS
Twenty Questions Subject: Twenty Questions
Take from Ten First Subtraction Strategy -9 Click on a number below to go directly to that type of subtraction problems
Created by Susan Neal $100 Fractions Addition Fractions Subtraction Fractions Multiplication Fractions Division General $200 $300 $400 $500 $100 $200.
Slide 1 EE3J2 Data Mining EE3J2 Data Mining Lecture 11: K-Means Clustering Martin Russell.
Lets play bingo!!. Calculate: MEAN Calculate: MEDIAN
Past Tense Probe. Past Tense Probe Past Tense Probe – Practice 1.
Limits (Algebraic) Calculus Fall, What can we do with limits?
Properties of Exponents
Document Clustering Carl Staelin. Lecture 7Information Retrieval and Digital LibrariesPage 2 Motivation It is hard to rapidly understand a big bucket.
Addition 1’s to 20.
25 seconds left…...
Test B, 100 Subtraction Facts
11 = This is the fact family. You say: 8+3=11 and 3+8=11
Week 1.
We will resume in: 25 Minutes.
1 Ke – Kitchen Elements Newport Ave. – Lot 13 Bethesda, MD.
Use the substitution method
Bottoms Up Factoring. Start with the X-box 3-9 Product Sum
1 Unit 1 Kinematics Chapter 1 Day
FIND THE AREA ( ROUND TO THE NEAREST TENTHS) 2.7 in 15 in in.
O X Click on Number next to person for a question.
Number Bonds to = = = = = = = = = = = 20.
Combining Like Terms.
EXAMPLE 3 Use synthetic division
One step equations Add Subtract Multiply Divide Addition X + 5 = -9 X = X = X = X = X = 2.
SEEM Tutorial 4 – Clustering. 2 What is Cluster Analysis?  Finding groups of objects such that the objects in a group will be similar (or.
Agglomerative Hierarchical Clustering 1. Compute a distance matrix 2. Merge the two closest clusters 3. Update the distance matrix 4. Repeat Step 2 until.
Landsat unsupervised classification Zhuosen Wang 1.
1 CLUSTERING ALGORITHMS  Number of possible clusterings Let X={x 1,x 2,…,x N }. Question: In how many ways the N points can be assigned into m groups?
Hierarchical Clustering Produces a set of nested clusters organized as a hierarchical tree Can be visualized as a dendrogram – A tree like diagram that.
Machine Learning Queens College Lecture 7: Clustering.
Text Categorization Berlin Chen 2003 Reference:
SEEM4630 Tutorial 3 – Clustering.
Supervised Classification
Presentation transcript:

Unsupervised Classification CHAPTER 14 CLASSIFICATION Clustering and Unsupervised Classification A. Dermanis

Clustering = dividing of N pixels into K classes ω1, ω2, …, ωK scatter matrix of class ωi : mean of class ωi :  xi Si = (x – mi)(x – mi)T mi = x  xi 1 ni covariance matrix of class ωi : Ci = Si 1 ni total scatter matrix: global mean   i xi ST = (x – mi)(x – mi)T m = x 1 N   i xi total covariance matrix: CT = ST 1 N A. Dermanis

     Sin = Si = (x – mi)(x – mi)T Sex = ni (mi – m)(mi – m)T Clustering criteria overall compactness of the clusters  internal scatter matrix   i xi Sin = Si = (x – mi)(x – mi)T  i degree of distinction between the clusters  external scatter matrix Sex = ni (mi – m)(mi – m)T  i ST = Sin + Sex = constant Optimal algorithm: Sin = min and Sex = max (simultaneously) Problem: How many clusters ? (K = ?) Extreme choice: K = N (each pixel a different class) k = {xk} mk = xk, Sk = 0, Sin = Sk = 0 = min, Sex = ST =max  k Extreme choice: K = 1 (all pixels in a single class) Sin = ST, Sex = 0 A. Dermanis

Hierarchical Clustering 1 2 3 4 5 6 A Agglomerative clustering: Unifying at each step the two closest clusters B AGGLOMERATIVE DIVISIVE C Divisive clustering : Dividing at each step the most disperse cluster into two new clusters D E F Needed: Unification criteria. Division criteria and procedures. A. Dermanis

Hierarchical Clustering 1 2 3 4 5 6 1 2 3 4 5 6 A B AGGLOMERATIVE DIVISIVE C D E F A B C D E F A. Dermanis

Distance between two clusters (alternatives): mean distance: minimum distance: maximum distance: Used in agglomerative and divisive clustering A. Dermanis

The K-means or migrating means algorithm 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 A. Dermanis

The K-means or migrating means algorithm 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 Step 0: Selection of K = 3 pixels as initial positions of means A. Dermanis

The K-means or migrating means algorithm 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 Step 1: Assignment each pixels to the cluster of its closest mean Calculation of the new means for each cluster A. Dermanis

The K-means or migrating means algorithm 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 Step 2: Assignment each pixels to the cluster of its closest mean Calculation of the new means for each cluster A. Dermanis

The K-means or migrating means algorithm 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 Step 3: Assignment each pixels to the cluster of its closest mean Calculation of the new means for each cluster A. Dermanis

The K-means or migrating means algorithm 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 Step 4: Assignment each pixels to the cluster of its closest mean All pixels remain in the same cluster. Means remain the same. Termination of the algorithm ! A. Dermanis

A variant of the K means algorithm. The Isodata Algorithm A variant of the K means algorithm. In each step one of 3 additional procedures can be used: 1. Cluster ELIMINATION Eliminate clusters with very few pixels 2. Cluster UNIFICATION Unify pairs of clusters Very close to each other 3. Cluster DIVISION Divide large clusters which are elongated Into two clusters A. Dermanis

The Isodata Algorithm 1. Cluster ELIMINATION Eliminate clusters with very few pixels A. Dermanis

The Isodata Algorithm 2. Cluster UNIFICATION Unify pairs of clusters Very close to each other A. Dermanis

The Isodata Algorithm 3. Cluster DIVISION Divide large clusters which are elongated Into two clusters A. Dermanis

m2+kσ2 m2 m2–kσ2 m1 The Isodata Algorithm The unification process The division process m2+kσ2 m2–kσ2 m2 m1 A. Dermanis

Examples of classifiction using the K-mean algorithm K-means: 3 classes K-means: 5 classes K-means: 7 classes K-means: 9 classes A. Dermanis

Examples of classifiction using the ISODATA algorithm ISODATA : 3 classes ISODATA : 5 classes ISODATA : 7 classes ISODATA : 9 classes A. Dermanis