Presentation is loading. Please wait.

Presentation is loading. Please wait.

Image Classification: Introduction Lecture Notes 6 prepared by R. Lathrop 11/99 updated 3/04 Readings: ERDAS Field Guide 6th Ed. CH. 6.

Similar presentations


Presentation on theme: "Image Classification: Introduction Lecture Notes 6 prepared by R. Lathrop 11/99 updated 3/04 Readings: ERDAS Field Guide 6th Ed. CH. 6."— Presentation transcript:

1 Image Classification: Introduction Lecture Notes 6 prepared by R. Lathrop 11/99 updated 3/04 Readings: ERDAS Field Guide 6th Ed. CH. 6

2 Image Classification One of the major applications of remotely sensed imagery is to provide information on the amount and spatial distribution of various types of land use and land cover land cover - the biophysical material covering the land surface land use - the use to which land is put by humans Move toward more automated procedures using digital image processing to map land use-land cover

3 Computer-assisted classification of remotely sensed images Automatically categorize all pixels in an image into land cover classes or themes Convert image data into information Normally uses multi-spectral data and spectral pattern recognition techniques as compared to spatial or temporal pattern recognition to aid in identification

4 Objective: Image to Thematic Map

5 Remotely Sensed Image Classification 1 st step: identify classification scheme to be applied Hierarchical approach of increasing specificity Level I: most general Level II: more specific Level of classification depends on the spatial, spectral, temporal and radiometric resolution of the image data

6 National Land Cover Dataset Classification system: 21 classes Water 11 Open Water 12 Perennial Ice/Snow Developed 21 Low Intensity Residential 22 High Intensity Residential 23 Commercial/Industrial/Transportation Barren 31 Bare Rock/Sand/Clay 32 Quarries/Strip Mines/Gravel Pits 33 Transitional Forested Upland 41 Deciduous Forest 42 Evergreen Forest 43 Mixed Forest Shrubland 51 Shrubland Non-Natural Woody 61 Orchards/Vineyards/Other Herbaceous Upland Natural/Semi-natural Vegetation 71 Grasslands/Herbaceous Herbaceous Planted/Cultivated 81 Pasture/Hay 82 Row Crops 83 Small Grains 84 Fallow 85 Urban/Recreational Grasses Wetlands 91 Woody Wetlands 92 Emergent Herbaceous Wetlands http://landcover.usgs.gov/prodescription.asp

7 Feature Space Image Visualization of 2 bands of image data simultaneously through a 2 band scatterplot - the graph of the data file values of one band of data against the values of another band Feature space - abstract space that is defined by spectral units

8 Each dot represents a pixel; the warmer the colors, the higher the frequency of pixels in that portion of the feature space

9 Spectral Pattern Recognition Numerical process whereby elements of multi-spectral image data sets are categorized into a limited number of spectrally separable, discrete classes: 1) show (train) the computer the multiple spectral band data associated with land cover type of interest 2) the computer decides, using some form of classification decision rule, which land cover type each pixel most looks like

10 Classification can be thought of as trying to relate spectral classes or locations in the feature space with the appropriate information class

11 Spectral vs. Information Class Spectral class - group (cluster) of spectrally "like" pixels Information class - land use/land cover class of interest May take many spectral classes to describe one information class. One spectral class may represent more than 1 information class.

12 Spectral vs. Information Classes: May take many spectral classes to describe one information class. One spectral class may represent more than 1 information class. Spectral ClassInformation Class Sunlit coniferUpland Conifer Hillside shadowed conifers Upland Deciduous Deciduous broadleafLowland Deciduous

13 Spectral Classes : pixels of one land cover type tend to cluster together Red reflectance NIR reflect ance Soil 1 Soil 2 Soil 3 Water 1 Water 2 Veg 1 Veg 2 Veg3 Adapted from J.A. Richards, 1986

14 Spectral vs. Information Classes Red reflectance NIR reflect ance Soil 1 Soil 2 Soil 3 Water 1 Water 2 Veg 1 Veg 2 Veg3 Soil Information class Adapted from J.A. Richards, 1986

15 Spectral & information classes do not always have a 1-to-1 match Red reflectance NIR reflect ance Soil 1 Soil 2 Soil 3 Water 1 Water 2 Veg 1 Veg 2 Veg3 Same spectral class may belong to more than one information class Adapted from J.A. Richards, 1986 Developed 1 Developed 2 Developed 3

16 Classification Process 1) Training/Clustering Stage - the process of defining criteria by which spectral patterns are recognized, developing a numerical description for each spectral class 2) Classification Stage - each pixel in the image data set is categorized into the spectral class it most closely resembles based on a mathematical decision rule 3) Output Stage - results are presented in a variety of forms (tables, graphics, etc.)

17 Multispectral classification Multispectral image classification using spectral pattern recognition often relies on measuring the “likelihood” that a pixel belongs to one class vs. another. This likelihood generally relies on some measure of distance between a pixel and the various spectral classes clusters. For example, if a pixel is “closest” to Spectral Class 1 vs. Spectral Class2, then the pixel is classified into spectral Class 1. Spectral distance can be measured in several ways: - as simple euclidean distance in multispectral space - as a statistical distance or probability

18 Spectral distance Spectral distance - the Euclidean distance in n-dimensional spectral space D = SQRT[(sum (d k - e k ) 2 ] where d k = BV of pixel d in band k where e k = BV of pixel e in band k the equation is summed across k = 1 to n bands

19 What is the spectral distance between Pixel A and Cluster 1? X Y 92, 153 180, 85 Pixel A Cluster 1

20 Spectral Distance example Distance between [x 1,y 1 ] & [x 2, y 2 ] [180, 85] & [92, 153] D = SQRT[(sum (d k - e k ) 2 ] D = SQRT[(180-92) 2 + (85-153) 2 ] = SQRT[(88) 2 + (-68) 2 ] = SQRT[7744 + 4624] = SQRT[12,368] = 111.2

21 Spectral Distance example X Y 92, 153 180, 85 X d = 180 -92 Y d = 85-153

22 Supervised vs. Unsupervised Approaches Supervised - image analyst "supervises" the selection of spectral classes that represent patterns or land cover features that the analyst can recognize Prior Decision Unsupervised - statistical "clustering" algorithms used to select spectral classes inherent to the data, more computer-automated Posterior Decision

23 Supervised vs. Unsupervised Red NIRNIR Supervised Prior Decision: from Information classes in the Image to Spectral Classes in Feature Space Unsupervised Posterior Decision: from Spectral Classes in Feature Space to Information Classes in the Image

24 Supervised vs. Unsupervised Edit/evaluate signatures Select Training fields Classify image Evaluate classification Identify classes Run clustering algorithm Evaluate classification Edit/evaluate signatures

25 ISODATA (Iterative Self-Organizing Data Analysis Technique) Clustering Algorithm User specified Input maximum number of clusters maximum % of pixels whose class values are allowed to be unchanged between iterations. maximum number of iterations minimum number of members in a cluster, if fall below threshold then that cluster eliminated maximum standard deviation: if the std dev exceeds the threshold then that cluster is split into two minimum distance between cluster means

26 Initial Cluster Allocation clusters allocated along the mean n- dimensional vector spaced according to std dev distance away from central mean Red NIRNIR

27 Algorithm Iteration each pixel is compared to each cluster mean and assigned to the cluster whose mean is closest in Euclidean distance ________________________________ \/(DN b1i -DN b1m ) 2 +... + (DN bxi - DN bxm ) 2 )) and a new cluster center is computed by averaging the locations of all the pixels assigned to that cluster.

28 ISODATA: multiple iterations from initial allocation to final assignment Red NI R Initial clusters Final clusters Red NI R Adapted from Jensen 2nd ed, 1996

29 Example of Naturally Clustered Data Adapted from Swain Green Vegetation Senesced Vegetation Red NIR

30 Red NIRNIR NIRNIR NIRNIR NIRNIR x x x x x x x x Initial cluster centers After 1st iteration After 2nd iteration Final cluster centers

31 Final Cluster Centers Adapted from Swain Green Vegetation Senesced Vegetation Red NIR X X

32 In spectral feature space, generally no distinct, isolated clusters, rather a continuous gradient. Classification can be thought of as trying to subdivide the feature space into appropriate spectral regions

33 Algorithm Iteration The Sum of Squared Errors (SSE) computes the cumulative squared difference (in the various bands) of each pixel from its cluster center for each cluster individually, and then sums these measures over all the clusters. The algorithm will stop either when the # iteration threshold is reached Or the max % of unchanged pixel threshold is reached

34 Example: ISODATA clustering 2. Calculating new cluster means - Cluster 1 & 2 unchanged - Cluster 3 migrates to Band 4 mean, Band 5 mean = 30, 15 1. Assigning unclassified pixels to cluster means Initial clusters: 1 (10,10), 2 (20,20) 3 (30,20) Unclassified Pixel (30,10)

35 Post-clustering Assignment The analyst must then assign each spectral cluster to an information class based on available ancillary information (e.g., field reference data, maps, aerial photos, analyst experience) Posterior Decision process If one and only one information class can not be unequivocally assigned to a cluster then assign the cluster to a “mixed” class

36 Post Clustering Assignment: what information class can be assigned to each spectral cluster? Red NI R Adapted from Jensen 2nd ed, 1996 31:Bare rock 84:fallow 71:grass 11:water 42 evergreen forest 41:decidous forest 33:transitional 91:wetlands 21:low intensity residential 23: Commercial 32: Quarries 22: high intensity residential

37 ISODATA Clustering: Pros clustering not geographically biased to any particular portion of the image highly successful at finding inherent spectral clusters results similar to a minimum-distance-to-means classifier

38 ISODATA Clustering: Cons analyst doesn’t know a priori number of spectral classes number of iterations needed, can be time consuming does not account for pixel spatial homogeneity Insensitive to variance/covariance

39 Cluster-busting technique to iteratively “bust up” spectrally “mixed” classes separate “good” vs. “bad” classified pixels into a binary mask mask out the “good” image, extract the “bad” image data and re-run the unsupervised process re-evaluate new clusters, keep good, toss out “bad”, cluster bust again create final cluster map by using a GIS overlay with a maximum dominate function

40 Cluster-busting: in feature space Red NIRNIR NIRNIR

41 Cluster-busting: in geographic space - separate “good” vs. “bad” classified pixels into a binary mask - mask out the “good” (green) image - extract the “bad”(red) image data - re-run the unsupervised process

42 Cluster-busting - re-evaluate new clusters, keep good, toss out “bad”, cluster bust again (if needed) - create final cluster map by using a GIS overlay with a maximum dominate function Overlay

43 Cluster busting Recode “good” class(es) = 0 “bad” class(es) > 1 Mask original image file

44 Cluster busting New clusters = the holes Old clusters – “bad” = swiss cheese Overlay


Download ppt "Image Classification: Introduction Lecture Notes 6 prepared by R. Lathrop 11/99 updated 3/04 Readings: ERDAS Field Guide 6th Ed. CH. 6."

Similar presentations


Ads by Google