Presentation is loading. Please wait.

Presentation is loading. Please wait.

Multispectral Remote Sensing Multispec program from PurdueMultispec On Information Extraction Principles for Hyperspectral Data, David Landgrebe, School.

Similar presentations


Presentation on theme: "Multispectral Remote Sensing Multispec program from PurdueMultispec On Information Extraction Principles for Hyperspectral Data, David Landgrebe, School."— Presentation transcript:

1 Multispectral Remote Sensing Multispec program from PurdueMultispec On Information Extraction Principles for Hyperspectral Data, David Landgrebe, School of Electrical and Computer Engineering, Purdue University.On Information Extraction Principles for Hyperspectral Data

2 Spaces Image space –rendering of data from (usually) up to three of the many sensors Spectral space –Function space with elements being spectra or linear combinations of spectra Feature space –representation, representation, representation

3 http://www.nasm.edu/ceps/RPIF/LANDSA T/http://www.nasm.edu/ceps/RPIF/LANDSA T/ Image Space

4 http://edcwww.cr.usgs.gov/Webglis/glisbin/ guide.pl/glis/hyper/guide/napphttp://edcwww.cr.usgs.gov/Webglis/glisbin/ guide.pl/glis/hyper/guide/napp

5 Spectral Space (Theory) Road Vegetation Water Reflectance Wavelength

6 Spectral Space (Reality) Road Vegetation Water Unknown Reflectance Wavelength 2 1

7 Feature Space Road Vegetation Water Unknown Sensor response at 1 Sensor response at 2

8 Feature Space Choosing features can be hard. Start with native data representation, for example: –sensor responses at band centers –mean sensor response in band –other weighted averages –sensor response at specified wavelengths

9 Feature Space Road Vegetation Water Sensor response at 1 Sensor response at 2

10 Classifying Road Vegetation Sensor response at 1 Sensor response at 2 Nearest mean may be wrong Second order better (Gaussian ML decision boundary)

11 Statistical Moments Nf = num features, Ns = num samples Samples x(i,j), i=1…Ns, j=1...Nf from a class C  C  C (x) = mean of x(:, j)  C = Cov C (r,s) = mean{ (x(:,r)-  C (r)) (x(:,s)-  i (s))} These are the sample moments. Training = finding enough samples to make these be good estimates of the true moments

12 Quadratic (Gaussian) Classifier x an element of an unknown class (e.g. a pixel of unknown classification). It’s a col. vector of size Nf. What class is it in? g C (x) = -(1/2)ln(|  C |) - (1/2)(x-  C ) t  C -1 (x-  C ) Choose C if g C (x) >= g D (x) for all other classes D If all the classes are Gaussian, this is a maximum likelihood classifier. Among all possible classifiers it minimizes a reasonable error measure.

13 Improving the feature space Want to –minimize dimension –find a basis of uncorrelated features –simplify calculations Eigenvalue analysis; anova; KL- decompositon,...


Download ppt "Multispectral Remote Sensing Multispec program from PurdueMultispec On Information Extraction Principles for Hyperspectral Data, David Landgrebe, School."

Similar presentations


Ads by Google