Presentation is loading. Please wait.

Presentation is loading. Please wait.

H. Lexie Yang1, Dr. Melba M. Crawford2

Similar presentations


Presentation on theme: "H. Lexie Yang1, Dr. Melba M. Crawford2"— Presentation transcript:

1 Manifold Alignment for Multitemporal Hyperspectral Image Classification
H. Lexie Yang1, Dr. Melba M. Crawford2 School of Civil Engineering, Purdue University and Laboratory for Applications of Remote Sensing {hhyang1, July 29, 2011 IEEE International Geoscience and Remote Sensing Symposium The added earth logo is from the website:

2 Outline Introduction Research Motivation Proposed Approach
Effective exploitation of information for multitemporal classification in nonstationary environments Goal: Learn “representative” data manifold Proposed Approach Manifold alignment via given features Manifold alignment via correspondences Manifold alignment with spectral and spatial information Experimental Results Summary and Future Directions Research motivation

3 Introduction Challenges for classification of hyperspectral data
2001 2003 2004 2005 2006 2002 June July May 3 2 1 N narrow spectral bands Landscape changes are revealed in those multitemporal images. These images are also bringing out some possible applications: environmental monitoring and change detection. Add description of hyperspectral data rather than images. When we look spectral characteristics, spectra can vary with spatial or temporal changes SHORTER Challenges for classification of hyperspectral data temporally nonstationary spectra high dimensionality

4 Research Motivation Nonstationarities in sequence of images
Spectra of same class may evolve or drift over time Potential approaches Semi-supervised methods Adaptive schemes Exploit similar data geometries Explore data manifolds PREVIOUS WORK TO SOLVE THE DIFFICULTIES: Semi-supervised approach requires the assumption of smooth changes. However, sometimes the assumption maybe not true for multitemporal data sets. It is also commonly seen that adaptive schemes are used to redefine decision boundaries. Statistically speaking, class distributions will alter due to environments. Mean or variances will be different from a scene to a scene. Decision boundaries therefore are needed to adjust according to samples from new scene. DIFFERENT POINT OF VIEW: in geometric learning point of view, since we are talking about a geometric learning methodology, we assume two data sets are similar in some sense, and we need to find a mapping between two similar structures. Good initial conditions required

5 Manifold Learning for Hyperspectral Data
Characterize data geometry with manifold learning To capture nonlinear structures To recover intrinsic space (preserve spectral neighbors) To reduce data dimensionality Classification performed in low dimensional space Original space Manifold space 3rd dim Spectral bands WHY WAS THIS PAPER PROPOSED To deal with high dim problem, manifold learning has been proposed as a successful dimension reduction technique. Recover intrinsic space where critical spectral neighborhoods are preserved. In addition, manifold learning captures the nonlinear nature of data with fewer dimensions Class clusters are presented in this low dimension space. Just layers; research motivaiton?????? Problem statement n Spatial dimension 6 5 4 3 2 1 Spatial dimension 1st dim 2nd dim

6 Challenges: Modeling Multitemporal Data
Unfaithful joint manifold due to spectra shift Often difficult to model the inter-image correspondences Data manifold at T1 Data manifold at T2 Data manifolds at T1 and T2

7 Proposed Approach: Exploit Local Structure
Assumption: local geometric structures are similar Approach: Extract and optimally align local geometry to minimize overall differences Locality HOW TO DESCRIBE THE “SIMILAR STRUCTURES”: Manifold Alignment background: Given classes, we are able to assume the local geometric structures are similar Spectral space at T1 Spectral space at T2

8 Proposed Approach: Conceptual Idea
(Ham, 2005)

9 Proposed Approach: Manifold Alignment
Exploit labeled data for classification of multitemporal data sets Samples with class labels Samples with no class labels WHY DOES MA WORK FOR CLASSIFICATION: Our main interest is to classify. Aligning similar underlying manifolds is beneficial to classification work when at least one image contains label information. A joint manifold can characterize geometric structures of both data sets. Joint manifold

10 Manifold Alignment: Introduction
and are 2 multitemporal hyperspectral images Predict labels of using labeled Explore local geometries using graph Laplacian and some form of prior information Define Graph Laplacian Two potential forms of prior information: given features and pairwise correspondences [Ham et al. 2005] DESCRIBE THE GOAL OF THIS PAPER START TO DEPLOY THE ALGORITHM: Graph Laplacian represent local neighbors information with connectivity matrix W and adjacent matrix D

11 Manifold Alignment via Given Features
First term: preserving given features Second term: clustering conditions on local properties \mu: tuning the relative weights of two terms in the cost function Minimize Joint Manifold Given Features

12 Manifold Alignment via Pairwise Correspondences
First term: pairwise alignment constraints Second and third terms: Local properties \mu: tuning the relative weights of two terms in the cost function Minimize Joint Manifold Correspondences between and

13 MA with spectral and spatial information
Combine spatial locations with spectral signatures To improve local geometries (spectral) quality Idea: Increase similarity measure when two samples are close together Weight matrix for graph Laplacian: where spatial location of each pixel is represented as Font in equation description

14 Experimental Results: Data
May June July Three Hyperion images collected in May, June and July 2001 May, June pair: Adjacent geographical area June, July pair: Targeted the same area It is a challenging classification scene pair due to they are not colocated

15 Experimental Results: Framework
Graph Laplacian Prior information Joint manifold I1, I2 L Given features I1 L I2 L Classification with KNN Correspondences Develop Data Manifold of Pooled Data BASELINE: Demonstrate how pooled data can fail a proper joint manifold Use other colors, not gray Data sets Labels Pair 1 Pair 2 May June Training data For KNN classifier July Testing data For overall accuracy evaluation

16 Manifold Learning for Feature Extraction
Global methods consider geodesic distance Isometric feature mapping (ISOMAP) Local methods consider pairwise Euclidian distance Locally Linear Embedding (LLE): (Saul and Roweis, 2000) Local Tangent Space Alignment (LTSA): (Zhang and Zha, 2004) Laplacian Eigenmaps (LE): (Belkin and Niyogi, 2004) (Tenenbaum, 2000)

17 MA with Given Features Baseline: Joint manifold developed by pooled data 79.21 77.29 Change color MA space using lower cases 77.88 76.31 (May, June pair)

18 MA Results – Classification Accuracy
Evaluate results by overall accuracies Methods Overall Accuracy May, June June, July Manifold learning from pooled data 62.38% 83.00% Manifold alignment (MA) Given features (LE) 79.21% 86.16% Correspondences 81.22% 84.27% Methods Overall Accuracy May , June June, July Given features (LE) Spectral 79.21% 86.16% Spectral + spatial 84.21% 90.30% Correspondences 81.22% 84.27% 84.74% 90.11%

19 Results – Class Accuracy
May, June pair Bold: class accuracy May, June pair Typical class (Island Interior) Critical class Critical class (Riparian) (Woodlands)

20 Summary and Future Directions
Multitemporal spectral changes result in failure to provide a faithful data manifold Manifold alignment framework demonstrates potential for nonstationary environment by utilizing similar local geometries and prior information Spatial proximity contributes to stabilization of local geometries for manifold alignment approaches Future directions Investigate alternative spatial and spectral integration strategy Address issue of longer sequences of images Spatial and contextual information contained in images can be exploited to improve prior information A longer sequence of images for applying manifold alignment will be studied

21 Thank you. Questions?

22 References J. Ham, D. D. Lee, and L. K. Saul, “Semisupervised alignment of manifolds,” in International Workshop on Artificial Intelligence and Statistics, August 2005.

23 Backup Slides

24 Local Manifold Learning for Feature Extraction (s,f)
Local geometry preserved via various strategies for embedding Popular local manifold learning methods Locally Linear Embedding (LLE): (Saul and Roweis, 2000) Local Tangent Space Alignment (LTSA): (Zhang and Zha, 2004) Laplacian Eigenmaps (LE): (Belkin and Niyogi, 2004) Pairwise distance between neighbors computed using Gaussian kernel function - O(pN2) method Embedding computed to minimize the total distance between neighbors [WK] Slide 5 introduced the abbreviations for the local methods.

25 LE: Impact of Parameter Values
Parameter values for local embedding s obtained via grid search k, p obtained empirically [WK] Slide 5 introduced the abbreviations for the local methods. BOT Class 3, 6 BOT Classes 1-9

26 Alignment Results: Typical Class
Island Interior Pooled Data MA: Given Features MA: Correspondences Class Accuracy 24.9% 67.8% 96.25%

27 Alignment Results: Critical Class
Critical class: Riparian Pooled Data MA: Given Features MA: Correspondences Class Accuracy 56.1% 71.6% 59.4%

28 Alignment Results: Critical Class
Critical class: Woodlands Pooled Data MA: Given Features MA: Correspondences Class Accuracy 45.1% 35.8% 60.5%

29 MA Results – Classification Accuracy
Evaluate results by overall accuracies Methods Overall Accuracy May, June June, July Manifold learning from pooled data 62.38% 83% Manifold alignment (MA) Given features 77.46% 86.16% Correspondences 81.22% 84.27% Labeled Class (Subset Data) Classified via Pooled Data Classified via Given Features Classified via Correspondences May, June pair

30 MA Results – Classification Accuracy
Methods Overall Accuracy May , June June, July Given features using LE Spectral 79.21% 86.16% Spectral + spatial 84.21% 90.3% Correspondences 81.22% 84.27% 84. 74% 90.11% Labeled Class (Subset Data) Classified via Given Features (Spectral) Classified via Correspondences (Spectral) Compare to previous results? Classified via Given Features (Spectral + spatial) Classified via Correspondences (Spectral + spatial) May, June pair


Download ppt "H. Lexie Yang1, Dr. Melba M. Crawford2"

Similar presentations


Ads by Google