Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Particle Swarm Optimization-based Dimensionality Reduction for Hyperspectral Image Classification He Yang, Jenny Q. Du Department of Electrical and Computer.

Similar presentations


Presentation on theme: "1 Particle Swarm Optimization-based Dimensionality Reduction for Hyperspectral Image Classification He Yang, Jenny Q. Du Department of Electrical and Computer."— Presentation transcript:

1 1 Particle Swarm Optimization-based Dimensionality Reduction for Hyperspectral Image Classification He Yang, Jenny Q. Du Department of Electrical and Computer Engineering Mississippi State University, MS 39762, USA

2 Outline   Motivation   Existing band selection approaches   Unsupervised band selection   Supervised band selection   Particle swarm optimization (PSO)   PSO for hyperspectral band selection   Experimental results   Conclusion

3 Motivation   The vast data volume of hyperspectral imagery brings about problems in data transmission and storage. In particular, the very high data dimensionality presents a challenge to many traditional image analysis algorithms.   One approach of reducing the data dimensionality is to transform the data onto a low-dimensional space using certain criteria (e.g., PCA, LDA). But these methods usually change the physical meaning of the original data since the channels in the low-dimensional space do not correspond to individual original bands but their linear combinations.   Another dimensionality reduction approach is band selection. It is to select a subset of the original bands without losing their physical meaning.

4 Motivation (Cont’d)   In terms of object information availability, band selection techniques can be divided into two categories: supervised and unsupervised. Supervised methods are to preserve the desired object information, which is known a priori; while unsupervised methods do not assume any object information.   Supervised techniques clearly aim at selecting the bands that include important object information and the selected bands can provide better detection or classification than those from unsupervised techniques. When the prior knowledge is unavailable, we have to apply an unsupervised method that can generally offer good performance regardless of the objects to be detected or classified in the following step.

5   In this research, dimensionality reduction is achieved by supervised band selection, and we propose to use particle swarm optimization (PSO) in conjunction with simple but effective objective functions for optimal band searching.   We will demonstrate that using data dimensionality reduction as a pre-processing step, support vector machine (SVM)- based classification accuracy (either before or after decision fusion) can be greatly improved. Motivation (Cont’d)

6 Unsupervised Band Selection   The basic idea of an unsupervised band selection is to select distinctive and informative bands.   Information Entropy   First Spectral Derivative   Second Spectral Derivative   Spectral Angle   Spectral Correlation   Uniform Spectral Spacing   Unsupervised band selection can be achieved by evaluating band similarity. ● ● Q. Du and H. Yang, “Similarity-based unsupervised band selection for hyperspectral image analysis,” IEEE Geoscience and Remote Sensing Letters, vol. 5, no. 4, pp. 564-568, Oct. 2008. ● ● H. Yang, Q. Du, and G. Chen, “Unsupervised hyperspectral band selection using graphics processing units,” IEEE Journal of Selected Topics in Earth Observation and Remote Sensing, vol. 4, no. 3, July 2011.

7 Supervised Band Selection   When class information is known, supervised band selection is applied to preserve the desired object information.   A supervised band selection algorithm maximizes class separability when a subset of bands is selected.   Class separability may be measured with − −Divergence − −Transformed divergence − −Bhattacharyya distance − −Jeffries-Matusita (JM) distance   Recently, we proposed a new metric based on minimum endmember abundance covariance (MEAC). ● ● H. Yang, Q. Du, H. Su, and Y. Sheng, “An efficient method for supervised hyperspectral band selection,” IEEE Geoscience and Remote Sensing Letters, vol. 8, no. 1, pp. 138-142, Jan. 2011.

8 Band Searching   To avoid testing all the possible band combinations, subset searching strategies can be used:   Sequential forward selection (SFS)   Sequential forward floating selection (SFFS)   Branch and Bound   An advanced but simple searching strategy is particle swarm optimization (PSO).

9 Particle Swarm Optimization   PSO is a computational optimization technique developed by Kennedy and Eberhart in 1995. It uses a simple mechanism that mimics swarm behavior in birds flocking and fish schooling to guide the particles to search for global optimal solutions.   PSO is proved to be a very efficient optimization algorithm by searching an entire high-dimensional problem space.   PSO does not use the gradient of the problem being optimized, so it does not require that the optimization problem be differential as required by classic optimization methods. PSO can be useful for optimization of irregular problems.

10 ● ● PSO is used to search the solution of. ● ● The initial particles are spread sparsely in the whole problem space in iteration 1. ● ● The particles start to be pulled by the update procedure to the optimal regions from iteration 25 to iteration 75. ● ● All the particles are gathered at the optimum point by the updating procedure. Iteration 1 Iteration 25 Iteration 50 Iteration 75

11 PSO for Band Selection   Assume p bands are to be selected. Let a particle x id (of size p×1) denote the selected band indices, and v id the update for selected band indices. The historically best local solution is v id, and the historically best global solution among all the particles is p gd. Particle update:   It calculates the new velocity for each particle based on the previous velocity v id, the particle’s location (p id ) that it has reached so far so best for the objective function, and the particle’s location among the global searched solutions (p gd ) that has reached so far so best. Particles are updated as:   c 1 and c 2 control the contributions from local and global solutions respectively, r 1 and r 2 are independent random variables; and w is used as the scalar of previous velocity v id in particle update.

12 PSO for Band Selection   Algorithm: 1. Assume p bands are to be selected. Randomly initialize M particles x id, and each particle includes p indices of the bands to be selected. 2.Evaluate the objective function for each particle, and determine the local and global optimal solution p id and p gd respectively. 3.Update all the particles. 4.If the algorithm is converged, then stop; otherwise, go to step 2. 5.The particle yielding the global optimum solution p gd is the final result.   MEAC:   JM distance:   Objective function:

13 Illustration of PSO-based band selection (Selecting 6 bands from L bands)

14 Convergence curves of PSO-based band selection (MEAC)

15 Convergence curve of PSO-based band selection (JM distance)

16 Decision Fusion Hyperspectral Image Data Supervised classifier (SVM) Use unsupervised result to segment supervised result Unsupervised classifier (Kmeans, Mean-Shift) (Weighted) Majority Voting Final Decision ● ● H. Yang, Q. Du, and B. Ma, “Decision fusion on supervised and unsupervised classifiers for hyperspectral imagery,” IEEE Geoscience and Remote Sensing Letters, vol. 7, no. 4, pp. 875-879, Oct. 2010.

17 Experiments   The hyperspectral data used in the experiments was taken by the airborne Hyperspectral Digital Imagery Collection Experiment (HYDICE) sensor. It was collected for the Mall in Washington, DC with 210 bands covering 0.4-2.4 µm spectral region. The water-absorption bands were deleted, resulting in 191 bands. The original data has 1280×307 pixels.   Another hyperspectral data used in the experiments was the 126-band HyMap data about a residential area near the campus of Purdue University. The image size is 377×512.

18 HYDICE Experiment six classes: road, grass, shadow, trail, tree, roof

19 HYDICE Experiment TrainingTest Road 55892 Grass 57910 Shadow 50567 Trail 46624 Tree 49656 Roof 521123

20 SVM classification accuracy using MEAC-selected bands in the HYDICE experiment

21 SVM classification accuracy using JM-selected bands in the HYDICE experiment

22 SVM Mean-Shift Road Grass Shadow Trail Tree Roof Majority-voting Fused result

23 RoadGrassShadowTrailTreeRoofOAAAKappa svm(pca)99.098.682.092.398.884.892.6 91.1 svm(pso)98.198.994.792.599.495.496.696.595.9 svm(pca)+ms100.099.081.394.998.989.394.393.993.0 svm(pso)+ms90.799.0100.0 98.9 97.597.997.0 svm(pca)+kmeans99.996.975.796.698.895.394.893.993.6 svm(pso)+kmeans94.899.298.999.795.999.397.998.097.5 Classification accuracy from different methods in HYDICE experiment (with 6 bands or 6 PCs)

24 HyMap Experiment six classes: road, grass, shadow, soil, tree, roof

25 HyMap Experiment TrainingTest Road 731231 Grass 721072 Shadow 49215 Soil 69380 Tree 671321 Roof 741244

26 SVM classification accuracy using MEAC-selected bands in the HyMap experiment

27 SVM classification accuracy using JM-selected bands in the HyMap experiment

28 SVM Mean-Shift Road Grass Shadow Soil Tree Roof Majority-volting Fused result

29 RoadGrassShadowSoilTreeRoofOAAAKappa svm(pca)92.498.997.290.896.481.492.292.890.3 svm(pso)94.998.398.185.293.989.693.693.391.9 svm(pca)+ms96.3100.098.1100.097.785.595.296.394.0 svm(pso)+ms97.396.0100.098.798.9100.098.298.597.7 svm(pca)+kmeans99.299.786.971.798.781.892.989.791.0 svm(pso)+kmeans96.196.695.998.199.598.297.697.496.9 Classification accuracy from different methods in HyMap experiment (with 6 bands or 6 PCs)

30 Conclusion   The experimental results demonstrate that PSO can greatly improve band selection performance in terms of SVM classification accuracy, compared to the frequently used SFS and SFFS searching strategies. The classification improvement can be magnified through decision fusion.   The searching criterion called MEAC without requiring training samples is considered more advanced than the JM distance. In the SFS searching, the JM performance is much worse than MEAC; however, after using PSO searching, its performance can be as good as MEAC. This means the employed searching strategy does play an important role in band selection performance.


Download ppt "1 Particle Swarm Optimization-based Dimensionality Reduction for Hyperspectral Image Classification He Yang, Jenny Q. Du Department of Electrical and Computer."

Similar presentations


Ads by Google