Presentation is loading. Please wait.

Presentation is loading. Please wait.

X.3 Linear Discriminant Analysis: C-Class

Similar presentations


Presentation on theme: "X.3 Linear Discriminant Analysis: C-Class"— Presentation transcript:

1 X.3 Linear Discriminant Analysis: C-Class
Generalization of Sw to an arbitrary number of classes. Generalization of SB to an arbitrary number of classes. Generalization of w to W for an arbitrary number of classes. Eigenvalue / eigenvector relationships for identifying the optimal dimensions for class resolution Error analysis in LDA. 6.7 : 1/9

2 LDA: C-Classes Starting with our J(w) function from the 2-class case, first let us generalize the denominator. With the 2-class case, the denominator is the sum of the variances in each of the two classes. Generalization of this design is simple; sum the within-class variances of all C-classes. Two-classes: C-classes: 6.7 : 1/9

3 LDA: C-Classes Two-classes: C-classes:
Two-classes: C-classes: Note – we have included a weighting nj, which depends on the number of measurements nj in each class j, with the total number of measurements in all classes given by ntot. 6.7 : 1/9

4 LDA: C-Classes Finally, we can generalize the vector w.
In the general form, we will replace the vector w with a matrix W with C columns created by augmenting a set of wj vectors. The tricky part is identifying the particular directions in W that maximize the projection of the value J between the different classes. 6.7 : 1/9

5 LDA: C-Classes Linear algebra to the rescue! Two class C- class
If we consider a particular value of J and its corresponding vector w, the expression can be re-arranged into the form: The eigenvalues of (SW-1SB) will recover the maximum values of J, with the eigenvectors yielding the corresponding directions. 6.7 : 1/9

6 Avoiding Singularities
LDA: C-Classes Avoiding Singularities Caution: The preceding math only works when Sw is invertible! In order to guarantee that Sw is nonsingular, the number of measurements should exceed the number of wavelength channels in the spectra. For high resolution spectra with many wavelength channels, this criterion can be challenging to meet. Selecting key windows of the spectra is one strategy. Initial dimension reduction using a different approach (e.g., PCA) is another. 6.7 : 1/9


Download ppt "X.3 Linear Discriminant Analysis: C-Class"

Similar presentations


Ads by Google