Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 13 Oct 14, 2005 Nanjing University of Science & Technology.

Similar presentations


Presentation on theme: "1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 13 Oct 14, 2005 Nanjing University of Science & Technology."— Presentation transcript:

1 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 13 Oct 14, 2005 Nanjing University of Science & Technology

2 2 Lecture 13 Topics 1. Multiple observation Multiple class example: (review) Sufficient statistic space and Likelihood ratio space 2. Calculation of P(error) for 2-class case : several special cases 3. P(error) calculations examples for special cases – 2-class case

3 3 Example 1: Multiple observation - multiple classes Given: the pattern vector x is composed of N independent observations of a Gaussian random variable X with the class conditional densities as follows for each component A zero one cost function is given as Find: (a) the Bayes decision rule in a sufficient statistic space. (b) the Bayes decision rule in a space of likelihood ratios

4 4 Solution: (a) Since the observations are independent the joint conditional density is a product of the marginal densities and given by for i = 1, 2, 3 and m i = i, i=1, 2, 3 Bayes decision rule is determined form a set of y i (x) defined for M=3 by

5 5 Substituting the given properties gives The region to decide C 1 is found by setting the following inequalities Therefore the region R 1 to decide C 1, reduces to the x that satisfy

6 6 Similarly the regions R 2 and R 3 become Substituting the conditional densities, taking the ln of both sides and simplifying the decision rule reduces to regions in a sufficient statistic s space as follows

7 7 Which is shown below in the sufficient statistic s space An intuitively pleasing result ! s

8 8 where y i (x) = C ij p(x | C j ) P(C j ) j=1 M if y i (x) < y j (x) for all j = i Then decide x is from C i (b) Bayes Decision Rule in Likelihood ratio space: M-Class Case derivation We know that Bayes Decision Rule for the M- Class Case is

9 9 L M (x) = p(x | C M ) / p(x | C M ) = 1 Dividing through by p(x | C M ) gives sufficient statistics v i (x) as follows Therefore the decision rule becomes

10 10 Bayes Decision Rule in the Likelihood Ratio Space The dimension of the Likelihood Ratio Space is always one less than the number of classes ( M - 1)

11 11 Back to Example: Define the likelihood ratios as Dividing both sides of the inequalities by p(x|C 3 ) gives the following equations in the Likelihood Ratio space for determining C 1 We have already determined the region to decide C 1 as

12 12 The other regions are determined in the same fashion giving the decision regions in the likelihood ratio space

13 13 Calculation of Probability of error for the 2-class Gaussian Cases We know Optimum Bayes Decision Rule is given by Special Case 1:

14 14 The sufficient statistic Z conditioned on C 1 has the following mean and variance

15 15 thus under C 1 we have : a1 =a1 = v1 =v1 = Z ~ N( a 1, v 1 ) The conditional variance becomes

16 16 Similarly the conditional mean and variance under class C 2 are The statistic Z under class C 2 is Gaussian and given by thus under C 1 we have : a2 =a2 = v2 =v2 = Z ~ N( a 2, v 2 )

17 17 Determination of the P(error) The total Probability Theorem states where

18 18 Since the scalar Z is Gaussian the error conditioned on C 1 becomes:

19 19 Similarly the error conditioned on C 2 becomes Finally the total P(error) becomes for Special Case 1

20 20 Special case 2: Equal scaled identity Covariance matrices Using the previous formula the P(error) reduces to where (Euclidean distance between the means)

21 21 Special case 3: Zero- one Bayes Costs and Equal apriori probabilities Using the previous formula for P(error) gives:

22 22 Special Case 4: Then

23 23 Example: Calculation of probability of Error Given: Find: P(error) for the following assumptions

24 24 (a) Solution:

25 25 (b) Solution: Substituting the above into the P(error) gives:

26 26 (c) Solution: Substituting the above into the P(error) gives:

27 27 (d) Solution: Substituting the above into the P(error) for the case of equal covariance matrices gives:

28 28 (d) Solution Continued:

29 29 Lecture 13 Summary 1. Multiple observation Multiple class example: (review) Sufficient statistic space and Likelihood ratio space 2. Calculation of P(error) for two class case : special cases 3. P(error) calculations examples for special cases - 2 class case

30 30 End of Lecture 13


Download ppt "1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 13 Oct 14, 2005 Nanjing University of Science & Technology."

Similar presentations


Ads by Google