Presentation is loading. Please wait.

Presentation is loading. Please wait.

Final Exam Review CS479/679 Pattern Recognition Dr. George Bebis 1.

Similar presentations


Presentation on theme: "Final Exam Review CS479/679 Pattern Recognition Dr. George Bebis 1."— Presentation transcript:

1 Final Exam Review CS479/679 Pattern Recognition Dr. George Bebis 1

2 Final Exam Material Midterm Exam Material Linear Discriminant Functions Support Vector Machines Expectation-Maximization Algorithm 2 Case studies are also included in the final exam

3 Linear Discriminant Functions General form of linear discriminant: What is the form of the decision boundary? What is the meaning of w and w 0 ? – The decision boundary is a hyperplane ; its orientation is determined by w and its location by w 0. 3

4 Linear Discriminant Functions What does g(x) measure? – Distance of x from the decision boundary (hyperplane) 4

5 Linear Discriminant Functions How do we find w and w 0 ? – Apply learning using a set of labeled training examples What is the effect of each training example? – Places a constraint on the solution 5 feature space (y 1, y 2 ) a1a1a1a1 a2a2a2a2 solution space ( ɑ 1, ɑ 2 )

6 Linear Discriminant Functions Iterative optimization – what is the main idea? – Minimize some error function J(α) iteratively 6 (k) α(k) (k+1) α(k+1) search direction learning rate

7 Linear Discriminant Functions Gradient descent method Newton method Perceptron rule 7

8 Support Vector Machines What is the capacity of a classifier? What is the VC dimension of a classifier? What is structural risk minimization? – Find solutions that (1) minimize the empirical risk and (2) have low VC dimension. – It can be shown that: 8 with probability (1-δ)

9 Support Vector Machines What is the margin of separation? How is it defined? What is the relationship between VC dimension and margin of separation? – VC dimension is minimized by maximizing the margin of separation. 9 support vectors

10 Support Vector Machines What is the criterion being optimized by SVMs? 10 maximize margin:

11 Support Vector Machines SVM solution depends only on the support vectors: Soft margin classifier – tolerate “outliers” 11

12 Support Vector Machines Non-linear SVM – what is the main idea? – Map data to a high dimensional space h 12

13 Support Vector Machines What is the kernel trick? – Compute dot products using a kernel function 13 K(x,y)=(x. y) d e.g., polynomial kernel:

14 Support Vector Machines Important comments about SVMs – SVM is based on exact optimization (no local optima). – Its complexity depends on the number of support vectors, not on the dimensionality of the transformed space. – Performance depends on the choice of the kernel and its parameters. 14

15 Expectation-Maximization (EM) What is the EM algorithm? – An iterative method to perform ML estimation i.e., max p(D/ θ) When is EM useful? – Most useful for problems where the data is incomplete or can be thought as being incomplete. 15

16 Expectation-Maximization (EM) What are the steps of the EM algorithm? – Initialization: θ 0 – Expectation Step: – Maximization Step: – Test for convergence: Convergence properties of EM ? – Solution depends on the initial estimate θ 0 – No guarantee to find global maximum but stable 16

17 Expectation-Maximization (EM) What is a mixture of Gaussians? How are the parameters of MoGs estimated? – Using the EM algorithm What is the main idea behind using EM for estimating the MoGs parameters? – Introduce “hidden variables: 17

18 Expectation-Maximization (EM) Explain the EM steps for MoGs 18

19 Expectation-Maximization (EM) 19 Explain the EM steps for MoGs


Download ppt "Final Exam Review CS479/679 Pattern Recognition Dr. George Bebis 1."

Similar presentations


Ads by Google