Presentation is loading. Please wait.

Presentation is loading. Please wait.

Daphne’s Approximate Group of Students. Outline Linear Regression Unregularized L2 Regularized What is a GP? Prediction with a GP Relationship to SVM.

Similar presentations


Presentation on theme: "Daphne’s Approximate Group of Students. Outline Linear Regression Unregularized L2 Regularized What is a GP? Prediction with a GP Relationship to SVM."— Presentation transcript:

1 Daphne’s Approximate Group of Students

2 Outline Linear Regression Unregularized L2 Regularized What is a GP? Prediction with a GP Relationship to SVM Implications What does this mean?

3 Linear Regression Predicting Y given X Y = wtx + n w_ml = argmax y[m+1] = w_mltx[m+1]

4 L2 Regularized Lin Reg L2 Regularized (Gaussian Prior on w) Y = wtx + n w ~ N(0,S) w_map = argmax blah + ||w||^2

5 What is a random process? It’s a prior over functions

6 What is a Gaussian Process? It’s a prior over functions that generalized a Gaussian Random Vector Prior over Y(x) ~ N(0,I)

7 Alternate Definition The thing with Euler’s equation

8 This is weird Not used to thinking of prior over Ys Or are we? We ARE used to thining about prior over w What prior over y does this induce

9 Math P(w) -> P(Y) Wow! This became a Gaussian Process!

10 Prediction with a GP Predict y*[m+1] given y[1]…y[m] We get a covariance = error bars Wow! This prediction is the same as w_map but we get error bars!

11 Generalize that shit - Covariance Functions Note that we have a thing here that is defined by C(x1,x2) which can be kernelized Has to be pos semidefinite Is a kernel function

12 Relationship to SVM

13 Example

14 How do we reconcile these views? Does this change anything?


Download ppt "Daphne’s Approximate Group of Students. Outline Linear Regression Unregularized L2 Regularized What is a GP? Prediction with a GP Relationship to SVM."

Similar presentations


Ads by Google