Download presentation

Presentation is loading. Please wait.

1
Most slides from http://www.autonlab.org/tutorials/ Expectation Maximization (EM) Northwestern University EECS 395/495 Special Topics in Machine Learning

2
Most slides from http://www.autonlab.org/tutorials/ Outline Objective Simple example Complex example

3
Most slides from http://www.autonlab.org/tutorials/ Objective Learning with missing/unobservable data J B E A E B A J 1 1 1 0 1 1 0 0 … Maximum likelihood

4
Most slides from http://www.autonlab.org/tutorials/ Objective Learning with missing/unobservable data J B E A E B A J 1 1 ? 1 1 0 ? 1 0 0 ? 0 … Optimize what?

5
Most slides from http://www.autonlab.org/tutorials/ Outline Objective Simple example Complex example

6
Most slides from http://www.autonlab.org/tutorials/ Simple example

7
Most slides from http://www.autonlab.org/tutorials/ Maximize likelihood

8
Most slides from http://www.autonlab.org/tutorials/ Same Problem with Hidden Information Score GradeHidden Observable

9
Most slides from http://www.autonlab.org/tutorials/ Same Problem with Hidden Information S G

10
Most slides from http://www.autonlab.org/tutorials/ Same Problem with Hidden Information S G

11
Most slides from http://www.autonlab.org/tutorials/ EM for our example

12
Most slides from http://www.autonlab.org/tutorials/ EM Convergence

13
Most slides from http://www.autonlab.org/tutorials/ Generalization X: observable data(score = {h, c, d}) z: missing data(grade = {a, b, c, d}) : model parameters to estimate ( ) E: given, compute the expectation of z M: use z obtained in E step, maximize the likelihood with respect to

14
Most slides from http://www.autonlab.org/tutorials/ Outline Objective Simple example Complex example

15
Most slides from http://www.autonlab.org/tutorials/ Gaussian Mixtures

16
Most slides from http://www.autonlab.org/tutorials/ Gaussian Mixtures Know –Data – - Don’t know –Data label Objective – -

17
Most slides from http://www.autonlab.org/tutorials/ The GMM assumption

18
Most slides from http://www.autonlab.org/tutorials/ The GMM assumption

19
Most slides from http://www.autonlab.org/tutorials/ The GMM assumption

20
Most slides from http://www.autonlab.org/tutorials/ The GMM assumption

21
Most slides from http://www.autonlab.org/tutorials/ The data generated Coordinates Label

22
Most slides from http://www.autonlab.org/tutorials/ Computing the likelihood

23
Most slides from http://www.autonlab.org/tutorials/ EM for GMMs

24
Most slides from http://www.autonlab.org/tutorials/ EM for GMMs

25
Most slides from http://www.autonlab.org/tutorials/ EM for GMMs

26
Most slides from http://www.autonlab.org/tutorials/

34
Generalization X: observable data z: unobservable data : model parameters to estimate E: given, compute the “expectation” of z M: use z obtained in E step, maximize the likelihood with respect to

35
Most slides from http://www.autonlab.org/tutorials/ Exponential family –Yes: normal, exponential, beta, Bernoulli, binomial, multinomial, Poisson… –No: Cauchy and uniform EM using sufficient statistics –S1: computing the expectation of the statistics –S2: set the maximum likelihood For distributions in exponential family

36
Most slides from http://www.autonlab.org/tutorials/ What EM really is Maximize expected log likelihood E-step: Determine the expectation M-step: Maximize the expectation above with respect to X: observable data z: missing data

37
Most slides from http://www.autonlab.org/tutorials/ Final comments Deal with missing data/latent variables Maximize expected log likelihood Local minima

Similar presentations

© 2019 SlidePlayer.com Inc.

All rights reserved.

To make this website work, we log user data and share it with processors. To use this website, you must agree to our Privacy Policy, including cookie policy.

Ads by Google