Presentation is loading. Please wait.

Presentation is loading. Please wait.

Tea – Time - Talks Every Friday 3.30 pm ICS 432. We Need Speakers (you)! Please volunteer. Philosophy: a TTT (tea-time-talk) should approximately take.

Similar presentations


Presentation on theme: "Tea – Time - Talks Every Friday 3.30 pm ICS 432. We Need Speakers (you)! Please volunteer. Philosophy: a TTT (tea-time-talk) should approximately take."— Presentation transcript:

1 Tea – Time - Talks Every Friday 3.30 pm ICS 432

2 We Need Speakers (you)! Please volunteer. Philosophy: a TTT (tea-time-talk) should approximately take 15 mins. (extract the essence only). Email: welling@ics

3 Embedded HMM’s Radford Neal Matt Beal Sam Roweis University of Toronto

4 Question: Can we efficiently sample in non-linear state space models with hidden variables (e.g. non- linear Kalman filter).

5 Graphical Model continuous domain hidden observed

6 Inference One option is Gibbs sampling. However: if random variables are tightly coupled, the Markov chain mixes very slowly. This is because we need to change a large number of variables simultaneously.

7 Idea: Embed an HMM! 1. Choose a distribution at every time slice t. i) Define a forward kernel and a backward kernel: such that: (note: not necessarily detailed balance) The kernels will be used to sample K states embedded the continuous domain of

8 Idea: Embed an HMM! 1. Choose a distribution at every time slice t. 2. Sample M states from distribution as follows: i) Define a forward kernel and a backward kernel: and the current state sequence ii) Pick a number uniformly at random between iii) Apply the forward kernel times, starting at and apply backward kernel times, starting at 3. Sample from the `embedded HMM’ using “forward-backward”.

9 Sampling from the eHMM REPEAT: 1. Starting at the current state sequence, sample K states by applying the forward kernel J times (J chosen uniform at random) and the backward kernel K-J-1 times. This defines the embedded state space. 2. Sample states using the forward-backward algorithm from the following distribribution: x & y discrete! proof of detailed balance: see paper

10 Note: The probabilities of the HMM are not normalized, so it should it should be treated as an undirected graphical model


Download ppt "Tea – Time - Talks Every Friday 3.30 pm ICS 432. We Need Speakers (you)! Please volunteer. Philosophy: a TTT (tea-time-talk) should approximately take."

Similar presentations


Ads by Google