Download presentation
Presentation is loading. Please wait.
Published byElvin Holland Modified over 9 years ago
1
Stochastic approximate inference Kay H. Brodersen Computational Neuroeconomics Group Department of Economics University of Zurich Machine Learning and Pattern Recognition Group Department of Computer Science ETH Zurich http://people.inf.ethz.ch/bkay/
2
2 When do we need approximate inference? sample from an arbitrary distribution compute an expectation
3
3 Deterministic approximations through structural assumptions application often requires mathematical derivations (hard work) systematic error computationally efficient efficient representation learning rules may give additional insight Which type of approximate inference? Stochastic approximations through sampling computationally expensive storage intensive asymptotically exact easily applicable general-purpose algorithms
4
4 Themes in stochastic approximate inference z z sampling summary
5
Transformation method 1
6
6 transformation
7
7 Transformation method: algorithm
8
8 Transformation method: example
9
9 Discussion yields high-quality samples easy to implement computationally efficient obtaining the inverse cdf can be difficult Transformation method: summary
10
Rejection sampling and importance sampling 2
11
11 Rejection sampling
12
12 Rejection sampling: algorithm
13
13 Importance sampling
14
14 Importance sampling
15
Markov Chain Monte Carlo 3
16
16 Idea: we can sample from a large class of distributions and overcome the problems that previous methods face in high dimensions using a framework called Markov Chain Monte Carlo. Markov Chain Monte Carlo (MCMC)
17
17 Background on Markov chains
18
18 Background on Markov chains 1 2 3 1 2 3 Equlibrium distribution Markov chain: state diagram 0.5
19
19 Background on Markov chains 1 2 3 1 2 3 Equlibrium distribution Markov chain: state diagram
20
20 The idea behind MCMC z z 2468 1357 Empirical distribution of samples Desired target distribution
21
21 Metropolis algorithm
22
22 Metropolis-Hastings algorithm
23
23 Metropolis: accept or reject? Inrease in density: Decrease in density: p(z*) p(z τ ) p(z*) zτzτ zτzτ z*
24
24 Gibbs sampling
25
25 Gibbs sampling
26
26 Throughout Bayesian statistics, we encounter intractable problems. Most of these problems are: (i) evaluating a distribution; or (ii) computing the expectation of a distribution. Sampling methods provide a stochastic alternative to deterministic methods. They are usually computationally less efficient, but are asymptotically correct, broadly applicable, and easy to implement. We looked at three main approaches: Transformation method: efficient sampling from simple distributions Rejection sampling and importance sampling: sampling from arbitrary distributions; direct computation of an expected value Monte Carlo Markov Chain (MCMC): efficient sampling from high-dimensional distributions through the Metropolis-Hastings algorithm or Gibbs sampling Summary
27
27 Supplementary slides
28
28 Transformation method: proof
29
29 Background on Markov chains
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.