Introduction to Sampling Methods Qi Zhao Oct.27,2004
Introduction to Sampling Methods Background Seven sampling methods Conclusion
Monte Carlo Methods Aim: to solve one or both of the following problems Problem 1Problem 1: to generate samples from a given probability distribution Problem 2Problem 2: to estimate expectations of functions under this distribution, for example
Write in the following form Monte Carlo Methods Hard to sample from Huge ! : known what is the cost to evaluate it? : unknown
Monte Carlo Methods To compute, every point in the space should be visited Back
Monte Carlo Methods Difficult to estimate the expectation of by drawing random samples uniformly from the state space and evaluating. Back
Sampling Methods Importance sampling Rejection sampling Metropolis sampling Gibbs sampling Factored sampling Condensation sampling Icondensation sampling References Back link
Importance Sampling Not a method for generating samples from, just a method for estimating the expectation of a function. is complex while is of simpler density.
Rejection Sampling Further assumption of importance sampling— we know the constant such that: for all, Evaluation of : the probability density of the x-coordinates of the accepted points must be proportional to Back
Metropolis Sampling If,then the new state is accepted. Otherwise, the new state is accepted with probability is not necessarily look similar to at all has a shape that changes as changes Back
Metropolis Sampling Proof: Back
Gibbs Sampling A special case of Metropolis sampling: is defined in terms of the conditional distribution of the joint distribution Sample from distributions over at least two dimensions
Gibbs Sampling An example with two variables: Back
Markov chain Monte Carlo Rejection sampling Accepted points are independent samples from the desired distribution Markov chain Monte Carlo Involve a Markov process in which a sequence of states is generated, each sample having a probability distribution that depends on the previous value. Comparison: Back
Factored Sampling Deal with non-Gaussian observations in single image Essential idea here is to transform the uniform distribution into weighted distribution. So that non-Gaussian forms can also use uniform distribution (random bits) to generate sample
Factored Sampling II. An index is chosen with probability the value chosen (with probability ) in this fashion has a distribution,as. I.An sample set is generated from the prior density
Condensation Sampling Based on factored sampling Extended to apply iteratively to successive images in a sequence
Condensation Sampling
I.Select a sample a. Generate a random number, uniformly distributed b. Find the smallest for which II. Predict using dynamic model e.g. III.Measure and weight a. Calculate the new position in terms of the measured features, b. Normalize so that c. Store together,where
ICondensation Sampling Premise: auxiliary knowledge is available in the form of an importance function that describes which areas of state-space contain most information about the posterior. It is a technique developed to improve the efficiency of factored sampling and condensation sampling. Idea: to concentrate samples in those areas of state-space by generating new samples from the importance function instead of from the prior. Back
Summary Easy to get the function? Importance Sampling Condensation Sampling Icondensation Sampling Factored Sampling Gibbs Sampling Metropolis Sampling yes successive images no To use uniform distribution for non- Gaussian distribution a special case improvement Rejection Sampling “ ” is easy to get MCMC
References D.J.C.Mackay, Introduction to Monte Carlo Methods Michael Isard and Andrew Blake, Condensation – conditional density propagation for visual tracking Michael Isard and Andrew Blake, Icondensation: Unifying low-level and high-level tracking in a stochastic framework Back