Presentation is loading. Please wait.

Presentation is loading. Please wait.

Computer vision: models, learning and inference Chapter 19 Temporal models.

Similar presentations


Presentation on theme: "Computer vision: models, learning and inference Chapter 19 Temporal models."— Presentation transcript:

1 Computer vision: models, learning and inference Chapter 19 Temporal models

2 Goal To track object state from frame to frame in a video Difficulties: Clutter (data association) One image may not be enough to fully define state Relationship between frames may be complicated

3 Structure 33Computer vision: models, learning and inference. ©2011 Simon J.D. Prince Temporal models Kalman filter Extended Kalman filter Unscented Kalman filter Particle filters Applications

4 4 Temporal Models Consider an evolving system Represented by an unknown vector, w This is termed the state Examples: – 2D Position of tracked object in image – 3D Pose of tracked object in world – Joint positions of articulated model OUR GOAL: To compute the marginal posterior distribution over w at time t. 4Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

5 5 Estimating State Two contributions to estimating the state: 1.A set of measurements x t, which provide information about the state w t at time t. This is a generative model: the measurements are derived from the state using a known probability relation Pr(x t |w 1 …w T ) 2.A time series model, which says something about the expected way that the system will evolve e.g., Pr(w t |w 1...w t-1,w t+1 …w T ) 5Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

6 Temporal Models 6Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

7 Only the immediate past matters (Markov) – the probability of the state at time t is conditionally independent of states at times 1...t-2 given the state at time t-1. Measurements depend on only the current state – the likelihood of the measurements at time t is conditionally independent of all of the other measurements and the states at times 1...t-1, t+1..T given the state at time t. Assumptions 7Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

8 Graphical Model World states Measurements 8Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

9 Recursive Estimation Time 1 Time 2 Time t from temporal model 9Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 贝叶斯法则,是指当分析样本大到接近总体数时,样本中事 件发生的概率将接近于总体中事件发生的概率。

10 Computing the prior (time evolution) Each time, the prior is based on the Chapman-Kolmogorov equation Prior at time tTemporal modelPosterior at time t-1 10Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

11 Summary Temporal Evolution Measurement Update Alternate between: Temporal model Measurement model 11Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

12 Structure 12 Computer vision: models, learning and inference. ©2011 Simon J.D. Prince Temporal models Kalman filter Extended Kalman filter Unscented Kalman filter Particle filters Applications

13 Kalman Filter The Kalman filter is just a special case of this type of recursive estimation procedure. Temporal model and measurement model carefully chosen so that if the posterior at time t-1 was Gaussian then the prior at time t will be Gaussian posterior at time t will be Gaussian The Kalman filter equations are rules for updating the means and covariances of these Gaussians 13Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

14 The Kalman Filter Previous time stepPrediction Measurement likelihood Combination 14Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

15 Kalman Filter Definition Time evolution equation Measurement equation State transition matrix Additive Gaussian noise Relates state and measurement 15Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

16 Kalman Filter Definition Time evolution equation Measurement equation State transition matrix Additive Gaussian noise Relates state and measurement 16Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

17 Temporal evolution 17Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

18 Measurement incorporation 18Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

19 Kalman Filter This is not the usual way these equations are presented. Part of the reason for this is the size of the inverses:  is usually landscape and so  T  is inefficient Define Kalman gain: 19Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

20 Mean Term Using Matrix inversion relations: 20Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

21 Covariance Term Kalman Filter Using Matrix inversion relations: 21Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

22 Final Kalman Filter Equation Innovation (difference between actual and predicted measurements Prior variance minus a term due to information from measurement 22Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

23 Kalman Filter Summary Time evolution equation Measurement equation Inference 23Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

24 Kalman Filter Example 1 24Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

25 Kalman Filter Example 2 Alternates: 25Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

26 26 Smoothing Estimates depend only on measurements up to the current point in time. Sometimes want to estimate state based on future measurements as well Fixed Lag Smoother: This is an on-line scheme in which the optimal estimate for a state at time t -t is calculated based on measurements up to time t, where t is the time lag. i.e. we wish to calculate Pr(w t-  |x 1...x t ). Fixed Interval Smoother: We have a fixed time interval of measurements and want to calculate the optimal state estimate based on all of these measurements. In other words, instead of calculating Pr(w t |x 1...x t ) we now estimate Pr(w t |x 1...x T ) where T is the total length of the interval. 26Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

27 27 Fixed lag smoother 27Computer vision: models, learning and inference. ©2011 Simon J.D. Prince State evolution equation Measurement equation Estimate delayed by 

28 Fixed-lag Kalman Smoothing 28Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

29 Fixed interval smoothing 29Computer vision: models, learning and inference. ©2011 Simon J.D. Prince Backward set of recursions where Equivalent to belief propagation / forward-backward algorithm

30 Temporal Models 30Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

31 Problems with the Kalman filter Requires linear temporal and measurement equations Represents result as a normal distribution: what if the posterior is genuinely multi- modal? 31Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

32 Structure 32 Computer vision: models, learning and inference. ©2011 Simon J.D. Prince Temporal models Kalman filter Extended Kalman filter Unscented Kalman filter Particle filters Applications

33 Roadmap 33Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

34 Extended Kalman Filter Allows non-linear measurement and temporal equations Key idea: take Taylor expansion and treat as locally linear 34Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

35 Jacobians Based on Jacobians matrices of derivatives 35Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

36 Extended Kalman Filter Equations 36Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

37 Extended Kalman Filter 37Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

38 Problems with EKF 38Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

39 Structure 39 Computer vision: models, learning and inference. ©2011 Simon J.D. Prince Temporal models Kalman filter Extended Kalman filter Unscented Kalman filter Particle filters Applications

40 Unscented Kalman Filter Key ideas: Approximate distribution as a sum of weighted particles with correct mean and covariance Pass particles through non-linear function of the form Compute mean and covariance of transformed variables 40Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

41 Unscented Kalman Filter Choose so that 41Computer vision: models, learning and inference. ©2011 Simon J.D. Prince Approximate with particles:

42 One possible scheme With: 42Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

43 Reconstitution 43Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

44 Unscented Kalman Filter 44Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

45 Measurement incorportation 45Computer vision: models, learning and inference. ©2011 Simon J.D. Prince Measurement incorporation works in a similar way: Approximate predicted distribution by set of particles Particles chosen so that mean and covariance the same

46 Measurement incorportation 46Computer vision: models, learning and inference. ©2011 Simon J.D. Prince Measurement update equations: Kalman gain now computed from particles: Pass particles through measurement equationand recompute mean and variance:

47 Problems with UKF 47Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

48 Structure 48 Computer vision: models, learning and inference. ©2011 Simon J.D. Prince Temporal models Kalman filter Extended Kalman filter Unscented Kalman filter Particle filters Applications

49 Particle filters 49Computer vision: models, learning and inference. ©2011 Simon J.D. Prince Key idea: Represent probability distribution as a set of weighted particles Advantages and disadvantages: + Can represent non-Gaussian multimodal densities + No need for data association - Expensive

50 Condensation Algorithm 50Computer vision: models, learning and inference. ©2011 Simon J.D. Prince Stage 1: Resample from weighted particles according to their weight to get unweighted particles

51 Condensation Algorithm 51Computer vision: models, learning and inference. ©2011 Simon J.D. Prince Stage 2: Pass unweighted samples through temporal model and add noise

52 Condensation Algorithm 52Computer vision: models, learning and inference. ©2011 Simon J.D. Prince Stage 3: Weight samples by measurement density

53 Structure 53 Computer vision: models, learning and inference. ©2011 Simon J.D. Prince Temporal models Kalman filter Extended Kalman filter Unscented Kalman filter Particle filters Applications

54 54 Tracking pedestrians 54Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

55 55 Computer vision: models, learning and inference. ©2011 Simon J.D. Prince Tracking contour in clutter

56 56 Simultaneous localization and mapping 56Computer vision: models, learning and inference. ©2011 Simon J.D. Prince


Download ppt "Computer vision: models, learning and inference Chapter 19 Temporal models."

Similar presentations


Ads by Google