Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter14-cont..

Similar presentations


Presentation on theme: "Chapter14-cont.."— Presentation transcript:

1 Chapter14-cont.

2 Hybrid (discrete+continuous) networks

3 Continuous-child Variables

4

5 discrete variables with continuous parents
E.g. the node “Buys” customer will buy if the cost is low and will not buy if it is high and that the probability of buying varies smoothly in some intermediate region. →the conditional distribution is like a “soft” threshold function 2 ways to make soft threshold: integral of the standard normal distribution (probit) Logistic function (logit distribution)

6 Probit distribution (=hard threshold with random gussian noise)
logit distribution

7 Naïve bayes model(classifier)

8 Chapter 15 Probabilistic Reasoning over Time

9 Outline Time and Uncertainty
Inference: Filtering, Prediction, Smoothing Hidden Markov models Brief Introduction to Kalman Filters Dynamic Bayesian networks Particle Filtering

10 Time and uncertainty The world changes; we need to track and predict it Diabetes management vs vehicle diagnosis Basic idea: copy state and evidence variables for each time step

11 Markov processes (Markov chains)

12 Example

13 Inference tasks t

14 Filtering

15 Filtering example Rt-1 P(Rt) t 0.7 f 0.3 Rt P(Ut) t 0.9 f 0.2

16 Filtering example Rt P(Ut) t 0.9 f 0.2 Rt-1 P(Rt) t 0.7 f 0.3

17 Prediction can be seen simply as filtering without the addition of new evidence recursive computation for predicting the state at t + k + 1 from a prediction for t + k: Example: P(X4|u1,u2)

18 likelihood we can use a forward recursion to compute the
likelihood of the evidence sequence, P(e1:t). For this recursion, we use a likelihood message message calculation Having computed 1:t, we obtain the actual likelihood by summing out Xt: Example: P(u1,u2)

19 Smoothing

20 Smoothing example Rt-1 P(Rt) t 0.7 f 0.3 Rt P(Ut) t 0.9 f 0.2

21 Most likely explanation

22 Viterbi example

23 Hidden Markov models

24

25

26

27

28 Kalman Filters

29 Updating Gaussian distributions

30 Simple 1-D example

31

32 General Kalman update

33

34

35 2-D tracking example: Filtering

36 2-D tracking example: smoothing

37 Where it breaks

38 Dynamic Bayesian networks

39 DBNs vs. HMMs

40 DBNs vs Kalman Filters

41 Exact inference in DBNs

42 Likelihood weighting for DBNs

43 Particle Filtering

44 Particle Filtering contd.

45 Particle ltering performance

46 Chapter 15, Sections 1-5 Summary


Download ppt "Chapter14-cont.."

Similar presentations


Ads by Google