Presentation is loading. Please wait.

Presentation is loading. Please wait.

Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 1 1 jrett.

Similar presentations


Presentation on theme: "Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 1 1 jrett."— Presentation transcript:

1 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 1 1 jrett

2 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 2 2 jrett

3 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 3 3 jrett Retrospective Bayesian Multimodal Perception by J. F. Fereira Bayes' theorem - Bayes rule Knowledge of past behavior and state form prediction of current state Non-Gaussian likelihood functions Multimodal Sensing in human perception

4 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 4 4 jrett Retrospective distribution of object position unknown => flat Noise in each modality is independent bimodal posterior distribution = product of the unimodal distributions Simplification: Probability distributions are Gaussian

5 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 5 5 jrett 1. Introduction to Pattern Recognition Example: “Sorting incoming Fish on a conveyor according to species using optical sensing”

6 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 6 6 jrett 1. Introduction to Pattern Recognition Selecting length feature Example: Fish Classifier Selecting lightness feature

7 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 7 7 jrett 1. Introduction to Pattern Recognition Example: Fish Classifier Selecting two features and defining a simple straight line as decision boundary Best performance but complicated classifier – will not perform well with novel patterns Search for the optimal tradeoff between performance on the training set and simplicity

8 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 8 8 jrett 1. Introduction to Pattern Recognition post-processing classification feature extraction segmentation sensing input decision Invariant Features Translation Rotation Scale Occlusion Projective Distortion Rate Deformation Feature Selection Noise Missing Features Error Rate Risk Context Multiple Classifiers Pattern Recognition System

9 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 9 9 jrett 1. Introduction to Pattern Recognition collect data choose features choose model train classifier evaluate classifier end start Prior Knowledge Overfitting Design Cycle

10 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 10 jrett 2. Continouos Features State of nature  Finite set of c states of nature (‘categories’) {  1, …,  c } Prior P(  j ) If the state of nature is finite: Decision rule (for c =2): Decide  1 if P(  1 ) > P(  2 ) ; otherwise decide  2

11 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 11 jrett 2. Continouos Features Feature vector x : x   d the feature space x is (for d=1) a continuous random variable x Class(State)-conditional probability density function: p ( x |  j ) expresses the distribution of x depending on the state of nature

12 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 12 jrett 2. Continouos Features Bayes formula (Posterior) Evidence Bayes Decision rule (for c =2): Decide  1 if P (  1 | x ) > P (  2 | x ) ; otherwise decide  2 Bayes Decision rule (expressed in terms of Priors): Decide  1 if p(x|  1 ) P (  1 ) > p(x|  2 ) P (  2 ) ; otherwise decide  2

13 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 13 jrett 2. Continouos Features Conditional Risk We can minimize our expected loss by selecting the action that minimizes the conditional risk. This Bayes decision procedure provides the optimal performance Two-Category Classification Bayes Risk Decide  1 if ( 21 - 11 ) P (  1 | x ) > ( 12 - 22 ) P (  2 | x ) ; otherwise decide  2

14 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 14 jrett Discrete Features Probabilities rather than probability densities. Bayes decision rule To minimize the overall risk, select the action  I for which R(  i |x) is minimum Feature vector x can assume m discrete values Posterior Evidence Risk

15 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 15 jrett Discrete Features Example: Independent Binary Features 2 category problem Feature vector x = {x 1, …, x d } T where xi = {0;1}

16 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 16 jrett Bayesian Belief Networks Represents knowledge about a distribution. Knowledge: Statistical Dependencies – Causal Relations among the component variables Knowledge from e.g. structural information Graphical representation: Bayesian Belief Nets node variable P(a) link parents (of C) children (of E)

17 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 17 jrett Bayesian Belief Networks Applying Bayes rule to determine the probability of any configuration of variables in the joint distribution. P(a 1 )P(a 2 ) 0.7390.261 P(c 1 |a k )P(c 2 |a k ) a1a1 0.30.7 a2a2 0.60.4  =1 Discrete Case: Discrete number of possible values A (e.g. 2: a={a1, a2} and continues-valued probabilities Conditional Probability Table

18 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 18 jrett Bayesian Belief Networks Determining the probabilities of the variables P(a)P(b|a)P(c|b)P(d|c) ABCD independance Summing the full joint distribution P(a,b,c,d) over all variables other than d E.g.: Probability distribution over d 1, d 2, … at D simple split P(b) P(c) P(d) simple interpretation Probability of a particular value of D

19 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 19 jrett Bayesian Belief Networks Give the values of some variables (evidence e) … and search to determine some particular configuration of other variables x

20 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 20 jrett Bayesian Belief Networks Example: Belief Network for Fish As usual: Compute P(x 1 salmon) and P(x 2 sea bass) Decide for the minimum expected classification error Ex.2 Classify the fish: Known: Fish is light (c 1 ) and caught in the south Atlantic (b 2 ). Unknown: Time of year (a), thickness (d) In this case D does not affect our results

21 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 21 jrett Bayesian Belief Networks Example: Belief Network for Fish

22 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 22 jrett Bayesian Belief Networks Example: Belief Network for Fish And if the dependency relation is unknown? naïve Bayes – idiot Bayes Features are conditionally independant After normalization:

23 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 23 jrett Compound Bayesian Decision Theory Consecutive  ’s not statistically independent => exploit dependence => improved performance Wait for n states to emerge and make all n decisions jointly = compound decision problem States of nature  = (  (1), …,  ( n )) T taking one of c values {  1, …,  c } Prior P(  ) for n states of nature Feature matrix X : =(x 1, …, x n ) x i obtained when state of nature was  i n observations

24 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 24 jrett Compound Bayesian Decision Theory Define loss matrix for the compound decision problem. Seek decision rule the minimizes the compound risk (optimal procedure) Assumption: Correct = no loss Errors = equally costly => simply calculate P (  |X) for all  and select  for which P (.) is maximum. practice: calculate P(  |X) is time expensive assumption: x i depends only on  (i) not on other x or  Conditional probability density function: p (X|  ) for X given the true set of  Posterior joint density

25 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 25 jrett Obrigado!

26 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 26 jrett Annex

27 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 27 jrett Book: Pattern Cl. Preface Ch 1: Introduction Ch 2: Bayesian Decision Theory Ch 3: Maximum Likelihood and Bayesian Estimation Ch 4: Nonparametric Techniques Ch 5: Linear Discriminant Functions Ch 6: Multilayer Neural Networks Ch 7: Stochastic Methods Ch 8: Nonmetric Methods Ch 9: Algorithm-Independent Machine Learning Ch 10: Unsupervised Learning and Clustering App A: Mathematical Foundations

28 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 28 jrett 2Bug Algorithms17 3Configuration Space39 4Potential Functions77 5Roadmaps107 6Cell Decompositions161 7Sampling-Based Algorithms197 8Kalman Filtering269 9Bayesian Methods301 10Robot Dynamics349 11Trajectory Planning373 12Nonholonomic and Underactuated Systems401 Book: Principles of...

29 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 29 jrett Book: Artificial... Preface Part I Artificial Intelligence Part II Problem Solving Part III Knowledge and Reasoning Part IV Planning Part V Uncertain Knowledge and Reasoning Part VI Learning Part VII Communicating, Perceiving, and Acting Part VIII Conclusions

30 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 30 jrett Book: Bayesian... Preface xix Part I: Fundamentals of Bayesian Inference 1 1 Background 3 2 Single-parameter models 33 3 Introduction to multiparameter models 73 4 Large-sample inference and frequency properties of Bayesian inference 101 Part II: Fundamentals of Bayesian Data Analysis 115 5 Hierarchical models 117 6 Model checking and improvement 157 7 Modeling accounting for data collection 197 8 Connections and challenges 247 9 General advice 259 Part III: Advanced Computation 273 10 Overview of computation 275 11 Posterior simulation 283 12 Approximations based on posterior modes 311 13 Special topics in computation 335 Part IV: Regression Models 351 14 Introduction to regression models 353 15 Hierarchical linear models 389 16 Generalized linear models 415 17 Models for robust inference 443 18 Mixture models 463 19 Multivariate models 481 20 Nonlinear models 497 21 Models for missing data 517 22 Decision analysis 541 Appendixes 571

31 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 31 jrett Book: Classification... Preface. Foreword. 1. Introduction. 2. Detection and Classification. 3. Parameter Estimation. 4. State Estimation. 5. Supervised Learning. 6. Feature Extraction and Selection. 7. Unsupervised Learning. 8. State Estimation in Practice. 9. Worked Out Examples. Appendix

32 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 32 jrett Images

33 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 33 jrett 2. Simple Example Designing a simple classifier for gesture recognition. The observer tries to predict which gesture might be performed next. The sequence of gestures appears to be random. Ten types of gestures: 1. Big circle 2. Small circle 3. Vertical Line 4. Horizontal Line 5. Pointing North-West 6. Pointing West 7. Talk louder 8. Talk more quiet 9. Wave Bye-Bye 10. I am hungry State of nature   Type of gesture (  1 …  10 ) We assume that there is some a priori probability (i.e. prior) P(  1 ) that the next gesture is ‘Big Circle’, P(  2 ) that the next gesture is ‘Small Circle’, etc. If the gesture lexicon is finite:

34 Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 34 jrett Missing and noisy features Missing Features: Example: x1 is missing measured value of x2 is x^2 mean x1 points to omega 3 but omega2 better decision


Download ppt "Institute of Systems and Robotics ISR – Coimbra Mobile Robotics Lab Bayesian Approaches 1 1 jrett."

Similar presentations


Ads by Google