Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3

Slides:



Advertisements
Similar presentations
Bayesian network for gene regulatory network construction
Advertisements

CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu Lecture 27 – Overview of probability concepts 1.
A Tutorial on Learning with Bayesian Networks
Learning on the Test Data: Leveraging “Unseen” Features Ben Taskar Ming FaiWong Daphne Koller.
Graphical Models and Applications CNS/EE148 Instructors: M.Polito, P.Perona, R.McEliece TA: C. Fanti.
Bayesian Network and Influence Diagram A Guide to Construction And Analysis.
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
CSE 5522: Survey of Artificial Intelligence II: Advanced Techniques Instructor: Alan Ritter TA: Fan Yang.
Introduction of Probabilistic Reasoning and Bayesian Networks
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Belief Propagation by Jakob Metzler. Outline Motivation Pearl’s BP Algorithm Turbo Codes Generalized Belief Propagation Free Energies.
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
Cognitive Computer Vision
From: Probabilistic Methods for Bioinformatics - With an Introduction to Bayesian Networks By: Rich Neapolitan.
Midterm Review. The Midterm Everything we have talked about so far Stuff from HW I won’t ask you to do as complicated calculations as the HW Don’t need.
1 Graphical Models in Data Assimilation Problems Alexander Ihler UC Irvine Collaborators: Sergey Kirshner Andrew Robertson Padhraic Smyth.
Bayes Nets Rong Jin. Hidden Markov Model  Inferring from observations (o i ) to hidden variables (q i )  This is a general framework for representing.
Graphical Models Lei Tang. Review of Graphical Models Directed Graph (DAG, Bayesian Network, Belief Network) Typically used to represent causal relationship.
Artificial Intelligence and Lisp Lecture 7 LiU Course TDDC65 Autumn Semester, 2010
5/25/2005EE562 EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005.
Computer vision: models, learning and inference Chapter 10 Graphical Models.
Latent Tree Models Part II: Definition and Properties
Cognitive Computer Vision 3R400 Kingsley Sage Room 5C16, Pevensey III
Machine Learning CUNY Graduate Center Lecture 21: Graphical Models.
Made by: Maor Levy, Temple University  Probability expresses uncertainty.  Pervasive in all of Artificial Intelligence  Machine learning 
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
A Brief Introduction to Graphical Models
Bayesian Belief Network Compiled By: Raj Gaurang Tiwari Assistant Professor SRMGPC, Lucknow.
12/07/2008UAI 2008 Cumulative Distribution Networks and the Derivative-Sum-Product Algorithm Jim C. Huang and Brendan J. Frey Probabilistic and Statistical.
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
Sum-Product Networks CS886 Topics in Natural Language Processing
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
1 Generative and Discriminative Models Jie Tang Department of Computer Science & Technology Tsinghua University 2012.
V13: Causality Aims: (1) understand the causal relationships between the variables of a network (2) interpret a Bayesian network as a causal model whose.
Physics Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 12th Bayesian network and belief propagation in statistical inference Kazuyuki Tanaka.
Ch 8. Graphical Models Pattern Recognition and Machine Learning, C. M. Bishop, Revised by M.-O. Heo Summarized by J.W. Nam Biointelligence Laboratory,
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
LAC group, 16/06/2011. So far...  Directed graphical models  Bayesian Networks Useful because both the structure and the parameters provide a natural.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
INTERVENTIONS AND INFERENCE / REASONING. Causal models  Recall from yesterday:  Represent relevance using graphs  Causal relevance ⇒ DAGs  Quantitative.
CS Statistical Machine learning Lecture 24
Marginalization & Conditioning Marginalization (summing out): for any sets of variables Y and Z: Conditioning(variant of marginalization):
Lecture 2: Statistical learning primer for biologists
1 Use graphs and not pure logic Variables represented by nodes and dependencies by edges. Common in our language: “threads of thoughts”, “lines of reasoning”,
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Elements of a Discrete Model Evaluation.
Christopher M. Bishop, Pattern Recognition and Machine Learning 1.
Pattern Recognition and Machine Learning
Using Bayesian Networks to Predict Plankton Production from Satellite Data By: Rob Curtis, Richard Fenn, Damon Oberholster Supervisors: Anet Potgieter,
Introduction on Graphic Models
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
Dynamic Programming & Hidden Markov Models. Alan Yuille Dept. Statistics UCLA.
Probabilistic Robotics Probability Theory Basics Error Propagation Slides from Autonomous Robots (Siegwart and Nourbaksh), Chapter 5 Probabilistic Robotics.
Graduate School of Information Sciences, Tohoku University
A Brief Introduction to Bayesian networks
CHAPTER 16: Graphical Models
Lecture on Bayesian Belief Networks (Basics)
Today.
Cognitive Computer Vision
Read R&N Ch Next lecture: Read R&N
Markov ó Kalman Filter Localization
Probabilistic Reasoning over Time
Read R&N Ch Next lecture: Read R&N
Markov Random Fields Presented by: Vladan Radosavljevic.
Expectation-Maximization & Belief Propagation
Graduate School of Information Sciences, Tohoku University
CS 188: Artificial Intelligence Spring 2006
Graduate School of Information Sciences, Tohoku University
Read R&N Ch Next lecture: Read R&N
Graduate School of Information Sciences, Tohoku University
Presentation transcript:

Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3

Lecture 3 Graphical models Probabilistic graphical models – Directed graphs – Unidirected graphs – Notation – Rolling out over time

What are graphical models? Represent salient relationships graphically e.g.

What are probabilistic graphical models? A probabilistic graphical model is a type of probabilistic network that has roots in AI, statistics and neural networks Provides a clean mathematical formalism that makes it possible to understand the relationships between a wide variety of network based approaches to computation Allows to see different methods as instances of a broader probabilistic framework

What are probabilistic graphical models? Probabilistic graphical models use graphs to represent and manipulate joint probability distributions Graphs can be directed – usually referred to as a belief network or Bayesian network Graphs can be undirected – usually referred to as a Markov Random Field A basis for algorithms for computation

Joint probability – a reminder A probability dependent on more than one variable e.g. p(AND|a,b): ab p(AND|a,b) a b Discrete case of a logic AND gate A continuous case where light values are high p(AND|a,b)

Directed graphs Intuitively, the notion of causality (although this can be a philosophical argument) A  B, so the value of A directly determines the value of B P(A,B) = P(B|A).P(A) A B

Traffic lights model from lecture 2 as a directed graph STOP GO GET READY TO GO GET READY TO STOP OBSERVABLEHIDDEN observable hidden

Examples of directed graphs Hidden Markov Models (later in the course) Kalman filters Factor analysis Independent component analysis Mixtures of Gaussians (later in the course) Probabilistic expert systems The list goes on …

Joint probability – conditional independence N variables are conditionally independent if one value does not depend on the other e.g: AB C Here, A and B are conditionally independent: But A and C and B and C are not:

Undirected graphs Intuitively, the notion of correlation (although this can be a philosophical argument) A  B, so the values of A and B are interdependent Directed graphs can be converted into undirected graphs (but beyond the scope of this course) A B

A undirected graph for a computer vision task

Notation Squares denote discrete nodes Circles denote continuous valued nodes Clear denotes hidden node Shaded denotes observed node B A C

Rolling out over time Probabilistic graphical model notation is very good at showing how models are propagated in time Expose the dependencies between the different elements of the graphical structure

Rolling out our traffic light example over 2 time steps … STOP GO GET READY TO GO GET READY TO STOP OBSERVABLEHIDDEN observable hidden observable hidden t=1 t=2

Remember the concept of the temporal order of a model ? observable hidden observable hidden t=1 t=2 In this model, the value of the hidden nodes (and thus the observable ones) at time t+1 only depends on the previous time step t So this is a first order temporal model

Remember the concept of the temporal order of a model ? A second order temporal model observable hidden observable hidden t=1 t=2 observable hidden observable hidden t=3 t=4 … …

So why are graphical models relevant to Cognitive CV? Precisely because they allows us to see different methods as instances of a broader probabilistic framework These methods are the basis for our model of perception guided by expectation We can put our model of expectation on a solid theoretical foundation We can develop well-founded methods of learning rather than just being stuck with hand-coded models

Summary Probabilistic graphical models put the formalisms on a well-founded mathematical basis We can distinguish directed and undirected graphs Here we concentrate on directed graphs that we can roll out over time easily

Next time … A family of graphical models A lot of excellent reference material can be found at: