Soft Computing Lecture 17 Introduction to probabilistic reasoning. Bayesian nets. Markov models.

Slides:



Advertisements
Similar presentations
CS188: Computational Models of Human Behavior
Advertisements

CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu Lecture 27 – Overview of probability concepts 1.
A Tutorial on Learning with Bayesian Networks
Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
Probabilistic Reasoning Bayesian Belief Networks Constructing Bayesian Networks Representing Conditional Distributions Summary.
Bayesian Network and Influence Diagram A Guide to Construction And Analysis.
Probability: Review The state of the world is described using random variables Probabilities are defined over events –Sets of world states characterized.
Uncertainty Everyday reasoning and decision making is based on uncertain evidence and inferences. Classical logic only allows conclusions to be strictly.
Dynamic Bayesian Networks (DBNs)
Chapter 4: Reasoning Under Uncertainty
5/17/20151 Probabilistic Reasoning CIS 479/579 Bruce R. Maxim UM-Dearborn.
Introduction of Probabilistic Reasoning and Bayesian Networks
Chapter 15 Probabilistic Reasoning over Time. Chapter 15, Sections 1-5 Outline Time and uncertainty Inference: ltering, prediction, smoothing Hidden Markov.
1 Slides for the book: Probabilistic Robotics Authors: Sebastian Thrun Wolfram Burgard Dieter Fox Publisher: MIT Press, Web site for the book & more.
Causal and Bayesian Network (Chapter 2) Book: Bayesian Networks and Decision Graphs Author: Finn V. Jensen, Thomas D. Nielsen CSE 655 Probabilistic Reasoning.
GS 540 week 6. HMM basics Given a sequence, and state parameters: – Each possible path through the states has a certain probability of emitting the sequence.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Bayesian Networks. Motivation The conditional independence assumption made by naïve Bayes classifiers may seem to rigid, especially for classification.
Learning with Bayesian Networks David Heckerman Presented by Colin Rickert.
Bayesian Belief Networks
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
5/25/2005EE562 EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005.
Uncertainty Logical approach problem: we do not always know complete truth about the environment Example: Leave(t) = leave for airport t minutes before.
Bayesian Networks Alan Ritter.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
Bayesian Reasoning. Tax Data – Naive Bayes Classify: (_, No, Married, 95K, ?)
A Brief Introduction to Graphical Models
Bayesian Networks – Principles and Application to Modelling water, governance and human development indicators in Developing Countries Jorge López Puga.
Bayesian Learning By Porchelvi Vijayakumar. Cognitive Science Current Problem: How do children learn and how do they get it right?
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
Digital Statisticians INST 4200 David J Stucki Spring 2015.
Visibility Graph. Voronoi Diagram Control is easy: stay equidistant away from closest obstacles.
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
Bayesian Networks for Data Mining David Heckerman Microsoft Research (Data Mining and Knowledge Discovery 1, (1997))
Bayesian Statistics and Belief Networks. Overview Book: Ch 13,14 Refresher on Probability Bayesian classifiers Belief Networks / Bayesian Networks.
Introduction to Bayesian Networks
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
1 Reasoning Under Uncertainty Artificial Intelligence Chapter 9.
Uncertainty. Assumptions Inherent in Deductive Logic-based Systems All the assertions we wish to make and use are universally true. Observations of the.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Reasoning Under Uncertainty. 2 Objectives Learn the meaning of uncertainty and explore some theories designed to deal with it Find out what types of errors.
CHAPTER 8 DISCRIMINATIVE CLASSIFIERS HIDDEN MARKOV MODELS.
The famous “sprinkler” example (J. Pearl, Probabilistic Reasoning in Intelligent Systems, 1988)
4 Proposed Research Projects SmartHome – Encouraging patients with mild cognitive disabilities to use digital memory notebook for activities of daily living.
Probabilistic reasoning over time Ch. 15, 17. Probabilistic reasoning over time So far, we’ve mostly dealt with episodic environments –Exceptions: games.
Lecture 2: Statistical learning primer for biologists
Bayesian networks and their application in circuit reliability estimation Erin Taylor.
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
CSC321: Introduction to Neural Networks and Machine Learning Lecture 17: Boltzmann Machines as Probabilistic Models Geoffrey Hinton.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Classification COMP Seminar BCB 713 Module Spring 2011.
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
Matching ® ® ® Global Map Local Map … … … obstacle Where am I on the global map?                                   
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Bayesian Decision Theory Introduction to Machine Learning (Chap 3), E. Alpaydin.
Bayes’ Theorem Suppose we have estimated prior probabilities for events we are concerned with, and then obtain new information. We would like to a sound.
Recuperação de Informação B Modern Information Retrieval Cap. 2: Modeling Section 2.8 : Alternative Probabilistic Models September 20, 1999.
CHAPTER 5 Handling Uncertainty BIC 3337 EXPERT SYSTEM.
Qian Liu CSE spring University of Pennsylvania
Quick Review Probability Theory
Uncertainty in AI.
CSE-490DF Robotics Capstone
Bayesian Statistics and Belief Networks
Class #19 – Tuesday, November 3
Class #16 – Tuesday, October 26
Belief Networks CS121 – Winter 2003 Belief Networks.
Presentation transcript:

Soft Computing Lecture 17 Introduction to probabilistic reasoning. Bayesian nets. Markov models

Why probabilistic methods? Probabilistic is used for description of events or behavior or making decision when we have not enough knowledge about object of observation Probability theory and then probabilistic methods of AI aim to introduce in random processes any knowledge about laws or rules about sources of random events This is alternative path of description of uncertainty in contrast to fuzzy logics

The Problem We normally deal with assertions and their causal connections: –John has fever –John has the flu –If somebody has the flu then that person has fever. We are not certain that such assertions are true. We believe/disbelieve them to some degree. [Though "belief" and "evidence" are not the same thing, for our purposes they will be treated synonymously.] Our problem is how to associate a degree of belief or of disbelief with assertions –How do we associate beliefs with elementary assertions –How do we combine beliefs in composite assertions from the beliefs of the component assertions –What is the relation between the beliefs of causally connected assertions. Estimates for elementary assertions are obtained –From Experts (subjective probability) –From frequencies (if given enough data) It is very hard to come up with good estimates for beliefs. Always consider the question "What if the guess is bad". Estimates are needed, given the belief in assertions A and B, for the assertions ~A, A & B, A v B Evidence must be combined in cases such as: –We have a causal connection from assertion A to assertion B, what can we say about B if A is true, or, vice versa, about A if B is true –We have a causal connection from assertion A to assertions B1 and B2, what can we say about A if both B1 and B2 are true –We have a causal connection from assertion A1 to B and a causal connection from A2 to B, what can we say about B when both A1 and A2 are true.

Probabilistic methods of reasoning and learning Probabilistic neural networks Bayesian networks Markov models and chains Support Vector and Kernel Machines (SVM) Genetic algorithms (evolution learning)

Bayes’ Law – P(a,b) = P(a|b) P(b) = P(b|a) P(a) –Joint probability of a and b = probability of b times the probability of a given b

Bayesian learning

Bayesian learning (2)

Bayesian learning (3)

Bayesian learning (4)

Bayes theorem P(A j | B) – posterior probability of event A j at condition of event B, P(B | A j ) – likelihood, P(B) – evidence Bayes theorem is only valid if we know all the conditional probabilities relating to the evidence in question. This makes it hard to apply the theorem in practical AI applications

Bayesian Network A Bayesian Network is a directed acyclic graph: –A graph where the directions are links which indicate dependencies that exist between nodes (variables). –Nodes represent propositions about events or events themselves. –Conditional probabilities quantify the strength of dependencies. Consider the following example: The probability, that my car won't start. If my car won't start then it is likely that –The battery is flat or –The staring motor is broken. In order to decide whether to fix the car myself or send it to the garage I make the following decision: –If the headlights do not work then the battery is likely to be flat so i fix it myself. –If the starting motor is defective then send car to garage. –If battery and starting motor both gone send car to garage.

A simple Bayesian network

Kinds of relations between variables in Bayesian nets a) Sequence, influence may be distribute from A to C and back while value of B is unknown b) Divergence, influence may be distributed on childes of A while A is unknown c) Convergence, about A nothing unknown except that may be obtained through its parents

Reasoning in Bayesian nets Probabilities in links obey standard conditional probability axioms. Therefore follow links in reaching hypothesis and update beliefs accordingly. A few broad classes of algorithms have bee used to help with this: –Pearls's message passing method. –Clique triangulation. –Stochastic methods. –Basically they all take advantage of clusters in the network and use their limits on the influence to constrain the search through net. –They also ensure that probabilities are updated correctly. Since information is local information can be readily added and deleted with minimum effect on the whole network. ONLY affected nodes need updating.

Synthesis of Bayes network based on a priory information Describe task in terms of probabilities of values of goal variables Select concept space of task, determine variables corresponding to goal variables, describe possible values of ones Determine a priori probabilities of values of variables Describe causal relations and node (variables) as graph For every node determine condition probabilities of value of variable at different combinations of values of variables-parents

Applications of Bayes networks Medical diagnostic systems –PathFinder (1990) for diagnostics of illness of lymphatic glands, Space and military applications –Vista (NASA) is used for selection of needed information for diagnostic display from telemetric information in real time, –Netica (Australia) for defence of territory from sea Computers and software –For control of agents-helpers in MS Office Image processing –Extract of 3-dimensional scene from 2-dimensional images Finance and economy –Estimation of risks and prediction of yield of portfolio

Hidden Markov Model for recognition of speech P3(.)P3(.)P1(.)P1(.)P2(.)P2(.) P 1,1 P 1,2 P 2,2 P 3,3 P 2,3 P 3,4

Create compound HMM for each lexical entry by concatenating the phones making up the pronunciation –example of HMM for ‘lab’ (following ‘speech’ for crossword triphone) Multiple pronunciations can be weighted by likelihood into compound HMM for a word (Tri)phone models are independent parts of word models Lexical HMMs P3(.)P3(.)P1(.)P1(.)P2(.)P2(.) P 1,1 P 1,2 P 2,2 P 3,3 P 2,3 P 3,4 P3(.)P3(.)P1(.)P1(.)P2(.)P2(.) P 1,1 P 1,2 P 2,2 P 3,3 P 2,3 P 3,4 P3(.)P3(.)P1(.)P1(.)P2(.)P2(.) P 1,1 P 1,2 P 2,2 P 3,3 P 2,3 P 3,4 phone: l a b triphone: ch- l +a l- a +b a- b +#