basic probability and bayes' rule

Slides:



Advertisements
Similar presentations
Rulebase Expert System and Uncertainty. Rule-based ES Rules as a knowledge representation technique Type of rules :- relation, recommendation, directive,
Advertisements

Probability: Review The state of the world is described using random variables Probabilities are defined over events –Sets of world states characterized.
PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
Psychology 290 Special Topics Study Course: Advanced Meta-analysis April 7, 2014.
Uncertainty Everyday reasoning and decision making is based on uncertain evidence and inferences. Classical logic only allows conclusions to be strictly.
FT228/4 Knowledge Based Decision Support Systems
5/17/20151 Probabilistic Reasoning CIS 479/579 Bruce R. Maxim UM-Dearborn.
Uncertain knowledge and reasoning
Probabilities Random Number Generators –Actually pseudo-random –Seed Same sequence from same seed Often time is used. Many examples on web. Custom random.
CPSC 422 Review Of Probability Theory.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
1. Probability 2. Random variables 3. Inequalities 4. Convergence of random variables 5. Point and interval estimation 6. Hypotheses testing 7. Nonparametric.
CS 547: Sensing and Planning in Robotics Gaurav S. Sukhatme Computer Science Robotic Embedded Systems Laboratory University of Southern California
Introduction to Information Retrieval Introduction to Information Retrieval Hinrich Schütze and Christina Lioma Lecture 11: Probabilistic Information Retrieval.
KI2 - 2 Kunstmatige Intelligentie / RuG Probabilities Revisited AIMA, Chapter 13.
1 Bayesian Reasoning Chapter 13 CMSC 471 Adapted from slides by Tim Finin and Marie desJardins.
Bayesian Networks What is the likelihood of X given evidence E? i.e. P(X|E) = ?
Representing Uncertainty CSE 473. © Daniel S. Weld 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one.
Ai in game programming it university of copenhagen Welcome to... the Crash Course Probability Theory Marco Loog.
I The meaning of chance Axiomatization. E Plurbus Unum.
Uncertainty Chapter 13.
For Monday after Spring Break Read Homework: –Chapter 13, exercise 6 and 8 May be done in pairs.
CSCI 121 Special Topics: Bayesian Network Lecture #1: Reasoning Under Uncertainty.
Dr. Gary Blau, Sean HanMonday, Aug 13, 2007 Statistical Design of Experiments SECTION I Probability Theory Review.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
1 Chapter 13 Uncertainty. 2 Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Basics of Probability. A Bit Math A Probability Space is a triple, where  is the sample space: a non-empty set of possible outcomes; F is an algebra.
What is Probability?  Hit probabilities  Damage probabilities  Personality (e.g. chance of attack, run, etc.)  ???  Probabilities are used to add.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
1 Reasoning Under Uncertainty Artificial Intelligence Chapter 9.
CSE PR 1 Reasoning - Rule-based and Probabilistic Representing relations with predicate logic Limitations of predicate logic Representing relations.
Uncertainty Uncertain Knowledge Probability Review Bayes’ Theorem Summary.
Chapter 13 February 19, Acting Under Uncertainty Rational Decision – Depends on the relative importance of the goals and the likelihood of.
Uncertainty Management in Rule-based Expert Systems
Uncertainty. Assumptions Inherent in Deductive Logic-based Systems All the assertions we wish to make and use are universally true. Observations of the.
Probability and Information Copyright, 1996 © Dale Carnegie & Associates, Inc. A brief review.
Probability Course web page: vision.cis.udel.edu/cv March 19, 2003  Lecture 15.
Uncertainty in Expert Systems
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
Making sense of randomness
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
12/7/20151 Math b Conditional Probability, Independency, Bayes Theorem.
Uncertainty ECE457 Applied Artificial Intelligence Spring 2007 Lecture #8.
Education as a Signaling Device and Investment in Human Capital Topic 3 Part I.
Textbook Basics of an Expert System: – “Expert systems: Design and Development,” by: John Durkin, 1994, Chapters 1-4. Uncertainty (Probability, Certainty.
Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability (road state, other.
© 2013 Pearson Education, Inc. Reading Quiz For use with Classroom Response Systems Introductory Statistics: Exploring the World through Data, 1e by Gould.
Statistical NLP: Lecture 4 Mathematical Foundations I: Probability Theory (Ch2)
Outline [AIMA Ch 13] 1 Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Anifuddin Azis UNCERTAINTY. 2 Introduction The world is not a well-defined place. There is uncertainty in the facts we know: What’s the temperature? Imprecise.
Conditional Probability 423/what-is-your-favorite-data-analysis-cartoon 1.
ECE457 Applied Artificial Intelligence Fall 2007 Lecture #8
CHAPTER 5 Handling Uncertainty BIC 3337 EXPERT SYSTEM.
Bayes Rule and Bayes Classifiers
Chapter 4 Probability.
Quick Review Probability Theory
Conditional probability
Quick Review Probability Theory
Reasoning Under Uncertainty in Expert System
Where are we in CS 440? Now leaving: sequential, deterministic reasoning Entering: probabilistic reasoning and machine learning.
Uncertainty Chapter 13.
Incorporating New Information to Decision Trees (posterior probabilities) MGS Chapter 6 Part 3.
Representing Uncertainty
Statistical NLP: Lecture 4
Wellcome Trust Centre for Neuroimaging
Bayes for Beginners Luca Chech and Jolanda Malamud
28th September 2005 Dr Bogdan L. Vrusias
Chapter 14 February 26, 2004.
ECE457 Applied Artificial Intelligence Spring 2008 Lecture #8
Presentation transcript:

 Uncertainty to epistemic(relating to knowledge)situations involving imperfect or unknown information.  Data might be missing or unavailable  Data might be present but unreliable or ambiguous.  The representation of the data may be imprecise or inconsistent.  Data may just be user’s best guest.

 Probability theory provides a sound principled method for decision making under uncertainty. In propositional logic, we used variables that could take on the logical values True or False. In probability, we use random variables instead.  Probabilities can be learned from data.  The probability in the range of [0,1] for each possible assignment indicating the probability that the variable has that particular value.

 In probability we use random variables instead. There are three types of random variable. 1. Boolean random variable:-which can take on values true or false. 2. Discrete random variable:- which can take on a finite number of values. 3. Continuous random variables:-which can take on an infinite number of values.

1. P(True)=1;P(False)=0.True variable are certain to be true;False variables are certain to not be true. 2. 0<-P(A)<-1:The probability of any variables being true lies between or at 0 and P(A[B) =P(A) +P(B) for disjoint events A and B.

 Conditional probability allows us to reason with partial information  P(A) is called the priori(or prior) probability of A and P(A|B) is called the a posteriori of A and B

 Bayes' theorem is also known as Bayes' rule, Bayes' law, or Bayesian reasoning, which determines the probability of an event with uncertain knowledge.  Bayes' rule species how to combine data and prior knowledge.  In probability theory, it relates the conditional probability and marginal probabilities of two random events.  Bayes' theorem was named after the British mathematician Thomas Bayes. The Bayesian inference is an application of Bayes' theorem, which is fundamental to Bayesian statistics.

 Bayes' theorem can be derived using product rule and conditional probability of event A with known event B: 1. As from product rule we can write: P(A ⋀ B)= P(A|B) P(B) or 2.Similarly, the probability of event B with known event A: P(A ⋀ B)= P(B|A) P(A)  Equating right hand side of both the equations, we will get:  The above equation (a) is called as Bayes' rule or Bayes' theorem. This equation is basic of most modern AI systems for probabilistic inference.

Following are some applications of Bayes' theorem:  It is used to calculate the next step of the robot when the already executed step is given.  Bayes' theorem is helpful in weather forecasting.  It can solve the Monty Hall problem.