CSc411Artificial Intelligence1 Chapter 5 STOCHASTIC METHODS Contents The Elements of Counting Elements of Probability Theory Applications of the Stochastic.

Slides:



Advertisements
Similar presentations
Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
Advertisements

Probability: Review The state of the world is described using random variables Probabilities are defined over events –Sets of world states characterized.
PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
Uncertainty Everyday reasoning and decision making is based on uncertain evidence and inferences. Classical logic only allows conclusions to be strictly.
FT228/4 Knowledge Based Decision Support Systems
Probability Theory Part 1: Basic Concepts. Sample Space - Events  Sample Point The outcome of a random experiment  Sample Space S The set of all possible.
COUNTING AND PROBABILITY
Ch5 Stochastic Methods Dr. Bernard Chen Ph.D. University of Central Arkansas Spring 2011.
CS 484 – Artificial Intelligence1 Announcements Homework 8 due today, November 13 ½ to 1 page description of final project due Thursday, November 15 Current.
Introduction of Probabilistic Reasoning and Bayesian Networks
Chapter 4 Introduction to Probability Experiments, Counting Rules, and Assigning Probabilities Events and Their Probability Some Basic Relationships of.
Business and Economics 7th Edition
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
1 Bayesian Reasoning Chapter 13 CMSC 471 Adapted from slides by Tim Finin and Marie desJardins.
1 Department of Computer Science and Engineering, University of South Carolina Issues for Discussion and Work Jan 2007  Choose meeting time.
CSE (c) S. Tanimoto, 2008 Bayes Nets 1 Probabilistic Reasoning With Bayes’ Rule Outline: Motivation Generalizing Modus Ponens Bayes’ Rule Applying.
Thanks to Nir Friedman, HU
Chap 4-1 EF 507 QUANTITATIVE METHODS FOR ECONOMICS AND FINANCE FALL 2008 Chapter 4 Probability.
Chapter 4 Probability Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Recitation 1 Probability Review
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
Chapter 1 Basics of Probability.
Stochastic Methods A Review (Mostly). Relationship between Heuristic and Stochastic Methods  Heuristic and stochastic methods useful where –Problem does.
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
CHAPTER 5 Probability: Review of Basic Concepts
Review Chapter Chapter 1 Combinatorial Analysis Basic principle of counting Permutation Combination 2.
Stochastic Methods A Review. Some Terms  Random Experiment: An experiment for which the outcome cannot be predicted with certainty  Each experiment.
Probability Notes Math 309. Sample spaces, events, axioms Math 309 Chapter 1.
Chapter 3 – Set Theory  .
Engineering Probability and Statistics Dr. Leonore Findsen Department of Statistics.
CSCI 115 Chapter 3 Counting. CSCI 115 §3.1 Permutations.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 25 Wednesday, 20 October.
CS 415 – A.I. Slide Set 12. Chapter 5 – Stochastic Learning Heuristic – apply to problems who either don’t have an exact solution, or whose state spaces.
Chapter 10 Probability. Experiments, Outcomes, and Sample Space Outcomes: Possible results from experiments in a random phenomenon Sample Space: Collection.
1 Reasoning Under Uncertainty Artificial Intelligence Chapter 9.
2-1 Sample Spaces and Events Random Experiments Figure 2-1 Continuous iteration between model and physical system.
Conditional Probability More often than not, we wish to express probabilities conditionally. i.e. we specify the assumptions or conditions under which.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 25 of 41 Monday, 25 October.
George F Luger ARTIFICIAL INTELLIGENCE 6th edition Structures and Strategies for Complex Problem Solving STOCHASTIC METHODS Luger: Artificial Intelligence,
Week 11 What is Probability? Quantification of uncertainty. Mathematical model for things that occur randomly. Random – not haphazard, don’t know what.
Probability Course web page: vision.cis.udel.edu/cv March 19, 2003  Lecture 15.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Review: Probability Random variables, events Axioms of probability Atomic events Joint and marginal probability distributions Conditional probability distributions.
1 1 Slide © 2007 Thomson South-Western. All Rights Reserved Chapter 4 Introduction to Probability Experiments, Counting Rules, and Assigning Probabilities.
Uncertainty ECE457 Applied Artificial Intelligence Spring 2007 Lecture #8.
Probability: Terminology  Sample Space  Set of all possible outcomes of a random experiment.  Random Experiment  Any activity resulting in uncertain.
2003/02/19 Chapter 2 1頁1頁 Chapter 2 : Basic Probability Theory Set Theory Axioms of Probability Conditional Probability Sequential Random Experiments Outlines.
Basic probability Sep. 16, Introduction Our formal study of probability will base on Set theory Axiomatic approach (base for all our further studies.
CSCI 115 Chapter 3 Counting. CSCI 115 §3.1 Permutations.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Statistical NLP: Lecture 4 Mathematical Foundations I: Probability Theory (Ch2)
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
Basic Probability. Introduction Our formal study of probability will base on Set theory Axiomatic approach (base for all our further studies of probability)
George F Luger ARTIFICIAL INTELLIGENCE 5th edition Structures and Strategies for Complex Problem Solving STOCHASTIC METHODS Luger: Artificial Intelligence,
Chapter 1: Outcomes, Events, and Sample Spaces men 1.
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Chapter 2: Probability CIS Computational Probability and Statistics.
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Chapter 2: Probability CIS Computational Probability and.
Review of Probability.
Math a - Sample Space - Events - Definition of Probabilities
Chapter 4 Probability.
Statistics for 8th Edition Chapter 3 Probability
Professor Marie desJardins,
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Chapter 2: Probability CIS Computational Probability and.
Chapter 2 Notes Math 309 Probability.
Chapter Sets &Venn Diagrams.
Class #21 – Monday, November 10
Probability Notes Math 309.
Probability Notes Math 309.
Presentation transcript:

CSc411Artificial Intelligence1 Chapter 5 STOCHASTIC METHODS Contents The Elements of Counting Elements of Probability Theory Applications of the Stochastic Methodology Bayes’ Theorem

CSc411Artificial Intelligence2 Application Areas Diagnostic Reasoning. In medical diagnosis, for example, there is not always an obvious cause/effect relationship between the set of symptoms presented by the patient and the causes of these symptoms. In fact, the same sets of symptoms often suggest multiple possible causes. Natural language understanding. If a computer is to understand and use a human language, that computer must be able to characterize how humans themselves use that language. Words, expressions, and metaphors are learned, but also change and evolve as they are used over time. Planning and scheduling. When an agent forms a plan, for example, a vacation trip by automobile, it is often the case that no deterministic sequence of operations is guaranteed to succeed. What happens if the car breaks down, if the car ferry is cancelled on a specific day, if a hotel is fully booked, even though a reservation was made? Learning. The three previous areas mentioned for stochastic technology can also be seen as domains for automated learning. An important component of many stochastic systems is that they have the ability to sample situations and learn over time.

CSc411Artificial Intelligence3 Set Operations Let A and B are two sets, U universe –Cardinality |A|: number of elements in A –Complement Ā: all elements in U but not in A –Subset: A  B –Empty set:  –Union: A  B –Intersection: A  B –Difference: A - B

CSc411Artificial Intelligence4 Addition Rules The Addition rule for combining two sets: The Addition rule for combining three sets: This Addition rule may be generalized to any finite number of sets

CSc411Artificial Intelligence5 The Cartesian Product of two sets A and B The multiplication principle of counting, for two sets 5 Multiplication Rules

CSc411Artificial Intelligence6 The permutations of a set of n elements taken r at a time The combinations of a set of n elements taken r at a time Permutations and Combinations

CSc411Artificial Intelligence7 Events and Probability

CSc411Artificial Intelligence8 The probability of any event E from the sample space S is: The sum of the probabilities of all possible outcomes is 1 The probability of the compliment of an event is The probability of the contradictory or false outcome of an event Probability Properties

CSc411Artificial Intelligence9 Independent Events

CSc411Artificial Intelligence10 The Kolmogorov Axioms Three Kolmogorov Axioms: From these three Kolmogorov axioms, all of probability theory can be constructed.

CSc411Artificial Intelligence11 Traffic Example Problem description A driver realizes the gradual slowdown and searches for possible explanation by means of car-based download system –Road construction? –Accident? Three Boolean parameters –S: whether slowdown –A: whether accident –C: whether road construction Download data – next page

CSc411Artificial Intelligence12 The joint probability distribution for the traffic slowdown, S, accident, A, and construction, C, variable of the example A Venn diagram representation of the probability distributions is traffic slowdown, A is accident, C is construction. Download data:

CSc411Artificial Intelligence13 Variables

CSc411Artificial Intelligence14 Expectation

CSc411Artificial Intelligence15 Prior and Posterior Probability

CSc411Artificial Intelligence16 A Venn diagram illustrating the calculations of P(d|s) as a function of p(s|d). Conditional Probability

CSc411Artificial Intelligence17 The chain rule for two sets: The generalization of the chain rule to multiple sets We make an inductive argument to prove the chain rule, consider the n th case: We apply the intersection of two sets of rules to get: And then reduce again, considering that: Until is reached, the base case, which we have already demonstrated. Chain Rules

CSc411Artificial Intelligence18 Independent Events

CSc411Artificial Intelligence19 Probabilistic FSM

CSc411Artificial Intelligence20 A probabilistic finite state acceptor for the pronunciation of “tomato”. Probabilistic Finite State Acceptor

CSc411Artificial Intelligence21 The ni words with their frequencies and probabilities from the Brown and Switchboard corpora of 2.5M words. The ni words The ni phone/word probabilities from the Brown and Switchboard corpora.

CSc411Artificial Intelligence22 Given a set of evidence E, and a set of hypotheses H ={h i } The conditional probability of h i given E is: p(h i |E) = (p(E|h i )  h(h i ))/p(E) Maximum a posteriori hypothesis (most probable hypothesis), since p(E)is a constant for all hypotheses arg max(h i ) p(E|h i )p(h i ) E is partitioned by all hypotheses, thus p(E) =  i p(E|h i )p(h i ) Bayes’ Rules

CSc411Artificial Intelligence23 The general form of Bayes’ theorem where we assume the set of hypotheses H partition the evidence set E : General Form of Bayes’ Theorem

CSc411Artificial Intelligence24 Used in PROSPECTOR A simple example: suppose to purchase an automobile: Applications of Bayes’ TheoremDealers Go to probability Purchase a1 probability 1 d1 = 0.2 p1 = d2 = 0.4 p2 = d3 = 0.4 p3 = 0.3 The application of Bayes’ rule to the car purchase problem:

CSc411Artificial Intelligence25 Naïve Bayes, or the Bayes classifier, that uses the partition assumption, even when it is not justified:. Assume all evidences are independent, given a particular hypothesis Bayes Classifier

CSc411Artificial Intelligence26 The Bayesian representation of the traffic problem with potential explanations. The joint probability distribution for the traffic and construction variables The Traffic Problem Given bad traffic, what is the probability of road construction? p(C|T)=p(C=t, T=t)/(p(C=t, T=t)+p(C=f, T=t))=.3/(.3+.1)=.75