AI - Week 24 Uncertain Reasoning (quick mention) then REVISION Lee McCluskey, room 2/07

Slides:



Advertisements
Similar presentations
FT228/4 Knowledge Based Decision Support Systems
Advertisements

A Tutorial on Learning with Bayesian Networks
Representations for KBS: Uncertainty & Decision Support
Uncertainty in Expert Systems (Certainty Factors).
Rulebase Expert System and Uncertainty. Rule-based ES Rules as a knowledge representation technique Type of rules :- relation, recommendation, directive,
Uncertainty Everyday reasoning and decision making is based on uncertain evidence and inferences. Classical logic only allows conclusions to be strictly.
FT228/4 Knowledge Based Decision Support Systems
B. Ross Cosc 4f79 1 Uncertainty Knowledge can have uncertainty associated with it - Knowledge base: rule premises, rule conclusions - User input: uncertain,
5/17/20151 Probabilistic Reasoning CIS 479/579 Bruce R. Maxim UM-Dearborn.
Introduction of Probabilistic Reasoning and Bayesian Networks
Probabilistic Models of Cognition Conceptual Foundations Chater, Tenenbaum, & Yuille TICS, 10(7), (2006)
Induction and Decision Trees. Artificial Intelligence The design and development of computer systems that exhibit intelligent behavior. What is intelligence?
Artificial Intelligence Chapter 19 Reasoning with Uncertain Information Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.
AI – CS364 Uncertainty Management Introduction to Uncertainty Management 21 st September 2006 Dr Bogdan L. Vrusias
AI – CS364 Uncertainty Management 26 th September 2006 Dr Bogdan L. Vrusias
AI - Week 13 Knowledge Representation, Logic, Semantic Web Lee McCluskey, room 2/07
M.I. Jaime Alfonso Reyes ´Cortés.  The basic task for any probabilistic inference system is to compute the posterior probability distribution for a set.
PR-OWL: A Framework for Probabilistic Ontologies by Paulo C. G. COSTA, Kathryn B. LASKEY George Mason University presented by Thomas Packer 1PR-OWL.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Propagation in Poly Trees Given a Bayesian Network BN = {G, JDP} JDP(a,b,c,d,e) = p(a)*p(b|a)*p(c|e,b)*p(d)*p(e|d) a d b e c.
AI – Week 8 AI + 2 Player Games Lee McCluskey, room 3/10
1 Knowledge Engineering for Bayesian Networks Ann Nicholson School of Computer Science and Software Engineering Monash University.
Bayesian Networks What is the likelihood of X given evidence E? i.e. P(X|E) = ?
Lecture 05 Rule-based Uncertain Reasoning
CPSC 322, Lecture 31Slide 1 Probability and Time: Markov Models Computer Science cpsc322, Lecture 31 (Textbook Chpt 6.5) March, 25, 2009.
Baysian Approaches Kun Guo, PhD Reader in Cognitive Neuroscience School of Psychology University of Lincoln Quantitative Methods 2011.
Formal Aspects of Computer Science – Week 12 RECAP Lee McCluskey, room 2/07
CPSC 322 Introduction to Artificial Intelligence September 10, 2004.
CPSC 322, Lecture 24Slide 1 Reasoning under Uncertainty: Intro to Probability Computer Science cpsc322, Lecture 24 (Textbook Chpt 6.1, 6.1.1) March, 15,
CS Bayesian Learning1 Bayesian Learning. CS Bayesian Learning2 States, causes, hypotheses. Observations, effect, data. We need to reconcile.
School of Computing and Engineering, University of Huddersfield Formal Aspects of Computer Science - CIA 2326 Lee McCluskey, room 2/07
Does Naïve Bayes always work?
For Monday after Spring Break Read Homework: –Chapter 13, exercise 6 and 8 May be done in pairs.
Quiz 4: Mean: 7.0/8.0 (= 88%) Median: 7.5/8.0 (= 94%)
CSCI 121 Special Topics: Bayesian Network Lecture #1: Reasoning Under Uncertainty.
1 Lyle H. Ungar, University of Pennsylvania What is AI? “Artificial Intelligence is the study of how to make computers do things at which, at the moment,
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 26 of 41 Friday, 22 October.
Bayesian Learning By Porchelvi Vijayakumar. Cognitive Science Current Problem: How do children learn and how do they get it right?
Digital Statisticians INST 4200 David J Stucki Spring 2015.
Bayesian Networks What is the likelihood of X given evidence E? i.e. P(X|E) = ?
Bayesian Networks for Data Mining David Heckerman Microsoft Research (Data Mining and Knowledge Discovery 1, (1997))
Overall course structure AI Artificial Intelligence ( A modern approach ) AI-2 Spring semester TDT4171 Methods in artificial intelligence AI-1 Fall semester.
Bayesian Statistics and Belief Networks. Overview Book: Ch 13,14 Refresher on Probability Bayesian classifiers Belief Networks / Bayesian Networks.
Computing & Information Sciences Kansas State University Wednesday, 22 Oct 2008CIS 530 / 730: Artificial Intelligence Lecture 22 of 42 Wednesday, 22 October.
Week 71 Hypothesis Testing Suppose that we want to assess the evidence in the observed data, concerning the hypothesis. There are two approaches to assessing.
Bayesian Networks for Cyber Crimes. Bayes’ Theorem For an hypothesis H supported by evidence E: Pr(H|E) = Pr(E|H).Pr(H)/Pr(E) where – Pr(H|E) is the posterior.
Reasoning Under Uncertainty: Conditioning, Bayes Rule & the Chain Rule Jim Little Uncertainty 2 Nov 3, 2014 Textbook §6.1.3.
Uncertainty Management in Rule-based Expert Systems
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Ch15: Decision Theory & Bayesian Inference 15.1: INTRO: We are back to some theoretical statistics: 1.Decision Theory –Make decisions in the presence of.
COMP 2208 Dr. Long Tran-Thanh University of Southampton Bayes’ Theorem, Bayesian Reasoning, and Bayesian Networks.
1 Bayesian Networks: A Tutorial. 2 Introduction Suppose you are trying to determine if a patient has tuberculosis. You observe the following symptoms:
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 24 of 41 Monday, 18 October.
Introduction on Graphic Models
Anifuddin Azis UNCERTAINTY. 2 Introduction The world is not a well-defined place. There is uncertainty in the facts we know: What’s the temperature? Imprecise.
Bayesian Decision Theory Introduction to Machine Learning (Chap 3), E. Alpaydin.
Text Classification with Belief Augmented Frames Colin Tan Department of Computer Science, School of Computing, National University of Singapore.
CHAPTER 5 Handling Uncertainty BIC 3337 EXPERT SYSTEM.
Does Naïve Bayes always work?
Today.
Reasoning Under Uncertainty: Conditioning, Bayes Rule & Chain Rule
Conditional Probability, Bayes’ Theorem, and Belief Networks
Probabilistic Horn abduction and Bayesian Networks
Bayesian Statistics and Belief Networks
Prepared by: Mahmoud Rafeek Al-Farra
Chapter 14 February 26, 2004.
Certainty Factor Model
basic probability and bayes' rule
Presentation transcript:

AI - Week 24 Uncertain Reasoning (quick mention) then REVISION Lee McCluskey, room 2/07

Logic and Reasoning in AI Other Forms of Representation and Reasoning Throughout the course (Planning, Logic, Machine Learning, Prolog) we have used “classical” True/False logic to underlie all we're done. There are huge branches of AI which represent knowledge in other ways – usually representing uncertainty, belief or probability.

Logic and Reasoning in AI Example: Baysian Representations There are many techniques for reasoning with uncertainty - the most common Baysian Belief networks (Bayes was an 18th century English Vicar). P(H given Evidence) = P(Evidence given H) * P(H) / P(Evidence) Bayesian networks are nodes connected by arcs where probability Is propagated theough the network using Bayesian Inference. Baysian systems are based on statistical inference, hence applications where relationships can be modelled by probability or likelihood are appropriate eg diagnostic expert systems for medicine such as Mycin.

Logic and Reasoning in AI Bayes * H is a hypothesis, and E is the evidence. * P(H) is the prior probability of H: the probability that H is correct before the evidence E was seen. * P(E | H) is the conditional probability of observing evidence E given that the hypothesis H is true. * P(E) is the probability of observing the evidence E. * P(H | E) is the probability that the hypothesis is true, given the evidence and the previous state of belief about the hypothesis. Bayesian networks are structures for representing knowledge about uncertain facts; they give a way of computing the impact of evidence.