Lecture 1 is based on David Heckerman’s Tutorial slides. (Microsoft Research) Bayesian Networks Lecture 1: Basics and Knowledge- Based Construction Requirements:

Slides:



Advertisements
Similar presentations
Autonomic Scaling of Cloud Computing Resources
Advertisements

A Tutorial on Learning with Bayesian Networks
Bayesian Network and Influence Diagram A Guide to Construction And Analysis.
1 Some Comments on Sebastiani et al Nature Genetics 37(4)2005.
BAYESIAN NETWORKS. Bayesian Network Motivation  We want a representation and reasoning system that is based on conditional independence  Compact yet.
Reasoning Under Uncertainty: Bayesian networks intro Jim Little Uncertainty 4 November 7, 2014 Textbook §6.3, 6.3.1, 6.5, 6.5.1,
1 Knowledge Engineering for Bayesian Networks. 2 Probability theory for representing uncertainty l Assigns a numerical degree of belief between 0 and.
Slide 1 Reasoning Under Uncertainty: More on BNets structure and construction Jim Little Nov (Textbook 6.3)
Bayesian Networks VISA Hyoungjune Yi. BN – Intro. Introduced by Pearl (1986 ) Resembles human reasoning Causal relationship Decision support system/ Expert.
Causal and Bayesian Network (Chapter 2) Book: Bayesian Networks and Decision Graphs Author: Finn V. Jensen, Thomas D. Nielsen CSE 655 Probabilistic Reasoning.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
Eric Horvitz, Jack Breese, David Heckerman, Eric Horvitz, Jack Breese, David Heckerman, David Hovel, Koos Rommelse Microsoft Research Redmond, WA
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Review: Bayesian learning and inference
Bayesian Networks Chapter 2 (Duda et al.) – Section 2.11
Bayesian Belief Networks
Bayesian Nets and Applications Today’s Reading: C. 14 Next class: machine learning C. 18.1, 18.2 Questions on the homework?
Part 2 of 3: Bayesian Network and Dynamic Bayesian Network.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.6 [P]: Reasoning Under Uncertainty Sections.
CPSC 322, Lecture 28Slide 1 Reasoning Under Uncertainty: More on BNets structure and construction Computer Science cpsc322, Lecture 28 (Textbook Chpt 6.3)
Artificial Intelligence and Lisp Lecture 7 LiU Course TDDC65 Autumn Semester, 2010
5/25/2005EE562 EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005.
CPSC 422, Lecture 18Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 18 Feb, 25, 2015 Slide Sources Raymond J. Mooney University of.
Bayes Nets. Bayes Nets Quick Intro Topic of much current research Models dependence/independence in probability distributions Graph based - aka “graphical.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
Bayesian networks More commonly called graphical models A way to depict conditional independence relationships between random variables A compact specification.
Learning In Bayesian Networks. Learning Problem Set of random variables X = {W, X, Y, Z, …} Training set D = { x 1, x 2, …, x N }  Each observation specifies.
Judgment and Decision Making in Information Systems Computing with Influence Diagrams and the PathFinder Project Yuval Shahar, M.D., Ph.D.
Bayes’ Nets  A Bayes’ net is an efficient encoding of a probabilistic model of a domain  Questions we can ask:  Inference: given a fixed BN, what is.
Made by: Maor Levy, Temple University  Probability expresses uncertainty.  Pervasive in all of Artificial Intelligence  Machine learning 
Reasoning with Bayesian Networks. Overview Bayesian Belief Networks (BBNs) can reason with networks of propositions and associated probabilities Useful.
A Brief Introduction to Graphical Models
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Monday, March 13, 2000 Yuhui.
Bayesian networks Classification, segmentation, time series prediction and more. Website: Twitter:
Reasoning Under Uncertainty: Bayesian networks intro CPSC 322 – Uncertainty 4 Textbook §6.3 – March 23, 2011.
Bayesian Networks for Data Mining David Heckerman Microsoft Research (Data Mining and Knowledge Discovery 1, (1997))
Probabilistic Reasoning ECE457 Applied Artificial Intelligence Spring 2007 Lecture #9.
Reasoning with Bayesian Belief Networks. Overview Bayesian Belief Networks (BBNs) can reason with networks of propositions and associated probabilities.
Introduction to Bayesian Networks
Bayesian Nets and Applications. Naïve Bayes 2  What happens if we have more than one piece of evidence?  If we can assume conditional independence 
Ch 8. Graphical Models Pattern Recognition and Machine Learning, C. M. Bishop, Revised by M.-O. Heo Summarized by J.W. Nam Biointelligence Laboratory,
Announcements Project 4: Ghostbusters Homework 7
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
Learning In Bayesian Networks. General Learning Problem Set of random variables X = {X 1, X 2, X 3, X 4, …} Training set D = { X (1), X (2), …, X (N)
1 BN Semantics 1 Graphical Models – Carlos Guestrin Carnegie Mellon University September 15 th, 2008 Readings: K&F: 3.1, 3.2, –  Carlos.
Review: Bayesian inference  A general scenario:  Query variables: X  Evidence (observed) variables and their values: E = e  Unobserved variables: Y.
Lecture 2: Statistical learning primer for biologists
Bayesian networks and their application in circuit reliability estimation Erin Taylor.
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
Pattern Recognition and Machine Learning
Introduction on Graphic Models
1 BN Semantics 2 – Representation Theorem The revenge of d-separation Graphical Models – Carlos Guestrin Carnegie Mellon University September 17.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) Nov, 13, 2013.
Belief Networks CS121 – Winter Other Names Bayesian networks Probabilistic networks Causal networks.
1 BN Semantics 1 Graphical Models – Carlos Guestrin Carnegie Mellon University September 15 th, 2006 Readings: K&F: 3.1, 3.2, 3.3.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016.
Bayesian Nets and Applications Next class: machine learning C. 18.1, 18.2 Homework due next class Questions on the homework? Prof. McKeown will not hold.
Bayesian Networks Chapter 2 (Duda et al.) – Section 2.11 CS479/679 Pattern Recognition Dr. George Bebis.
Reasoning Under Uncertainty: More on BNets structure and construction
Presented By S.Yamuna AP/CSE
Read R&N Ch Next lecture: Read R&N
Reasoning Under Uncertainty: More on BNets structure and construction
Data Mining Lecture 11.
Read R&N Ch Next lecture: Read R&N
Bayesian Networks Based on
CAP 5636 – Advanced Artificial Intelligence
Belief Networks CS121 – Winter 2003 Belief Networks.
Probabilistic Reasoning
Read R&N Ch Next lecture: Read R&N
Presentation transcript:

Lecture 1 is based on David Heckerman’s Tutorial slides. (Microsoft Research) Bayesian Networks Lecture 1: Basics and Knowledge- Based Construction Requirements: 50% home works; 50% Exam or a project

What I hope you will get out of this course... u What are Bayesian networks? u Why do we use them? u How do we build them by hand? u How do we build them from data? u What are some applications? u What is their relationship to other models? u What are the properties of conditional independence that make these models appropriate? u Usage in genetic linkage analysis

Applications of hand-built Bayes Nets u Answer Wizard 95, Office Assistant 97,2000 u Troubleshooters in Windows 98 u Lymph node pathology u Trauma care u NASA mission control Some Applications of learned Bayes Nets u Clustering users on the web (MSNBC) u Classifying Text (spam filtering)

Some factors that support intelligence u Knowledge representation u Reasoning u Learning / adapting

Artificial Intelligence

Artificial Intelligence is better than none !

Artificial Intelligence is better than ours !

Outline for today u Basics u Knowledge-based construction u Probabilistic inference u Applications of hand-built BNs at Microsoft

Bayesian Networks: History u 1920s: Wright -- analysis of crop failure u 1950s: I.J. Good -- causality u Early 1980s: Howard and Matheson, Pearl u Other names: l directed acyclic graphical (DAG) models l belief networks l causal networks l probabilistic networks l influence diagrams l knowledge maps

Bayesian Network Fuel FuelGauge Start Battery Engine Turns Over p(b)p(b) p(t|b) p(g|f,b) p(s|f,t) p(f)p(f) Directed Acyclic Graph, annotated with prob distributions

BN structure: Definition Missing arcs encode independencies such that Fuel FuelGauge Start Battery Engine Turns Over p(b)p(b) p(t|b) p(g|f,b) p(s|f,t) p(f)p(f)

Independencies in a Bayes net Many other independencies are entailed by (*): can be read from the graph using d-separation (Pearl) Example:

Explaining Away and Induced Dependencies Fuel Start TurnOver "explaining away" "induced dependencies"

Local distributions Table: p(S=y|T=n,F=e) = 0.0 p(S=y|T=n,F=n) = 0.0 p(S=y|T=y,F=e) = 0.0 p(S=y|T=y,F=n) = 0.99 Fuel (empty, not) Start (yes, no) TurnOver (yes, no) TF S

Local distributions Tree: Fuel (empty, not) Start (yes, no) TurnOver (yes, no) TF S TurnOver Fuel no yes empty not empty p( start )=0 p( start )=0.99

Lots of possibilities for a local distribution... u y = discrete node: any probabilistic classifier l Decision tree l Neural net u y= continuous node: any probabilistic regression model l Linear regression with Gaussian noise l Neural net nodeparents

Naïve Bayes Classifier Class Input 1Input 2Input n... discrete

Hidden Markov Model H1H1 X1X1 H2H2 X2X2 H3H3 X3X3 H4H4 X4X4 H5H5 X5X5... discrete, hidden observations

Feed-Forward Neural Network X1X1 X1X1 X1X1 Y1Y1 Y2Y2 Y3Y3 hidden layer inputs outputs (binary) sigmoid

Outline u Basics u Knowledge-based construction u Probabilistic inference u Decision making u Applications of hand-built BNs at Microsoft

Building a Bayes net by hand (ok, now we're starting to be Bayesian) u Define variables u Assess the structure u Assess the local probability distributions

What is a variable? u Collectively exhaustive, mutually exclusive values Error Occured No Error

Clarity Test: Is the variable knowable in principle u Is it raining? {Where, when, how many inches?} u Is it hot? {T  100F, T < 100F} u Is user’s personality dominant or submissive? {numerical result of standardized personality test}

Assessing structure (one approach) u Choose an ordering for the variables u For each variable, identify parents Pa i such that

Example Fuel GaugeStart Battery TurnOver

Example Fuel GaugeStart Battery TurnOver p(f)p(f)

Example p(b|f)=p(b) Fuel GaugeStart Battery TurnOver p(f)p(f)

Example p(b|f)=p(b) p(t|b,f)=p(t|b) Fuel GaugeStart Battery TurnOver p(f)p(f)

Example p(b|f)=p(b) p(t|b,f)=p(t|b) p(g|f,b,t)=p(g|f,b) Fuel GaugeStart Battery TurnOver p(f)p(f)

Example p(b|f)=p(b) p(t|b,f)=p(t|b) p(g|f,b,t)=p(g|f,b) p(s|f,b,t,g)=p(s|f,t) p(f,b,t,g,s) = p(f) p(b) p(t|b) p(g|f,b) p(s|f,t) FuelGaugeStart Battery TurnOver p(f)p(f)

Why is this the wrong way? Variable order can be critical Battery TurnOver Start Fuel Gauge

A better way: Use causal knowledge Fuel Gauge Start Battery TurnOver

Conditional Independence Simplifies Probabilistic Inference FuelGauge Battery TurnOverStart

Online Troubleshooters

Define Problem

Gather Information

Get Recommendations

(see Breese & Heckerman, 1996) Portion of BN for print troubleshooting

Office Assistant 97

Lumière Project User’s Goals User’s Needs User Activity User Activity (see Horvitz, Breese, Heckerman, Hovel & Rommelse 1998)

Studies with Human Subjects u “Wizard of OZ” experiments at MS Usability Labs Expert Advisor Inexperienced user User Actions Typed Advice

. Activities with Relevance to User’s Needs Several classes of evidence n Search: e.g., menu surfing n Introspection: e.g., sudden pause, slowing of command stream n Focus of attention: e.g, selected objects n Undesired effects: e.g., command/undo, dialogue opened and cancelled n Inefficient command sequences n Goal-specific sequences of actions

Summary so far Bayes nets are useful because... u They encode independence explicitly l more parsimonious models l efficient inference u They encode independence graphically l Easier explanation l Easier encoding u They sometimes correspond to causal models l Easier explanation l Easier encoding l Modularity leads to easier maintenance

Teenage Bayes MICRONEWS 97: Microsoft Researchers Exchange Brainpower with Eighth-grader Teenager Designs Award-Winning Science Project.. For her science project, which she called "Dr. Sigmund Microchip," Tovar wanted to create a computer program to diagnose the probability of certain personality types. With only answers from a few questions, the program was able to accurately diagnose the correct personality type 90 percent of the time.

Artificial Intelligence is a promising field always was, always will be.