Presentation is loading. Please wait.

Presentation is loading. Please wait.

Lecture 1 is based on David Heckerman’s Tutorial slides. (Microsoft Research) Bayesian Networks Lecture 1: Basics and Knowledge- Based Construction Requirements:

Similar presentations


Presentation on theme: "Lecture 1 is based on David Heckerman’s Tutorial slides. (Microsoft Research) Bayesian Networks Lecture 1: Basics and Knowledge- Based Construction Requirements:"— Presentation transcript:

1 Lecture 1 is based on David Heckerman’s Tutorial slides. (Microsoft Research) Bayesian Networks Lecture 1: Basics and Knowledge- Based Construction Requirements: 50% home works; 50% Exam or a project

2 What I hope you will get out of this course... u What are Bayesian networks? u Why do we use them? u How do we build them by hand? u How do we build them from data? u What are some applications? u What is their relationship to other models? u What are the properties of conditional independence that make these models appropriate? u Usage in genetic linkage analysis

3 Applications of hand-built Bayes Nets u Answer Wizard 95, Office Assistant 97,2000 u Troubleshooters in Windows 98 u Lymph node pathology u Trauma care u NASA mission control Some Applications of learned Bayes Nets u Clustering users on the web (MSNBC) u Classifying Text (spam filtering)

4 Some factors that support intelligence u Knowledge representation u Reasoning u Learning / adapting

5 Artificial Intelligence

6 Artificial Intelligence is better than none !

7 Artificial Intelligence is better than ours !

8 Outline for today u Basics u Knowledge-based construction u Probabilistic inference u Applications of hand-built BNs at Microsoft

9 Bayesian Networks: History u 1920s: Wright -- analysis of crop failure u 1950s: I.J. Good -- causality u Early 1980s: Howard and Matheson, Pearl u Other names: l directed acyclic graphical (DAG) models l belief networks l causal networks l probabilistic networks l influence diagrams l knowledge maps

10 Bayesian Network Fuel FuelGauge Start Battery Engine Turns Over p(b)p(b) p(t|b) p(g|f,b) p(s|f,t) p(f)p(f) Directed Acyclic Graph, annotated with prob distributions

11 BN structure: Definition Missing arcs encode independencies such that Fuel FuelGauge Start Battery Engine Turns Over p(b)p(b) p(t|b) p(g|f,b) p(s|f,t) p(f)p(f)

12 Independencies in a Bayes net Many other independencies are entailed by (*): can be read from the graph using d-separation (Pearl) Example:

13 Explaining Away and Induced Dependencies Fuel Start TurnOver "explaining away" "induced dependencies"

14 Local distributions Table: p(S=y|T=n,F=e) = 0.0 p(S=y|T=n,F=n) = 0.0 p(S=y|T=y,F=e) = 0.0 p(S=y|T=y,F=n) = 0.99 Fuel (empty, not) Start (yes, no) TurnOver (yes, no) TF S

15 Local distributions Tree: Fuel (empty, not) Start (yes, no) TurnOver (yes, no) TF S TurnOver Fuel no yes empty not empty p( start )=0 p( start )=0.99

16 Lots of possibilities for a local distribution... u y = discrete node: any probabilistic classifier l Decision tree l Neural net u y= continuous node: any probabilistic regression model l Linear regression with Gaussian noise l Neural net nodeparents

17 Naïve Bayes Classifier Class Input 1Input 2Input n... discrete

18 Hidden Markov Model H1H1 X1X1 H2H2 X2X2 H3H3 X3X3 H4H4 X4X4 H5H5 X5X5... discrete, hidden observations

19 Feed-Forward Neural Network X1X1 X1X1 X1X1 Y1Y1 Y2Y2 Y3Y3 hidden layer inputs outputs (binary) sigmoid

20 Outline u Basics u Knowledge-based construction u Probabilistic inference u Decision making u Applications of hand-built BNs at Microsoft

21 Building a Bayes net by hand (ok, now we're starting to be Bayesian) u Define variables u Assess the structure u Assess the local probability distributions

22 What is a variable? u Collectively exhaustive, mutually exclusive values Error Occured No Error

23 Clarity Test: Is the variable knowable in principle u Is it raining? {Where, when, how many inches?} u Is it hot? {T  100F, T < 100F} u Is user’s personality dominant or submissive? {numerical result of standardized personality test}

24 Assessing structure (one approach) u Choose an ordering for the variables u For each variable, identify parents Pa i such that

25 Example Fuel GaugeStart Battery TurnOver

26 Example Fuel GaugeStart Battery TurnOver p(f)p(f)

27 Example p(b|f)=p(b) Fuel GaugeStart Battery TurnOver p(f)p(f)

28 Example p(b|f)=p(b) p(t|b,f)=p(t|b) Fuel GaugeStart Battery TurnOver p(f)p(f)

29 Example p(b|f)=p(b) p(t|b,f)=p(t|b) p(g|f,b,t)=p(g|f,b) Fuel GaugeStart Battery TurnOver p(f)p(f)

30 Example p(b|f)=p(b) p(t|b,f)=p(t|b) p(g|f,b,t)=p(g|f,b) p(s|f,b,t,g)=p(s|f,t) p(f,b,t,g,s) = p(f) p(b) p(t|b) p(g|f,b) p(s|f,t) FuelGaugeStart Battery TurnOver p(f)p(f)

31 Why is this the wrong way? Variable order can be critical Battery TurnOver Start Fuel Gauge

32 A better way: Use causal knowledge Fuel Gauge Start Battery TurnOver

33 Conditional Independence Simplifies Probabilistic Inference FuelGauge Battery TurnOverStart

34 Online Troubleshooters

35 Define Problem

36 Gather Information

37 Get Recommendations

38 (see Breese & Heckerman, 1996) Portion of BN for print troubleshooting

39 Office Assistant 97

40 Lumière Project User’s Goals User’s Needs User Activity User Activity (see Horvitz, Breese, Heckerman, Hovel & Rommelse 1998)

41 Studies with Human Subjects u “Wizard of OZ” experiments at MS Usability Labs Expert Advisor Inexperienced user User Actions Typed Advice

42 . Activities with Relevance to User’s Needs Several classes of evidence n Search: e.g., menu surfing n Introspection: e.g., sudden pause, slowing of command stream n Focus of attention: e.g, selected objects n Undesired effects: e.g., command/undo, dialogue opened and cancelled n Inefficient command sequences n Goal-specific sequences of actions

43 Summary so far Bayes nets are useful because... u They encode independence explicitly l more parsimonious models l efficient inference u They encode independence graphically l Easier explanation l Easier encoding u They sometimes correspond to causal models l Easier explanation l Easier encoding l Modularity leads to easier maintenance

44 Teenage Bayes MICRONEWS 97: Microsoft Researchers Exchange Brainpower with Eighth-grader Teenager Designs Award-Winning Science Project.. For her science project, which she called "Dr. Sigmund Microchip," Tovar wanted to create a computer program to diagnose the probability of certain personality types. With only answers from a few questions, the program was able to accurately diagnose the correct personality type 90 percent of the time.

45 Artificial Intelligence is a promising field always was, always will be.


Download ppt "Lecture 1 is based on David Heckerman’s Tutorial slides. (Microsoft Research) Bayesian Networks Lecture 1: Basics and Knowledge- Based Construction Requirements:"

Similar presentations


Ads by Google