Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 20, 2012.

Similar presentations


Presentation on theme: "CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 20, 2012."— Presentation transcript:

1 CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 20, 2012

2 Outline Midterm planning problem: solution http://www.ccs.neu.edu/course/cs4100sp12/classnotes/midterm-planning.doc Discuss term projects Continue uncertain reasoning in AI – Probability distribution (review) – Conditional Probability and the Chain Rule (cont.) – Bayes’ Rule – Independence, “Expert” systems and the combinatorics of joint probabilities – Bayes networks – Assignment 6

3 Term Projects – The Process 1.Form teams of 3 or 4 people – 10-12 teams 2.Before next class (Mar 20) each team send an email a.Name and a main contact person (email) b.All team members’ names and email addresses c.You can reserve a topic asap (first request) 3.Brief written project proposal due Fri March 23 10pm (email) 4.Each team will a.submit a written project report (due April 17, last day of class) b.a running computer application (due April 17, last day of class) c.make a presentation of 15 minutes on their project (April 12 & 17) 5.Attendance is required and will be taken on April 12 & 17

4 Term Projects – The Content 1.Select a domain 2.Model the domain a.“Logical/state model” : define an ontology w/ example world state b.Implementation in Protégé – demo with some queries c.“Dynamics model” (of how the world changes) Using Situation Calculus formalism or STRIPS-type operators 3.Define and solve example planning problems: initial state  goal state a.Specify planning axioms or STRIPS-type operators b.Show (on paper) a proof or derivation of a trivial plan and then a more challenging one using resolution or the POP algorithm

5 Ontology Design Example: Protege Simplest example: Dog project Cooking ontology – Overall Design – Implement class taxonomy – Slots representing data types – Slots containing relationships

6 Cooking ontology (for meal or party planning) FoodItem – taxonomy can include Dairy, Meat, Starch, Veg, Fruit, Sweets. A higher level can be Protein, Carbs. Should include nuts due to possible allergies A Dish – taxonomy can be Appetizer, Main Course, Salad, Dessert. A Dish has Ingredients which are instances of FoodItem A Recipe – Has Servings (a number) – Has steps Each step includes a FoodItem, Amount, and Prep An Amount is a number and units Prep is a string Relationships

7

8 Bayes' Rule Product rule P(a  b) = P(a | b) P(b) = P(b | a) P(a)  Bayes' rule: P(a | b) = P(b | a) P(a) / P(b) or in distribution form P(Y|X) = P(X|Y) P(Y) / P(X) = αP(X|Y) P(Y) Useful for assessing diagnostic probability from causal probability: P(Cause|Effect) = P(Effect|Cause) P(Cause) / P(Effect) P(Disease|Symptom) = P(Symptom|Diease) P(Symptom) / (Disease) – E.g., let M be meningitis, S be stiff neck: P(m|s) = P(s|m) P(m) / P(s) = 0.8 × 0.0001 / 0.1 = 0.0008 – Note: posterior probability of meningitis still very small!

9

10

11

12

13

14 Bayes' Rule and conditional independence P(Cavity | toothache  catch) = αP(toothache  catch | Cavity) P(Cavity) = αP(toothache | Cavity) P(catch | Cavity) P(Cavity) We say: “toothache and catch are independent, given cavity”. This is an example of a naïve Bayes model. We will study this later as our simplest machine learning application P(Cause,Effect 1, …,Effect n ) = P(Cause) π i P(Effect i |Cause) Total number of parameters is linear in n (number of symptoms). This is our first Bayesian inference net.

15 Conditional independence P(Toothache, Cavity, Catch) has 2 3 – 1 = 7 independent entries If I have a cavity, the probability that the probe catches in it doesn't depend on whether I have a toothache: (1) P(catch | toothache, cavity) = P(catch | cavity) The same independence holds if I haven't got a cavity: (2) P(catch | toothache,  cavity) = P(catch |  cavity) Catch is conditionally independent of Toothache given Cavity: P(Catch | Toothache,Cavity) = P(Catch | Cavity) Equivalent statements (from original definitions of independence): P(Toothache | Catch, Cavity) = P(Toothache | Cavity) P(Toothache, Catch | Cavity) = P(Toothache | Cavity) P(Catch | Cavity)

16 Conditional independence contd. Write out full joint distribution using chain rule: P(Toothache, Catch, Cavity) = P(Toothache | Catch, Cavity) P(Catch, Cavity) = P(Toothache | Catch, Cavity) P(Catch | Cavity) P(Cavity) = P(Toothache | Cavity) P(Catch | Cavity) P(Cavity) I.e., 2 + 2 + 1 = 5 independent numbers In most cases, the use of conditional independence reduces the size of the representation of the joint distribution from exponential in n to linear in n. Conditional independence is our most basic and robust form of knowledge about uncertain environments.

17 Remember this examples

18 Example of conditional independence

19

20

21 Test your understanding of the Chain Rule

22 This is our second Bayesian inference net

23

24

25

26

27

28

29

30

31 How to construct a Bayes Net

32

33

34

35

36

37

38 Test your understanding: design a Bayes net with plausible numbers

39 Calculating using Bayes’ Nets

40

41

42

43

44

45

46

47

48


Download ppt "CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 20, 2012."

Similar presentations


Ads by Google