Presentation is loading. Please wait.

Presentation is loading. Please wait.

3/31. Notice that sampling methods could in general be used even when we don’t know the bayes net (and are just observing the world)!  We should strive.

Similar presentations


Presentation on theme: "3/31. Notice that sampling methods could in general be used even when we don’t know the bayes net (and are just observing the world)!  We should strive."— Presentation transcript:

1 3/31

2 Notice that sampling methods could in general be used even when we don’t know the bayes net (and are just observing the world)!  We should strive to make the sampling more efficient given that we know the bayes net

3

4

5

6

7

8

9

10 Generating a Sample from the Network Network  Samples  Joint distribution Note that the sample is generated in the causal order so all the required probabilities can be read off from CPTs! (To see how useful this is, consider generating the sample in the order of wetgrass, rain, sprinkler … To sample WG, we will first need P(WG), then we need P(Rain|Wg), then we need P(Sp|Rain,Wg)— NONE of these are directly given in CPTs –and have to be computed… Note that in MCMC we do (re)sample a node given its markov blanket—which is not in causal order—since MB contains children and their parents.

11

12 That is, the rejection sampling method doesn’t really use the bayes network that much…

13

14 Notice that to attach the likelihood to the evidence, we are using the CPTs in the bayes net. (Model-free empirical observation, in contrast, either gives you a sample or not; we can’t get fractional samples)

15

16

17

18

19

20

21 Notice that to attach the likelihood to the evidence, we are using the CPTs in the bayes net. (Model-free empirical observation, in contrast, either gives you a sample or not; we can’t get fractional samples)

22

23

24

25

26 Note that the other parents of z j are part of the markov blanket P(rain|cl,sp,wg) = P(rain|cl) * P(wg|sp,rain)

27

28 First-order Logic

29 Assertions; t/f Epistemological commitment Ontological commitment t/f/u Deg belief facts Facts Objects relations Prop logic Prob prop logic FOPCProb FOPC

30 Atomic  Propositional  Relational  First order Atomic representations: States as blackboxes.. Propositional representations: States as made up of state variables Relational representations: States made up of objects and relations between them –First-order: there are functions which “produce” objects.. (so essentially an infinite set of objects Propositional can be compiled to atomic (with exponential blow-up) Relational can be compiled to propositional (with exponential blo-up) if there are no functions –With functions, we cannot compile relational representations into any finite propositional representation “higher-order” representations can (sometimes) be compiled to lower order Expressiveness of Representations

31 Why FOPC If your thesis is utter vacuous Use first-order predicate calculus. With sufficient formality The sheerest banality Will be hailed by the critics: "Miraculous!"

32 4/2

33

34 Connection to propositional logic: Think of “atomic sentences” as propositions…  general object referent Can’t have predicates of predicates.. thus first-order

35

36

37

38 Important facts about quantifiers Forall and There-exists are related through negation.. –~[forall x P(x)] = Exists x ~P(x) –~[exists x P(x)] = forall x ~P(x) Quantification is allowed only on variables –can’t quantify on predicates; can’t say – [Forall P Reflexive(P)  forall x,y P(x,y) => P(y,x) —you have to write it once per relation) Order of quantifiers matters

39 Family Values: Falwell vs. Mahabharata According to a recent CTC study, “….90% of the men surveyed said they will marry the same woman..” “…Jessica Alba.”

40 Caveat: Order of quantifiers matters “either Fido loves both Fido and Tweety; or Tweety loves both Fido and Tweety” “ Fido or Tweety loves Fido; and Fido or Tweety loves Tweety” Loves(x,y) means x loves y Intuitively, x depends on y as it is in the scope of the quantification on y (foreshadowing Skolemization)

41 Caveat: Decide whether a symbol is predicate, constant or function… Make sure you decide what are your constants, what are your predicates and what are your functions Once you decide something is a predicate, you cannot use it in a place where a predicate is not expected! In the previous example, you cannot say

42 More on writing sentences Forall usually goes with implications (rarely with conjunctive sentences) There-exists usually goes with conjunctions—rarely with implications Everyone at ASU is smart Someone at UA is smart

43 Apt-pet An apartment pet is a pet that is small Dog is a pet Cat is a pet Elephant is a pet Dogs and cats are small. Some dogs are cute Each dog hates some cat Fido is a dog

44 Notes on encoding English statements to FOPC You get to decide what your predicates, functions, constants etc. are. All you are required to do is be consistent in their usage. When you write an English sentence into FOPC sentence, you can “double check” by asking yourself if there are worlds where FOPC sentence doesn’t hold and the English one holds and vice versa Since you are allowed to make your own predicate and function names, it is quite possible that two people FOPCizing the same KB may wind up writing two syntactically different KBs If each of the KBs is used in isolation, there is no problem. However, if the knowledge written in one KB is supposed to be used in conjunction with that in another KB, you will need “Mapping axioms” which relate the “vocabulary” in one KB to the vocabulary in the other KB. This problem is PRETTY important in the context of Semantic Web The “Semantic Web” Connection

45

46

47 Two different Tarskian Interpretations This is the same as the one on The left except we have green guy for Richard Problem: There are too darned many Tarskian interpretations. Given one, you can change it by just substituting new real-world objects  Substitution-equivalent Tarskian interpretations give same valuations to the FOPC statements (and thus do not change entailment)  Think in terms of equivalent classes of Tarskian Interpretations (Herbrand Interpretations) We had this in prop logic too—The real World assertion corresponding to a proposition

48 Connection to propositional logic: Think of “atomic sentences” as propositions …

49 Herbrand Interpretations Herbrand Universe –All constants Rao,Pat –All “ground” functional terms Son-of(Rao);Son-of(Pat); Son-of(Son-of(…(Rao)))…. Herbrand Base –All ground atomic sentences made with terms in Herbrand universe Friend(Rao,Pat);Friend(Pat,Rao);Friend(P at,Pat);Friend(Rao,Rao) Friend(Rao,Son-of(Rao)); Friend(son-of(son-of(Rao),son-of(son- of(son-of(Pat)) –We can think of elements of HB as propositions; interpretations give T/F values to these. Given the interpretation, we can compute the value of the FOPC database sentences If there are n constants; and p k-ary predicates, then --Size of HU = n --Size of HB = p*n k But if there is even one function, then |HU| is infinity and so is |HB|. --So, when there are no function symbols, FOPC is really just syntactic sugaring for a (possibly much larger) propositional database Let us think of interpretations for FOPC that are more like interpretations for prop logic

50


Download ppt "3/31. Notice that sampling methods could in general be used even when we don’t know the bayes net (and are just observing the world)!  We should strive."

Similar presentations


Ads by Google