Presentation is loading. Please wait.

Presentation is loading. Please wait.

Propositional Logic Reading: C. 7.4-7.8, C. 8. 2 Logic: Outline Propositional Logic Inference in Propositional Logic First-order logic.

Similar presentations

Presentation on theme: "Propositional Logic Reading: C. 7.4-7.8, C. 8. 2 Logic: Outline Propositional Logic Inference in Propositional Logic First-order logic."— Presentation transcript:

1 Propositional Logic Reading: C , C. 8

2 2 Logic: Outline Propositional Logic Inference in Propositional Logic First-order logic

3 3 Agents that reason logically A logic is a: Formal language in which knowledge can be expressed A means of carrying out reasoning in the language A Knowledge base agent Tell: add facts to the KB Ask: query the KB

4 4 Towards General-Purpose AI Problem-specific AI (e.g., Roomba) Specific data structure Need special implementation Can be fast General –purpose AI (e.g., logic-based) Flexible and expressive Generic implementation possible Can be slow

5 5 Language Examples Programming languages Formal, not ambiguous Lacks expressivity (e.g., partial information) Natural Language Very expressive, but ambiguous: –Flying planes can be dangerous. –The teacher gave the boys an apple. Inference possible, but hard to automate Good representation language Both formal and can express partial information Can accommodate inference

6 6 Components of a Formal Logic Syntax: symbols and rules for combining them What you can say Semantics: Specification of the way symbols (and sentences) relate to the world What it means Inference Procedures: Rules for deriving new sentences (and therefore, new semantics) from existing sentences Reasoning

7 7

8 8

9 9

10 10

11 11

12 12 Semantics A possible world (also called a model) is an assignment of truth values to each propositional symbol The semantics of a logic defines the truth of each sentence with respect to each possible world A model of a sentence is an interpretation in which the sentence evaluates to True E.g., TodayIsTuesday -> ClassAI is true in model {TodayIsTuesday=True, ClassAI=True} We say {TodayIsTuesday=True, ClassAI=True} is a model of the sentence

13 13 Exercise: Semantics What is the meaning of these two sentences? If Shakespeare ate Crunchy-Wunchies for breakfast, then Sally will go to Harvard If Shakespeare ate Cocoa-Puffs for breakfast, then Sally will go to Columbia

14 14 Examples What are the models of the following sentences? KB 1 : TodayIsTuesday -> ClassAI KB 2 : TodayIsTuesday -> ClassAI, TodayIsTuesday

15 15

16 16

17 17

18 18 Proof by refutation A complete inference procedure A single inference rule, resolution A conjunctive normal form for the logic

19 19

20 20

21 21

22 22

23 23

24 24

25 25

26 26 Example: Wumpus World Agent in [1,1] has no breeze KB = R 2 Λ R 4 = (B 1,1 (P 1,2 V P 2,1 )) Λ⌐B 1,1 Goal: show ⌐P 1,2

27 27 Conversion Example

28 28

29 29 Resolution of Example

30 30 Inference Properties Inference method A is sound (or truth- preserving) if it only derives entailed sentences Inference method A is complete if it can derive any sentence that is entailed A proof is a record of the progress of a sound inference algorithm.

31 31

32 32

33 33 Other Types of Inference Model Checking Forward chaining with modus ponens Backward chaining with modus ponens

34 34 Model Checking Enumerate all possible worlds Restrict to possible worlds in which the KB is true Check whether the goal is true in those worlds or not

35 35 Wumpus Reasoning Percepts: {nothing in 1,1; breeze in 2,1} Assume agent has moved to [2,1] Goal: where are the pits? Construct the models of KB based on rules of world Use entailment to determine knowledge about pits

36 36

37 37 Constructing the KB

38 38

39 39 Properties of Model Checking Sound because it directly implements entailment Complete because it works for any KB and sentence to prove α and always terminates Problem: there can be way too many worlds to check O(2 n ) when KB and α have n variables in total

40 40 Inference as Search State: current set of sentences Operator: sound inference rules to derive new entailed sentences from a set of sentences Can be goal directed if there is a particular goal sentence we have in mind Can also try to enumerate every entailed sentence

41 41

42 42

43 43

44 44

45 45

46 46

47 47

48 48 Example

49 49 Complexity N propositions; M rules Every possible fact can be establisehd with at most N linear passes over the database Complexity O(NM) Forward chaining with Modus Ponens is complete for Horn logic

50 50

51 51 Example

Download ppt "Propositional Logic Reading: C. 7.4-7.8, C. 8. 2 Logic: Outline Propositional Logic Inference in Propositional Logic First-order logic."

Similar presentations

Ads by Google