Presentation is loading. Please wait.

Presentation is loading. Please wait.

Intro to AI Fall 2004 © L. Joskowicz 1 Introduction to Artificial Intelligence LECTURE 7: Knowledge Representation and Logic Motivation Knowledge bases.

Similar presentations


Presentation on theme: "Intro to AI Fall 2004 © L. Joskowicz 1 Introduction to Artificial Intelligence LECTURE 7: Knowledge Representation and Logic Motivation Knowledge bases."— Presentation transcript:

1 Intro to AI Fall 2004 © L. Joskowicz 1 Introduction to Artificial Intelligence LECTURE 7: Knowledge Representation and Logic Motivation Knowledge bases and inferences Logic as a representation language Propositional logic

2 Intro to AI Fall 2004 © L. Joskowicz 2 Motivation (1) How should we organize the Knowledge Base in which we shall search? Up to now, we concentrated on search methods in worlds that can be relatively easily represented by states and actions on them –a few objects, rules, relatively simple states –problem-specific heuristics to guide the search –complete knowledge: know all what’s needed –no new knowledge is deduced or added –well-defined start and goal states Appropriate for fully observable, static, discrete problems.

3 Intro to AI Fall 2004 © L. Joskowicz 3 Motivation (2) What about other types of problems? –More objects, more complex relations –Not all knowledge is explicitly stated –Dynamic environments: the rules change! –Agents change their knowledge –Deduction: how to derive new conclusions Examples 1. queries on family relations 2. credit approval 3. diagnosis of circuits

4 Intro to AI Fall 2004 © L. Joskowicz 4 Example 1: family relations Facts: –Sarah is the mother of Tal and Mor –Moshe is married to Sarah –Fanny is the mother of Gal Query: Is Moshe the father of Tal? Deduction: –people have a mother and a father –Moshe is married to Sarah, who has 2 children –Sarah’s children are Moshe’s children (no divorce) New knowledge deduced, assumptions apply!

5 Intro to AI Fall 2004 © L. Joskowicz 5 Example 2: credit approval Facts –Moshe is employed for 5 years and earns 10,000 shekels a month. –Credit approval rules: must be employed at least 3 years, earn at least 5,000 shekels, have no outstanding debts. Query: is Moshe eligible for credit? Decision procedure: –build a decision tree procedural –check the rulesdeclarative Advantages and disadvantages of each.

6 Intro to AI Fall 2004 © L. Joskowicz 6 Example 3: circuit diagnosis Facts: –circuit topology, components, inputs/outputs –component and connection rules –faulty output for given input Query: What are the components that are likely to be faulty? Deduction: –classify all possible faults and their explanation –deductive process for fault detection

7 Intro to AI Fall 2004 © L. Joskowicz 7 Procedural vs. Declarative knowledge Procedural: how to achieve a goal, procedure to answer queries –hard wired, efficient, specific to a problem and situation; difficult to change and update. Declarative: relations that hold between entities + general inference mechanism –more general: decouples knowledge from deduction, easier to update, possibly less efficient We will focus on declarative representations.

8 Intro to AI Fall 2004 © L. Joskowicz 8 Knowledge base architecture Note: compare with problem solving as search KNOWLEDGE BASE (KB) facts and rules INFERENCE MECHANISM Updates QueryAnswer

9 Intro to AI Fall 2004 © L. Joskowicz 9 Knowledge base issues Representation language: –how expressive is it? What can and cannot be said? Inference procedure: general procedure to derive new conclusions –Is it sound? Do all conclusions follow rationally from the facts and rules? –Is it complete? If a conclusion rationally follows from the KB, can I deduce it? –Is it efficient? Does it take time polynomial in the number of facts and rules?

10 Intro to AI Fall 2004 © L. Joskowicz 10 The world, its representation, and its implementation microworld representation implementation Facts ==> Facts Sentences ==> Sentences FOLLOWS INFERENCE

11 Intro to AI Fall 2004 © L. Joskowicz 11 Domain model Specifies how the microworld will be modeled Ontology: microworld we are modeling. –Family relations between individuals Domain theory: type of facts and relations –persons: sarah, tal, mor –relations: mother_of, married, … A fact is true if it follows from a set of facts based on rational arguments

12 Intro to AI Fall 2004 © L. Joskowicz 12 Representation language Formal language to represent facts and rules about a microworld as sentences. Interpreted sentences represent a model of the microworld Syntax: how sentences formed mother_of(sarah,tal) /\ mother_of(sarah,mor) Semantics: how to interpret sentences True/False Set of all sentences (axioms, rules) is the abstract representation of the KB

13 Intro to AI Fall 2004 © L. Joskowicz 13 KB Inference procedure Works on the syntactic representation of sentences: a => b and a, deduce b Independent of the meaning (semantics) of the knowledge represented Captures a subset of rational rules of thought –modus ponens, entailment, resolution. Note: these inference rules are different from the KB rules! Base sentences are called axioms, derived sentences theorems, derivations proofs.

14 Intro to AI Fall 2004 © L. Joskowicz 14 Implementation How sentences are represented in the computer: data structures for facts and relations. How to perform inferences based on abstract inference procedure rules. Typical procedures: –pattern matching –knowledge base management

15 Intro to AI Fall 2004 © L. Joskowicz 15 Logical Theory Structure Ontology Procedures Domain Theory Axioms Formal Language Data Structures Implementation Axiomatic System Domain Model Operates on Describes Stated in Definition Justified by Formal semantics Note: contrast with standard algorithms and math domains

16 Intro to AI Fall 2004 © L. Joskowicz 16 Example: family relations Ontology: family relations microworld Domain theory: sarah, tal, mother_of relation, Formal language: first order logic Axioms: –mother_of(sarah,tal), …. –  X,  Y mother_of(Y,X), ….. Data structures: functions, structs, lists Procedures: matching, rule ordering,...

17 Intro to AI Fall 2004 © L. Joskowicz 17 Logic and knowledge representation (1) Mathematical logics have well-defined syntax, semantics, and models: –Propositional: facts are True/False –First Order: facts, objects, relations are True/False –Temporal logic: First Order + time –Probability theory: facts, degree of belief [0…1] Interpretation: truth assignment to each element on the formulaA is True

18 Intro to AI Fall 2004 © L. Joskowicz 18 Logic and knowledge representation (2) A sentence is –valid (a tautology) if it is true for any truth assignement (A \/ ~A) –satisfiable if there exists a truth assignment that makes it true (A /\ B) –unsatisfiable if there is no truth assignment that makes it true(A /\ ~A) –model of a sentence is an interpretation that satisfies the sentence Inference rules: modus ponens, deduction

19 Intro to AI Fall 2004 © L. Joskowicz 19 Logic: notation and properties KB |= c c logically follows from KB KB |= R cc follows from KB using rules R |= cc is a tautology Soundness of R: KB |= R c implies KB |=c Completeness of R: KB |= c implies KB |= R c Monotonicity if KB 1 |= c then (KB 1 U KB 2 ) |= c Note: distinguish with KB |-- c, S => c

20 Intro to AI Fall 2004 © L. Joskowicz 20 Propositional Logic -- Syntax Sentence ----> Atomic_Sentence | Complex_Sentence Atomic_Sentence ----> True | False | P | Q | R … Complex_Sentence ----> (~Sentence) | (Sentence Connective Sentence) Connective -----> /\ | \/ | => | | …. Facts, boolean relations between them, True/False truth assignments to boolean sentences SYNTAX:

21 Intro to AI Fall 2004 © L. Joskowicz 21 Recursively defined by the truth value of atomic sentences. Boolean truth tables for each connective and for  The validity of a sentence is determined by constructing a truth table ((P \/ Q) /\ ~Q) => P PQ P /\ Q P \/ Q FalseFalseFalse False False TrueFalse True True FalseFalse True TrueTrueTrue True Propositional Logic -- Semantics

22 Intro to AI Fall 2004 © L. Joskowicz 22 Validity by truth-table construction

23 Intro to AI Fall 2004 © L. Joskowicz 23 Proof methods Given a knowledge base KB = {S 1, S 2,…, S n } and a sentence c, determine if c logically follows from KB: KB |= c Two proof methods –build a truth table to test the validity of the sentence ( S 1 /\ S 2 /\ …/\ S n ) => c –use inference rules R to determine if KB |= R c

24 Intro to AI Fall 2004 © L. Joskowicz 24 Propositional logic example Given: “Heads I win, tails you loose” Prove: “I always win” Propositions: heads, tail, winme, looseyou Axioms: 1. heads => winmeheads I win 2. tails => looseyoutails you loose 3. heads \/ tailseither heads or tails 4. looseyou => winmeyou loose I win

25 Intro to AI Fall 2004 © L. Joskowicz 25 Truth-table inference method Let KB = {S 1, S 2,…, S n } be a set of sentences and C a possible conclusion C logically follows from KB iff the sentence S 1 /\ S 2 /\ … /\ S n => C is a tautology Complexity: always exponential in the number of propositions!

26 Intro to AI Fall 2004 © L. Joskowicz 26 Truth-table method: example winme looseyou heads tails 1 2 3 4 (1,2,3,4) S =>winme 0 0 0 0 1 1 0 1 01 0 0 0 1 1 0 1 1 01 0 0 1 0 0 1 1 1 01 0 0 1 1 0 0 1 1 01 0 1 0 0 1 1 0 0 01 0 1 0 1 1 1 1 0 01 0 1 1 0 0 1 1 0 01 0 1 1 1 0 1 1 0 01 1 0 0 0 1 1 0 1 01 1 0 0 1 1 0 1 1 01 1 0 1 0 1 1 1 1 11 1 0 1 1 1 0 1 1 01 1 1 0 0 1 1 0 1 01 1 1 0 1 1 1 1 1 11 1 1 1 0 1 1 1 1 11 1 1 1 1 1 1 1 1 11

27 Intro to AI Fall 2004 © L. Joskowicz 27 Models and Inferences A possible world, or a propositional model is an assignment of truth values to the atomic propositions. Any world in which a sentence S is true is called a model of that sentence. Rules of inference: move us from assumptions to a conclusion. A rule is sound if, in any model in which all the assumptions are true, the conclusion is true. In the following table, a, b, a i, etc represent sentence patterns: they can be matched to specific sentences

28 Intro to AI Fall 2004 © L. Joskowicz 28 Propositional Logic -- Inference Rules Modus Ponens And-Elimination And-Introduction Or-Introduction Resolution Double negation

29 Intro to AI Fall 2004 © L. Joskowicz 29 Propositional logic example (1) Given: “Heads I win, tails you loose” Prove: “I always win” Propositions: heads, tail, winme, looseyou Axioms: 1. heads => winmeheads I win 2. tails => looseyoutails you loose 3. heads \/ tailseither heads or tails 4. looseyou => winmeyou loose I win

30 Intro to AI Fall 2004 © L. Joskowicz 30 Propositional logic example (2) 1. ~heads \/ winme 2. ~tails \/ looseyou 3. heads \/ tails 4. ~looseyou \/ winme Resolution: a \/ b, ~b \/ c a \/ c 1’ (1,3) tails \/ winme 2’ (2,4) ~tails \/ winme 3’ (1’,2’) winme \/ winme 3” winme

31 Intro to AI Fall 2004 © L. Joskowicz 31 Inference as Search Search method: Given a knowledge base that is a set of sentences, apply inference rules until the query sentence is generated. If it is not generated, then it cannot be inferred –state: a conjunction of sentences in the KB –start: initial KB –Goal: KB containing the query sentence –Inference rules: the ones above Are all inferences sound? Is the set of inference rules complete? What is their complexity?

32 Intro to AI Fall 2004 © L. Joskowicz 32 Search Considerations We have a Knowledge Base KB = {S 1, S 2,…, S n } and a sentence C and we want to know whether the Knowledge Base logically implies the sentence: S 1 /\ S 2 /\ … /\ S n => C We could start from KB and use rules to generate sentences that follow from it until we get C. We could also try to find a model for KB and ~ C : there is no such a model iff KB logically implies C.

33 Intro to AI Fall 2004 © L. Joskowicz 33 Search Considerations Both methods are used. The second one: look for a model of KB and ~ C, is implemented as a depth-first search, a CSP problem really, with a few specific improvements: Davis- Putnam algorithm. In a sense D-P is the mother of all CSP heuristics. Local search as in CSP is also used. The first method will be studied now: Resolution is the rule most often chosen. The special case of Horn clauses will be discussed then.

34 Intro to AI Fall 2004 © L. Joskowicz 34 Illustration of inference as search KB = {P, P => Q, P => S, S => Q, Q =>R, ~P \/ T} To prove: R KB KB U {Q} KB U {Q,R} KB U {T} KB U {S} KB U {S,Q} KB U {Q,R} KB U {T,S}

35 Intro to AI Fall 2004 © L. Joskowicz 35 Resolution a 1 \/ a 2 …. \/ a n \/ c 1 \/ … \/ c m a 1 \/ a 2 … a i \/ b \/ a i+1 \/ …. \/ a n, c 1 \/ c 2... c k \/ ~b \/ c k+1 \/ … \/ c m

36 Intro to AI Fall 2004 © L. Joskowicz 36 Search Considerations The rules should be sound: only sentences that follow logically from KB are generated. They should be complete: all sentences that follow logically from KB are generated (at some point). But they should also provide a way to direct the search towards the goal: C.

37 Intro to AI Fall 2004 © L. Joskowicz 37 Soundness of Inference Rules The conclusions obtained by applying inference rules follow logically from KB. Proof by truth table for each inference rule Example: Resolution

38 Intro to AI Fall 2004 © L. Joskowicz 38 Completeness of Inference Rules The inference rules are complete iff all sentences that follow logically from KB can be derived by the rules. The rules are refutation-complete: tautologies such as (P \/ ~P) cannot be derived. Instead, prove that the negation of the sentence yields a contradiction. Proof procedure: add the negation of the conclusion, apply the rules. If a contradiction is derived, the conclusion is true (ex: “Tails…”)

39 Intro to AI Fall 2004 © L. Joskowicz 39 Refutation-Completeness We need only refutation-completeness: if a set S of sentences is inconsistent then False can be derived, since we are trying to show that KB U { ~ C } is inconsistent. Notice: S is inconsistent iff False logically follows from S.

40 Intro to AI Fall 2004 © L. Joskowicz 40 Using Resolution 1.Put every sentence ( S i, ~ C) in CNF: Conjunctive Normal Form. 2.We may now assume that we want to prove the inconsistency of a set S of clauses: disjunctions of literals. 3.Apply the Resolution Rule to get new clauses until no new clause can be obtained.

41 Intro to AI Fall 2004 © L. Joskowicz 41 Clauses Examples of clauses: 1. ~heads \/ winme 2. ~tails \/ looseyou 3. heads \/ tails 4. ~looseyou \/ winme \/ ~tails \/ heads 5. The empty clause is equivalent to False Caution: always make sure each literal appears at most once in a clause: ~heads \/ winme \/ ~heads is NOT a legal clause

42 Intro to AI Fall 2004 © L. Joskowicz 42 Example of proof by refutation Resolution is NOT complete: (P \/ Q) cannot be deduced from P although it follows from it. However, it is refutation-complete: To show KB |= C, prove KB U {~C} |= False 1) Transform ~(P \/ Q) into ~P /\ ~Q 2) Show: S = {P, ~P, ~Q} is inconsistent: Applying resolution on : P and ~P gives the empty clause. Conclusion: S is unsatisfiable and P => (P \/ Q)

43 Intro to AI Fall 2004 © L. Joskowicz 43 Decidability and complexity Propositional logic is decidable: there exists a computational procedure to decide if a sentence logically follows from a set of axioms Complexity: NP-complete by reduction to the satisfiability problem. For the restricted Horn type there is a linear time procedure.

44 Intro to AI Fall 2004 © L. Joskowicz 44 Computational considerations In most cases, not exponential Smallest number of rules –in Math: many rules, lemmas, short proofs manually derived –In AI: fewest rules, automatic deduction One rule: resolution, provided formulas are represented in Normal Form to eliminate syntactic differences: P /\ Q /\ P /\ (S \/ ~S)

45 Intro to AI Fall 2004 © L. Joskowicz 45 Resolution is refutation-complete (1) Let RC(S) be the set of all clauses that can be generated from the set S of clauses by repeated uses of the Resolution Rule. We want to show that if RC(S) does not contain the empty clause, then S is satisfiable. Suppose RC(S) does not contain the empty clause. Define the following assignment of truth values to the atomic propositions.

46 Intro to AI Fall 2004 © L. Joskowicz 46 Resolution is refutation-complete (2) Atomic propositions: P 1, …, P n for i = 1 to n do 1.simplify all clauses by replacing P k by its value for k < i, 2.ignore clauses already known to be satisfied, 3.if there is a clause ~P i assign P i the value False else assign it the value True

47 Intro to AI Fall 2004 © L. Joskowicz 47 Resolution is refutation-complete (3) Claim: the assignment above satisfies all the propositions in RC(S) and in particular all elements of S  S is satisfiable. Proof: assume some element in RC(S) is not satisfied. There is an i and a clause A of RC(S), not satisfied, that contains P i or ~ P i and no P j for j > i, such that any clause containing only P j for j < i is satisfied.

48 Intro to AI Fall 2004 © L. Joskowicz 48 Resolution is refutation-complete (4) The clause A must contain P i, not ~P i There must be some other clause B that contains ~P i and only P j for j < i, such that all other literals are unsatisfied in the assignment. By applying resolution to A and B one gets a clause of RC(S) that contains only P j for j < i. This clause is satisfied. Therefore one of its members is satisfied. It cannot be a member of B. It must be one of A and A is satisfied. Contradiction!

49 Intro to AI Fall 2004 © L. Joskowicz 49 Horn clauses (1) A clause is a Horn clause iff at most one of the literals is positive. Examples: 1. ~heads \/ winme heads => winme 2. ~tails \/ looseyou \/ ~whatever tails /\ whatever => looseyou 3. ~tails \/ ~whatever

50 Intro to AI Fall 2004 © L. Joskowicz 50 Horn clauses (2) If S is a set of Horn clauses and C is a Horn clause, then there is a linear-time algorithm to decide whether C is logically implied by the sentences of S. The algorithm looks for positive literals that can be proven from the positive literals already known to hold: once a positive literal is added, subtract one to the number of prerequisites still to be fulfilled for each clause in which it appears as a prerequisite.

51 Intro to AI Fall 2004 © L. Joskowicz 51 Expressiveness of Propositional Logic Cannot express general statements of the form “every person has a father and a mother” Must list all specific instances: father_of(moshe, tal), father_of(moshe,mor)…. Which usually yields many sentences... Extend the language to represent objects and relations between objects: First Order Logic –  X  Y, Z such that father(Y,X) and mother(Z,X)


Download ppt "Intro to AI Fall 2004 © L. Joskowicz 1 Introduction to Artificial Intelligence LECTURE 7: Knowledge Representation and Logic Motivation Knowledge bases."

Similar presentations


Ads by Google