Download presentation
Presentation is loading. Please wait.
Published byΛητώ Κοντόσταυλος Modified over 5 years ago
2
LECTURE 06 Logical Agent Course : T0264 – Artificial Intelligence
10/30/2019 Course : T0264 – Artificial Intelligence Year : 2013 LECTURE 06 Logical Agent
3
T0264 - Artificial Intelligence
Learning Outcomes At the end of this session, students will be able to: LO 2 : Explain various intelligent search algorithms to solve the problems LO 3 : Explain how to use knowledge representation in reasoning purpose T Artificial Intelligence
4
T0264 - Artificial Intelligence
Outline Knowledge-Based Agents Logic in General Propositional Logic (PL) : A Very Simple Logic Proportional Theorem Proving (Resolution Algorithm) Forward and Backward Algorithm Summary T Artificial Intelligence
5
T0264 - Artificial Intelligence
Knowledge-Based Agents Knowledge base = set of sentences in a formal language Declarative approach to building an agent (or other system): Tell it what it needs to know Then it can Ask itself what to do - answers should follow from the KB Agents can be viewed at the knowledge level i.e., what they know, regardless of how implemented Or at the implementation level i.e., data structures in KB and algorithms that manipulate them T Artificial Intelligence
6
Tugas Presentasi I Logical Agents & Wumpuss World
ANDREAN HARRY RAMADHAN DIONDY KUSUMA RIVKY CHANDRA WIENA MARCELINA WILSON GUNAWAN ANDRE IVAN T Artificial Intelligence
7
A Simple Knowledge-Based Agent
Agents A Simple Knowledge-Based Agent The agent must be able to: Represent states, actions, etc. Incorporate new percepts Update internal representations of the world Deduce hidden properties of the world Deduce appropriate actions T Artificial Intelligence
8
Wumpus World PEAS Description
Knowledge-Based Agents Wumpus World PEAS Description Performance measure gold +1000, death -1000 -1 per step, -10 for using the arrow Environment Squares adjacent to wumpus are smelly Squares adjacent to pit are breezy Glitter if gold is in the same square Shooting kills wumpus if you are facing it Shooting uses up the only arrow Grabbing picks up gold if in same square Releasing drops the gold in same square Sensors: Stench, Breeze, Glitter, Bump, Scream Actuators: Left turn, Right turn, Forward, Grab, Release, Shoot T Artificial Intelligence
9
Wumpus World Characterization
Knowledge-Based Agents Wumpus World Characterization Fully Observable No – only local perception Deterministic Yes – outcomes exactly specified Episodic No – sequential at the level of actions Static Yes – Wumpus and Pits do not move Discrete Yes Single-agent? Yes – Wumpus is essentially a natural feature T Artificial Intelligence
10
Exploring a Wumpus World
Knowledge-Based Agents Exploring a Wumpus World A = Agent B = Breeze G = Glitter, Gold OK = Safe square P = Pit S = Stench V = Visited W = Wumpus T Artificial Intelligence
11
Exploring a Wumpus World
Knowledge-Based Agents Exploring a Wumpus World T Artificial Intelligence
12
T0264 - Artificial Intelligence
Logic In General Logics are formal languages for representing information such that conclusions can be drawn Syntax defines the sentences in the language Semantics define the "meaning" of sentences; i.e., define truth of a sentence in a world E.g., the language of arithmetic x+2 ≥ y is a sentence; x2+y > { } is not a sentence x+2 ≥ y is true if the number x+2 is no less than the number y x+2 ≥ y is true in a world where x = 7, y = 1 x+2 ≥ y is false in a world where x = 0, y = 6 T Artificial Intelligence
13
T0264 - Artificial Intelligence
Logic In General Entailment Entailment means that one thing follows from another: KB ╞ α Knowledge base KB entails sentence α if and only if α is true in all worlds where KB is true E.g., the KB containing “the Giants won” and “the Reds won” entails “Either the Giants won or the Reds won” E.g., x+y = 4 entails 4 = x+y Entailment is a relationship between sentences (i.e., syntax) that is based on semantics T Artificial Intelligence
14
T0264 - Artificial Intelligence
Logic In General Models Logicians typically think in terms of models, which are formally structured worlds with respect to which truth can be evaluated We say m is a model of a sentence α if α is true in m M(α) is the set of all models of α Then KB ╞ α iff M(KB) M(α) E.g. KB = Giants won and Reds won α = Giants won T Artificial Intelligence
15
T0264 - Artificial Intelligence
Logic In General Wumpus Models KB = wumpus-world rules + observations T Artificial Intelligence
16
T0264 - Artificial Intelligence
Logic In General Wumpus Models KB = wumpus-world rules + observations α1 = "[1,2] is safe", KB ╞ α1, proved by model checking KB = wumpus-world rules + observations T Artificial Intelligence
17
T0264 - Artificial Intelligence
Logic In General Wumpus Models KB = wumpus-world rules + observations α2 = "[2,2] is safe", KB ╞ α2 T Artificial Intelligence
18
T0264 - Artificial Intelligence
Logic In General Inference KB ├i α = sentence α can be derived from KB by procedure i Soundness: i is sound if whenever KB ├i α, it is also true that KB╞ α Completeness: i is complete if whenever KB╞ α, it is also true that KB ├i α Preview: we will define a logic (first-order logic) which is expressive enough to say almost anything of interest, and for which there exists a sound and complete inference procedure. That is, the procedure will answer any question whose answer follows from what is known by the KB. T Artificial Intelligence
19
T0264 - Artificial Intelligence
Logic In General Sentences are physical configuration of agent, and reasoning is a process of constructing new physical configurations from old ones. New configurations represent aspects of the world that actually follow from the aspect that the old configurations represent T Artificial Intelligence
20
T0264 - Artificial Intelligence
Propositional Logic: A Very Simple Logic Standard Logic Symbols = For all ; [e.g : every one, every body, any time, etc] = There exists ; [e.g : some one, some time, etc] = Implication ; [ if … then ….] = Equivalent ; biconditional [if … and … only … if …] = Not ; negation = OR ; disjunction = AND ; conjunction T Artificial Intelligence
21
T0264 - Artificial Intelligence
Propositional Logic: A Very Simple Logic Syntax Propositional logic is the simplest logic – illustrates basic ideas The proposition symbols S1, S2 etc are sentences If S is a sentence, S is a sentence (negation) If S1 and S2 are sentences, S1 S2 is a sentence (conjunction) If S1 and S2 are sentences, S1 S2 is a sentence (disjunction) If S1 and S2 are sentences, S1 S2 is a sentence (implication) If S1 and S2 are sentences, S1 S2 is a sentence (biconditional) T Artificial Intelligence
22
Propositional Logic: A Very Simple Logic
Semantics True is true in every model and False is false in every model The truth value of every other proposition symbol must be specified directly in the model. For complex sentences, there are five rules for any sub sentences P and Q in any model m : P is true iff P is false in m. P Q is true iff both P and Q true in m. P Q is true iff either P or Q is true in m. P Q is true unles P is true and Q is false in m. i.e., is false iff P is true and Q is false P Q is true iff P and Q are both true or both false in m. i.e., P Q is true and Q P is true Simple recursive process evaluates an arbitrary sentence, e.g., P1,2 (P2,2 P3,1) = true (true false) = true true = true T Artificial Intelligence
23
Truth Tables for Connectives
Propositional Logic: A Very Simple Logic Truth Tables for Connectives T Artificial Intelligence
24
A Truth-Table Enumeration Algorithm
Propositional Logic: A Very Simple Logic A Truth-Table Enumeration Algorithm Depth-first enumeration of all models is sound and complete For n symbols, time complexity is O(2n), space complexity is O(n) T Artificial Intelligence
25
T0264 - Artificial Intelligence
Propositional Logic: A Very Simple Logic Logical Equivalences Two sentences are logically equivalent iff true in same models: α ≡ ß iff α╞ β and β╞ α T Artificial Intelligence
26
Propositional Theorem Proving (Resolution Algorithm)
Inference procedure based on resolution work by using the principle of proof by contradiction. That is, to show that KB├ = , we show that (KB ) is unsatisfied. We do this by contradiction. Therefore: First : (KB ) is convert into CNF (normal clause form) Second : show clauses obtained by resolving pairs in the row T Artificial Intelligence
27
Propositional Theorem Proving (Resolution Algorithm)
Convert all the proposition logic to normal clause form (CNF). Negate P and convert the result to clause form. Add it to the set of clause in step Repeat until either a contradiction is found or no progress can be made, or a predetermined amount of effort has been expended. Select two clause. Call these the parent clauses. T Artificial Intelligence
28
Propositional Theorem Proving (Resolution Algorithm)
Algorithm: (cont’d) Resolve them together. The resolvent will be the disjunction of all of the literals of both parent clauses with appropriate substitutions performed and with the following exception: If there is one pairs of literals T1 and T2 such that one of the parent clauses contains T1 and the other contains T2 and if T1 and T2 are unifiable, then neither T1 nor T2 should appear in the resolvent. If there is more than one pair of complementary literals, only one pair should be omitted from the resolvent. If the resolvent is the empty clause, then a contradiction has been found. If it is not, then add it to the set of clause available to the procedure. T Artificial Intelligence
29
Simple Pseudo Code Resolution algorithm for proportional logic
Propositional Theorem Proving (Resolution Algorithm) Simple Pseudo Code Resolution algorithm for proportional logic T Artificial Intelligence
30
Conversion to Normal Clause Form (CNF Algorithm)
Propositional Theorem Proving (Resolution Algorithm) Conversion to Normal Clause Form (CNF Algorithm) Eliminate , using: a b = a b. Reduce the scope of each to a single term, using de Morgan’s laws: (p) = p (ab) = a b (ab) = a b x : P(x) =x : P(x) x : P(x) = x : P(x) 3. Standardize variables For sentences like (∀x P(x)) ∨ (∃x Q(x)) which use the same variable name twice, change the name of one of the variables. This avoids confusion later when we drop the quantifiers. For example, from ∀x [∃y Animal(y) ∧ ¬Loves(x, y)] ∨ [∃y Loves(y, x)]. we obtain: ∀x [∃y Animal(y) ∧ ¬Loves(x, y)] ∨ [∃z Loves(z, x)]. T Artificial Intelligence
31
Convert to Normal Clause Form (cont’d)
Propositional Theorem Proving (Resolution Algorithm) Convert to Normal Clause Form (cont’d) Move all quantifiers to the left of the formula without changing their relative order. Eliminate existential quantifiers by inserting Skolem functions. ∃x P(x) into P(A), where A is a new constant Drop universal quantifiers Convert the matrix into a conjunction of disjoints, using associativity and distributivity (distribute ORs over ANDs) T Artificial Intelligence
32
Example : Conversion to Normal Clause Form
Propositional Theorem Proving (Resolution Algorithm) Example : Conversion to Normal Clause Form Sentence: “Every body who know Hitler, either like Hitler or think that anyone who killed some one is crazy” Proportional Logic is : x : [body(x) know(x, Hitler)] [like(x, Hitler) (y:z: killed(y, z) crazy(x, y)] T Artificial Intelligence
33
Applied of Clause Form Algorithm
Propositional Theorem Proving (Resolution Algorithm) Applied of Clause Form Algorithm x : [body(x) know(x, Hitler)] [like(x, Hitler) (y:z: killed(y, z) crazy(x, y)] x : [body(x) know(x, Hitler)] [like(x, Hitler) (y: (z: killed(y, z)) crazy(x, y))] Ok ! x : y : z : [body(x) know(x, Hitler)] [like(x, Hitler) (killed(y, z)) crazy(x, y)] [body(x) know(x, Hitler)] [like(x, Hitler) (killed(y, z)) crazy(x, y)] body(x) know(x, Hitler) like(x, Hitler) killed(y, z) crazy(x, y) T Artificial Intelligence
34
Conversion to Normal Form
Propositional Theorem Proving (Resolution Algorithm) Conversion to Normal Form T Artificial Intelligence
35
Propositional Theorem Proving (Resolution Algorithm)
Conversion to Normal Form T Artificial Intelligence
36
Forward and Backward Chaining
Algorithm Forward and Backward Chaining Idea: fire any rule whose premises are satisfied in the KB, add its conclusion to the KB, until query is found T Artificial Intelligence
37
Forward Chaining Algorithm
Forward and Backward Algorithm Forward Chaining Algorithm Forward chaining is sound and complete for Horn KB T Artificial Intelligence
38
Forward Chaining Example
Forward and Backward Algorithm Forward Chaining Example T Artificial Intelligence
39
Forward Chaining Example
Forward and Backward Algorithm Forward Chaining Example T Artificial Intelligence
40
Forward Chaining Example
Forward and Backward Algorithm Forward Chaining Example T Artificial Intelligence
41
T0264 - Artificial Intelligence
Forward and Backward Algorithm Proof of Completeness FC derives every atomic sentence that is entailed by KB FC reaches a fixed point where no new atomic sentences are derived Consider the final state as a model m, assigning true/false to symbols Every clause in the original KB is true in m a1 … ak b Hence m is a model of KB If KB╞ q, q is true in every model of KB, including m T Artificial Intelligence
42
T0264 - Artificial Intelligence
Forward and Backward Algorithm Backward Chaining Idea: work backwards from the query q: to prove q by BC, check if q is known already, or prove by BC all premises of some rule concluding q Avoid loops: check if new sub goal is already on the goal stack Avoid repeated work: check if new sub goal has already been proved true, or has already failed T Artificial Intelligence
43
Backward Chaining Example
Forward and Backward Algorithm Backward Chaining Example T Artificial Intelligence
44
Backward Chaining Example
Forward and Backward Algorithm Backward Chaining Example T Artificial Intelligence
45
T0264 - Artificial Intelligence
Forward and Backward Algorithm Backward Chaining Example T Artificial Intelligence
46
T0264 - Artificial Intelligence
Forward and Backward Algorithm Backward Chaining Example T Artificial Intelligence
47
T0264 - Artificial Intelligence
Summary Logical agents apply inference to a knowledge base to derive new information and make decisions Basic concepts of logic: syntax: formal structure of sentences semantics: truth of sentences in the models entailment: necessary truth of one sentence given another inference: deriving sentences from other sentences soundness: derivations produce only entailed sentences completeness: derivations can produce all entailed sentences Resolution is complete for propositional logic Forward, backward chaining are linear-time, complete for Horn clauses T Artificial Intelligence
48
T0264 - Artificial Intelligence
References Stuart Russell, Peter Norvig, Artificial intelligence : a modern approach. PE. New Jersey. ISBN: , Chapter 7 Elaine Rich, Kevin Knight, Shivashankar B. Nair Artificial Intelligence. MHE. New York. , Chapter 5 Propositional Logic: The Resolution Method: T Artificial Intelligence
49
<< CLOSING >>
End of Session 06 Good Luck T Artificial Intelligence
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.