CS 416 Artificial Intelligence Lecture 10 Logic Chapters 7 and 8 Lecture 10 Logic Chapters 7 and 8.

Slides:



Advertisements
Similar presentations
Russell and Norvig Chapter 7
Advertisements

Inference Rules Universal Instantiation Existential Generalization
Methods of Proof Chapter 7, second half.. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound)
Agents That Reason Logically Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 7 Spring 2004.
Methods of Proof Chapter 7, Part II. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound) generation.
Propositional Logic Reading: C , C Logic: Outline Propositional Logic Inference in Propositional Logic First-order logic.
Logic CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
Logic.
CPSC 422, Lecture 21Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 21 Mar, 4, 2015 Slide credit: some slides adapted from Stuart.
Artificial Intelligence Chapter 14. Resolution in the Propositional Calculus Artificial Intelligence Chapter 14. Resolution in the Propositional Calculus.
Proof methods Proof methods divide into (roughly) two kinds: –Application of inference rules Legitimate (sound) generation of new sentences from old Proof.
Logic in general Logics are formal languages for representing information such that conclusions can be drawn Syntax defines the sentences in the language.
Logical Agents Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 7 Fall 2005.
ITCS 3153 Artificial Intelligence Lecture 11 Logical Agents Chapter 7 Lecture 11 Logical Agents Chapter 7.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 9 Jim Martin.
ITCS 3153 Artificial Intelligence Lecture 12 Logical Agents Chapter 7 Lecture 12 Logical Agents Chapter 7.
Methods of Proof Chapter 7, second half.
Knoweldge Representation & Reasoning
Logical Agents Chapter 7 Feb 26, Knowledge and Reasoning Knowledge of action outcome enables problem solving –a reflex agent can only find way from.
Logical Agents Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 7 Spring 2005.
Rutgers CS440, Fall 2003 Propositional Logic Reading: Ch. 7, AIMA 2 nd Ed. (skip )
Artificial Intelligence Chapter 14 Resolution in the Propositional Calculus Artificial Intelligence Chapter 14 Resolution in the Propositional Calculus.
Notes for Chapter 12 Logic Programming The AI War Basic Concepts of Logic Programming Prolog Review questions.
Logical Agents. Knowledge bases Knowledge base = set of sentences in a formal language Declarative approach to building an agent (or other system): 
February 20, 2006AI: Chapter 7: Logical Agents1 Artificial Intelligence Chapter 7: Logical Agents Michael Scherger Department of Computer Science Kent.
Logical Agents (NUS) Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 19, 2012.
CHAPTERS 7, 8 Oliver Schulte Logical Inference: Through Proof to Truth.
Logical Agents Logic Propositional Logic Summary
1 Knowledge Representation. 2 Definitions Knowledge Base Knowledge Base A set of representations of facts about the world. A set of representations of.
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Propositional Logic: Methods of Proof (Part II) This lecture topic: Propositional Logic (two lectures) Chapter (previous lecture, Part I) Chapter.
An Introduction to Artificial Intelligence – CE Chapter 7- Logical Agents Ramin Halavati
S P Vimal, Department of CSIS, BITS, Pilani
Logical Agents Chapter 7. Knowledge bases Knowledge base (KB): set of sentences in a formal language Inference: deriving new sentences from the KB. E.g.:
1 Logical Agents Chapter 7. 2 A simple knowledge-based agent The agent must be able to: –Represent states, actions, etc. –Incorporate new percepts –Update.
Logical Agents Chapter 7. Outline Knowledge-based agents Logic in general Propositional (Boolean) logic Equivalence, validity, satisfiability.
1 Logical Inference Algorithms CS 171/271 (Chapter 7, continued) Some text and images in these slides were drawn from Russel & Norvig’s published material.
1 The Wumpus Game StenchBreeze Stench Gold Breeze StenchBreeze Start  Breeze.
CPSC 422, Lecture 21Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 21 Oct, 30, 2015 Slide credit: some slides adapted from Stuart.
© Copyright 2008 STI INNSBRUCK Intelligent Systems Propositional Logic.
CS 416 Artificial Intelligence Lecture 10 Reasoning in Propositional Logic Chapters 7 and 8 Lecture 10 Reasoning in Propositional Logic Chapters 7 and.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 11 Jim Martin.
Reasoning with Propositional Logic automated processing of a simple knowledge base CD.
Dr. Shazzad Hosain Department of EECS North South Universtiy Lecture 04 – Part B Propositional Logic.
1 Propositional Logic Limits The expressive power of propositional logic is limited. The assumption is that everything can be expressed by simple facts.
1 Logical Agents Chapter 7. 2 Logical Agents What are we talking about, “logical?” –Aren’t search-based chess programs logical Yes, but knowledge is used.
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Logical Agents Chapter 7. Outline Knowledge-based agents Propositional (Boolean) logic Equivalence, validity, satisfiability Inference rules and theorem.
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Inference in Propositional Logic (and Intro to SAT) CSE 473.
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Logical Agents Chapter 7 Part I. 2 Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic.
Proof Methods for Propositional Logic CIS 391 – Intro to Artificial Intelligence.
CS 416 Artificial Intelligence Lecture 13 First-Order Logic Chapter 9 Lecture 13 First-Order Logic Chapter 9.
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Logical Agents. Outline Knowledge-based agents Logic in general - models and entailment Propositional (Boolean) logic Equivalence, validity, satisfiability.
ECE457 Applied Artificial Intelligence Fall 2007 Lecture #6
EA C461 Artificial Intelligence
EA C461 – Artificial Intelligence Logical Agent
Resolution in the Propositional Calculus
Logical Agents Chapter 7 Selected and slightly modified slides from
Logical Agents Reading: Russell’s Chapter 7
CS 416 Artificial Intelligence
Biointelligence Lab School of Computer Sci. & Eng.
Artificial Intelligence: Agents and Propositional Logic.
CS 416 Artificial Intelligence
Methods of Proof Chapter 7, second half.
Propositional Logic: Methods of Proof (Part II)
Presentation transcript:

CS 416 Artificial Intelligence Lecture 10 Logic Chapters 7 and 8 Lecture 10 Logic Chapters 7 and 8

Homework Assignment First homework assignment is out Due October 20th First homework assignment is out Due October 20th

Midterm October 25 th Up through chapter 9 (excluding chapter 5) October 25 th Up through chapter 9 (excluding chapter 5)

Review Store information in a knowledge base Backus-Naur Form (BNF)Backus-Naur Form (BNF) Use equivalences to isolate what you wantUse equivalences to isolate what you want Use inference to relate sentencesUse inference to relate sentences –Modus Ponens –And-Elimination Store information in a knowledge base Backus-Naur Form (BNF)Backus-Naur Form (BNF) Use equivalences to isolate what you wantUse equivalences to isolate what you want Use inference to relate sentencesUse inference to relate sentences –Modus Ponens –And-Elimination

Constructing a proof Proving is like searching Find sequence of logical inference rules that lead to desired resultFind sequence of logical inference rules that lead to desired result Note the explosion of propositionsNote the explosion of propositions –Good proof methods ignore the countless irrelevant propositions The two inference rules are sound but not complete! Proving is like searching Find sequence of logical inference rules that lead to desired resultFind sequence of logical inference rules that lead to desired result Note the explosion of propositionsNote the explosion of propositions –Good proof methods ignore the countless irrelevant propositions The two inference rules are sound but not complete!

Recall that a complete inference algorithm is one that can derive any sentence that is entailed Resolution is a single inference rule Guarantees the ability to derive any sentence that is entailedGuarantees the ability to derive any sentence that is entailed –i.e. it is complete It must be partnered with a complete search algorithmIt must be partnered with a complete search algorithm Recall that a complete inference algorithm is one that can derive any sentence that is entailed Resolution is a single inference rule Guarantees the ability to derive any sentence that is entailedGuarantees the ability to derive any sentence that is entailed –i.e. it is complete It must be partnered with a complete search algorithmIt must be partnered with a complete search algorithm Completeness

Resolution Resolution is a single inference rule Guarantees the ability to derive any sentence that is entailedGuarantees the ability to derive any sentence that is entailed –i.e. it is complete It must be partnered with a complete search algorithmIt must be partnered with a complete search algorithm Resolution is a single inference rule Guarantees the ability to derive any sentence that is entailedGuarantees the ability to derive any sentence that is entailed –i.e. it is complete It must be partnered with a complete search algorithmIt must be partnered with a complete search algorithm

Resolution Unit Resolution Inference Rule If m and l i are complementary literalsIf m and l i are complementary literals Unit Resolution Inference Rule If m and l i are complementary literalsIf m and l i are complementary literals

Resolution Inference Rule Also works with clauses But make sure each literal appears only once We really just want the resolution to return A Also works with clauses But make sure each literal appears only once We really just want the resolution to return A

Resolution and completeness Any complete search algorithm, applying only the resolution rule, can derive any conclusion entailed by any knowledge base in propositional logic More specifically, refutation completenessMore specifically, refutation completeness –Able to confirm or refute any sentence –Unable to enumerate all true sentences Any complete search algorithm, applying only the resolution rule, can derive any conclusion entailed by any knowledge base in propositional logic More specifically, refutation completenessMore specifically, refutation completeness –Able to confirm or refute any sentence –Unable to enumerate all true sentences

What about “and” clauses? Resolution only applies to “or” clauses (disjunctions) Every sentence of propositional logic can be transformed to a logically equivalent conjunction of disjunctions of literalsEvery sentence of propositional logic can be transformed to a logically equivalent conjunction of disjunctions of literals Conjunctive Normal Form (CNF) A sentence expressed as conjunction of disjunction of literalsA sentence expressed as conjunction of disjunction of literals Resolution only applies to “or” clauses (disjunctions) Every sentence of propositional logic can be transformed to a logically equivalent conjunction of disjunctions of literalsEvery sentence of propositional logic can be transformed to a logically equivalent conjunction of disjunctions of literals Conjunctive Normal Form (CNF) A sentence expressed as conjunction of disjunction of literalsA sentence expressed as conjunction of disjunction of literals

CNF

An algorithm for resolution We wish to prove KB entails  That is, will some sequence of inferences create new sentences in knowledge base equal to That is, will some sequence of inferences create new sentences in knowledge base equal to  If some sequence of inferences created ~  we would know the knowledge base could not also entail If some sequence of inferences created ~  we would know the knowledge base could not also entail  –  ^ ~  is impossible We wish to prove KB entails  That is, will some sequence of inferences create new sentences in knowledge base equal to That is, will some sequence of inferences create new sentences in knowledge base equal to  If some sequence of inferences created ~  we would know the knowledge base could not also entail If some sequence of inferences created ~  we would know the knowledge base could not also entail  –  ^ ~  is impossible

An algorithm for resolution To show KB cannot resolve  Must show (KB ^ ~  ) is unsatisfiableMust show (KB ^ ~  ) is unsatisfiable –No possible way for KB to entail (not  ) –Proof by contradiction To show KB cannot resolve  Must show (KB ^ ~  ) is unsatisfiableMust show (KB ^ ~  ) is unsatisfiable –No possible way for KB to entail (not  ) –Proof by contradiction

An algorithm for resolution Algorithm (KB ^ ~  is put in CNF(KB ^ ~  is put in CNF Each pair with complementary literals is resolved to produce new clause which is added to KB (if novel)Each pair with complementary literals is resolved to produce new clause which is added to KB (if novel) –Cease if no new clauses to add (~  is not entailed) –Cease if resolution rule derives empty clause  If resolution generates  …  ^ ~  = empty clause  (~  is entailed) Algorithm (KB ^ ~  is put in CNF(KB ^ ~  is put in CNF Each pair with complementary literals is resolved to produce new clause which is added to KB (if novel)Each pair with complementary literals is resolved to produce new clause which is added to KB (if novel) –Cease if no new clauses to add (~  is not entailed) –Cease if resolution rule derives empty clause  If resolution generates  …  ^ ~  = empty clause  (~  is entailed)

Example of resolution Proof that there is not a pit in P 1,2 : ~P 1,2 KB ^ P 1,2 leads to empty clauseKB ^ P 1,2 leads to empty clause Therefore ~P 1,2 is trueTherefore ~P 1,2 is true Proof that there is not a pit in P 1,2 : ~P 1,2 KB ^ P 1,2 leads to empty clauseKB ^ P 1,2 leads to empty clause Therefore ~P 1,2 is trueTherefore ~P 1,2 is true

Formal Algorithm

Horn Clauses Horn Clause Disjunction of literals where at most one is positiveDisjunction of literals where at most one is positive –(~a V ~b V ~c V d) –(~a V b V c V ~d) Not a Horn Clause Horn Clause Disjunction of literals where at most one is positiveDisjunction of literals where at most one is positive –(~a V ~b V ~c V d) –(~a V b V c V ~d) Not a Horn Clause

Horn Clauses Why? Every Horn clause can be written as an implication:Every Horn clause can be written as an implication: –(a conjunction of positive literals)  single positive literal –Inference algorithms can use forward chaining and backward chaining –Entailment computation is linear in size of KB Why? Every Horn clause can be written as an implication:Every Horn clause can be written as an implication: –(a conjunction of positive literals)  single positive literal –Inference algorithms can use forward chaining and backward chaining –Entailment computation is linear in size of KB

Horn Clauses Can be written as a special implication (~a V ~b V c) becomes (a ^ b) => c(~a V ~b V c) becomes (a ^ b) => c –(~a V ~b V c) == (~(a ^ b) V c) … de Morgan –(~(a ^ b) V c) == ((a ^ b) => c … implication elimination Can be written as a special implication (~a V ~b V c) becomes (a ^ b) => c(~a V ~b V c) becomes (a ^ b) => c –(~a V ~b V c) == (~(a ^ b) V c) … de Morgan –(~(a ^ b) V c) == ((a ^ b) => c … implication elimination

Forward Chaining Does KB (Horn clauses) entail q (single symbol)? Keep track of known facts (positive literals)Keep track of known facts (positive literals) If the premises of an implication are knownIf the premises of an implication are known –Add the conclusion to the known set of facts Repeat process until no further inferences or q is addedRepeat process until no further inferences or q is added Does KB (Horn clauses) entail q (single symbol)? Keep track of known facts (positive literals)Keep track of known facts (positive literals) If the premises of an implication are knownIf the premises of an implication are known –Add the conclusion to the known set of facts Repeat process until no further inferences or q is addedRepeat process until no further inferences or q is added

Forward Chaining

Properties SoundSound CompleteComplete –All entailed atomic sentences will be derived Data Driven Start with what we knowStart with what we know Derive new info until we discover what we wantDerive new info until we discover what we wantProperties SoundSound CompleteComplete –All entailed atomic sentences will be derived Data Driven Start with what we knowStart with what we know Derive new info until we discover what we wantDerive new info until we discover what we want

Backward Chaining Start with what you want to know, a query (q) Look for implications that conclude qLook for implications that conclude q –Look at the premises of those implications  Look for implications that conclude those premises… Goal-Directed Reasoning Can be less complex search than forward chainingCan be less complex search than forward chaining Start with what you want to know, a query (q) Look for implications that conclude qLook for implications that conclude q –Look at the premises of those implications  Look for implications that conclude those premises… Goal-Directed Reasoning Can be less complex search than forward chainingCan be less complex search than forward chaining

Making it fast Example Problem to solve is in CNFProblem to solve is in CNF –Is Marvin a Martian given --- M == 1 (true)?  Marvin is green --- G=1  Marvin is little --- L=1  (little and green) implies Martian --- (L ^ G) => M ~(L^G) V M ~L V ~G V M –Proof by contradiction… are there true/false values for G, L, and M that are consistent with knowledge base and Marvin not being a Martian?  G ^ L ^ (~L V ~G V M) ^ ~M == empty? Example Problem to solve is in CNFProblem to solve is in CNF –Is Marvin a Martian given --- M == 1 (true)?  Marvin is green --- G=1  Marvin is little --- L=1  (little and green) implies Martian --- (L ^ G) => M ~(L^G) V M ~L V ~G V M –Proof by contradiction… are there true/false values for G, L, and M that are consistent with knowledge base and Marvin not being a Martian?  G ^ L ^ (~L V ~G V M) ^ ~M == empty?

Searching for variable values Want to find values such that: Randomly consider all true/false assignments to variables until we exhaust them all or find match (model checking)Randomly consider all true/false assignments to variables until we exhaust them all or find match (model checking) –(G, L, M) = (1, 0, 0)… no = (0, 1, 0)… no = (0, 0, 0)… no = (1, 1, 0)… no Alternatively…Alternatively… Want to find values such that: Randomly consider all true/false assignments to variables until we exhaust them all or find match (model checking)Randomly consider all true/false assignments to variables until we exhaust them all or find match (model checking) –(G, L, M) = (1, 0, 0)… no = (0, 1, 0)… no = (0, 0, 0)… no = (1, 1, 0)… no Alternatively…Alternatively… G ^ L ^ (~L V ~G V M) ^ ~M == 0

Backtracking Algorithm Davis-Putnam Algorithm (DPLL) Search through possible assignments to (G, L, M) via depth-first search (0, 0, 0) to (0, 0, 1) to (0, 1, 0)…Search through possible assignments to (G, L, M) via depth-first search (0, 0, 0) to (0, 0, 1) to (0, 1, 0)… –Each clause of CNF must be true  Terminate consideration when clause evaluates to false –Use heuristics to reduce repeated computation of propositions  Early termination  Pure symbol heuristic  Unit clause heuristic Davis-Putnam Algorithm (DPLL) Search through possible assignments to (G, L, M) via depth-first search (0, 0, 0) to (0, 0, 1) to (0, 1, 0)…Search through possible assignments to (G, L, M) via depth-first search (0, 0, 0) to (0, 0, 1) to (0, 1, 0)… –Each clause of CNF must be true  Terminate consideration when clause evaluates to false –Use heuristics to reduce repeated computation of propositions  Early termination  Pure symbol heuristic  Unit clause heuristic G L M Cull branches from tree

Searching for variable values Other ways to find (G, L, M) assignments for: G ^ L ^ (~L V ~G V M) ^ ~M == 0 G ^ L ^ (~L V ~G V M) ^ ~M == 0 Simulated Annealing (WalkSAT) – –Start with initial guess (0, 1, 1) – –With each iteration, pick an unsatisfied clause and flip one symbol in the clause – –Evaluation metric is the number of clauses that evaluate to true – –Move “in direction” of guesses that cause more clauses to be true – –Many local mins, use lots of randomness Other ways to find (G, L, M) assignments for: G ^ L ^ (~L V ~G V M) ^ ~M == 0 G ^ L ^ (~L V ~G V M) ^ ~M == 0 Simulated Annealing (WalkSAT) – –Start with initial guess (0, 1, 1) – –With each iteration, pick an unsatisfied clause and flip one symbol in the clause – –Evaluation metric is the number of clauses that evaluate to true – –Move “in direction” of guesses that cause more clauses to be true – –Many local mins, use lots of randomness

WalkSAT termination How do you know when simulated annealing is done? No way to know with certainty that an answer is not possibleNo way to know with certainty that an answer is not possible –Could have been bad luck –Could be there really is no answer –Establish a max number of iterations and go with best answer to that point How do you know when simulated annealing is done? No way to know with certainty that an answer is not possibleNo way to know with certainty that an answer is not possible –Could have been bad luck –Could be there really is no answer –Establish a max number of iterations and go with best answer to that point

So how well do these work? Think about it this way 16 of 32 possible assignments are models (are satisfiable) for this sentence16 of 32 possible assignments are models (are satisfiable) for this sentence Therefore, 2 random guesses should find a solutionTherefore, 2 random guesses should find a solution WalkSAT and DPLL should work quicklyWalkSAT and DPLL should work quickly Think about it this way 16 of 32 possible assignments are models (are satisfiable) for this sentence16 of 32 possible assignments are models (are satisfiable) for this sentence Therefore, 2 random guesses should find a solutionTherefore, 2 random guesses should find a solution WalkSAT and DPLL should work quicklyWalkSAT and DPLL should work quickly

What about more clauses? If # symbols (variables) stays the same, but number of clauses increases More ways for an assignment to fail (on any one clause)More ways for an assignment to fail (on any one clause) More searching through possible assignments is neededMore searching through possible assignments is needed Let’s create a ratio, m/n, to measure #clauses / # symbolsLet’s create a ratio, m/n, to measure #clauses / # symbols We expect large m/n causes slower solutionWe expect large m/n causes slower solution If # symbols (variables) stays the same, but number of clauses increases More ways for an assignment to fail (on any one clause)More ways for an assignment to fail (on any one clause) More searching through possible assignments is neededMore searching through possible assignments is needed Let’s create a ratio, m/n, to measure #clauses / # symbolsLet’s create a ratio, m/n, to measure #clauses / # symbols We expect large m/n causes slower solutionWe expect large m/n causes slower solution

What about more clauses Higher m/n means fewer assignments will work If fewer assignments work it is harder for DPLL and WalkSAT If fewer assignments work it is harder for DPLL and WalkSAT Higher m/n means fewer assignments will work If fewer assignments work it is harder for DPLL and WalkSAT If fewer assignments work it is harder for DPLL and WalkSAT

Combining it all 4x4 Wumpus World The “physics” of the gameThe “physics” of the game – – At least one wumpus on boardAt least one wumpus on board – A most one wumpus on board (for any two squares, one is free)A most one wumpus on board (for any two squares, one is free) – n(n-1)/2 rules like: ~W 1,1 V ~W 1,2 Total of 155 sentences containing 64 distinct symbolsTotal of 155 sentences containing 64 distinct symbols 4x4 Wumpus World The “physics” of the gameThe “physics” of the game – – At least one wumpus on boardAt least one wumpus on board – A most one wumpus on board (for any two squares, one is free)A most one wumpus on board (for any two squares, one is free) – n(n-1)/2 rules like: ~W 1,1 V ~W 1,2 Total of 155 sentences containing 64 distinct symbolsTotal of 155 sentences containing 64 distinct symbols

Wumpus World Inefficiencies as world becomes large Knowledge base must expand if size of world expandsKnowledge base must expand if size of world expands Preferred to have sentences that apply to all squares We brought this subject up last weekWe brought this subject up last week Next chapter addresses this issueNext chapter addresses this issue Inefficiencies as world becomes large Knowledge base must expand if size of world expandsKnowledge base must expand if size of world expands Preferred to have sentences that apply to all squares We brought this subject up last weekWe brought this subject up last week Next chapter addresses this issueNext chapter addresses this issue

Chapter 8: First-Order Logic

What do we like about propositional logic? It is: DeclarativeDeclarative –Relationships between variables are described –A method for propagating relationships ExpressiveExpressive –Can represent partial information using disjunction CompositionalCompositional –If A means foo and B means bar, A ^ B means foo and bar It is: DeclarativeDeclarative –Relationships between variables are described –A method for propagating relationships ExpressiveExpressive –Can represent partial information using disjunction CompositionalCompositional –If A means foo and B means bar, A ^ B means foo and bar

What don’t we like about propositional logic? Lacks expressive power to describe the environment concisely Separate rules for every square/square relationship in Wumpus worldSeparate rules for every square/square relationship in Wumpus world Lacks expressive power to describe the environment concisely Separate rules for every square/square relationship in Wumpus worldSeparate rules for every square/square relationship in Wumpus world

Natural Language English appears to be expressive Squares adjacent to pits are breezySquares adjacent to pits are breezy But natural language is a medium of communication, not a knowledge representation Much of the information and logic conveyed by language is dependent on contextMuch of the information and logic conveyed by language is dependent on context Information exchange is not well definedInformation exchange is not well defined Not compositional (combining sentences may mean something different)Not compositional (combining sentences may mean something different) It is ambiguousIt is ambiguous English appears to be expressive Squares adjacent to pits are breezySquares adjacent to pits are breezy But natural language is a medium of communication, not a knowledge representation Much of the information and logic conveyed by language is dependent on contextMuch of the information and logic conveyed by language is dependent on context Information exchange is not well definedInformation exchange is not well defined Not compositional (combining sentences may mean something different)Not compositional (combining sentences may mean something different) It is ambiguousIt is ambiguous

But we borrow representational ideas from natural language Natural language syntax Nouns and noun phrases refer to objectsNouns and noun phrases refer to objects –People, houses, cars Properties and verbs refer to object relationsProperties and verbs refer to object relations –Red, round, nearby, eaten Some relationships are clearly defined functions where there is only one output for a given inputSome relationships are clearly defined functions where there is only one output for a given input –Best friend, first thing, plus We build first-order logic around objects and relations Natural language syntax Nouns and noun phrases refer to objectsNouns and noun phrases refer to objects –People, houses, cars Properties and verbs refer to object relationsProperties and verbs refer to object relations –Red, round, nearby, eaten Some relationships are clearly defined functions where there is only one output for a given inputSome relationships are clearly defined functions where there is only one output for a given input –Best friend, first thing, plus We build first-order logic around objects and relations

Ontology a “specification of a conceptualization”a “specification of a conceptualization” A description of the objects and relationships that can existA description of the objects and relationships that can exist –Propositional logic had only true/false relationships –First-order logic has many more relationships The ontological commitment of languages is differentThe ontological commitment of languages is different –How much can you infer from what you know?  Temporal logic defines additional ontological commitments because of timing constraints a “specification of a conceptualization”a “specification of a conceptualization” A description of the objects and relationships that can existA description of the objects and relationships that can exist –Propositional logic had only true/false relationships –First-order logic has many more relationships The ontological commitment of languages is differentThe ontological commitment of languages is different –How much can you infer from what you know?  Temporal logic defines additional ontological commitments because of timing constraints