Presentation is loading. Please wait.

Presentation is loading. Please wait.

Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program.

Similar presentations


Presentation on theme: "Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program."— Presentation transcript:

1 Knowledge Representation

2 Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program. 2. Results of inferences on the knowledge base (KB) should correspond to the results of actions or observations in the world.

3 Have Already Examined Two (related) KR Schemes  First Order Predicate Logic  Production Systems

4 We’ll look at four others 1. Semantic Nets 2. Conceptual Dependency Schemes 3. Frames 4. Scripts 1 & 2 are called network schemes 3 & 4 are called structured schemes (alternatively slot and filler schemes)

5 Problems with FOPL 1. Emphasis is on truth-preserving relations 2. Sometimes at odds with the way that humans acquire and use knowledge 3. Leads to problems in mapping human language to FOPL

6 For Example If … then In English suggests causality But In FOPL specifies a relationship between truth values of antecedent and consequent (2+2 = 5)  color(elephants, green) This is true, but without common sense meaning.

7 Categorical Interlude Category  A group of objects that seem to go together  Because they have significant attributes in common  Example: DOG

8  Allows us to use our finite mental resources efficiently –When identifying an objects, we can abstract key attributes from all sensory information presented to us. –I am trying to determine whether that flying object is a bird or a wasp. –I don’t care that robins have orange breasts and sparrows have grey. –What matters are those attributes of category bird that exclude instances of category wasp

9  Categories license inductive inferences –Most birds pose no threat to humans –Common wasps do –Inference from category wasp tells us to avoid its members

10 Gelman & Markman’s Experiment Children were  Shown a picture of a fish  Told that it breathes under water  Shown a picture of a dolphin  Told that it breathes by jumping out of the water  Shown a picture of a shark  Told that it is a fish (though it looks like a dolphin)  Were asked how it breathes  Answered “Under water.”

11 Semantic Nets  Proposed by Quillian in the late 1960’s  Tries to provide a formalism that captures taxonomic hierarchies  A graph where –Nodes are categories –Arcs are of three types  Isa links, indicating a subset relationship (a dog isa mammal)  Inst links, indicating an element-set relationship (mazel is a dog)  Attribute links, indicating a property held by a category (simcha is grey)

12 Example thing Animate thing Table_1 legs Ponderosa pine animal plant green Inanimate Thing Block_1 Furniture Table cubic Block color isa Instance_of Supported_by shape

13 In (what else?) Prolog In (what else?) Prolog

14 Strengths of Semantic Nets 1. Provides for inheritance 2. Organizes knowledge using interconnected concepts 3. Let’s us discover relationships between pairs of concepts (block_1 and table_1 are both inanimate things and are supported by legs)

15 Weaknesses of Semantic Nets 1. Generality of the attribute links 2. As task grows in complexity, so does the representation 3. No systematic basis for structuring semantic relationships 4. Puts the burden of constructing facts & links on programmer

16 Key Issue  Isolation of primitives for semantic network languages  Primitives are those things that the interpreter is programmed in advance to understand.  We need a more systematic basis for structuring semantic relationships

17 Case Structure Grammars  C.J. Fillmore, 1968  Verb oriented (as opposed to concept-oriented)  Sentences are represented as verb nodes with links to specific roles played by nouns and noun phrases  Important links –Agent –Object –Instrument –Location –Time

18 “Mary caught the ball with her glove.” Mary agent catch ball glove instrument past object time

19 Advantages  Representational language captures some of the deep structure of natural languages (i.e., the relationship between any verb and its subject is the agent relationship)  This deep structure is independent of any sentence or even of any distinct language

20 Leading To  Conceptual Dependency Theory –Associated with Robert Schank (then of Yale, most recently of Northwestern) –Attempts to model the semantic structure of natural language –Attempts to provide a canonical form for the meaning of sentences –That is, all sentences that mean the same thing (whatever that means) will be represented internally by identical graphs –Idea is to parse two sentences that use different words but mean the same thing into identical internal representations –Example: John gave the book to mary/Mary was given the book by John.

21 Primitives in CD Theory  ACTs – actions  PPs – picture producers  AAs – modifiers of actions (action aiders)  PAs – Modifiers of objects (picture aiders)

22 Further Breakdown Each of these classes has a well-defined number of primitives (luger, pp. 236-37) All ACTs (actions) can be reduced to: 1. ATRANS – transfer a relationship (give) 2. PTRANS – transfer a physical location (go) 3. PROPEL – apply physical force (push) 4. MOVE – move body part by owner (kick) … 12. ATTEND – focus sense organ (listen)

23 Yet More  Indicates direction of dependency  P indicates past  F indicates future  Indicates agent-verb relationship  Indicates the object of an action ACT PP  Agent instrument is an arrow pointing left o

24 pp pp ACT Recipient of an action

25 “John gave the book to mary.” John p ATRANS book R mary john

26 Basic Idea 1. Parse the sentence 2. Fit it into canonical form 3. Group sentences with similar meanings

27 Strengths 1. Provides a formal theory of language semantics 2. Reduces the problem of ambiguity 3. Attempts to reduce the complexity of natural language by grouping sentences of similar meaning together.

28 Weaknesses 1. Reduction is not computable in polynomial time 2. No evidence that humans store knowledge in canonical forms 3. Does not address the difficult issue of meaning in discourse Example: Bill and John always walk home together. One afternoon, Bill said to John, “Let’s leave early.” In effect, he asked him to go along with his plan of playing hooky. What are the referents of these three pronouns?

29 Canonical Sentences leads to Canonical Events  NLP programs must use a large amount of background knowledge  Evidence that we organize this information into structures corresponding to typical situations  Example: if we read a story about baseball, we resolve any ambiguities in the text in a way consistent with baseball

30 Example 1. City Council refused to give the demonstrators a permit because they feared violence. 2. City Council refused to give the demonstrators a permit because they advocated revolution. Background knowledge lets us determine the correct referent to they in each case.

31 Script  Structural representation that describes a stereotypical sequence of events in a particular context.  May be viewed as a causal chain

32 Components 1. Entry conditions: must be satisfied before the script is activated 2. Result: things that will be true after script completes 3. Props: slots representing objects that are involved in the events of the script. 4. Roles: slots representing people involved in the events of the script. 5. Track: Specific variation on a general pattern 6. Scene: The actual sequence of events

33 Notice  Entry Conditions/Result are pre/post conditions  Props and Roles are Data Structures  Track is overloading  Scene is an algorithm

34 Example John went to a restaurant last night. He ordered penne arrabiata. When he paid, he noticed he was running out of money. He hurried home, since it had started to rain. Question: Did he eat?

35 In Action  Activate Script: Restaurant  Roles: –S = Customer –W= Waiter  Props: –F = Food  Scene –Entering:  S ptrans s into restaurant –Ordering: … –Eating  S ingest F –Exiting  S atrans money to W  Result: Answer to question is yes

36 Frames  Associated with Marvin Minsky  Semantic nets informally represent –inheritance through isa links –Relationships among entities  Frames –More structured semantic net –Assign structure to nodes as well as links –Definition  A frame is a collection of attributes (slots) and associated values (along with constraints) that describe something in the world  Each frame –Represents a set of items (isa) with given properties that are inherited by its members –Represents an instance (inst) of a class of items with given properties, some of which are inherited

37 Example Semantic Net person Male ML baseball player pitcheroutfielder KoufaxMaysDodgers.106 Giants.262.253 6-1 5-10 right is a inst ht

38 Transformed to a Frame Issue Some attributes are to be inherited Some refer only to the frame itself Person has both cardinality (8,000,000,000) and locomotion (biped) Only locomotion is to be inherited— indicate with an *

39 Person  isa: Mammal  Card: 8,000,000,000  *handed: right We have a frame with three slots Male  Isa: Person  Card: 4,000,000,000  *height: 5-10

40 Baseball Player  Isa: male  Card: 624  *Height: 6-1  *avg:.252  *team:  *uniform color

41 Slots  Have inherited default values  Can be structured objects –Frames to which it can be attached (*avg makes sense for baseball player but not for water fowl) –Constraints on values (0 <= avg <= 1) –Default value –Rules for computing a value separate from inheritance –Whether a slot is single or multi-valued


Download ppt "Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program."

Similar presentations


Ads by Google