Conceptual Dependency Theory

Slides:



Advertisements
Similar presentations
Feet. land side without boy once animals life.
Advertisements

Natural Language Processing Lecture 2: Semantics.
1 CS 388: Natural Language Processing Story Understanding Raymond J. Mooney University of Texas at Austin.
Conceptual Dependency and Natural Language Processing
KR Using rules IF.. THEN ECA (Event Condition Action) RULES. APLLICATIONS EXAMPLES 1.If flammable liquid was spilled, call the fire department. 2.If the.
Syntax-Semantics Mapping Rajat Kumar Mohanty CFILT.
Conceptual Dependency ATRANS – abstract transfer (give) PTRANS – physical transfer (move) PROPEL – application of force (push) MTRANS – mental transfer.
Knowledge Representation. Essential to artificial intelligence are methods of representing knowledge. A number of methods have been developed, including:
Knowledge Representation.  What is Knowledge Representation What is Knowledge Representation  Type of knowledge in AI Type of knowledge in AI  Declarative.
CPSC 322 Introduction to Artificial Intelligence November 5, 2004.
Conceptual Dependency
NLP and Speech 2004 Feature Structures Feature Structures and Unification.
Mental Development and Representation Building through Motivated Learning Janusz A. Starzyk.
CS 4705 Semantic Roles and Disambiguation. Today Semantic Networks: Wordnet Thematic Roles Selectional Restrictions Selectional Association Conceptual.
6 Nov 2001IS202: Information Organization and Retrieval Information Extraction Ray Larson & Warren Sack IS202: Information Organization and Retrieval Fall.
A modern approach input sentence syntax analysis (parsing) semantic analysis pragmatic analysis target representation grammar lexicon semantic rules contextual.
Chapter 9 Domain Models 1CS6359 Fall 2012 John Cole.
It’s a Little Thing. "Sometimes you just know what's important. You know without being told." That's what Papa Joe always used to say to us.
Poetic Elements Poetry Unit.
Assessment of Semantics
Memory. Interesting Video  Color Changing Card Trick Color Changing Card Trick.
BTS430 Systems Analysis and Design using UML Domain Model Part 1—Finding Conceptual Classes.
Early Experiences Presented by Frank H. Osborne, Ph. D. © 2015 EMSE 3123 Math and Science in Education 1.
Chapter 15 Natural Language Processing (cont)
 A number of a noun, pronoun,, and a verb does not refer to a specific number. It means only singular or plural.  The subject-verb agreement is very.
Lexical Semantics Chapter 16
Early Work Masterman: 100 primitive concepts, 15,000 concepts Wilks: Natural Language system using semantic networks Shapiro: Propositional calculus based.
Syntax n Constituent Structure and Tree diagrams.
Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program.
Intellectual Development of School-Age Children
Levels of thinking and questioning. Knowledge Recalling memorized information What are some of the things that Goldilocks did in the bear's house?
Knowledge Representation
Higher Mental Function: Information Processing Scott S. Rubin, Ph.D. Neuroscience.
Sight Words.
Domain Model A representation of real-world conceptual classes in a problem domain. The core of object-oriented analysis They are NOT software objects.
Knowledge Representation & Game Theory Presented By: Saurabh Sohoney Adil Anis Sandalwala “Once acquired, knowledge must be organized for use” Guided By:
INDIRECT OBJECTS BY EMMA LAMBERT AND OLIVIA LAFLAMME.
Graphic Organizers EDUC 307. Graphic Organizers  Graphic organizers are mental maps that represent key skills like sequencing, comparing and contrasting,
Bloom’s Revised Taxonomy Creating Higher Level Discussions.
The War of the Ghosts The War of the Ghosts.
By Kyle McCardle.  Issues with Natural Language  Basic Components  Syntax  The Earley Parser  Transition Network Parsers  Augmented Transition Networks.
Artificial Intelligence – CS364 Knowledge Representation Lectures on Artificial Intelligence – CS364 Conceptual Dependency 20 th September 2005 Dr Bogdan.
Chapter 9 Domain Models.
From Micro-Worlds to Knowledge Representation : AI at an Impasse Hubert L. Dreyfus 15. Oct Presented by BoYun Eom.
It’s a Little Thing.
Vocabulary Set 1.
A classification of learning objectives within education
-by Nisarg Vasavada (Compiled*)
OO Domain Modeling With UML Class Diagrams and CRC Cards
Knowledge Representation & Game Theory
Identify sequence of events in text.
Representation of Actions as an Interlingua
Chapter 10: Bloom’s Taxonomy
Draw a Penny.
Compiler Construction
Conceptual Dependency (CD)
Strong Slot-and-Filler Structures
OO Domain Modeling With UML Class Diagrams and CRC Cards
By Dr. lubna Riyadh Abdul Jabber
Memory and Thought Chapter 3.
Fry Word Test First 300 words in 25 word groups
Strong Slot-and-Filler Structures
Levels of Linguistic Analysis
Trait Tree Assignment Draw a picture of your character. Add some color! Choose three character traits and support each trait with detailed evidence from.
Third 100 Words Fry Instant Word List.
Inputs, outputs and stores
Structured Knowledge Representation
Artificial Intelligence 2004 Speech & Natural Language Processing
Language translation Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Sections
Knowledge Representation
Presentation transcript:

Conceptual Dependency Theory

Schank and Students – 1968 Interlingua Form A Limited Set of Primitives Frame Based

Primitives PP - Picture Producers ACT – actions LOC - locations T - time PA - Attributes of PP

Conceptual Syntax

Physical ACTs PTRANS Propel Move Grasp Ingest Expel Speak physical Transfer occurs as an instrument Propel Move Grasp Ingest Expel Speak

Mental ACTs Mtrans Attend Mbuild

CD has Mental States MLOC – mental locations CP – conscious processor IM – intermediate memory LTM – long term memory

Social ACTs Atrans – abstract transfer

States Color(red) Height(6 feet) Health() Anger()

Micro ELI A CD analyzer

Input is (JACK WENT TO THE STORE) CD form is (PTRANS (ACTOR (PERSON (NAME (JACK)))) (OBJECT (PERSON (NAME (JACK)))) (TO (STORE)) ) This is a Frame: PTRANS Actor: Person name=Jack Object: Person name = Jack To: Store

Micro-ELI Internals ;;; ;;; Dictionary Functions (defmacro defword (&body def) `(progn (setf (get ',(car def) 'defintion) ',(cdr def)) ',(car def))) ;; Example vocabulary items... (defword jack ((assign *cd-form* '(person (name (jack))) *part-of-speech* 'noun-phrase)))

A CD form (defword went ((assign *part-of-speech* 'verb *cd-form* '(ptrans (actor ?go-var1) (object ?go-var1) (to ?go-var2) (from ?go-var3)) go-var1 *subject* go-var2 nil go-var3 nil) (next-packet ((test (equal *word* 'to)) ((test (equal *part-of-speech* 'noun-phrase)) (assign go-var2 *cd-form*)))) ((test (equal *word* 'home)) (assign go-var2 '(house))))))

Micro Tale spin Using: Build a story story representation World knowledge Build a story

A Story ; irving kills joe. (defvar *story2* '(irving thirsty (irving (like (actor joe) (to irving) (mode (neg)))) (irving (dominate (actor joe) (to irving) (mode (neg)))) (irving (deceive (actor joe) (to irving) (mode (pos)))) (irving (like (actor irving) (to joe) (mode (neg)))) (joe (deceive (actor irving) (to joe) (mode (neg))))))

The Output Once upon a time ... JOE WAS NEAR THE CAVE. JOE KNEW THAT JOE WAS NEAR THE CAVE. IRVING WAS NEAR THE OAK-TREE. IRVING KNEW THAT IRVING WAS NEAR THE OAK-TREE. JOE KNEW THAT IRVING WAS NEAR THE OAK-TREE. THE WATER WAS NEAR THE RIVER. JOE KNEW THAT THE WATER WAS NEAR THE RIVER. THE HONEY WAS NEAR THE ELM-TREE. IRVING KNEW THAT THE HONEY WAS NEAR THE ELM-TREE. THE WORM WAS NEAR THE GROUND. JOE KNEW THAT THE WORM WAS NEAR THE GROUND. IRVING KNEW THAT JOE WAS NEAR THE CAVE. THE FISH WAS NEAR THE RIVER. IRVING KNEW THAT THE FISH WAS NEAR THE RIVER. IRVING THOUGHT THAT JOE DID NOT LIKE IRVING. IRVING THOUGHT THAT JOE DID NOT DOMINATE IRVING. IRVING THOUGHT THAT JOE DECEIVED IRVING. IRVING THOUGHT THAT IRVING DID NOT LIKE JOE

JOE THOUGHT THAT IRVING DID NOT DECEIVE JOE. One day, IRVING WAS THIRSTY . IRVING WANTED NOT TO BE THIRSTY . IRVING WANTED TO BE NEAR THE WATER. IRVING WANTED TO KNOW WHERE THE WATER WAS . IRVING WANTED JOE TO TELL IRVING WHERE THE WATER WAS . IRVING WANTED JOE TO THINK THAT IF JOE WOULD NOT TELL IRVING WHERE THE WATER WAS THEN IRVING WOULD STRIKE JOE. IRVING WANTED TO BE NEAR JOE. IRVING WENT TO THE CAVE. IRVING WAS NEAR THE CAVE. IRVING TOLD JOE THAT IF JOE WOULD NOT TELL IRVING WHERE THE WATER WAS THEN IRVING WOULD STRIKE JOE. IRVING STRUCK JOE. JOE WAS NOT ALIVE . IRVING THOUGHT THAT JOE WOULD NOT TELL IRVING WHERE THE WATER WAS . IRVING THOUGHT THAT IRVING DID NOT KNOW WHERE THE WATER WAS . IRVING THOUGHT THAT IRVING WAS NOT NEAR THE WATER. IRVING KNEW THAT IRVING WOULD BE THIRSTY . The end.

World Knowledge ((world (loc (actor joe) (val cave))) (joe (loc (actor joe) (val cave))) (world (loc (actor irving) (val oak-tree))) (put 'joe 'is-a 'bear) (put 'joe 'home 'cave)

CD A critique Implicit relationships in world knowledge Ex hit implies touched Right primitives? Need higher level concepts for inference Semantic parsing didn’t pan out! Lexical structure adds detail! Just looking for slots was not good enough

CD final thoughts Without getting prescriptive CD can be used as a knowledge rep Needs extension in the CD Use more parsing