Lectures on Artificial Intelligence – CS289 Conceptual Graphs

Slides:



Advertisements
Similar presentations
Database Systems: Design, Implementation, and Management Tenth Edition
Advertisements

Knowledge Representation
Intelligent systems Lection 7 Frames, selection of knowledge representation, its combinations.
Knowledge Representation. Essential to artificial intelligence are methods of representing knowledge. A number of methods have been developed, including:
CPSC 322 Introduction to Artificial Intelligence November 5, 2004.
Ontology From Wikipedia, the free encyclopedia In philosophy, ontology (from the Greek oν, genitive oντος: of being (part. of εiναι: to be) and –λογία:
14th September 2006 Dr Bogdan L. Vrusias
Lecture 10: Semantic Network
Knowledge Representation
Knowledge Engineering
1 Lecture 35 Brief Introduction to Main AI Areas (cont’d) Overview  Lecture Objective: Present the General Ideas on the AI Branches Below  Introduction.
Lecturer: Sebastian Coope Ashton Building, Room G.18 COMP 201 web-page: Lecture.
PSY 369: Psycholinguistics Some basic linguistic theory part3.
Entity Relationship Diagrams
Knowledge Representation
Common Mechanisms in UML
Objects Objects are at the heart of the Object Oriented Paradigm What is an object?
Meaning and Language Part 1.
RDF (Resource Description Framework) Why?. XML XML is a metalanguage that allows users to define markup XML separates content and structure from formatting.
ARTIFICIAL INTELLIGENCE [INTELLIGENT AGENTS PARADIGM] Professor Janis Grundspenkis Riga Technical University Faculty of Computer Science and Information.
Knowledge Representation
General Knowledge Dr. Claudia J. Stanny EXP 4507 Memory & Cognition Spring 2009.
Frames and semantic networks, page 1 CSI 4106, Winter 2005 A brief look at semantic networks A semantic network is an irregular graph that has concepts.
TM CG Notes Latest: Outline Big Picture architecture Topic Maps – CG – Notio Notes – Amine Notes – CharGer Notes – Prolog – Analogy –
AI – CS289 Knowledge Representation Conceptual Graphs 25 th September 2006 Dr Bogdan L. Vrusias
Artificial Intelligence 4. Knowledge Representation Course V231 Department of Computing Imperial College, London © Simon Colton.
Knowledge Management in Theory and Practice
Nancy Lawler U.S. Department of Defense ISO/IEC Part 2: Classification Schemes Metadata Registries — Part 2: Classification Schemes The revision.
Cognitive Psychology: Thinking, Intelligence, and Language
Knowledge Representation CPTR 314. The need of a Good Representation  The representation that is used to represent a problem is very important  The.
An Intelligent Analyzer and Understander of English Yorick Wilks 1975, ACM.
Early Work Masterman: 100 primitive concepts, 15,000 concepts Wilks: Natural Language system using semantic networks Shapiro: Propositional calculus based.
Artificial Intelligence LECTURE 2 ARTIFICIAL INTELLIGENCE LECTURES BY ENGR. QAZI ZIA 1.
Lectures on Artificial Intelligence – CS435 Conceptual Graphs
ARTIFICIAL INTELLIGENCE [INTELLIGENT AGENTS PARADIGM] Professor Janis Grundspenkis Riga Technical University Faculty of Computer Science and Information.
Artificial Intelligence Knowledge Representation.
Copyright © Curt Hill Languages and Grammars This is not English Class. But there is a resemblance.
Albert Gatt LIN3021 Formal Semantics Lecture 4. In this lecture Compositionality in Natural Langauge revisited: The role of types The typed lambda calculus.
Semantic Nets, Frames, World Representation CS – W February, 2004.
SOFTWARE DESIGN. INTRODUCTION There are 3 distinct types of activities in design 1.External design 2.Architectural design 3.Detailed design Architectural.
Programming Languages and Design Lecture 3 Semantic Specifications of Programming Languages Instructor: Li Ma Department of Computer Science Texas Southern.
Knowledge Representation
1 CS 385 Fall 2006 Chapter 7 Knowledge Representation 7.1.1, 7.1.5, 7.2.
Knowledge Management in Theory and Practice
Reasoning Systems For Categories By Franklyn O. Reasoning Systems For Categories Categories are the primary building blocks of any large-scale knowledge.
Lecture 5 Frames. Associative networks, rules or logic do not provide the ability to group facts into associated clusters or to associate relevant procedural.
From NARS to a Thinking Machine Pei Wang Temple University.
1 CS 430 Database Theory Winter 2005 Lecture 3: A Fifty Minute Introduction to Data Modeling.
Artificial Intelligence Hossaini Winter Outline book : Artificial intelligence a modern Approach by Stuart Russell, Peter Norvig. A Practical Guide.
Knowledge Representation
Knowledge Engineering. Sources of Knowledge - Books - Journals - Manuals - Reports - Films - Databases - Pictures - Audio and Video Tapes - Flow Diagram.
Artificial Intelligence – CS364 Knowledge Representation Lectures on Artificial Intelligence – CS364 Conceptual Dependency 20 th September 2005 Dr Bogdan.
Definition and Technologies Knowledge Representation.
Artificial Intelligence Logical Agents Chapter 7.
1 Lectures on Conceptual Graphs Artificial Intelligence (CS 364) Khurshid Ahmad, Professor of Artificial Intelligence.
Chapter 9 Knowledge. Some Questions to Consider Why is it difficult to decide if a particular object belongs to a particular category, such as “chair,”
COP Introduction to Database Structures
Module 5 Other Knowledge Representation Formalisms
Knowledge Representation Techniques
Conceptual Graphs(1) A CG is a finite, connected, bipartite graph.
Knowledge Representation
Chapter 7 Psychology: Memory.
Ontology From Wikipedia, the free encyclopedia
WELCOME TO COSRI IBADAN
Knowledge Representation
Conceptual Graphs Graph Structure
KNOWLEDGE REPRESENTATION
Semantic Nets and Frames
Representations & Reasoning Systems (RRS) (2.2)
Presentation transcript:

Lectures on Artificial Intelligence – CS289 Conceptual Graphs 18th September 2006 Dr Bogdan L. Vrusias b.vrusias@surrey.ac.uk

Contents Definition of Conceptual Graphs Basic building blocks Concept node representation Exercise 18th September 2006 Bogdan L. Vrusias © 2006

Definition of Conceptual Graphs John Sowa, formerly of IBM, is one of the key proponents of conceptual graphs (CG). Sowa’s project is to create "a system of logic for representing natural language semantics". Conceptual graphs form a knowledge representation language based on the one hand in linguistics, psychology and philosophy, and data structures and data processing techniques on the other. 18th September 2006 Bogdan L. Vrusias © 2006

Definition of Conceptual Graphs The main aim is mapping perception onto an abstract representation and reasoning system. A conceptual graph consists of concept nodes and relation nodes The concept nodes represent entities, attributes, states, and events  The relation nodes show how the concepts are interconnected 18th September 2006 Bogdan L. Vrusias © 2006

Conceptual Graphs: Basic Structure ("The cat sat on the mat") Rules for assembling percepts Words Percepts Grammar Rules CAT STAT SIT LOC MAT PS: percepts are fragments of images that fit together like pieces of a jigsaw puzzle 18th September 2006 Bogdan L. Vrusias © 2006

Conceptual Graphs: Basic Structure Alternative notation for text based representation: [cat] --> (stat) --> [sit] --> (loc) --> [mat] Square brackets denote concept nodes. Parentheses denote relation nodes. 18th September 2006 Bogdan L. Vrusias © 2006

A Graph-Theoretic Definition Conceptual Graphs are finite, connected, bipartite graphs. Finite: because any graph (in 'human brain' or 'computer storage') can only have a finite number of concepts and conceptual relations. Connected: because two parts that are not connected would simply be called two conceptual graphs. Bipartite: because there are two different kinds of nodes: concepts and conceptual relations, and every arc links a node of one kind to a node of another kind 18th September 2006 Bogdan L. Vrusias © 2006

Perception ‘Perception is the process of building a working model that represents and interprets sensory input’. The reception of sensory input, ‘a mosaic of percepts’, is converted into concepts: Concrete concepts – that have associated percepts Abstract concepts – that do not have any associated percepts. 18th September 2006 Bogdan L. Vrusias © 2006

Perception For Sowa, a sensory icon is matched in an ideal brain to a single percept or to a collection of percepts, which are combined to form a complete image: an interconnected set of percepts. Percepts are combined in the brain and their interconnections stored as a conceptual graph. 18th September 2006 Bogdan L. Vrusias © 2006

Conceptual Graphs Example Consider the sentence: "A cat sitting on a mat" This sentence can be interpreted at different levels: There are concrete concepts: cat, mat and sitting which enable us to experience the external word and motor mechanism to react to it. The words of our natural language, arranged in accordance with the grammar of the language, is one way of articulating and disseminating the experience. 18th September 2006 Bogdan L. Vrusias © 2006

Conceptual Graphs Example Each of the concepts in the sentence belongs to, or can be related to, a category or class: Animal>Cat; Furniture>Mat; Posture>Sit; Living Being>Animal; Household Objects>Furniture; Act>Posture Thus Cat – Sit – Mat Animal – Posture – Furniture Living Being – Act – Household Object A hierarchy of concept type defines the relationship between concepts at different levels of generality Increasing Abstraction 18th September 2006 Bogdan L. Vrusias © 2006

Conceptual Graphs Example The concepts cat-sit-mat are related to each other in that: It is a common observation that some animate objects do sit on certain concrete objects Even if we had never seen a cat sitting on a mat, we may derive the conceptual graph on the basis of observation The order of the concrete concepts is important in that were we to say that mat-sit-cat, it would be difficult to match this stated percept with a conceptual graph in the ideal brain. Formation rules determine how each type of concept may be linked to conceptual relations. 18th September 2006 Bogdan L. Vrusias © 2006

Conceptual Graphs Example The above sentence relates to an episode or to some context to which it is relevant. Each episode may have some deeper mental associations, like emotions. When we ask the question: what is the cat doing?, the answer is that the cat is sitting and that its current location is the mat. The cat’s STATe, its current ACTivity, its LOCation may each be related to a procedure of some type. 18th September 2006 Bogdan L. Vrusias © 2006

Conceptual Relations Concepts are linked by conceptual relations to form a conceptual graph. If a conceptual relation has n-arcs, then it is said to be n-adic, and its arcs are labelled 1, 2, …..n 18th September 2006 Bogdan L. Vrusias © 2006

Example Consider the sentence: Mary gave John the boring book authored by Tom & Jerry There are three main parts: (1), (2), and (3) (1) (2) (3) 18th September 2006 Bogdan L. Vrusias © 2006

Example agent Person: Mary give recipient Person: John (1): Mary gave John the boring book authored by Tom & Jerry agent Person: Mary give recipient Person: John Both relation nodes have two arcs each and are referred to as expressing a 2-ary or binary relation between the two concepts 18th September 2006 Bogdan L. Vrusias © 2006

Example (2): Mary gave John the boring book authored by Tom & Jerry boring book The relation node has only one arc and thus refers to a 1-ary or unary relation 18th September 2006 Bogdan L. Vrusias © 2006

Example Person: Tom author book Person: Jerry (3): Mary gave John the boring book authored by Tom & Jerry Person: Tom author book Person: Jerry The relation node has 3-arcs and is referred to as expressing 3-ary or ternary relation 18th September 2006 Bogdan L. Vrusias © 2006

Formal Conceptual Relations Entity:*x Entity*y accompaniment (ACCM)   attribute (ATTR) characteristic (CHRC) content (CONT) part (PART) possession (POSS) support (SUPP) Event(Act) Attribute manner (MANR) 18th September 2006 Bogdan L. Vrusias © 2006

Formal Conceptual Relations Event(Act) Entity result (RSLT)   source (SOUR) Entity (Animate) agent (AGNT) recipient (RCPT) Entity (Place) destination (DEST) path (PATH) Entity (Substance) material (MATR) Function Data argument (ARG) State*x State*y causation (CAUS) 18th September 2006 Bogdan L. Vrusias © 2006

Concept Nodes Recall that in the discussion of Collins and Quillian’s semantic networks, we have found that these networks were logically inadequate! This situation was not resolved in some of the subsequent formulations of semantic networks. Specifically, it was difficult in a typical semantic network notation to distinguish between nodes describing: classes and subclasses classes and members 18th September 2006 Bogdan L. Vrusias © 2006

Concept Nodes In the sentence: Tom is a cat, a feline mammal Tom is_a cat is_a feline is_a mammal individual species subclass class The relation "is_a" is used to describe relationships between concepts that are mildly different. 18th September 2006 Bogdan L. Vrusias © 2006

Concept Nodes A good representation should allow us to distinguish between: Individuals and species Species and classes Classes and subclasses Individuals may have properties that may not influence their belonging to a subclass: Tom is a brown tabby Should not influence the observation that: A tabby cat is a kind of cat 18th September 2006 Bogdan L. Vrusias © 2006

Concept Nodes In CG theory, 'every concept is a unique individual of a particular type'. Concept nodes are labelled with descriptors or names like "dog", "cat", "gravity", etc. The labels refer to the class or type of individual represented by the node. Each concept node is used to refer to an individual concept or a generic concept. In CG theory we have a relation called: name 18th September 2006 Bogdan L. Vrusias © 2006

Concept Nodes colour cat: "Tom" brown CG allows nodes to be labelled simultaneously with the name of the individual the node represents and its type. The two are separated by a colon (":") Consider the example: Tom, a cat, is brown colour cat: "Tom" brown 18th September 2006 Bogdan L. Vrusias © 2006

Concept Nodes: Unnamed Individuals Consider the example that we do not know the name of a cat that is brown: Each concept node in a CG may be used to represent specific but unnamed individuals by a unique prescribed number. colour cat: #12345 brown 18th September 2006 Bogdan L. Vrusias © 2006

Concept Nodes: Multiple Names We subsequently found out that the cat is called by different names: "Sylvester", "Sugar Pie" and "Squidgy Bod": name "Sylvester" name cat: #12345 "Sugar Pie" name "Squidgy Bod" 18th September 2006 Bogdan L. Vrusias © 2006

Concept Nodes: Unspecified Individuals General markers can also be used to refer to an unspecified individual. The CG: Refers to an unspecified cat. Notationally, unspecified individuals are shown by the existence of an asterisk ("*") BUT… this is usually omitted (cat = cat:*). colour cat brown colour cat: * brown 18th September 2006 Bogdan L. Vrusias © 2006

Concept Nodes: Named Variables Named variables can also be used to refer to an individual. These are represented by an asterisk followed by the variable name. This is useful to indicate nodes that are the same unspecified individual. agent object dog:*X scratch ear instrument part part paw dog:*X 18th September 2006 Bogdan L. Vrusias © 2006

Canonical Graphs A conceptual graph is a combination of concept nodes and relation nodes where every arc of every conceptual relation is linked to a concept. This could lead sometimes to sensible statements like "a bunny sitting on a mat" and at time will lead to nonsense like: "colourless green ideas sleep furiously" Sowa distinguishes the nonsensical graphs from those that represent real or possible situations in the external world by declaring the later as canonical. Certain conceptual graphs are canonical. New graphs may become canonical or be canonised by perception, formation rules, or through "insight". 18th September 2006 Bogdan L. Vrusias © 2006

Exercises Please create the conceptual graph of the following sentence: John is between a rock and a hard place 18th September 2006 Bogdan L. Vrusias © 2006

Solution 1 rock between person: John place attribute hard "John is between a rock and a hard place" rock between person: John place attribute hard 18th September 2006 Bogdan L. Vrusias © 2006

Closing Questions??? Remarks??? Comments!!! Evaluation! 18th September 2006 Bogdan L. Vrusias © 2006