The kinship domain example from cs. manchester. ac

Slides:



Advertisements
Similar presentations
SOTIRIS BATSAKIS EURIPIDES G.M. PETRAKIS TECHNICAL UNIVERSITY OF CRETE INTELLIGENT SYSTEMS LABORATORY Imposing Restrictions Over Temporal Properties in.
Advertisements

Some Prolog Prolog is a logic programming language
Ontologies and Databases Ian Horrocks Information Systems Group Oxford University Computing Laboratory.
Three Theses of Representation in the Semantic Web
An Introduction to Description Logics
Chronos: A Tool for Handling Temporal Ontologies in Protégé
Based on “A Practical Introduction to Ontologies & OWL” © 2005, The University of Manchester A Practical Introduction to Ontologies & OWL Session 3: Additional.
Tutorial Le Phuoc Son Hoang Huu Hanh Hue University.
Internet Technologies An Introduction to Ontologies in OWL Bibliography The OWL Guide The OWL Overview Description Logic slides from Enrico Franconi Artificial.
Of 27 lecture 7: owl - introduction. of 27 ece 627, winter ‘132 OWL a glimpse OWL – Web Ontology Language describes classes, properties and relations.
Properties and Individuals in OWL: Reasoning About Family History Robert Stevens and Simon Jupp BioHealth Informatics Group University of Manchester
Existential Graphs and Davis-Putnam April 3, 2002 Bram van Heuveln Department of Cognitive Science.
From SHIQ and RDF to OWL: The Making of a Web Ontology Language
Basics of Reasoning in Description Logics
Identity and Equality Properties. Properties refer to rules that indicate a standard procedure or method to be followed. A proof is a demonstration of.
Table of Contents Inverse Functions: Given a function, find its inverse. First rename f (x) as y. Example: Given the one-to-one function, find f - 1 (x).
An Introduction to Description Logics. What Are Description Logics? A family of logic based Knowledge Representation formalisms –Descendants of semantic.
Protege OWL Plugin Short Tutorial. OWL Usage The world wide web is a natural application area of ontologies, because ontologies could be used to describe.
8/11/2011 Web Ontology Language (OWL) Máster Universitario en Inteligencia Artificial Mikel Egaña Aranguren 3205 Facultad de Informática Universidad Politécnica.
Building an Ontology of Semantic Web Techniques Utilizing RDF Schema and OWL 2.0 in Protégé 4.0 Presented by: Naveed Javed Nimat Umar Syed.
OWL 2 in use. OWL 2 OWL 2 is a knowledge representation language, designed to formulate, exchange and reason with knowledge about a domain of interest.
LDK R Logics for Data and Knowledge Representation ClassL (part 3): Reasoning with an ABox 1.
An Introduction to Description Logics (chapter 2 of DLHB)
Ontology Building in Action Hasan TÜRKSOY Compiled, partly based on various online tutorials and presentations, with respect to their authors.
Coastal Atlas Interoperability - Ontologies (Advanced topics that we did not get to in detail) Luis Bermudez Stephanie Watson Marine Metadata Interoperability.
Logic …. Disjoint Properties As for disjoint classes, two properties can be disjoint (owl : propertyDisjointWith) Property p and p’ are disjoint if no.
Based on “A Practical Introduction to Ontologies & OWL” © 2005, The University of Manchester A Practical Introduction to Ontologies & OWL Session 2: Defined.
© O. Corcho, MC Suárez de Figueroa Baonza 1 OWL and SWRL Protégé 4: Building an OWL Ontology Mari Carmen Suárez-Figueroa, Oscar Corcho {mcsuarez,
Bigscholar 2014, April 8, Seoul, South Korea1 Trust and Hybrid Reasoning for Ontological Knowledge Bases Hui Shi, Kurt Maly, and Steven Zeil Contact:
OilEd An Introduction to OilEd Sean Bechhofer. Topics we will discuss Basic OilEd use –Defining Classes, Properties and Individuals in an Ontology –This.
Organization of the Lab Three meetings:  today: general introduction, first steps in Protégé OWL  November 19: second part of tutorial  December 3:
Ontology Design for USC Semantic Information Research Lab Chen Li, Tengfei Li, Tian Wang.
Logics for Data and Knowledge Representation Web Ontology Language (OWL) -- Exercises Feroz Farazi.
LDK R Logics for Data and Knowledge Representation ClassL (Propositional Description Logic with Individuals) 1.
A Semantic Web Approach for the Third Provenance Challenge Tetherless World Rensselaer Polytechnic Institute James Michaelis, Li Ding,
ONTOLOGY ENGINEERING Lab #3 – September 15,
Description Logics Dr. Alexandra I. Cristea. Description Logics Description Logics allow formal concept definitions that can be reasoned about to be expressed.
ece 627 intelligent web: ontology and beyond
SWRL Semantic Web Rule Language Susana R. Novoa UNIK4710.
ONTOLOGY ENGINEERING Lab #4 - September 22, 2014.
Ontology Engineering Lab #4 - September 23, 2013.
Write a function rule for a graph EXAMPLE 3 Write a rule for the function represented by the graph. Identify the domain and the range of the function.
Inverse Functions: Given a function, find its inverse. First rename f (x) as y. Example: Given the one-to-one function, find f - 1 (x). Second, interchange.
BioHealth Informatics Group Copyright © 2005, The University of Manchester1 A Practical Introduction to Ontologies & OWL Additional Exercises: Common Errors.
Knowledge Representation and Inference Dr Nicholas Gibbins 32/3019.
Ccs.  Ontologies are used to capture knowledge about some domain of interest. ◦ An ontology describes the concepts in the domain and also the relationships.
Copyright © Cengage Learning. All rights reserved. Functions.
Copyright © 2013, 2009, 2005 Pearson Education, Inc. 1 4 Inverse, Exponential, and Logarithmic Functions Copyright © 2013, 2009, 2005 Pearson Education,
OWL (Ontology Web Language and Applications) Maw-Sheng Horng Department of Mathematics and Information Education National Taipei University of Education.
5.3 Trigonometric Graphs.
Logics for Data and Knowledge Representation
Knowledge Representation Part II Description Logic & Introduction to Protégé Jan Pettersen Nytun.
Computer Science cpsc322, Lecture 20
ece 720 intelligent web: ontology and beyond
Identity and Equality Properties
ATS Application Programming: Java Programming
ece 720 intelligent web: ontology and beyond
CSE 311: Foundations of Computing
KNOWLEDGE REPRESENTATION
Foundations of the Semantic Web: Ontology Engineering
Logics for Data and Knowledge Representation
Ontologies and Databases
Computer Science cpsc322, Lecture 20
Description Logics.
Object Oriented Programming
Knowledge Representation Part VII Protégé / RDFS / OWL / ++
Object Oriented Programming
Knowledge Representation Part III
CIS Monthly Seminar – Software Engineering and Knowledge Management IS Enterprise Modeling Ontologies Presenter : Dr. S. Vasanthapriyan Senior Lecturer.
University of Manchester
Presentation transcript:

The kinship domain example from http://owl. cs. manchester. ac The kinship domain example from http://owl.cs.manchester.ac.uk/publications/talks-and-tutorials/fhkbtutorial/ extended with rules

Describing properties

Create the A-box: add individuals and their respective mother and father

Run the reasoner (Pellet): see all derived relations for the selected individual

Execution of the queries “Whose father is David_Bright_1934 Execution of the queries “Whose father is David_Bright_1934?” and “Who is the father of Robert_David_Bright_1965?” Note entailed fact that D_B_1934 isFatherOf R_D_B_1965 – we have defined only hasFather relation but stated that isFatherOf is its inverse.

Other inferences for R_D_B_1965 reflecting property hierarchy – sub-property implies super-property

Introducing ancestors and descendants: add new properties (hasRelation – symmetric; hasAncesstor – transitive; isAncesstorOf – inverse of hasAncesstor)

Inferred property hierarchy

Execution of the queries “Who are the descendants of W_G_B_1901 Execution of the queries “Who are the descendants of W_G_B_1901? and “Who are the ancestors of Robert_David_Bright_1965?”

Who are the children and the father of D_B_1934?

Adding grandparents properties (hasGrandparent, hasGrandmother, hasGrandfather and their inverses): inserted and inferred hierarchies

Property chains for grandparents, great-grandparents, etc Property chains for grandparents, great-grandparents, etc. reflect limited transitivity

All inferred relations are also explicitly associated with a selected individual (only part of these relations shown here)

Who are the grandparents/grandfathers/grandmothers of R_D_B_1965?

Creating classes to represent groups of objects in the kinship domain (Person, Sex, Man, Woman).This would allow us to describe the objects in terms of their type. We can put them all under the class DomainEntity to “enclose” the domain. Here is the description of the Person class. Note that restrictions (Subclass Of) on classes mean that we know something about members of that class; Equivalent To means iff statements (i.e. axioms).

hasSex property connects an object of type Person to an object of type Sex

Inferred hierarchies

Who is a Person, Man, Woman Who is a Person, Man, Woman? How domains and ranges of hasMother/hasFather drive inferences? (31 Person; 10 Woman; 10 Man?)

Who has a mother? Whose mother is M_G_R_1934?

Inconsistencies: a stated or derived fact cannot be added to the Abox, because it contradicts other facts that are already there. Example: declare R_D_B_1965 hasMother David_Bright_1934

Adding classes for Ancestors/Descendants (before/after reasoner) Adding classes for Ancestors/Descendants (before/after reasoner). Note that the reasoner inferred that Person and Descendant classes are equivalent (have the same members) – same for subclasses.

Defining domain and range of the properties makes a difference (compare with the next slide)

Defining domain and range of the properties makes a difference Defining domain and range of the properties makes a difference. Answers to query about descendants will be different.

Introducing sisterhood/brotherhood: the rules approach

The missing part: sex not implied for all individuals The missing part: sex not implied for all individuals. J_B_1964 has siblings, but no brother? The reason – Woman/Man can be inferred for some individuals only. Defining the sex property for individuals seems the easiest way to proceed.

Man/Woman identified so far without sex property explicitly defined for all individuals (Person shown for comparison)

What is known for individuals so far?

Brotherhood/Sisterhood complete with sex defined for all individuals

Another query

How to define half-siblings: the role of OWA Here are the rules for hasHalfSibling: hasFather(?x, ?y) ^ hasFather(?v, ?w) ^ hasMother(?x, ?z) ^ hasMother(?v, ?z) -> hasHalfSibling(?x, ?v) hasFather(?x, ?y) ^ hasFather(?v, ?y) ^ hasMother(?x, ?z) ^ hasMother(?v, ?w) -> hasHalfSibling(?x, ?v) This rules will NOT work correctly, because although we have stated that all individuals are different, we still need to say that ?y  ?w (in the first rule) and ?z  ?w (in the second rule). The reason: the Open World Assumption (OWA), which states that if we do not explicitly state something, then it is considered UNKNOWN. The unification procedure, therefore, suggests that ?y = ?w and ?z = ?w, which results in these rules acting the same way as the rule for hasSibling. Why OWA? The guiding principle on the web is that “Anyone can say Anything about Any topic” – theso-called AAA slogan. The OWA is a consequence of it. In our case, we are not explicitly stating that two variables are different, therefore they can be the same.

More rules added

More interesting query: Who are the wives of WB1970 More interesting query: Who are the wives of WB1970? Notice that hasWife is derived by rule 15, and then isWifeOf is derived by the reasoner as an inverse property of hasWife.

Back to OWL version of the family ontology Notice that sisterhood/brotherhood is derived by the corresponding SWRL rules. We cannot do this in Protégé by means of OWL 2 in purely DLs fashion, because we cannot say Man(x) and hasSibling(x, y) implies hasBrother(y). To handle this, we will explicitly define the sibling relationships by stating, for example, that “x isBrotherOf y” and “z isSisterOf y”. But, if it is not defined both ways, we will get a wrong result:

How much is inferred at this point by the reasoner for a named individual? Here is a small sample of inferred facts: