Lens effects in autonomous terminology and conceptual vector learning

Slides:



Advertisements
Similar presentations
S é mantique lexicale Vecteur conceptuels et TALN Mathieu Lafourcade LIRMM - France
Advertisements

Conceptual vectors for NLP Lexical functions
Conceptual vectors for NLP MMA 2001 Mathieu Lafourcade LIRMM - France
Automatically Populating Acception Lexical Database through Bilingual Dictionaries and Conceptual Vectors PAPILLON 2002 Mathieu Lafourcade LIRMM - France.
Synonymies and conceptual vectors NLPRS 2001 Mathieu Lafourcade, Violaine Prince LIRMM - France.
Ciro Cattuto, Dominik Benz, Andreas Hotho, Gerd Stumme Presented by Smitashree Choudhury.
The Google Similarity Distance  We’ve been talking about Natural Language parsing  Understanding the meaning in a sentence requires knowing relationships.
Dependence Analysis Kathy Yelick Bebop group meeting, 8/3/01.
Applications Chapter 9, Cimiano Ontology Learning Textbook Presented by Aaron Stewart.
Introduction to Traversing using distances and directions of lines between Traversing is the method of using distances and directions of lines between.
Guessing Hierarchies and Symbols for Word Meanings through Hyperonyms and Conceptual Vectors Mathieu Lafourcade LIRMM - France
1Ellen L. Walker Segmentation Separating “content” from background Separating image into parts corresponding to “real” objects Complete segmentation Each.
Semantic text features from small world graphs Jure Leskovec, IJS + CMU John Shawe-Taylor, Southampton.
Antonymy and Conceptual Vectors Didier Schwab, Mathieu Lafourcade, Violaine Prince Laboratoire d’informatique, de robotique Et de microélectronique de.
Name : Emad Zargoun Id number : EASTERN MEDITERRANEAN UNIVERSITY DEPARTMENT OF Computing and technology “ITEC547- text mining“ Prof.Dr. Nazife Dimiriler.
M. Lafourcade (LIRMM & Ch. Boitet (GETA, CLIPS)LREC-02, Las Palmas, 31/5/ LREC-2002, Las Palmas, May 2002 Mathieur Lafourcade & Christian Boitet.
10/22/2015ACM WIDM'20051 Semantic Similarity Methods in WordNet and Their Application to Information Retrieval on the Web Giannis Varelas Epimenidis Voutsakis.
Course 9 Texture. Definition: Texture is repeating patterns of local variations in image intensity, which is too fine to be distinguished. Texture evokes.
LANGUAGE MODELS FOR RELEVANCE FEEDBACK Lee Won Hee.
Wikipedia as Sense Inventory to Improve Diversity in Web Search Results Celina SantamariaJulio GonzaloJavier Artiles nlp.uned.es UNED,c/Juan del Rosal,
Team Members Dilip Narayanan Gaurav Jalan Nithya Janarthanan.
Personalized Interaction With Semantic Information Portals Eric Schwarzkopf DFKI
Improving Translation Selection using Conceptual Vectors LIM Lian Tze Computer Aided Translation Unit School of Computer Sciences Universiti Sains Malaysia.
Whole Numbers Section 3.4 Properties of Whole-Number Operations
A Novel Visualization Model for Web Search Results Nguyen T, and Zhang J IEEE Transactions on Visualization and Computer Graphics PAWS Meeting Presented.
Discretization Methods Chapter 2. Training Manual May 15, 2001 Inventory # Discretization Methods Topics Equations and The Goal Brief overview.
2/10/2016Semantic Similarity1 Semantic Similarity Methods in WordNet and Their Application to Information Retrieval on the Web Giannis Varelas Epimenidis.
Semantic Grounding of Tag Relatedness in Social Bookmarking Systems Ciro Cattuto, Dominik Benz, Andreas Hotho, Gerd Stumme ISWC 2008 Hyewon Lim January.
Houses of Mirrors: Deeply Adaptive Designs for Machine Cognition Deborah Duong, Michael Ross.
Lens effects in autonomous terminology and conceptual vector learning Mathieu Lafourcade LIRMM - France
Constructing A Yami Language Lexicon Database from Yami Archiving Projects Meng-Chien Yang(Providence University, Taiwan) D. Victoria Rau(National Chung.
Defect-Defect Interaction in Carbon Nanotubes under Mechanical Loading Topological defects can be formed in carbon nanotubes (CNTs) during processing or.
1 Dongheng Sun 04/26/2011 Learning with Matrix Factorizations By Nathan Srebro.
SERVICE ANNOTATION WITH LEXICON-BASED ALIGNMENT Service Ontology Construction Ontology of a given web service, service ontology, is constructed from service.
UNIFIED MEDICAL LANGUAGE SYSTEMS (UMLS)
Introduction to Concept Mapping
Exploiting Wikipedia as External Knowledge for Document Clustering
Lecture 12: Relevance Feedback & Query Expansion - II
Conceptual vectors for NLP MMA 2001 Mathieu Lafourcade LIRMM - France
The Needs for Coding and Classification Systems
Antonymy and Conceptual Vectors
Copyright © Cengage Learning. All rights reserved.
Vector-Space (Distributional) Lexical Semantics
Chapter 4 – Part 3.
Statistical NLP: Lecture 9
CHAPTER 6 Random Variables
CHAPTER 6 Random Variables
A method for WSD on Unrestricted Text
A Graph-Based Approach to Learn Semantic Descriptions of Data Sources
Matrices and Matrix Operations
CHAPTER 6 Random Variables
Warmup Consider tossing a fair coin 3 times.
Chapter 6: Random Variables
Ying Dai Faculty of software and information science,
CHAPTER 6 Random Variables
CHAPTER 6 Random Variables
Semantic Similarity Methods in WordNet and their Application to Information Retrieval on the Web Yizhe Ge.
Lesson 9-R Chapter 9 Review.
CHAPTER 6 Random Variables
Synonymies and conceptual vectors
Giannis Varelas Epimenidis Voutsakis Paraskevi Raftopoulou
Automatically Populating Acception Lexical Database through Bilingual Dictionaries and Conceptual Vectors PAPILLON 2002 Mathieu Lafourcade LIRMM -
Sample Problem A student committee has 6 members: 4 females and 2 males. Two students are selected at random, and a random variable X is defined to be.
Math review - scalars, vectors, and matrices
CHAPTER 6 Random Variables
Computational Intelligence
Computational Intelligence
Vector Spaces COORDINATE SYSTEMS © 2012 Pearson Education, Inc.
Statistical NLP : Lecture 9 Word Sense Disambiguation
Presentation transcript:

Lens effects in autonomous terminology and conceptual vector learning Mathieu Lafourcade LIRMM - France http://www.lirmm.fr/~lafourcade 1

Overwiew & Objectives lexical semantic representations conceptual vector model (cvm) autonomous learning by the system from a given « semantic space » (ontology) effects of swithing ontologies (general  spec) global effects on the lexicon local effects on particular word ambiguity as noise towards self contained WSD annotations « I made a deposit at the bank »  « I made a deposit at the bank<g:money> »

Conceptual vectors vector space An idea Concept combination — a vector Idea space = vector space A concept = an idea = a vector V with augmentation: V + neighboorhood Meaning space = vector space + {v}*  27

2D view of « meaning space » “product” “cat”

Conceptual vectors Thesaurus H : thesaurus hierarchy — K concepts Thesaurus Larousse = 873 concepts V(Ci) : <a1, …, ai, … , a873> aj = 1/ (2 ** Dum(H, i, j)) 1/16 1/16 1/4 1 1/4 1/4 1/64 1/64 4 2 6 93

Conceptual vectors Concept c4:peace conflict relations hierarchical relations The world, manhood society

Conceptual vectors Term “peace” c4:peace

exchange profit finance

Angular distance DA(x, y) = angle (x, y) 0  DA(x, y)   if 0 then x & y colinear — same idea if /2 then nothing in common if  then DA(x, -x) with -x — anti-idea of x x’ x  y 36

Angular distance DA(x, y) = acos(sim(x,y)) DA(x, y) = acos(x.y/|x||y|)) DA(x, x) = 0 DA(x, y) = DA(y, x) DA(x, y) + DA(y, z)  DA(x, z) DA(0, 0) = 0 and DA(x, 0) = /2 by definition DA(x, y) = DA(x, y) with   0 DA(x, y) =  - DA(x, y) with  < 0 DA(x+x, x+y) = DA(x, x+y)  DA(x, y) 37

Thematic distance Examples DA(tit, tit) = 0 DA(tit, passerine) = 0.4 DA(tit, bird) = 0.7 DA(tit, train) = 1.14 DA(tit, insect) = 0.62 tit = insectivorous passerine bird … 43

Some vector operations Addition : Z = X  Y zi = xi + yi vector Z is normalized Term to term mult : Z = X  Y zi = (xi * yi)1/2 vector Z is not normalized Weak contextualization : Z = X  (X  Y) = (X,Y) “Z is X augmented by its mutual information with Y”

2D view of weak contextualization XY X(XY)    X XY Y(XY) XY Y

Autonomous learning 1/2 set of known words K, set of unknow words U revise a word w of K OR (try to) learn a word w of U From the web : for w ask for a def D specific sites : dicts, synonyms list, etc.  def analysis general sites : google, etc.  corpus analysis for each word wd of D if not in K then add wd to U AND add VO to V* otherwise get the vector of wd AND add V(wd) to V* compute the new vector of w from def(D) and V* 98870 words for 400000 senses (vectors) learned in 3 years French « forever » looping process

Autonomous learning 2/2 insectivorous passerine bird … Weighted sum TXT Weighted sum (X,Y) V PH V N, GOV … V … V ADJ, … insectivorous passerine bird … V

Local expansion of vector space “product” “cat” cn c1 GS G finer mesh locally over the space

Point of G without link in S Folding and unfolding S Specialized ontology + a + b General ontology + b G + c Point of G without link in S vG … a b … c a … a+b … b … b+c … vS GGS(vG) = vGS 2000 873 GSG(vGS) = vG

Local lexical density given a point P count the number of points at distance d1, d2, dn P 0 ≤ d1 < d2 <…dn ≤ π/2

Lexical Distribution from Local density medium density curve top centered high density curve shifted on the left low density case left as an exercise

Macro level Local density variation G GS

Micro level Distance variation small angle = high similarity larger angle = less similarity G GS

Last words Switching of representation global and local test functions coarse grained to fine grained  better semantic discrimation … and vice-versa  conservation of resource global and local test functions for vector quality assessment decision taking about level of representation detectors when combined to lexical functions (antonymy, etc.) the basis for self adjustement toward a vector space of constant density wsd as a reduction of noise (in context or out of context) unification of ontologies self emergent structuration of terminology