Lens effects in autonomous terminology and conceptual vector learning Mathieu Lafourcade LIRMM - France
Overwiew & Objectives lexical semantic representations conceptual vector model (cvm) autonomous learning by the system from a given « semantic space » (ontology) effects of swithing ontologies (general spec) global effects on the lexicon local effects on particular word ambiguity as noise towards self contained WSD annotations « I made a deposit at the bank » « I made a deposit at the bank »
Conceptual vectors vector space An idea Concept combination — a vector Idea space = vector space A concept = an idea = a vector V with augmentation: V + neighboorhood Meaning space = vector space + {v}*
2D view of « meaning space » “ cat ” “ product ”
Conceptual vectors Thesaurus H : thesaurus hierarchy — K concepts Thesaurus Larousse = 873 concepts V(C i ) : a j = 1/ (2 ** D um (H, i, j)) 1/41 1/16 1/64 264
Conceptual vectors Concept c4:peace peace hierarchical relations conflict relations The world, manhood society
Conceptual vectors Term “peace” c4:peace
finance profit exchange
Angular distance D A (x, y) = angle (x, y) 0 D A (x, y) if 0 then x & y colinear — same idea if /2 then nothing in common if then D A (x, -x) with -x — anti-idea of x x’ y x
Angular distance D A (x, y) = acos(sim(x,y)) D A (x, y) = acos(x.y/|x||y|)) D A (x, x) = 0 D A (x, y) = D A (y, x) D A (x, y) + D A (y, z) D A (x, z) D A (0, 0) = 0 and D A (x, 0) = /2 by definition D A ( x, y) = D A (x, y) with 0 D A ( x, y) = - D A (x, y) with < 0 D A (x+x, x+y) = D A (x, x+y) D A (x, y)
Thematic distance Examples D A (tit, tit) = 0 D A (tit, passerine) = 0.4 D A (tit, bird) = 0.7 D A (tit, train) = 1.14 D A (tit, insect) = 0.62 tit = insectivorous passerine bird …
Some vector operations Addition : Z = X Y z i = x i + y ivector Z is normalized Term to term mult : Z = X Y z i = (x i * y i ) 1/2 vector Z is not normalized Weak contextualization : Z = X (X Y) = (X,Y) “ Z is X augmented by its mutual information with Y ”
2D view of weak contextualization Y X XYXY XYXY Y (X Y) XYXY X (X Y)
Autonomous learning 1/2 set of known words K, set of unknow words U revise a word w of K OR (try to) learn a word w of U From the web : for w ask for a def D specific sites : dicts, synonyms list, etc. def analysis general sites : google, etc. corpus analysis for each word wd of D if not in K then add wd to U AND add VO to V* otherwise get the vector of wd AND add V(wd) to V* compute the new vector of w from def(D) and V* words for senses (vectors) learned in 3 years French « forever » looping process
Autonomous learning 2/2 insectivorous passerine bird … ADJ, … N, GOV … … PH TXT VVV V V V V V (X,Y) Weighted sum
Local expansion of vector space G GS c1c1 cncn “ cat ” “ product ” finer mesh locally over the space
S G G GS (v G ) = v GS vGvG vSvS Specialized ontology General ontology Point of G without link in S GS G (v GS ) = v G a bc + a + b + c a… … a+bb+cb………… Folding and unfolding
Local lexical density given a point P count the number of points at distance d1, d2, dn P 0 ≤ d1 < d2 < … dn ≤ π/2
Lexical Distribution from Local density high density curve shifted on the left medium density curve top centered low density case left as an exercise
Macro level Local density variation G GS
Micro level Distance variation G GS small angle = high similarity larger angle = less similarity
Last words Switching of representation coarse grained to fine grained better semantic discrimation … and vice-versa conservation of resource global and local test functions for vector quality assessment decision taking about level of representation detectors when combined to lexical functions (antonymy, etc.) the basis for self adjustement toward a vector space of constant density wsd as a reduction of noise (in context or out of context) unification of ontologies self emergent structuration of terminology