Presentation is loading. Please wait.

Presentation is loading. Please wait.

COMP 4060 Natural Language Processing Semantics. Semantics Semantics I  General Introduction  Types of Semantics  From Syntax to Semantics Semantics.

Similar presentations


Presentation on theme: "COMP 4060 Natural Language Processing Semantics. Semantics Semantics I  General Introduction  Types of Semantics  From Syntax to Semantics Semantics."— Presentation transcript:

1 COMP 4060 Natural Language Processing Semantics

2 Semantics Semantics I  General Introduction  Types of Semantics  From Syntax to Semantics Semantics II  Desiderata for Representation  Logic-based Semantics

3 Semantics I

4 Semantics Distinguish between  surface structure (syntactic structure) and  deep structure (semantic structure) of sentences. Different forms of Semantic Representation  logic formalisms  ontology / semantic representation languages  Case Frame Structures (Filmore)  Conceptual Dependy Theory (Schank)  DL and similar KR languages  Ontologies

5 Semantic Representations Semantic Representation based on some kind of (formal) representation language:  Semantics Networks  Conceptual Dependency Graphs  Case Frames  Ontologies  Description Logics and similar Knowledge Rrepresentation languages

6 Constructing a Semantic Representation General:  Start with surface structure derived from parser.  Map surface structure to semantic structure  Use phrases as sub-structures.  Find concepts and representations for central phrases (e.g. VP, NP, then PP)  Assign phrases to appropriate roles around central concepts (e.g. bind PP into VP representation).

7 Ontology (Interlingua) approach  Ontology: a language-independent classification of objects, events, relations.  A Semantic Lexicon which connects lexical items to nodes (concepts) in the ontology.  An Analyzer that constructs Interlingua representations and selects appropriate one.

8 Semantic Lexicon  Provides a syntactic context for the appearance of the lexical item.  Provides a mapping for the lexical item to a node in the ontology (or more complex associations).  Provides connections from the syntactic context to semantic roles and constraints on these roles.

9 Deriving Basic Semantic Dependency (a toy example) Input: John makes tools Syntactic Analysis: catverb tensepresent subject root john catnoun-proper object root tool catnoun numberplural Deriving Basic Semantic Dependency

10 John-n1 syn-struc rootjohn catnoun-proper sem-struc human name john gendermale tool-n1 syn-struc roottool catn sem-struc tool Lexicon Entries for John and tool

11 Relevant Extract from the Specification of the Ontological Concept Used to Describe the Appropriate Meaning of make: manufacturing-activity... agenthuman themeartifact … Meaning Representation - Example make

12 John-n1 syn-struc rootjohn catnoun-proper sem-struc human name john gendermale tool-n1 syn-struc roottool catn sem-struc tool Relevant parts of the (appropriate senses of the) lexicon entries for John and tool

13 The basic semantic dependency component of the TMR for John makes tools manufacturing-activity-7 agentuman-3 themeset-1 element tool cardinality> 1 … Semantic Dependency Component

14 try-v3 syn-struc root try cat v subj root $var1 cat n xcomp root $var2 cat v formOR infinitive gerund sem-struc set-1element-typerefsem-1 cardinality>=1 refsem-1 semevent agent^$var1 effectrefsem-2 modality modality-typeepiteuctic modality-scoperefsem-2 modality-value< 1 refsem-2value^$var2 semevent

15 Constructing an Interlingua Representation For each syntactic analysis:  Access all semantic mappings and contexts for each lexical item.  Create all possible semantic representations.  Test them for coherency of structure and content.

16 “Why is Iraq developing weapons of mass destruction?”

17 Word sense disambiguation  Constraint checking  making sure the constraints imposed on context are met  Graph traversal  is-a links are inexpensive  other links are more expensive  The “cheapest” structure is the most coherent

18 Semantics II

19 Desiderata for a Semantic Representation  Verifiability – semantic representation must be compatible with knowledge (base) of the system.  Canonical Form - assign same representation to different surface expressions which have essentially the same meaning  Ambiguity and Vagueness – representation should (in relation to knowledge base or information system access etc.) be unambiguous and precise

20 Semantics - Connecting Words and Worlds Semantic Representation NL Input NL Output World State (KB: T-Box, A-Box) Knowledge Representation

21 Representation of Meaning Representation of meaning for natural language sentences:  Semantic Representation Language (in most cases) = some kind of formal language + semantic primitives  For example: First Order Predicate Logic with specific set of predicates and functions

22 Semantic Representations Semantic Representation based on some form of (formal) Representation Language. –Semantics Networks –Conceptual Dependency Graphs –Case Frames –Ontologies –DL and similar KR languages –First-Order Predicate Logic

23 Example - NL Database Access Imagine a database access using natural language, i.e. questions to the DB posed in natural language. Example: DB of courses in the CS department Pose questions like: Who is teaching Advanced AI in Fall 2008? Is John Anderson teaching this term? What is Jacky Baltes teaching this term? Who is teaching AI at the University of Winnipeg? Who is teaching an AI related course this term?

24 Example Story: My car was stolen two weeks ago. They found it last week. direct representation of meaning knowledgeinference

25 Example Primitives in logic language FOPL: my car as individual constant my_car, car_1 can make statement owns(car_1, I) about ownership of carowns(car_1, Speaker) 2-place predicate owns with one place for the object / car and one place for the owner; filled with variable or constant owns(car_1, Speaker) Someone owns car_1.  x: owns(car_1, x) I own all cars.  x: car(x)  owns(x, Speaker)

26 Example Primitives in logic language FOPL: stolen as predicate applied to carstolen(car_1) as event, specified with variable for event and constant for specific event stolen-event  e,x: event(e)  stolen(e,x)  x= car_1 or  e,x: event(e)  stolen(e, car_1) can make additional specifications, e.g. tense; time; location  e,x: event(e)  stolen(e, car_1)  past(e)  time(e)=UT-2weeks / time(e,UT-2weeks)  loc(e)=street#1 refers to identified street utterance time - 2 weeks event time before utterance time

27 Example Primitives in logic language FOPL: They found it last week. found(car_1,t)  time(t)  t=(UT-1week)

28 NL and Logic Levels of Representation and Transformation  direct representation of meaning  translation into logic expression  knowledge  stored information about relations etc., e.g. as rules  ontology; terminology; proper axioms  inference  gain additional information, conclusions  combine semantic representation and knowledge

29 Example car (my_car) stolen (my_car, t1), owns (speaker, my_car) found (police, my-car, t2) t1<t2 stolen (x, t1)  owns (y, x) and found (police, x, t2) implies has (y, x, t3) for some timepoints t1, t2, t3 with t1<t2<t3 What can you infer if you instantiate x with my_car? concrete world description general world knowledge

30 Reichenbach's Approach to English Tenses Fig. 14.4. from Jurafsky and Martin, p. 530 UTime of Utterance R Reference Time E Event Time

31 Example car (my_car) stolen (my_car, t1), owns (speaker, my_car) found (police, my-car, t2) t1<t2 stolen (x, t1)  owns (y, x) and found (police, x, t2) implies has (y, x, t3) for some timepoints t1, t2, t3 with t1<t2<t3 stolen(my_car, t1)  owns (speaker, my_car)  found (police, my-car, t2)  has (speaker, my_car, t3) pattern matching with variable binding: unification; inference

32 Example stolen (x, t1)  owns (y, x) implication? Express that, if something is stolen, the owner does not have it anymore!

33 Predicate-Argument Structure Verb-centered approach Thematic roles, case roles  Describe semantic structure based on verb and associated roles filled by other parts of the sentence (phrases). Representation using e.g. logic:  Transform structured input sentence (syntax!) into expression in predicate logic.  Usually based on central predicate, the verb, or equivalent, like ‘be’+ adjective etc.  Other parts of the sentence directly related to the verb go into the central predicate.

34 Verb Subcategorization Consider possible subcat frames of verbs. Example: 3 different kinds of want: 1.NP want NPI want money. want1 (Speaker, money) 2.NP want Inf-VPHe wants to go home. want2 (he, to_go_home) 3.NP want NP Inf-VP I want him to go away. want3 (I, him, to_go_away)

35 Example - Restaurant 'Maharani' Example: Restaurant 'Maharani' Maharani serves vegetarian food.Maharani serves vegetarian food. Maharani is a vegetarian restaurant.Maharani is a vegetarian restaurant. Maharani is close to ICSI.Maharani is close to ICSI. Write down logical formulas representing the three different sentences.

36 Logic Formalisms Lambda Calculus

37 Semantics - Lambda 1 Semantics - Lambda Calculus 1 Logic representations often involve Lambda-Calculus: represent central phrases (e.g. verb) as -expressions -expression is like a function which can be applied to terms -expression is like a function which can be applied to terms insert semantic representation of complement or modifier phrases etc. in place of variables  x, y: loves (x, y)FOPLsentence x y loves (x, y) -expression x y loves (x, y) -expression x y loves (x, y) (John)  y loves (John, y) x y loves (x, y) (John)  y loves (John, y)

38 Semantics - Lambda Calculus 2 Transform sentence into lambda-expression: “AI Caramba is close to ICSI.” specific: close-to (AI Caramba, ICSI) general:  x,y: close-to (x, y)  x=AI Caramba  y=ICSI Lambda Conversion form -expression: x y: close-to (x, y) (AI Caramba) Lambda Reduction apply -expression y: close-to (AI Caramba, y) close-to (AI Caramba, ICSI) close-to (AI Caramba, ICSI)

39 Semantics - Lambda Calculus 3  Lambda Expressions as basis for semantic representations  attached to words and syntactic categories in grammar rules  passed between nodes during parsing, according to grammar Example: semantics of the verb 'serve' Verb  serve { x y  e IS-A(e, Serving)  Server(e, y)  Served(e, x)} { x y  e IS-A(e, Serving)  Server(e, y)  Served(e, x)} Reification denotes the use of predicates as constants. Allows the use of "predicates over predicates", e.g. IS-A (serving, event) or IS-A (restaurant, location). IS-A (serving, event) or IS-A (restaurant, location). e: serving - event, action, verb reification

40 Semantics - Lambda Calculus 4 Lambda Expressions are constructed from central expression, inserting semantic representations for subject and complement phrases: Verb  serve { x y  e IS-A(e, Serving)  Server(e, y)  Served(e, x)} { x y  e IS-A(e, Serving)  Server(e, y)  Served(e, x)} Fill in appropriate expressions for y, x, e.g. Ay Caramba for y and steak for x, derived from direct NP = object NP of the sentence, and y as complement subject NP to the verb. y: restaurant - NP, subj. x: food - NP, dir. obj. e: serving - S, event

41 Semantics - Lambda Calculus 5 Complete semantic representation is produced by combining semantic feature structures of phrases in the sentence, according to extended grammar rules. Verb  serves { x y  e IS-A(e, Serving)  Server(e, y)  Served (e, x)} { x y  e IS-A(e, Serving)  Server(e, y)  Served (e, x)} Apply lambda-expression representing the verb semantics to semantic representations of NPs: { x y  e IS-A (e, Serving)  Server (e, y)  Served (e, x)} Successively apply the lambda-expression (for serves in the example above), filling the x-position with the semantics of the object-NP, and the y-position with the representation of the subject-NP.

42 Semantics - Lambda Calculus 6 Extend the grammar with semantic attachments, e.g. NP  ProperNoun {ProperNoun.sem} {ProperNoun.sem} The "base" semantic attachment is determined through access to a lexicon or an ontology. It corresponds to the concept associated with the lexical word, or in the simplest form just the lexical word. Example: Ay Caramba as individual constant or meat as (reified) concept.

43 Semantics - Lambda Calculus 7 Constructive Semantics During parsing, these semantic attachments are combined, according to the grammar rules, to form more complete representations, finally covering the whole sentence. Combine and pass upwards semantic attachments, e.g. S  NP VP {VP.sem {NP.sem}} VP  Verb NP {Verb.sem {NP.sem}}

44 Semantic Representation in BeRP Figure 15.3. from Jurafsky and Martin, p. 554. Parse tree with semantic attachments for the sentence "AyCaramba serves meat."

45 Semantics - Lambda Calculus 8 Modifiers can be added into semantic description as part of the grammar rules, by intersection of concepts: Nominal  Adj Nominal { x. Nominal.sem(x)  IS-A(x, Adj.sem)} Example: a "cheap restaurant" x. IS-A (x, restaurant)  IS-A (x, cheap) x. IS-A (x, restaurant)  IS-A (x, cheap) Problem if intersection of concepts is misleading, e.g. "former friend". Use "modification" rule instead : Nominal  Adj Nominal { x. Nominal.sem(x)  AdjMod(x, Adj.sem)} Use rule for "cheap restaurant":  x. IS-A (x, restaurant)  AdjMod (x, cheap)  x. IS-A (x, restaurant)  AdjMod (x, cheap) where "cheap" modifies the "restaurant" in a specific way.

46 Semantics - Problems Problems with Modal Verbs:  apply to predicate structure (other verb)  referential opaqueness  not standard implications Example: Example: I think Joe's flight leaves at 7pm. (think (Speaker, leaves (Joe's flight, 7 pm))) Add: Add: Joe's flight is BA727. BA 727 is delayed. Add: Add: I think I go home. Problem: cannot apply predicate to formula in FOPL What does Speaker think now? Should I stay or should I go?

47 Parsing with Semantic Features Modified Early Algorithm. Figure 15.5, Jurafsky and Martin, p. 570.

48 References Jurafsky, D. & J. H. Martin, Speech and Language Processing, Prentice-Hall, 2000. (Chapters 9 and 10) Helmreich, S., From Syntax to Semantics, Presentation in the 74.419 Course, November 2003.


Download ppt "COMP 4060 Natural Language Processing Semantics. Semantics Semantics I  General Introduction  Types of Semantics  From Syntax to Semantics Semantics."

Similar presentations


Ads by Google