Presentation is loading. Please wait.

Presentation is loading. Please wait.

Natural Language Processing Lecture 2: Semantics.

Similar presentations

Presentation on theme: "Natural Language Processing Lecture 2: Semantics."— Presentation transcript:

1 Natural Language Processing Lecture 2: Semantics

2 Last Lecture n Motivation n Paradigms for studying language n Levels of NL analysis n Syntax –Parsing n Top-down n Bottom-up n Chart parsing

3 Today’s Lecture n DCGs and parsing in Prolog n Semantics –Logical representation schemes –Procedural representation schemes –Network representation schemes –Structured representation schemes

4 Parsing in PROLOG n How do you represent a grammar in PROLOG?

5 Writing a CFG in PROLOG Consider the rule S -> NP VP Consider the rule S -> NP VP n We can reformulate this as an axiom: –A sequence of words is a legal S if it begins with a legal NP that is followed by a legal VP n What about s(P1, P3):-np(P1, P2), vp(P2, P3)? –There is an S between position P1 and P3 if there is a position P2 such that there is an NP between P1 and P2 and a VP between P2 and P3

6 Inputs n John ate the cat can be described –word(john, 1, 2) –word(ate, 2, 3) –word(the, 3, 4) –word(cat, 4, 5) n Or (better) use a list representation: –[john, ate, the, cat]

7 Lexicon n First representation –isname(john), isverb(ate) –v(P1, P2):- word(Word, P1, P2), isverb(Word) n List representation –name([john|T], T).

8 A simple PROLOG grammar s(P1, P3):-np(P1, P2), vp(P2, P3). np(P1, P3):-art(P1, P2), n(P2, P3). np(P1, P3):-name(P1, P3). pp(P1, P3):-p(P1, P2), np(P2, P3). vp(P1, P2):-v(P1, P2). vp(P1, P3):-v(P1, P2), np(P2, P3). vp(P1, P3):-v(P1, P2), pp(P2, P3).

9 Direct clause grammars n PROLOG provides an operator that supports DCGs n Rules look like CFG notation n PROLOG automatically translates these

10 DCGs and Prolog s --> np, vp. np --> art, n. np --> name. pp --> p, np. vp --> v. vp --> v, np. vp --> v, pp. Lexicon name --> [john]. v --> [ate]. art --> [the]. n --> [cat]. Grammar s(P1, P3):-np(P1, P2), vp(P2, P3). np(P1, P3):-art(P1, P2), n(P2, P3). np(P1, P3):-name(P1, P3). pp(P1, P3):-p(P1, P2), np(P2, P3). vp(P1, P2):-v(P1, P2). vp(P1, P3):-v(P1, P2), np(P2, P3). vp(P1, P3):-v(P1, P2), pp(P2, P3). Lexicon name([john|P], P). v([ate|P],P). art([the|P],P). n([cat|P],P).

11 Building a tree with DCGs n We can add extra arguments to DCGs to represent a tree: –s --> np, vp. becomes –s(s(NP, VP)) -->np(NP), vp(VP).

12 An ambiguous DCG s(s(NP, VP)) --> np(NP), vp(VP). np(np(ART, N)) --> art(ART), n(N). np(np(NAME)) --> name(NAME). pp(pp(P,NP)) --> p(P), np(NP). vp(vp(V)) --> v(V). vp(vp(V,NP)) --> v(V), np(NP). vp(vp(V,PP)) --> v(V), pp(PP). vp(vp(V,NP,PP)) --> v(V), np(NP), pp(PP). np(np(ART, N, PP)) --> art(ART), n(N), pp(PP). %Lexicon art(art(the)) --> [the]. n(n(man)) --> [man]. n(n(boy)) --> [boy]. n(n(telescope)) --> [telescope]. v(v(saw)) --> [saw]. p(p(with)) --> [with].

13 Semantics n What does it mean?

14 Semantic ambiguity n A sentence may have a single syntactic structure, but multiple semantic structures –Every boy loves a dog n Vagueness – some senses are more specific than others –“Person” is more vague than “woman” –Quantifiers: Many people saw the accident

15 Logical forms n Most common is first-order predicate calculus (FOPC) n PROLOG is an ideal implementation language

16 Thematic roles n Consider the following sentences: –John broke the window with the hammer –The hammer broke the window –The window broke n The syntactic structure is different, but John, the hammer, and the window have the same semantic roles in each sentence

17 Themes/Cases n We can define a notion of theme or case –John broke the window with the hammer –The hammer broke the window –The window broke n John is the AGENT n The window is the THEME (syntactic OBJECT -- what was Xed) n The hammer is the INSTR(ument)

18 Case Frames fixSarah past chair glue AGENT TIME THEME INSTR Sarah fixed the chair with glue

19 Network Representations n Examples: –Semantic networks –Conceptual dependencies –Conceptual graphs

20 Semantic networks n General term encompassing graph representations for semantics n Good for capturing notions of inheritance n Think of OOP


22 Strengths of semantic networks n Ease the development of lexicons through inheritance –Reasonable sized grammars can incorporate hundreds of features n Provide a richer set of semantic relationships between word senses to support disambiguation

23 Conceptual dependencies n Influential in early semantic representations n Base representation on a small set of primitives

24 Primitives for conceptual dependency n Transfer –ATRANS - abstract transfer (as in transfer of ownership) –PTRANS - physical transfer –MTRANS - mental transfer (as in speaking) n Bodily activity –PROPEL (applying force), MOVE (a body part), GRASP, INGEST, EXPEL n Mental action –CONC (conceptualize or think) –MBUILD (perform inference)

25 Problems with conceptual dependency n Very ambitious project –Tries to reduce all semantics to a single canonical form that is syntactically identical for all sentences with same meaning n Primitives turn out to be inadequate for inference –Must create larger structures out of primitives, compute on those structures

26 Structured representation schemes n Frames n Scripts

27 Frames n Much of the inference required for NLU involves making assumptions about what is typically true about a situation n Encode this stereotypical information in a frame n Looks like themes, but on a higher level of abstraction

28 Frames n For an (old) PC: Class PC(p): Roles: Keyb, Disk1, MainBox Constraints: Keyboard(Keyb) & PART_OF(Keyb, p) & CONNECTED_TO(Keyb,KeyboardPlug(MainBox)) & DiskDrive(Disk1) & PART-OF(Disk1, p) & CONNECTED_TO(Disk1, DiskPort(MainBox)) & CPU(MainBox) & PART_OF(MainBox, p)

29 Scripts n A means of identifying common situations in a particular domain n A means of generating expectations –We precompile information, rather than recomputing from first principles

30 Scripts n Travel by plane: –Roles: Actor, Clerk, Source, Dest, Airport, Ticket, Money, Airplane –Constraints: Person(Actor), Value(Money, Price(Ticket)),... –Preconditions: Owns(Actor, Money), At(Actor, Source) –Effects: not(Owns(Actor, Money)), not(At(Actor, Source)), At(Actor, Dest) –Decomposition: n GoTo(Actor, Airport) n BuyTicket(Actor, Clerk, Money, Ticket),...

31 Issues with Scripts n Script selection –How do we decide which script is relevant? n Where are we in the script?

32 NLP -- Where are we? n We’re five years away (??) n Call NUANCE9 (banking/airline ticket demo) n LSD-TALK (Weather information) n Google n Ask Jeeves n Office Assistant

Download ppt "Natural Language Processing Lecture 2: Semantics."

Similar presentations

Ads by Google