Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Grammars and Parsing Allen ’ s Chapters 3, Jurafski & Martin ’ s Chapters 8-9.

Similar presentations


Presentation on theme: "1 Grammars and Parsing Allen ’ s Chapters 3, Jurafski & Martin ’ s Chapters 8-9."— Presentation transcript:

1 1 Grammars and Parsing Allen ’ s Chapters 3, Jurafski & Martin ’ s Chapters 8-9

2 2 Syntax Why is the structure of language (syntax) important? How do we represent syntax? What does an example grammar for English look like? What strategies exist to find the structure in natural language? A Prolog program to recognise English sentences

3 3 Syntax shows the role of words in a sentence. John hit Sue vs Sue hit John Here knowing the subject allows us to know what is going on.

4 4 Syntax shows how words are related in a sentence. Visiting aunts ARE boring. vs Visiting aunts IS boring. Subject verb agreement allows us to disambiguate here.

5 5 Syntax shows how words are related between sentences. (a) Italy was beating England. Germany too. (b) Italy was being beaten by England. Germany too. Here missing parts of a sentence does not allow us to understand the second sentence. But syntax allows us to see what is missing.

6 6 But syntax alone is not enough Visiting museums can be boring This is not ambiguous for us, as we know there is no such thing as a "visiting museum", but syntax cannot show this to a computer. Compare with … Visiting aunts can be boring

7 7 How do we represent syntax? Parse Tree

8 8 An example: –Parsing sentence: –"They are cooking apples."

9 9 Parse 1

10 10 Parse 2

11 11 How do we represent syntax? List Sue hit John [ s, [np, [proper_noun, Sue] ], [vp, [v, hit], [np, [proper_noun, John] ]

12 12 Chomsky Hierarchy 0 Unrestricted  A    1 Context-Sensitive| LHS |  | RHS | 2 Context-Free|LHS | = 1 3 Regular|RHS| = 1 or 2, A  a | aB, or A  a | Ba

13 13 What Makes a Good Grammar? Generality Selectivity Understandability

14 14 Generality of Grammars Regular {abd, ad, bcd, b, abcd, … } S -> a S1 | b S2 | c S3 | d S1 -> b S2 | c S3 | d S2 -> c S3 | d S3 -> d Context Free {a n b n } S -> ab | a S b Context Sensetive { a n b n c n } or {abcddabcdd, abab, asease, … }

15 15 What strategies exist for trying to find the structure in natural language? Top Down vs. Bottom Up Bottom - Up John, hit, the, cat prpn, hit, the, cat np, hit, the, cat np, v, the, cat np, v, det, cat np, v, det, n np, v, np np, vp s Top - Down s s -> np, vp s -> prpn, vp s -> John, vp s -> John, v, np s -> John, hit, np s -> John, hit, det,n s -> John, hit, the,n s -> John, hit, the,cat

16 16 What strategies exist for trying to find the structure in natural language? Top Down vs. Bottom Up Bottom - Up John, hit, the, cat prpn, hit, the, cat np, hit, the, cat np, v, the, cat np, v, det, cat np, v, det, n np, v, np np, vp s Better if many alternative rules for a phrase Worse if many alternative terminal symbols for each word Top - Down s s -> np, vp s -> prpn, vp s -> John, vp s -> John, v, np s -> John, hit, np s -> John, hit, det,n s -> John, hit, the,n s -> John, hit, the,cat Better if many alternative terminal symbols for each word Worse if many alternative rules for a phrase

17 17 What does an example grammar for English look like? Re-write rules 1.sentence -> noun phrase, verb phrase 2.noun phrase -> art, noun 3.noun phrase -> art, adj, noun 4.verb phrase -> verb 5.verb phrase -> verb, noun phrase

18 18 Parsing as a search procedure 1. Select the first state from the possibilities list (and remove it from the list). 2.Generate the new states by trying every possible option from the selected state (there may be none if we are on a bad path). 3.Add the states generated in step 2 to the possibilities list

19 19 Top down parsing 1 The 2 dog 3 cried 4 Step Current stateBackup States comment 1((S) 1)initial position 2((NP VP) 1)Rule 1 3((ART N VP) 1)Rules 2 & 3 ((ART ADJ N VP) 1) 4((N VP) 2)Match Art with the ((ART ADJ N VP) 1) 5((VP) 3)Match N with dog ((ART ADJ N VP) 1) 6((V) 3)Rules 4 & 5 ((V NP) 3) ((ART ADJ N VP) 1) 7Success

20 20 What strategies exist for trying to find the structure in natural language? Depth First vs. Breadth First Depth First Try rules one at a time and back track if you get stuck Easier to program Less memory required Good if parse tree is deep Breadth First Try all rules at the same time Can be faster Order of rules is not important Good if tree is flat

21 21 An Example of Top-Down Parsing 1 The 2 old 3 man 4 cried 5

22 22 Depth First Search versus Breadth First

23 23 What does a Prolog program look like that tries to recognise English sentences? s --> np vp. np --> det n. np --> det adj n. vp --> v np.

24 24 What does a Prolog program look like that tries to recognise English sentences? sentence(S) :- noun_phrase(NP), verb_phrase(VP), append(NP,VP,S). noun_phrase(NP) :- determiner(D), noun(N), append(D,N,NP). noun_phrase(NP) :- determiner(D), adj(A), noun(N), append(D,A,AP), append(AP,N,NP). verb_phrase(VP) :- verb(V), noun_phrase(NP), append(V,NP,VP). determiner([D]) :- member(D,[the,a,an]). noun([N]) :- member(N,[cat,dog,mat,meat,fish]). adj([A]) :- member(A,[big,fat,red]). verb([V]) :- member(V,[ate,saw,killed,pushed]).

25 25 Pattern matching as an alternative (e.g., Eliza) This uses a database of input output pairs. The input part of pair is a template to be matched against the user input The output part of the pair is given as a response. X computers Y => Do computers interest you? X mother Y => Tell me more about your family? But … Nothing is known about structure (syntax) I X you => Why do you X me? Fine for X = like, but not for X = do not know Nothing is known about meaning (semantics) I feel X => I'm sorry you feel X. Fine for X = depressed, but not for X = happy


Download ppt "1 Grammars and Parsing Allen ’ s Chapters 3, Jurafski & Martin ’ s Chapters 8-9."

Similar presentations


Ads by Google