Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books עיבוד שפות טבעיות - שיעור שבע Tagging, Partial Parsing Context Free Grammar עידו דגן המחלקה למדעי המחשב אוניברסיטת בר אילן
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Up from bigrams The POS tagging model we described used a history of just the previous tag: P(t i |t 1,…,t i-1 ) ≈ P(t i |t i-1 ) i.e. a First Order Markovian Assumption In this case each state in the HMM corresponds to a POS tag One can build an HMM for POS trigrams P(t i |t 1,…,t i-1 ) ≈ P(t i |t i-2,t i-1 )
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books POS Trigram HMM Model More accurate (also empirically) than a bigram model –He clearly marked –is clearly marked Sparseness problem – smoothing, back-off In such a model the HMM states do NOT correspond to POS tags. Why not 4-grams? –Too many states, not enough data!
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Supervised/Unsupervised Is the HMM based tagging a supervised algorithm? –Yes, because we need a tagged corpus to estimate the transition and emission probabilities (!) What do we do if we don’t have an annotated corpus but, –Have a dictionary –Have an annotated corpus from a different domain and an un-annotated corpus in desired domain.
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Baum-Welch Algorithm also known as the Forward-Backward Algorithm An EM algorithm for HMMs. Maximization by Iterative hill climbing The algorithm iteratively improves the model parameters based on un- annotated training data.
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Baum-Welch Algorithm… Start of with parameters based on the dictionary: –P(w|t) = 1 if t is possible tag for w –P(w|t) = 0 otherwise –Uniform distribution on state transitions This is enough to bootstrap from. Could also be used to tune a system to a new domain. But best results, and common practice, is using supervised estimation
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Completely unsupervised? What if there is no dictionary and no annotated corpus? Then POS tagging becomes clustering – doesn’t correspond to linguistic POS
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Syntax The study of grammatical relations between words and other units within the sentence. The Concise Oxford Dictionary of Linguistics the way in which linguistic elements (as words) are put together to form constituents (as phrases or clauses) Merriam-Webster Dictionary
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Brackets “I prefer a morning flight” [ S [ NP [ pro I]][ VP [ V prefer][ NP [ Det a] [ Nom [ N morning] [ N flight]]]]]]
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Parse Tree Noun Nom NounDet VerbPronoun Ipreferamorning flight NP S VP
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Parsing The problem of mapping from a string of words to to its parse tree is called parsing.
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Generative Grammar A set of rules which indicate precisely what can be and cannot be a sentence in a language. A grammar which precisely specifies the membership of the set of all the grammatical sentences in the language in question and therefore excludes all the ungrammatical sentences.
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Formal Language Approach The set of all grammatical sentences in a given natural language –In practice – a complete grammar is not available Are natural languages regular?
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books English is not a regular language! a n b n is not regular Look at the following English sentences: –John and Mary like to eat and sleep, respectively. –John, Mary, and Sue like to eat, sleep, and dance, respectively. –John, Mary, Sue, and Bob like to eat, sleep, dance, and cook, respectively. Anti missile missile
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Constituents Certain groupings of words behave as constituents. Constituents are able to occur in various sentence positions: –ראיתי את הילד הרזה –ראיתי אותו מדבר עם הילד הרזה –הילד הרזה גר ממול
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books The Noun Phrase (NP) Examples: –He –Ariel Sharon –The prime minister –The minister of defense during the war in Lebanon. They can all appear in a similar context: ___ was born in Kfar-Malal
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Prepositional Phrases Examples: –the man in the white suit –Come and look at my paintings –Are you fond of animals? –Put that thing on the floor
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Verb Phrases Examples: –He went –He was trying to keep his temper. –She quickly showed me the way to hide.
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Chunking Text chunking is dividing sentences into non- overlapping phrases. Noun phrase chunking deals with extracting the noun phrases from a sentence. While NP chunking is much simpler than parsing, it is still a challenging task to build an accurate and very efficient NP chunker.
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books What is it good for Chunking is useful in many applications: –Information Retrieval & Question Answering –Machine Translation –Preprocessing before full syntactic analysis –Text to speech –…
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books What kind of structures should a partial parser identify? Different structures useful for different tasks: –Partial constituent structure [ NP I] [ VP saw [ NP a tall man in the park]]. –Prosodic segments [I saw] [a tall man] [in the park]. –Content word groups [I] [saw] [a tall man] [in the park].
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Chunk Parsing Goal: divide a sentence into a sequence of chunks. Chunks are non-overlapping regions of a text: –[I] saw [a tall man] in [the park]. Chunks are non-recursive –a chunk can not contain other chunks Chunks are non-exhaustive –not all words must be included in chunks
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Chunk Parsing Examples Noun-phrase chunking: –[I] saw [a tall man] in [the park]. Verb-phrase chunking: –The man who [was in the park] [saw me]. Prosodic chunking: –[I saw] [a tall man] [in the park].
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Chunks and Constituency Constituents: [a tall man in [the park]]. Chunks: [a tall man] in [the park]. Chunks are not constituents –Constituents are recursive Chunks are typically subsequences of Constituents –Chunks do not cross constituent boundaries
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Chunk Parsing: Accuracy Chunk parsing achieves higher accuracy –Smaller solution space –Less word-order flexibility within chunks than between chunks –Better locality: Fewer long-range dependencies Less context dependence –No need to resolve attachment ambiguity –Less error propagation
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Chunk Parsing: Domain Specificity Chunk parsing is less domain specific: Dependencies on lexical/semantic information tend to occur at levels "higher" than chunks: –Attachment –Argument selection –Movement Fewer stylistic differences within chunks
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Chunk Parsing: Efficiency Chunk parsing is more efficient –Smaller solution space –Relevant context is small and local –Chunks are non-recursive –Chunk parsing can be implemented with a finite state machine
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Psycholinguistic Motivations Chunk parsing is psycholinguistically motivated: Chunks as processing units –Humans tend to read texts one chunk at a time – Eye-movement tracking studies Chunks are phonologically marked –Pauses, Stress patterns Chunking might be a first step in full parsing
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Chunk Parsing Techniques Chunk parsers usually ignore lexical content Only need to look at part-of-speech tags Techniques for implementing chunk parsing: –Regular expression matching / Finite State Machines (see next) –Transformation Based Learning –Memory Based Learning –Other tagging-style methods (see further)
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Regular Expression Matching Define a regular expression that matches the sequences of tags in a chunk –A simple noun phrase chunk regexp: ? * * –Chunk all matching subsequences: the/DT little/JJ cat/NN sat/VBD on/IN the/DT mat/NN [the/DT little/JJ cat/NN] sat/VBD on/IN [the/DT mat/NN] –If matching subsequences overlap, the first one or longest one gets priority
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Chunking as Tagging Map Part of Speech tag sequences to {I,O,B}* I – tag is part of an NP chunk O – tag is not part of B – the first tag of an NP chunk which immediately follows another NP chunk Alternative tags: Begin, End, Outside Example: –Input: The teacher gave Sara the book –Output: I I O I B I
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Chunking State of the Art When addressed as tagging – methods similar to POS tagging can be used –HMM – combining POS and IOB tags –TBL – rules based on POS and IOB tags Depending on task specification and test set: 90-95% for NP chunks
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Homework
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Context Free Grammars Putting the constituents together
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Problems with FS grammars Reasons why finite-state grammars may be inadequate: Cannot represent constituency adequately Cannot represent structural ambiguity Cannot deal with recursion Recursion occurs when an expansion of a non- terminal includes the non-terminal itself, eg.NP containing a PP, which contains an NP
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Context-free grammars A context-free grammar consists of –a lexicon – set of terminals –a set of non-terminals –a set of rules or productions
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books A baby CF grammar for NPs NP Det Nominal NP ProperNoun NP Pronoun Nominal Noun Nominal Noun Nominal Det a Det the Pronoun I Noun flight
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Context-free rules Each context-free rule is of the form: A γ a single non-terminal symbol an ordered list of one or more terminals and non-terminals Start symbol S (sentence)
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Context-free grammars CFGs can be used for analyzing or for generating „ “ means: rewrite the symbol on the left with the string of symbols on the right Example: NP Det Nominal Det Noun a Noun a flight
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Context-free grammars A sequence of rule expansions is called the derivation of a string of words. Formally, a particular CF language is a set of strings which can be derived from a particular CF grammar A derivation is represented by a parse tree
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books The combined baby CFG S NP VP NP Pronoun |Det Nominal |ProperNoun Nominal Noun|Noun Nominal VP Verb NP | Verb NP PP | Verb PP PP Preposition NP Noun flight|flights |trip |morning |... Verb is |prefer |like|need |want |fly |... Adjective cheapest|non-stop|first|direct|... Pronoun me|I |you|it|... ProperNoun Los Angeles |Chicago |Alaska |... Det a |the|an |this | these|that|... Preposition from|to |on |near|... Conjunction and |or|but|...
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Formal definition of a CFG A context-free grammar is a 4-tuple: –A set of non-terminal symbols N –A set of terminal symbols Σ (disjoint from N) –A set of productions P, each of the form A α, where A is a non-terminal and α is a string of symbols from the infinite set of strings (Σ N)* –A designated start symbol S
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books CF language If A β is a production of P and α and γ are any strings in the set (∑ N)*,we say that αAγ directly derives α β γ, or αAγ α β γ Derivation is a generalization of direct derivation –Let α1, α2,..., αm be strings in (∑ N)*, m 1, such that α1 α2, α2 α3,..., α(m-1) αm –we say that α1 derives αm, or α1 * αm
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books CF language The language L(G) generated by a grammar G is the set of strings composed of terminal symbols which can be derived from the designed start symbol S: L(G)= {W| w is in ∑ * and S * w}
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books Grammar equivalence Two grammars are strongly equivalent if they generate the same set of strings and if they assign the same phrase structure to each sentence Two grammars are weakly equivalent if they generate the same set of strings but do not assign the same phrase structure to each sentence