Introduction to Deep Processing Techniques for NLP Deep Processing Techniques for NLP Ling 571 January 5, 2015 Gina-Anne Levow.

Slides:



Advertisements
Similar presentations
Natural Language Processing (or NLP) Reading: Chapter 1 from Jurafsky and Martin, Speech and Language Processing: An Introduction to Natural Language Processing,
Advertisements

Computational language: week 10 Lexical Knowledge Representation concluded Syntax-based computational language Sentence structure: syntax Context free.
Introduction to Natural Language Processing A.k.a., “Computational Linguistics”
Syntax and Context-Free Grammars Julia Hirschberg CS 4705 Slides with contributions from Owen Rambow, Kathy McKeown, Dan Jurafsky and James Martin.
Syntactic analysis using Context Free Grammars. Analysis of language Morphological analysis – Chairs, Part Of Speech (POS) tagging – The/DT man/NN left/VBD.
Sequence Classification: Chunking Shallow Processing Techniques for NLP Ling570 November 28, 2011.
Statistical NLP: Lecture 3
PSY 369: Psycholinguistics Some basic linguistic theory part2.
Leksička semantika i pragmatika 6. predavanje. Headlines Police Begin Campaign To Run Down Jaywalkers Iraqi Head Seeks Arms Teacher Strikes Idle Kids.
Oct 2009HLT1 Human Language Technology Overview. Oct 2009HLT2 Acknowledgement Material for some of these slides taken from J Nivre, University of Gotheborg,
For Monday Read Chapter 23, sections 3-4 Homework –Chapter 23, exercises 1, 6, 14, 19 –Do them in order. Do NOT read ahead.
1 Words and the Lexicon September 10th 2009 Lecture #3.
Natural Language and Speech Processing Creation of computational models of the understanding and the generation of natural language. Different fields coming.
NLP and Speech Course Review. Morphological Analyzer Lexicon Part-of-Speech (POS) Tagging Grammar Rules Parser thethe – determiner Det NP → Det.
Introduction & Tokenization Ling570 Shallow Processing Techniques for NLP September 28, 2011.
Introduction to Deep Processing Techniques for NLP Deep Processing Techniques for Natural Language Processing Ling 571 January 3, 2011 Gina-Anne Levow.
Introduction to Syntax Owen Rambow October
Parsing I Miriam Butt May 2005 Jurafsky and Martin, Chapters 10, 13.
Introduction to Syntax, with Part-of-Speech Tagging Owen Rambow September 17 & 19.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 2.
1 CONTEXT-FREE GRAMMARS. NLE 2 Syntactic analysis (Parsing) S NPVP ATNNSVBD NP AT NNthechildrenate thecake.
Stochastic POS tagging Stochastic taggers choose tags that result in the highest probability: P(word | tag) * P(tag | previous n tags) Stochastic taggers.
Models of Generative Grammar Smriti Singh. Generative Grammar  A Generative Grammar is a set of formal rules that can generate an infinite set of sentences.
March 1, 2009 Dr. Muhammed Al-Mulhem 1 ICS 482 Natural Language Processing INTRODUCTION Muhammed Al-Mulhem March 1, 2009.
Lecture 1, 7/21/2005Natural Language Processing1 CS60057 Speech &Natural Language Processing Autumn 2005 Lecture 1 21 July 2005.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Paul Tarau, based on Rada.
9/8/20151 Natural Language Processing Lecture Notes 1.
TEORIE E TECNICHE DEL RICONOSCIMENTO Linguistica computazionale in Python: -Analisi sintattica (parsing)
Lecture 12: 22/6/1435 Natural language processing Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
For Friday Finish chapter 23 Homework: –Chapter 22, exercise 9.
Tree-adjoining grammar (TAG) is a grammar formalism defined by Aravind Joshi and introduced in Tree-adjoining grammars are somewhat similar to context-free.
1 Computational Linguistics Ling 200 Spring 2006.
Natural Language Processing Introduction. 2 Natural Language Processing We’re going to study what goes into getting computers to perform useful and interesting.
Natural Language Processing Lecture 6 : Revision.
GRAMMARS David Kauchak CS159 – Fall 2014 some slides adapted from Ray Mooney.
Chapter 12: FORMAL GRAMMARS OF ENGLISH Heshaam Faili University of Tehran.
Natural Language Processing Artificial Intelligence CMSC February 28, 2002.
Introduction to CL & NLP CMSC April 1, 2003.
Parsing I: Earley Parser CMSC Natural Language Processing May 1, 2003.
Ling 570 Introduction and Overview 1. Roadmap Course Overview Tokenization Homework #1 2.
Context Free Grammars Reading: Chap 9, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Rada Mihalcea.
Copyright © Curt Hill Languages and Grammars This is not English Class. But there is a resemblance.
Parsing Introduction Syntactic Analysis I. Parsing Introduction 2 The Role of the Parser The Syntactic Analyzer, or Parser, is the heart of the front.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 4.
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
October 2005CSA3180 NLP1 CSA3180 Natural Language Processing Introduction and Course Overview.
CSA2050 Introduction to Computational Linguistics Lecture 1 Overview.
Parsing with Context-Free Grammars for ASR Julia Hirschberg CS 4706 Slides with contributions from Owen Rambow, Kathy McKeown, Dan Jurafsky and James Martin.
CPE 480 Natural Language Processing Lecture 4: Syntax Adapted from Owen Rambow’s slides for CSc Fall 2006.
CSA2050 Introduction to Computational Linguistics Parsing I.
1 Context Free Grammars October Syntactic Grammaticality Doesn’t depend on Having heard the sentence before The sentence being true –I saw a unicorn.
Quick Speech Synthesis CMSC Natural Language Processing April 29, 2003.
CSE467/567 Computational Linguistics Carl Alphonce Computer Science & Engineering University at Buffalo.
Supertagging CMSC Natural Language Processing January 31, 2006.
Introduction to Deep Processing Techniques for NLP Deep Processing Techniques for NLP Ling 571 January 6, 2014 Gina-Anne Levow.
1Computer Sciences Department. Book: INTRODUCTION TO THE THEORY OF COMPUTATION, SECOND EDITION, by: MICHAEL SIPSER Reference 3Computer Sciences Department.
CPSC 422, Lecture 27Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27 Nov, 16, 2015.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Spring 2006-Lecture 2.
SYNTAX.
English Syntax Read J & M Chapter 9.. Two Kinds of Issues Linguistic – what are the facts about language? The rules of syntax (grammar) Algorithmic –
NATURAL LANGUAGE PROCESSING
Introduction to Deep Processing Techniques for NLP Deep Processing Techniques for NLP Ling 571 January 4, 2016 Gina-Anne Levow.
Roadmap Probabilistic CFGs –Handling ambiguity – more likely analyses –Adding probabilities Grammar Parsing: probabilistic CYK Learning probabilities:
Natural Language Processing Vasile Rus
Statistical NLP: Lecture 3
Natural Language - General
Natural Language Processing
Parsing I: CFGs & the Earley Parser
Artificial Intelligence 2004 Speech & Natural Language Processing
Presentation transcript:

Introduction to Deep Processing Techniques for NLP Deep Processing Techniques for NLP Ling 571 January 5, 2015 Gina-Anne Levow

Roadmap Motivation: Applications Language and Thought Knowledge of Language Cross-cutting themes Ambiguity, Evaluation, & Multi-linguality Course Overview

Motivation: Applications Applications of Speech and Language Processing Call routing Information retrieval Question-answering Machine translation Dialog systems Spell-, Grammar- checking Sentiment Analysis Information extraction….

Building on Many Fields Linguistics: Morphology, phonology, syntax, semantics,.. Psychology: Reasoning, mental representations Formal logic Philosophy (of language) Theory of Computation: Automata,.. Artificial Intelligence: Search, Reasoning, Knowledge representation, Machine learning, Pattern matching Probability..

Language & Intelligence Turing Test: (1950) – Operationalize intelligence Two contestants: human, computer Judge: human Test: Interact via text questions Question: Can you tell which contestant is human? Crucially requires language use and understanding

Limitations of Turing Test ELIZA (Weizenbaum 1966) Simulates Rogerian therapist User: You are like my father in some ways ELIZA: WHAT RESEMBLANCE DO YOU SEE User: You are not very aggressive ELIZA: WHAT MAKES YOU THINK I AM NOT AGGRESSIVE...

Limitations of Turing Test ELIZA (Weizenbaum 1966) Simulates Rogerian therapist User: You are like my father in some ways ELIZA: WHAT RESEMBLANCE DO YOU SEE User: You are not very aggressive ELIZA: WHAT MAKES YOU THINK I AM NOT AGGRESSIVE... Passes the Turing Test!!

Limitations of Turing Test ELIZA (Weizenbaum 1966) Simulates Rogerian therapist User: You are like my father in some ways ELIZA: WHAT RESEMBLANCE DO YOU SEE User: You are not very aggressive ELIZA: WHAT MAKES YOU THINK I AM NOT AGGRESSIVE... Passes the Turing Test!! (sort of) “You can fool some of the people....”

Limitations of Turing Test ELIZA (Weizenbaum 1966) Simulates Rogerian therapist User: You are like my father in some ways ELIZA: WHAT RESEMBLANCE DO YOU SEE User: You are not very aggressive ELIZA: WHAT MAKES YOU THINK I AM NOT AGGRESSIVE... Passes the Turing Test!! (sort of) “You can fool some of the people....” Simple pattern matching technique True understanding requires deeper analysis & processing

Turing Test Revived “On the web, no one knows you’re a….” Problem: ‘bots’ Automated agents swamp services Challenge: Prove you’re human Test: Something human can do, ‘bot can’t

Turing Test Revived “On the web, no one knows you’re a….” Problem: ‘bots’ Automated agents swamp services Challenge: Prove you’re human Test: Something human can do, ‘bot can’t Solution: CAPTCHAs Distorted images: trivial for human; hard for ‘bot

Turing Test Revived “On the web, no one knows you’re a….” Problem: ‘bots’ Automated agents swamp services Challenge: Prove you’re human Test: Something human can do, ‘bot can’t Solution: CAPTCHAs Distorted images: trivial for human; hard for ‘bot* Key: Perception, not reasoning

Knowledge of Language What does HAL (of 2001, A Space Odyssey) need to know to converse? Dave: Open the pod bay doors, HAL. HAL: I'm sorry, Dave. I'm afraid I can't do that.

Knowledge of Language What does HAL (of 2001, A Space Odyssey) need to know to converse? Dave: Open the pod bay doors, HAL. HAL: I'm sorry, Dave. I'm afraid I can't do that. Phonetics & Phonology (Ling 450/550) Sounds of a language, acoustics Legal sound sequences in words

Knowledge of Language What does HAL (of 2001, A Space Odyssey) need to know to converse? Dave: Open the pod bay doors, HAL. HAL: I'm sorry, Dave. I'm afraid I can't do that. Morphology (Ling 570) Recognize, produce variation in word forms

Knowledge of Language What does HAL (of 2001, A Space Odyssey) need to know to converse? Dave: Open the pod bay doors, HAL. HAL: I'm sorry, Dave. I'm afraid I can't do that. Morphology (Ling 570) Recognize, produce variation in word forms Singular vs. plural: Door + sg: -> door; Door + plural -> doors Verb inflection: Be + 1 st person, sg, present -> am

Knowledge of Language What does HAL (of 2001, A Space Odyssey) need to know to converse? Dave: Open the pod bay doors, HAL. HAL: I'm sorry, Dave. I'm afraid I can't do that. Part-of-speech tagging (Ling 570) Identify word use in sentence

Knowledge of Language What does HAL (of 2001, A Space Odyssey) need to know to converse? Dave: Open the pod bay doors, HAL. HAL: I'm sorry, Dave. I'm afraid I can't do that. Part-of-speech tagging (Ling 570) Identify word use in sentence Bay (Noun) --- Not verb, adjective

Knowledge of Language What does HAL (of 2001, A Space Odyssey) need to know to converse? Dave: Open the pod bay doors, HAL. HAL: I'm sorry, Dave. I'm afraid I can't do that. Syntax (Ling 566: analysis; Ling 570 – chunking; Ling 571- parsing) Order and group words in sentence I’m I do, sorry that afraid Dave I can’t.

Knowledge of Language What does HAL (of 2001, A Space Odyssey) need to know to converse? Dave: Open the pod bay doors, HAL. HAL: I'm sorry, Dave. I'm afraid I can't do that. Semantics (Ling 571) Word meaning: individual (lexical), combined (compositional)

Knowledge of Language What does HAL (of 2001, A Space Odyssey) need to know to converse? Dave: Open the pod bay doors, HAL. HAL: I'm sorry, Dave. I'm afraid I can't do that. Semantics (Ling 571) Word meaning: individual (lexical), combined (compositional) ‘Open’ : AGENT cause THEME to become open; ‘pod bay doors’ : (pod bay) doors

Knowledge of Language What does HAL (of 2001, A Space Odyssey) need to know to converse? Dave: Open the pod bay doors, HAL. HAL: I'm sorry, Dave. I'm afraid I can't do that. Pragmatics/Discourse/Dialogue (Ling 571) Interpret utterances in context Speech act (request, statement)

Knowledge of Language What does HAL (of 2001, A Space Odyssey) need to know to converse? Dave: Open the pod bay doors, HAL. (request) HAL: I'm sorry, Dave. I'm afraid I can't do that. (statement) Pragmatics/Discourse/Dialogue (Ling 571) Interpret utterances in context Speech act (request, statement) Reference resolution: I = HAL; that = ‘open doors’ Politeness: I’m sorry, I’m afraid I can’t

Language Processing Pipeline Shallow Processing Deep Processing

Shallow vs Deep Processing Shallow processing (Ling 570) Usually relies on surface forms (e.g., words) Less elaborate linguistics representations E.g. HMM POS-tagging; FST morphology

Shallow vs Deep Processing Shallow processing (Ling 570) Usually relies on surface forms (e.g., words) Less elaborate linguistics representations E.g. HMM POS-tagging; FST morphology Deep processing (Ling 571) Relies on more elaborate linguistic representations Deep syntactic analysis (Parsing) Rich spoken language understanding (NLU)

Cross-cutting Themes Ambiguity How can we select among alternative analyses? Evaluation How well does this approach perform: On a standard data set? When incorporated into a full system? Multi-linguality Can we apply this approach to other languages? How much do we have to modify it to do so?

Ambiguity “I made her duck” Means....

Ambiguity “I made her duck” Means.... I caused her to duck down

Ambiguity “I made her duck” Means.... I caused her to duck down I made the (carved) duck she has

Ambiguity “I made her duck” Means.... I caused her to duck down I made the (carved) duck she has I cooked duck for her

Ambiguity “I made her duck” Means.... I caused her to duck down I made the (carved) duck she has I cooked duck for her I cooked the duck she owned

Ambiguity “I made her duck” Means.... I caused her to duck down I made the (carved) duck she has I cooked duck for her I cooked the duck she owned I magically turned her into a duck

Ambiguity: POS “I made her duck” Means.... I caused her to duck down I made the (carved) duck she has I cooked duck for her I cooked the duck she owned I magically turned her into a duck V N Pron Poss

Ambiguity: Syntax “I made her duck” Means.... I made the (carved) duck she has ((VP (V made) (NP (POSS her) (N duck)))

Ambiguity: Syntax “I made her duck” Means.... I made the (carved) duck she has ((VP (V made) (NP (POSS her) (N duck))) I cooked duck for her ((VP (V made) (NP (PRON her)) (NP (N (duck)))

Ambiguity: Semantics “I made her duck” Means.... I caused her to duck down Make: AG cause TH to do sth

Ambiguity: Semantics “I made her duck” Means.... I caused her to duck down Make: AG cause TH to do sth I cooked duck for her Make: AG cook TH for REC

Ambiguity: Semantics “I made her duck” Means.... I caused her to duck down Make: AG cause TH to do sth I cooked duck for her Make: AG cook TH for REC I cooked the duck she owned Make: AG cook TH

Ambiguity: Semantics “I made her duck” Means.... I caused her to duck down Make: AG cause TH to do sth I cooked duck for her Make: AG cook TH for REC I cooked the duck she owned Make: AG cook TH I magically turned her into a duck Duck: animal

Ambiguity: Semantics “I made her duck” Means.... I caused her to duck down Make: AG cause TH to do sth I cooked duck for her Make: AG cook TH for REC I cooked the duck she owned Make: AG cook TH I magically turned her into a duck Duck: animal I made the (carved) duck she has Duck: duck-shaped figurine

Ambiguity Pervasive Pernicious Particularly challenging for computational systems

Ambiguity Pervasive Pernicious Particularly challenging for computational systems Problem we will return to again and again in class

Course Information

Syntax Ling 571 Deep Processing Techniques for Natural Language Processing January 5, 2015

Roadmap Sentence Structure Motivation: More than a bag of words Constituency Representation: Context-free grammars Formal definition of context free grammars Chomsky hierarchy Why not finite state? Aside: Context-sensitivity

More than a Bag of Words Sentences are structured: Impacts meaning: Dog bites man vs man bites dog Impacts acceptability: Dog man bites

Constituency Constituents: basic units of sentences word or group of words that acts as a single unit Phrases: Noun phrase (NP), verb phrase (VP), prepositional phrase (PP), etc Single unit: type determined by head (e.g., N->NP)

Constituency How can we tell what units are constituents? On September seventeenth, I’d like to fly from Sea-Tac Airport to Denver.

Constituency How can we tell what units are constituents? On September seventeenth, I’d like to fly from Sea-Tac Airport to Denver. September seventeenth

Constituency How can we tell what units are constituents? On September seventeenth, I’d like to fly from Sea-Tac Airport to Denver. September seventeenth On September seventeenth

Constituency How can we tell what units are constituents? On September seventeenth, I’d like to fly from Sea-Tac Airport to Denver. September seventeenth On September seventeen Sea-Tac Airport

Constituency How can we tell what units are constituents? On September seventeenth, I’d like to fly from Sea-Tac Airport to Denver. September seventeenth On September seventeen Sea-Tac Airport from Sea-Tac Airport

Constituency Testing Appear in similar contexts PPs, NPs, PPs Preposed or Postposed constructions On September seventeenth, I’d like to fly from Sea-Tac Airport to Denver.

Constituency Testing Appear in similar contexts PPs, NPs, PPs Preposed or Postposed constructions On September seventeenth, I’d like to fly from Sea-Tac Airport to Denver. I’d like to fly from Sea-Tac Airport to Denver on September seventeenth.

Constituency Testing Appear in similar contexts PPs, NPs, PPs Preposed or Postposed constructions On September seventeenth, I’d like to fly from Sea-Tac Airport to Denver. I’d like to fly from Sea-Tac Airport to Denver on September seventeenth. Must move as unit *On I’d like to fly September seventeenth from Sea-Tac Airport to Denver.

Constituency Testing Appear in similar contexts PPs, NPs, PPs Preposed or Postposed constructions On September seventeenth, I’d like to fly from Sea-Tac Airport to Denver. I’d like to fly from Sea-Tac Airport to Denver on September seventeenth. Must move as unit *On I’d like to fly September seventeenth from Sea-Tac Airport to Denver. *I’d like to fly on September from Sea-Tac airport to Denver seventeenth.

Representing Sentence Structure Captures constituent structure Basic units Phrases Subcategorization Argument structure Components expected by verbs Hierarchical

Representation: Context-free Grammars CFGs: 4-tuple A set of terminal symbols: Σ A set of non-terminal symbols: N A set of productions P: of the form A -> α Where A is a non-terminal and α in (Σ U N)* A designated start symbol S

Representation: Context-free Grammars CFGs: 4-tuple A set of terminal symbols: Σ A set of non-terminal symbols: N A set of productions P: of the form A -> α Where A is a non-terminal and α in (Σ U N)* A designated start symbol S L =W|w in Σ* and S=>*w Where S=>*w means S derives w by some seq

Representation: Context-free Grammars Partial example Σ: the, cat, dog, bit, bites, man N: NP, VP, AdjP, Nom, Det, V, N, Adj, P: S  NP VP; NP  Det Nom; Nom  N Nom|N; VP  V NP, N  cat, N  dog, N  man, Det  the, V  bit, V  bites S S NP VP Det Nom V NP N Det Nom N The dog bit the man

Sentence-level Knowledge: Syntax Different models of language Specify the expressive power of a formal language Chomsky Hierarchy Recursively Enumerable =Any Context = αAβ->αγβ Sensitive Context A-> γ Free Regular S->aB Expression a*b*

Representing Sentence Structure Why not just Finite State Models?

Representing Sentence Structure Why not just Finite State Models? Cannot describe some grammatical phenomena Inadequate expressiveness to capture generalization

Representing Sentence Structure Why not just Finite State Models? Cannot describe some grammatical phenomena Inadequate expressiveness to capture generalization Center embedding Finite State: Context-Free: Allows recursion The luggage arrived. The luggage that the passengers checked arrived. The luggage that the passengers that the storm delayed checked arrived.

Parsing Goals

Accepting: Legal string in language? Formally: rigid Practically: degrees of acceptability

Parsing Goals Accepting: Legal string in language? Formally: rigid Practically: degrees of acceptability Analysis What structure produced the string? Produce one (or all) parse trees for the string

Parsing Goals Accepting: Legal string in language? Formally: rigid Practically: degrees of acceptability Analysis What structure produced the string? Produce one (or all) parse trees for the string Will develop techniques to produce analyses of sentences Rigidly accept (with analysis) or reject Produce varying degrees of acceptability

Is Context-free Enough? Natural language provably not finite state Do we need context-sensitivity? Many articles have attempted to demonstrate Many failed, too Solid proofs for Swiss German (Shieber) Key issue: Cross-serial dependencies: a n b m c n d m

Examples Verbs and their arguments can be ordered cross-serially - arguments and verbs must match

Tree Adjoining Grammars Mildly context-sensitive (Joshi, 1979) Motivation: Enables representation of crossing dependencies Operations for rewriting “Substitution” and “Adjunction” A X A A A X A A

TAG Example NP N Maria NP N pasta S NP VP VNP eats VP Ad quickly S NP VP V NP eats N pasta VP Ad quickly N Maria

Grammar Equivalence and Form Grammar equivalence Weak: Accept the same language, May produce different analyses Strong: Accept same language, Produce same structure Canonical form: Chomsky Normal Form (CNF) All CFGs have a weakly equivalent CNF All productions of the form: A-> B C where B,C in N, or A->a where a in Σ