CSC 594 Topics in AI – Applied Natural Language Processing

Slides:



Advertisements
Similar presentations
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Advertisements

Natural Language Processing Lecture 2: Semantics.
Semantics (Representing Meaning)
BİL711 Natural Language Processing1 Problems with CFGs We know that CFGs cannot handle certain things which are available in natural languages. In particular,
Syntax-Semantics Mapping Rajat Kumar Mohanty CFILT.
Syntax and Context-Free Grammars Julia Hirschberg CS 4705 Slides with contributions from Owen Rambow, Kathy McKeown, Dan Jurafsky and James Martin.
Grammatical Relations and Lexical Functional Grammar Grammar Formalisms Spring Term 2004.
Statistical NLP: Lecture 3
LING NLP 1 Introduction to Computational Linguistics Martha Palmer April 19, 2006.
Leksička semantika i pragmatika 1. predavanje. Introduction to Semantics and Pragmatics.
Linguistic Theory Lecture 8 Meaning and Grammar. A brief history In classical and traditional grammar not much distinction was made between grammar and.
Chapter 17. Lexical Semantics From: Chapter 17 of An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition, by.
Natural Language Processing - Feature Structures - Feature Structures and Unification.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Semantics.
NLP and Speech 2004 Semantics I Semantics II
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
NLP and Speech Course Review. Morphological Analyzer Lexicon Part-of-Speech (POS) Tagging Grammar Rules Parser thethe – determiner Det NP → Det.
NLP and Speech 2004 Feature Structures Feature Structures and Unification.
Artificial Intelligence 2005/06 From Syntax to Semantics.
1/27 Semantics Going beyond syntax. 2/27 Semantics Relationship between surface form and meaning What is meaning? Lexical semantics Syntax and semantics.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Machine Learning in Natural Language Processing Noriko Tomuro November 16, 2006.
1 CSC 594 Topics in AI – Applied Natural Language Processing Fall 2009/ Outline of English Syntax.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Paul Tarau, based on Rada.
1 Semantics Interpretation Allen ’ s Chapter 9 J&M ’ s Chapter 15.
1 Features and Unification Chapter 15 October 2012 Lecture #10.
9/8/20151 Natural Language Processing Lecture Notes 1.
Lecture 12: 22/6/1435 Natural language processing Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
PropBank, VerbNet & SemLink Edward Loper. PropBank 1M words of WSJ annotated with predicate- argument structures for verbs. –The location & type of each.
SEMANTIC ANALYSIS WAES3303
Linguistic Essentials
Parsing with Context-Free Grammars for ASR Julia Hirschberg CS 4706 Slides with contributions from Owen Rambow, Kathy McKeown, Dan Jurafsky and James Martin.
CPE 480 Natural Language Processing Lecture 4: Syntax Adapted from Owen Rambow’s slides for CSc Fall 2006.
Rules, Movement, Ambiguity
CSA2050 Introduction to Computational Linguistics Parsing I.
1 Context Free Grammars October Syntactic Grammaticality Doesn’t depend on Having heard the sentence before The sentence being true –I saw a unicorn.
The meaning of Language Chapter 5 Semantics and Pragmatics Week10 Nov.19 th -23 rd.
Artificial Intelligence 2004
CS460/IT632 Natural Language Processing/Language Technology for the Web Lecture 1 (03/01/06) Prof. Pushpak Bhattacharyya IIT Bombay Introduction to Natural.
LING 6520: Comparative Topics in Linguistics (from a computational perspective) Martha Palmer Jan 15,
◦ Process of describing the structure of phrases and sentences Chapter 8 - Phrases and sentences: grammar1.
Lecture 19 Word Meanings II Topics Description Logic III Overview of MeaningReadings: Text Chapter 189NLTK book Chapter 10 March 27, 2013 CSCE 771 Natural.
NLP. Introduction to NLP Last week, Min broke the window with a hammer. The window was broken with a hammer by Min last week With a hammer, Min broke.
Chapter 11: Parsing with Unification Grammars Heshaam Faili University of Tehran.
SEMANTICS Chapter 10 Ms. Abrar Mujaddidi. What is semantics?  Semantics is the study of the conventional meaning conveyed by the use of words, phrases.
SYNTAX.
King Faisal University جامعة الملك فيصل Deanship of E-Learning and Distance Education عمادة التعلم الإلكتروني والتعليم عن بعد [ ] 1 King Faisal University.
COSC 6336: Natural Language Processing
Lecture – VIII Monojit Choudhury RS, CSE, IIT Kharagpur
Statistical NLP: Lecture 3
Lexical Functional Grammar
Semantics (Representing Meaning)
SYNTAX.
Language, Logic, and Meaning
CPE 480 Natural Language Processing
Lecture 19 Word Meanings II
What is Linguistics? The scientific study of human language
Unit-4 Lexical Semantics M.B.Chandak, HoD CSE,
Machine Learning in Natural Language Processing
Levels of Linguistic Analysis
Linguistic Essentials
CPSC 503 Computational Linguistics
Which Translation Technique may help in a complex instance?
Lecture 19 Word Meanings II
Semantics Going beyond syntax.
Structure of a Lexicon Debasri Chakrabarti 13-May-19.
Artificial Intelligence 2004 Speech & Natural Language Processing
Dynamic Word Sense Disambiguation with Semantic Similarity
Post-Midterm Practice 1
Presentation transcript:

CSC 594 Topics in AI – Applied Natural Language Processing Fall 2009/2010 5. Semantics

Natural Language Semantics Semantics of a sentence is the meaning of the sentence. And the meaning comes from: meanings of the words/phrases in the sentence; plus (semantic) relations between the words/phrases Underlying assumption – Compositionality The meaning of a sentence is a composition of the meanings of its parts. This assumption is not always true, especially for words/phrases which have idiomatic/non-literal meaning. e.g. “It’s raining cats and dogs”

Semantic Analysis Derive the meaning of a sentence. Often applied to the result of syntactic analysis. “John ate the cake.” NP V NP ((action INGEST) ; syntactic verb (actor JOHN-01) ; syntactic subj (object FOOD)) ; syntactic obj To do semantic analysis, we need a (semantic) dictionary (e.g. WordNet, http://www.cogsci.princeton.edu/~wn/).

Various Semantic Analyses There are many aspects and levels in the meaning of a sentence: Word meanings/senses, concepts Semantic relations Quantifiers (e.g. “Every boy loves a dog”) Truth value in a model Inference e.g. entailment – “He was snoring” entails “He was sleeping” Various representations of meaning First-order Predicate Logic (FOPL) Slot-filler structures or frames Semantic networks Web Ontology Language (OWL)  new etc.

Word Senses Many words have more than one sense (ambiguous words) – an applicable sense varies depending on the context. A word may also have several parts-of-speech.

Ontology Word senses (not words) can be grouped into a set of broad classes – semantic concepts. The set of semantic concepts is called ontology. Typical major classes: substance, quantity, quality, relation, place, time, position, state, action, affection, event, situation For each class, there are many sub-classes, which are often organized in a hierarchy. e.g. WordNet

Source: Jurafsky & Martin “Speech and Language Processing”

Relations between Senses Synonyms – two senses of two different words that are identical e.g. couch/sofa, vomit/throw up Antonyms -- two senses of two different words that are opposite e.g. long/short, cold/hot, rise/fall Hyponyms – one word sense is more specific (a subclass) than another word sense Hypernyms -- one word sense is more general (a superordinate) than another word sense Hyponym/hypernym relation creates a taxonomy of word senses/concepts Meronyms/Holonyms – part-whole relations e.g. table -> leg (meronym), leg -> table (holonym) Source: Jurafsky & Martin “Speech and Language Processing”

A node in WordNet is a set of synonyms (called synset)

Relations between actions/events (as used in WordNet) Source: Jurafsky & Martin “Speech and Language Processing”

Thematic Roles (1) Consider the following examples: John broke the window with the hammer. The hammer broke the window. The window broke. Although in each sentence the surface position of the three words are different, they play the same semantic roles (deep roles; thematic roles) John – AGENT the window – THEME (or OBJECT) the hammer -- INSTRUMENT

Thematic Roles (2) There are several commonly used thematic roles: Source: Jurafsky & Martin “Speech and Language Processing”

Some prototypical examples of various thematic roles But it’s difficult to come up with a standard set of roles or to define them. Source: Jurafsky & Martin “Speech and Language Processing”

Linking Syntax and Semantics Same thematic role could be realized in various syntactic positions. John broke the window with the hammer. The hammer broke the window. The window broke. To link syntax and semantics to derive a semantic representation of a sentence, one common approach is by Lexical Semantics – encode the semantic constraints which a word imposes upon other words in specific relations (arguments). [BREAK [AGENT animate] [THEME inanimate] [INSTR utensil] ] [BREAK [AGENT utensil] [THEME inanimate] ] [BREAK [AGENT inanimate] ]

Selectional Restrictions Specifications of the legal combinations of senses that can co-occur are called Selectional Restrictions – constraints which a predicate imposes on the semantic types of its arguments. e.g. [READ [AGENT person] [THEME text] ] Semantic constraints (on nouns) specify the most general types – all sub-types/hyponyms in the ontology are legal. thing animate inanimate person animal text child adult book magazine

Semantic Filtering by Selectional Restrictions Selectional restrictions help filtering semantically ill-formed (nonsense) sentences. The hammer broke the window. (*) The cake broke the window. (*) Green ideas sleep furiously. Also help disambiguate syntactic ambiguities. “I saw a man on the hill with a telescope” “I saw a man on the hill with a hat”

Syntax-Semantics Integrated Approaches Two major approaches: Semantic grammar In texts of a limited domain, certain constructs appear only with a specific semantic context. Write a grammar that is cast in terms of the major semantic categories of the domain. e.g. air travel domain FLIGHT-NP -> DET FLIGHT-CNP FLIGHT-CNP -> FLIGHT-N Encode both syntax and semantic constraints in the grammar Example: LINK system (by Prof. Steve Lytinen) Grammar formalism based on Unification Grammar

Unification Grammar Originated from context-free grammar (CFG), augmented with more linguistic knowledge/features in the form of constraint equations Represented by feature-structures, and often modeled by DAG e.g. Context-free backbone cat 1 2 S  NP VP S head = VP head S head subj = NP head VP head agr = NP head agr S head cat cat head head NP subj VP agr agr

Subsumption is an ordering on feature structures cat cat cat F f F f h F f h g g p g p foo foo bar foo bar Unification is the operation for * combining information (concatenation) * checking compatibility cat cat 1 1 D h 1 D h cat cat cat h h w F F F g w h g bar f foo foo bar f foo2 foo2

Syntax/Semantics Integrated Processing (define-word (cat) = V (word) = “ate” (head tense) = past (head sem cat) = EAT (head sem actor) = (head subj sem) (head sem object) = (head dobj sem)) head object cat EAT V subj sem actor dobj tense past word “ate” head object cat EAT V subj sem actor dobj tense past word “ate” FOOD ANIMATE UTENSIL instru- ment cat EAT actor object instru- ment FOOD ANIMATE UTENSIL (define-semcat (cat) = EAT (actor cat) = ANIMATE (object cat) = FOOD (instrument cat) = UTENSIL)

Derivation using Unification Grammar …….. 1 2 cat C E D h “d” word i j 

Grammar R3= R0= R4= R1= R5= R2=