For Friday Finish chapter 23 Homework –Chapter 23, exercise 15.

Slides:



Advertisements
Similar presentations
CILC2011 A framework for structured knowledge extraction and representation from natural language via deep sentence analysis Stefania Costantini Niva Florio.
Advertisements

Proceedings of the Conference on Intelligent Text Processing and Computational Linguistics (CICLing-2007) Learning for Semantic Parsing Advisor: Hsin-His.
INTERNATIONAL CONFERENCE ON NATURAL LANGUAGE PROCESSING NLP-AI IIIT-Hyderabad CIIL, Mysore ICON DECEMBER, 2003.
For Friday No reading Homework –Chapter 23, exercises 1, 13, 14, 19 –Not as bad as it sounds –Do them IN ORDER – do not read ahead here.
For Monday Read Chapter 23, sections 3-4 Homework –Chapter 23, exercises 1, 6, 14, 19 –Do them in order. Do NOT read ahead.
Natural Language and Speech Processing Creation of computational models of the understanding and the generation of natural language. Different fields coming.
CSE111: Great Ideas in Computer Science Dr. Carl Alphonce 219 Bell Hall Office hours: M-F 11:00-11:
Basi di dati distribuite Prof. M.T. PAZIENZA a.a
Big Ideas in Cmput366. Search Blind Search Iterative deepening Heuristic Search A* Local and Stochastic Search Randomized algorithm Constraint satisfaction.
Introduction to CL Session 1: 7/08/2011. What is computational linguistics? Processing natural language text by computers  for practical applications.
LING 388 Language and Computers Take-Home Final Examination 12/9/03 Sandiway FONG.
Natural Language Query Interface Mostafa Karkache & Bryce Wenninger.
1 Information Retrieval and Extraction 資訊檢索與擷取 Chia-Hui Chang, Assistant Professor Dept. of Computer Science & Information Engineering National Central.
Information Retrieval and Extraction 資訊檢索與擷取 Chia-Hui Chang National Central University
تمرين شماره 1 درس NLP سيلابس درس NLP در دانشگاه هاي ديگر ___________________________ راحله مکي استاد درس: دکتر عبدالله زاده پاييز 85.
Machine Learning in Natural Language Processing Noriko Tomuro November 16, 2006.
1/23 Applications of NLP. 2/23 Applications Text-to-speech, speech-to-text Dialogues sytems / conversation machines NL interfaces to –QA systems –IR systems.
March 1, 2009 Dr. Muhammed Al-Mulhem 1 ICS 482 Natural Language Processing INTRODUCTION Muhammed Al-Mulhem March 1, 2009.
Lecture 1, 7/21/2005Natural Language Processing1 CS60057 Speech &Natural Language Processing Autumn 2005 Lecture 1 21 July 2005.
Information Retrieval – and projects we have done. Group Members: Aditya Tiwari ( ) Harshit Mittal ( ) Rohit Kumar Saraf ( ) Vinay.
ELN – Natural Language Processing Giuseppe Attardi
9/8/20151 Natural Language Processing Lecture Notes 1.
Empirical Methods in Information Extraction Claire Cardie Appeared in AI Magazine, 18:4, Summarized by Seong-Bae Park.
Lecture 12: 22/6/1435 Natural language processing Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
Computational Linguistics Yoad Winter *General overview *Examples: Transducers; Stanford Parser; Google Translate; Word-Sense Disambiguation * Finite State.
For Friday Finish chapter 23 Homework: –Chapter 22, exercise 9.
Machine Translation, Digital Libraries, and the Computing Research Laboratory Indo-US Workshop on Digital Libraries June 23, 2003.
1 Computational Linguistics Ling 200 Spring 2006.
Natural Language Processing Introduction. 2 Natural Language Processing We’re going to study what goes into getting computers to perform useful and interesting.
Scott Duvall, Brett South, Stéphane Meystre A Hands-on Introduction to Natural Language Processing in Healthcare Annotation as a Central Task for Development.
10/12/2015CPSC503 Winter CPSC 503 Computational Linguistics Lecture 10 Giuseppe Carenini.
THE BIG PICTURE Basic Assumptions Linguistics is the empirical science that studies language (or linguistic behavior) Linguistics proposes theories (models)
Machine Translation  Machine translation is of one of the earliest uses of AI  Two approaches:  Traditional approach using grammars, rewrite rules,
Introduction to CL & NLP CMSC April 1, 2003.
11 Chapter 14 Part 1 Statistical Parsing Based on slides by Ray Mooney.
1 CSI 5180: Topics in AI: Natural Language Processing, A Statistical Approach Instructor: Nathalie Japkowicz Objectives of.
Collocations and Information Management Applications Gregor Erbach Saarland University Saarbrücken.
Parsing Introduction Syntactic Analysis I. Parsing Introduction 2 The Role of the Parser The Syntactic Analyzer, or Parser, is the heart of the front.
Artificial Intelligence: Natural Language
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
October 2005CSA3180 NLP1 CSA3180 Natural Language Processing Introduction and Course Overview.
What you have learned and how you can use it : Grammars and Lexicons Parts I-III.
For Wednesday No reading Homework –Chapter 23, exercise 15 –Process: 1.Create 5 sentences 2.Select a language 3.Translate each sentence into that language.
Artificial Intelligence: Natural Language
For Monday Read chapter 24, sections 1-3 Homework: –Chapter 23, exercise 8.
For Friday Finish chapter 24 No written homework.
For Monday Read chapter 26 Last Homework –Chapter 23, exercise 7.
CPSC 503 Computational Linguistics
Communicative and Academic English for the EFL Professional.
LING 001 Introduction to Linguistics Spring 2010 Syntactic parsing Part-Of-Speech tagging Apr. 5 Computational linguistics.
CS460/IT632 Natural Language Processing/Language Technology for the Web Lecture 1 (03/01/06) Prof. Pushpak Bhattacharyya IIT Bombay Introduction to Natural.
CPSC 422, Lecture 27Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27 Nov, 16, 2015.
CS 4705 Lecture 17 Semantic Analysis: Robust Semantics.
1 An Introduction to Computational Linguistics Mohammad Bahrani.
Natural Language Processing (NLP)
For Monday Read chapter 26 Homework: –Chapter 23, exercises 8 and 9.
Stochastic Methods for NLP Probabilistic Context-Free Parsers Probabilistic Lexicalized Context-Free Parsers Hidden Markov Models – Viterbi Algorithm Statistical.
Overview of Statistical NLP IR Group Meeting March 7, 2006.
NATURAL LANGUAGE PROCESSING
PARSING David Kauchak CS159 – Fall Admin Assignment 3 Quiz #1  High: 36  Average: 33 (92%)  Median: 33.5 (93%)
Natural Language Processing [05 hours/week, 09 Credits] [Theory]
Natural Language Processing (NLP)
Machine Learning in Natural Language Processing
Natural Language Processing
Natural Language Processing (NLP)
Lec00-outline May 18, 2019 Compiler Design CS416 Compiler Design.
Artificial Intelligence 2004 Speech & Natural Language Processing
Information Retrieval
Natural Language Processing (NLP)
Presentation transcript:

For Friday Finish chapter 23 Homework –Chapter 23, exercise 15

Program 5 Any questions?

Survey of Some Natural Language Processing Research

Speech Recognition Two major approaches –Neural Networks –Hidden Markov Models A statistical technique Tries to determine the probability of a certain string of words producing a certain string of sounds Choose the most probable string of words Both approaches are “learning” approaches

Syntax Both hand-constructed approaches and data- driven or learning approaches Multiple levels of processing and goals of processing Most active area of work in NLP (maybe the easiest because we understand syntax much better than we understand semantics and pragmatics)

POS Tagging Statistical approaches--based on probability of sequences of tags and of words having particular tags Symbolic learning approaches –One of these: transformation-based learning developed by Eric Brill is perhaps the best known tagger Approaches data-driven

Developing Parsers Hand-crafted grammars Usually some variation on CFG Definite Clause Grammars (DCG) –A variation on CFGs that allow extensions like agreement checking –Built-in handling of these in most Prologs Hand-crafted grammars follow the different types of grammars popular in linguistics Since linguistics hasn’t produced a perfect grammar, we can’t code one

Efficient Parsing Top down and bottom up both have issues Also common is chart parsing –Basic idea is we’re going to locate and store info about every string that matches a grammar rule One area of research is producing more efficient parsing

Data-Driven Parsing PCFG - Probabilistic Context Free Grammars Constructed from data Parse by determining all parses (or many parses) and selecting the most probable Fairly successful, but requires a LOT of work to create the data

Applying Learning to Parsing Basic problem is the lack of negative examples Also, mapping complete string to parse seems not the right approach Look at the operations of the parse and learn rules for the operations, not for the complete parse at once

Syntax Demos ervlethttp://teemapoint.fi/nlpdemo/servlet/ParserS ervlet sentence-4.htmlhttp:// sentence-4.html

Semantics Most work probably hand-constructed systems Some more interested in developing the semantics than the mappings Basic question: what constitutes a semantic representation? Answer may depend on application

Possible Semantic Representations Logical representation Database query Case grammar

Distinguishing Word Senses Use context to determine which sense of a word is meant Probabilistic approaches Rules Issues –Obtaining sense-tagged corpora –What senses do we want to distinguish?

Semantic Demos mlhttp:// ml

Information Retrieval Take a query and a set of documents. Select the subset of documents (or parts of documents) that match the query Statistical approaches –Look at things like word frequency More knowledge based approaches interesting, but maybe not helpful

Information Extraction From a set of documents, extract “interesting” pieces of data Hand-built systems Learning pieces of the system Learning the entire task (for certain versions of the task) Wrapper Induction

IE Demos

Question Answering Given a question and a set of documents (possibly the web), find a small portion of text that answers the question. Some work on putting answers together from multiple sources.

QA Demo

Text Mining Outgrowth of data mining. Trying to find “interesting” new facts from texts. One approach is to mine databases created using information extraction.

Pragmatics Distinctions between pragmatics and semantics get blurred in practical systems To be a practically useful system, some aspects of pragmatics must be dealt with, but we don’t often see people making a strong distinction between semantics and pragmatics these days. Instead, we often distinguish between sentence processing and discourse processing

What Kinds of Discourse Processing Are There? Anaphora Resolution –Pronouns –Definite noun phrases Handling ellipsis Topic Discourse segmentation Discourse tagging (understanding what conversational “moves” are made by each utterance)

Approaches to Discourse Hand-built systems that work with semantic representations Hand-built systems that work with text (or recognized speech) or parsed text Learning systems that work with text (or recognized speech) or parsed text

Issues Agreement on representation Annotating corpora How much do we use the modular model of processing?

Pronoun Resolution Demo ex.phphttp:// ex.php

Summarization Short summaries of a single text or summaries of multiple texts. Approaches: –Select sentences –Create new sentences (much harder) –Learning has been used some but not extensively

Machine Translation Best systems must use all levels of NLP Semantics must deal with the overlapping senses of different languages Both understanding and generation Advantage in learning: bilingual corpora exist--but we often want some tagging of intermediate relationships Additional issue: alignment of corpora

Approaches to MT Lots of hand-built systems Some learning used Probably most use a fair bit of syntactic and semantic analysis Some operate fairly directly between texts

Generation Producing a syntactically “good” sentence Interesting issues are largely in choices –What vocabulary to use –What level of detail is appropriate –Determining how much information to include