1 Ling 569: Introduction to Computational Linguistics Jason Eisner Johns Hopkins University Tu/Th 1:30-3:20 (also this Fri 1-5)

Slides:



Advertisements
Similar presentations
The people Look for some people. Write it down. By the water
Advertisements

A.
Introduction to Computational Linguistics
Introduction to Computational Linguistics
The Art of Leading a Great Youth Discussion. Newspaper Headlines Do these make sense to you?
5/15/2015 copyright - All Rights Reserved 1 Home Educator’s Network, Inc. Present Training Module III Methods of Teaching.
Leksička semantika i pragmatika 6. predavanje. Headlines Police Begin Campaign To Run Down Jaywalkers Iraqi Head Seeks Arms Teacher Strikes Idle Kids.
LING NLP 1 Introduction to Computational Linguistics Martha Palmer April 19, 2006.
Drawing Trees & Ambiguity in Trees. Some Phrase Structure Rules of English S’ -> (Comp) S S’ -> (Comp) S S -> {NP/S’} (T) VP S -> {NP/S’} (T) VP VP 
January 12, Statistical NLP: Lecture 2 Introduction to Statistical NLP.
Natural Language and Speech Processing Creation of computational models of the understanding and the generation of natural language. Different fields coming.
CS4705 Natural Language Processing Fall What will we study in this course? How can machines recognize and generate text and speech? – Human language.
School of Computing Science CMT1000 Ed Currie © Middlesex University Lecture 2: 1 CMT1000: Introduction to Programming Ed Currie Lecture 2B: Programming.
1 Natural Language Processing for the Web Prof. Kathleen McKeown 722 CEPSR, Office Hours: Wed, 1-2; Mon 3-4 TA: Fadi Biadsy 702 CEPSR,
Intro to Robots 10 Artificial Intelligence “I want to make a computer that will be proud of me” Daniel Hillis.
1 SIMS 290-2: Applied Natural Language Processing Marti Hearst August 30, 2004.
Natural Language Processing Prof: Jason Eisner Webpage: syllabus, announcements, slides, homeworks.
Natural Language Processing Ellen Back, LIS489, Spring 2015.
SI485i : NLP Day 1 Intro to NLP. Assumptions about You You know… how to program Java basic UNIX usage basic probability and statistics (we’ll also review)
HEADLINE WRITING NCSMA SUMMER WORKSHOP 2014 Colin Donohue, Elon University.
C OMPUTER P ROGRAMMING 1 Introduction to Programming.
CSCI 200 Introduction To Programming with Visual Basic Bob Bradley.
9/8/20151 Natural Language Processing Lecture Notes 1.
Overview: Humans are unique creatures. Everything we do is slightly different from everyone else. Even though many times these differences are so minute.
Lecture 12: 22/6/1435 Natural language processing Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
1 Natural Language Processing Gholamreza Ghassem-Sani.
Lesson 3: Talking about Occupations, Continuing Conversations.
Unit 1 The world of our senses Task Telling a story.
/425 Declarative Methods - J. Eisner /425 Declarative Methods Prof. Jason Eisner MWF 3-4pm (sometimes 3-4:15)
1 Natural Language Processing Gholamreza Ghassem-Sani Fall 1383.
An Introduction to Miscue Analysis. Defining Reading One Definition: “Your eyes are dancing across the page of text, and the words are being translated.
1 Computational Linguistics Ling 200 Spring 2006.
Presentation by Dianne Smith, MJE. Something went wrong In jet crash, expert says.
I am ready to test!________ I am ready to test!________
Sight Words.
Words from the Fry List. set put end dies.
Sight Words List 1 Mr. Matthews Grade One can.
Unit 4 What would you do? Self Check Language Goal: 1. learn how to do before some accidents and problems. 2. grasp the following phrase: let... down;
CHAPTER 13 NATURAL LANGUAGE PROCESSING. Machine Translation.
Avoiding Run-on Sentences, Comma Splices Getting Your Punctuation Right!, ;
1 CSI 5180: Topics in AI: Natural Language Processing, A Statistical Approach Instructor: Nathalie Japkowicz Objectives of.
NACLO 2008 North American Computation Linguistics Olympiad Brandeis CL Olympiad Team James Pustejovsky Tai Sassen-Liang Sharone Horowit-Hendler Noam Sienna.
CSA2050 Introduction to Computational Linguistics Lecture 1 Overview.
District 200 High frequency words
Natural Language Processing Menu Based Natural Language Interfaces -Kyle Neumeier.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Spring 2006-Lecture 1.
CSE467/567 Computational Linguistics Carl Alphonce Computer Science & Engineering University at Buffalo.
9/13/1999JHU CS / Intro to NLP/Jan Hajic1 Textbooks you need Manning, C. D., Sch ü tze, H.: Foundations of Statistical Natural Language Processing.
CS 188: Artificial Intelligence Spring 2009 Natural Language Processing Dan Klein – UC Berkeley 1.
1 Running Experiments for Your Term Projects Dana S. Nau CMSC 722, AI Planning University of Maryland Lecture slides for Automated Planning: Theory and.
Characterization Do NOW:
Sight Words.
Introduction English Language. Course and unit outline: Looking closely at unit 1: AOS 1 Folio task Text book.
Drawing Trees & Ambiguity in Trees
High Frequency Words.
1 An Introduction to Computational Linguistics Mohammad Bahrani.
Spring, 2005 CSE391 – Lecture 1 1 Introduction to Artificial Intelligence Martha Palmer CSE391 Spring, 2005.
Simple Present vs. Present Continuous WHEN DO WE USE EACH TENSE?
Unit 20 Humour.
Welcome to My Reading Recovery Lesson. Rereading Familiar Books In every lesson every day I get to read lots of little books. I get to pick some of my.
Introducing Python 3 Introduction to Python. Introduction to Python L1 Introducing Python 3 Learning Objectives Know what Python is and some of the applications.
Created By Sherri Desseau Click to begin TACOMA SCREENING INSTRUMENT FIRST GRADE.
/665 Natural Language Processing
Unit 5: Hypothesis Testing
Question Answer Relationship ?.
Prof. Jason Eisner MWF 3-4pm (sometimes 3-4:15)
Fry Word Test First 300 words in 25 word groups
Lesson Objectives To identify the different features used on the front cover of a newspaper and the inside pages To understand the tone of a newspaper.
Presentation transcript:

1 Ling 569: Introduction to Computational Linguistics Jason Eisner Johns Hopkins University Tu/Th 1:30-3:20 (also this Fri 1-5) me if you need to get

– Intro to NLP – J. Eisner2 Computational Linguistics (CL) Use computational methods to solve problems of interest to linguists Competence modeling –Formal grammars (now probabilistic) Performance modeling –Algorithms (now statistical inference) –Computational psycholinguistics Modeling language in the world from data –E.g., language contact and change

– Intro to NLP – J. Eisner3 Natural Language Processing (NLP) CL is science; NLP is engineering. Computers would be a lot more useful if they could handle our , do our library research, talk to us … But they are fazed by natural human language. How can we tell computers about language? (Or help them learn it as kids do?)

– Intro to NLP – J. Eisner4 A few examples of NLP tasks Spelling correction, grammar checking … Machine translation Better search engines Information extraction Psychotherapy; Harlequin romances; etc. New interfaces: –Speech recognition (and text-to-speech) –Dialogue systems (USS Enterprise onboard computer) –Machine translation (the Babel fish)

– Intro to NLP – J. Eisner5 Goals of the course Introduce you to NLP problems & solutions Relation to linguistics & statistics At the end you should: –Agree that language is subtle & interesting –Feel some ownership over the formal & statistical models –Understand research papers in the field

– Intro to NLP – J. Eisner6 Ambiguity: Favorite Headlines Iraqi Head Seeks Arms Is There a Ring of Debris Around Uranus? Juvenile Court to Try Shooting Defendant Teacher Strikes Idle Kids Stolen Painting Found by Tree Kids Make Nutritious Snacks Local HS Dropouts Cut in Half Obesity Study Looks for Larger Test Group

– Intro to NLP – J. Eisner7 Ambiguity: Favorite Headlines British Left Waffles on Falkland Islands Never Withhold Herpes Infection from Loved One Red Tape Holds Up New Bridges Man Struck by Lightning Faces Battery Charge Clinton Wins on Budget, but More Lies Ahead Hospitals Are Sued by 7 Foot Doctors

– Intro to NLP – J. Eisner8 Levels of Language Phonetics/phonology/morphology: what words (or subwords) are we dealing with? Syntax: What phrases are we dealing with? Which words modify one another? Semantics: What’s the literal meaning? Pragmatics: What should you conclude from the fact that I said something? How should you react?

– Intro to NLP – J. Eisner9 Subtler Ambiguity Q: Why does my high school give me a suspension for skipping class? A: Administrative error. They’re supposed to give you a suspension for auto shop, and a jump rope for skipping class. (*rim shot*)

– Intro to NLP – J. Eisner10 What’s hard about this story? John stopped at the donut store on his way home from work. He thought a coffee was good every few hours. But it turned out to be too expensive there.

– Intro to NLP – J. Eisner11 What’s hard about this story? John stopped at the donut store on his way home from work. He thought a coffee was good every few hours. But it turned out to be too expensive there. To get a donut (spare tire) for his car?

– Intro to NLP – J. Eisner12 What’s hard about this story? John stopped at the donut store on his way home from work. He thought a coffee was good every few hours. But it turned out to be too expensive there. store where donuts shop? or is run by donuts? or looks like a big donut? or made of donut? or has an emptiness at its core?

– Intro to NLP – J. Eisner13 What’s hard about this story? I stopped smoking freshman year, but John stopped at the donut store on his way home from work. He thought a coffee was good every few hours. But it turned out to be too expensive there.

– Intro to NLP – J. Eisner14 What’s hard about this story? John stopped at the donut store on his way home from work. He thought a coffee was good every few hours. But it turned out to be too expensive there. Describes where the store is? Or when he stopped?

– Intro to NLP – J. Eisner15 What’s hard about this story? John stopped at the donut store on his way home from work. He thought a coffee was good every few hours. But it turned out to be too expensive there. Well, actually, he stopped there from hunger and exhaustion, not just from work.

– Intro to NLP – J. Eisner16 What’s hard about this story? John stopped at the donut store on his way home from work. He thought a coffee was good every few hours. But it turned out to be too expensive there. At that moment, or habitually? (Similarly: Mozart composed music.)

– Intro to NLP – J. Eisner17 What’s hard about this story? John stopped at the donut store on his way home from work. He thought a coffee was good every few hours. But it turned out to be too expensive there. That’s how often he thought it?

– Intro to NLP – J. Eisner18 What’s hard about this story? John stopped at the donut store on his way home from work. He thought a coffee was good every few hours. But it turned out to be too expensive there. But actually, a coffee only stays good for about 10 minutes before it gets cold.

– Intro to NLP – J. Eisner19 What’s hard about this story? John stopped at the donut store on his way home from work. He thought a coffee was good every few hours. But it turned out to be too expensive there. Similarly: In America a woman has a baby every 15 minutes. Our job is to find that woman and stop her.

– Intro to NLP – J. Eisner20 What’s hard about this story? John stopped at the donut store on his way home from work. He thought a coffee was good every few hours. But it turned out to be too expensive there. the particular coffee that was good every few hours? the donut store? the situation?

– Intro to NLP – J. Eisner21 What’s hard about this story? John stopped at the donut store on his way home from work. He thought a coffee was good every few hours. But it turned out to be too expensive there. too expensive for what? what are we supposed to conclude about what John did? how do we connect “it” to “expensive”?

– Intro to NLP – J. Eisner22 n-grams Letter or word frequencies: 1-grams (= unigrams) –useful in solving cryptograms: ETAOINSHRDLU… If you know the previous letter: 2-grams (= bigrams) –“h” is rare in English (4%; 4 points in Scrabble) –but “h” is common after “t” (20%) If you know the previous two letters: 3-grams (= trigrams) –“h” is really common after “(space) t” etc. …

– Intro to NLP – J. Eisner23 Some random n-gram text …