Presentation is loading. Please wait.

Presentation is loading. Please wait.

15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum.

Similar presentations


Presentation on theme: "15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum."— Presentation transcript:

1 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

2 Text Classification by Example

3

4

5

6

7 How could you build a text classifier? Take some ideas from machine learning –Supervised learning setting –Examples of each class (a few or thousands) Take some ideas from machine translation –Generative models –Language models Simplify each and stir thoroughly

8 Basic Approach of Generative Modeling 1.Pick representation for data 2.Write down probabilistic generative model 3.Estimate model parameters with training data 4.Turn model around to calculate unknown values for new data

9 Naïve Bayes: Bag of Words Representation Corn prices rose today while corn futures dropped in surprising trading activity. Corn... All words in dictionary Occurrence counts

10 Naïve Bayes: Mixture of Multinomials Model 1.Pick the class: P(class) 2.For every word, pick from the class urn: P(word|class) while polo soccer activity dropped soccer the ball COMPUTERS SPORTS the in web windows the in java windows again modem Word independence assumption!

11 Naïve Bayes: Estimating Parameters Just like estimating biased coin flip probabilities Estimate MAP word probabilities: Estimate MAP class priors:

12 Naïve Bayes: Performing Classification Word independence assumption Take the class with the highest probability

13 Classification Tricks of the Trade Stemming –run, runs, running, ran  run –table, tables, tabled  table –computer, compute, computing  compute Stopwords –Very frequent function words generally uninformative –if, in, the, like, … Information gain feature selection –Keep just most indicative words in the vocabulary

14 Naïve Bayes Rules of Thumb Need hundreds of labeled examples per class for good performance (~85% accuracy) Stemming and stopwords may or may not help Feature selection may or may not help Predicted probabilities will be very extreme Use sum of logs instead of multiplying probabilities for underflow prevention Coding this up is trivial, either as a mapreduce or not

15 Information Extraction with Generative Models

16 Example: A Problem Genomics job Mt. Baker, the school district Baker Hostetler, the company Baker, a job opening

17 Example: A Solution

18 Job Openings: Category = Food Services Keyword = Baker Location = Continental U.S.

19 Extracting Job Openings from the Web Title: Ice Cream Guru Description: If you dream of cold creamy… Contact: susan@foodscience.comsusan@foodscience.com Category: Travel/Hospitality Function: Food Services

20 Potential Enabler of Faceted Search

21 Lots of Structured Information in Text

22 IE from Research Papers

23 What is Information Extraction? Recovering structured data from formatted text

24 What is Information Extraction? Recovering structured data from formatted text –Identifying fields (e.g. named entity recognition)

25 What is Information Extraction? Recovering structured data from formatted text –Identifying fields (e.g. named entity recognition) –Understanding relations between fields (e.g. record association)

26 What is Information Extraction? Recovering structured data from formatted text –Identifying fields (e.g. named entity recognition) –Understanding relations between fields (e.g. record association) –Normalization and deduplication

27 What is Information Extraction? Recovering structured data from formatted text –Identifying fields (e.g. named entity recognition) –Understanding relations between fields (e.g. record association) –Normalization and deduplication Today, focus on field identification

28 IE History Pre-Web Mostly news articles –De Jong’s FRUMP [1982] Hand-built system to fill Schank-style “scripts” from news wire –Message Understanding Conference (MUC) DARPA [’87-’95], TIPSTER [’92-’96] Most early work dominated by hand-built models –E.g. SRI’s FASTUS, hand-built FSMs. –But by 1990’s, some machine learning: Lehnert, Cardie, Grishman and then HMMs: Elkan [Leek ’97], BBN [Bikel et al ’98] Web AAAI ’94 Spring Symposium on “Software Agents” –Much discussion of ML applied to Web. Maes, Mitchell, Etzioni. Tom Mitchell’s WebKB, ‘96 –Build KB’s from the Web. Wrapper Induction –Initially hand-build, then ML: [Soderland ’96], [Kushmeric ’97],…

29 IE Posed as a Machine Learning Task Training data: documents marked up with ground truth In contrast to text classification, local features crucial. Features of: –Contents –Text just before item –Text just after item –Begin/end boundaries 00 : pm Place : Wean Hall Rm 5409 Speaker : Sebastian Thrun prefixcontentssuffix … …

30 Good Features for Information Extraction Example word features: –identity of word –is in all caps –ends in “-ski” –is part of a noun phrase –is in a list of city names –is under node X in WordNet or Cyc –is in bold font –is in hyperlink anchor –features of past & future –last person name was female –next two words are “and Associates” begins-with-number begins-with-ordinal begins-with-punctuation begins-with-question-word begins-with-subject blank contains-alphanum contains-bracketed- number contains-http contains-non-space contains-number contains-pipe contains-question-mark contains-question-word ends-with-question-mark first-alpha-is-capitalized indented indented-1-to-4 indented-5-to-10 more-than-one-third-space only-punctuation prev-is-blank prev-begins-with-ordinal shorter-than-30 Creativity and Domain Knowledge Required!

31 Is Capitalized Is Mixed Caps Is All Caps Initial Cap Contains Digit All lowercase Is Initial Punctuation Period Comma Apostrophe Dash Preceded by HTML tag Character n-gram classifier says string is a person name (80% accurate) In stopword list (the, of, their, etc) In honorific list (Mr, Mrs, Dr, Sen, etc) In person suffix list (Jr, Sr, PhD, etc) In name particle list (de, la, van, der, etc) In Census lastname list; segmented by P(name) In Census firstname list; segmented by P(name) In locations lists (states, cities, countries) In company name list (“J. C. Penny”) In list of company suffixes (Inc, & Associates, Foundation) Word Features –lists of job titles, –Lists of prefixes –Lists of suffixes –350 informative phrases HTML/Formatting Features –{begin, end, in} x {,,, } x {lengths 1, 2, 3, 4, or longer} –{begin, end} of line Creativity and Domain Knowledge Required! Good Features for Information Extraction

32 Landscape of ML Techniques for IE: Any of these models can be used to capture words, formatting or both. Classify Candidates Abraham Lincoln was born in Kentucky. Classifier which class? Sliding Window Abraham Lincoln was born in Kentucky. Classifier which class? Try alternate window sizes: Boundary Models Abraham Lincoln was born in Kentucky. Classifier which class? BEGINENDBEGINEND BEGIN Finite State Machines Abraham Lincoln was born in Kentucky. Most likely state sequence? Wrapper Induction Abraham Lincoln was born in Kentucky. Learn and apply pattern for a website PersonName

33 Sliding Windows & Boundary Detection

34 Information Extraction by Sliding Windows GRAND CHALLENGES FOR MACHINE LEARNING Jaime Carbonell School of Computer Science Carnegie Mellon University 3:30 pm 7500 Wean Hall Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on. CMU UseNet Seminar Announcement

35 Information Extraction by Sliding Windows GRAND CHALLENGES FOR MACHINE LEARNING Jaime Carbonell School of Computer Science Carnegie Mellon University 3:30 pm 7500 Wean Hall Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on. CMU UseNet Seminar Announcement

36 Information Extraction by Sliding Window GRAND CHALLENGES FOR MACHINE LEARNING Jaime Carbonell School of Computer Science Carnegie Mellon University 3:30 pm 7500 Wean Hall Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on. CMU UseNet Seminar Announcement

37 Information Extraction by Sliding Window GRAND CHALLENGES FOR MACHINE LEARNING Jaime Carbonell School of Computer Science Carnegie Mellon University 3:30 pm 7500 Wean Hall Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on. CMU UseNet Seminar Announcement

38 Information Extraction by Sliding Window GRAND CHALLENGES FOR MACHINE LEARNING Jaime Carbonell School of Computer Science Carnegie Mellon University 3:30 pm 7500 Wean Hall Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on. CMU UseNet Seminar Announcement

39 Information Extraction with Sliding Windows [Freitag 97, 98; Soderland 97; Califf 98] 00 : pm Place : Wean Hall Rm 5409 Speaker : Sebastian Thrun w t-m w t-1 w t w t+n w t+n+1 w t+n+m prefixcontentssuffix … … Standard supervised learning setting –Positive instances: Windows with real label –Negative instances: All other windows –Features based on candidate, prefix and suffix Special-purpose rule learning systems work well courseNumber(X) :- tokenLength(X,=,2), every(X, inTitle, false), some(X, A,, inTitle, true), some(X, B, <>. tripleton, true)

40 IE by Boundary Detection GRAND CHALLENGES FOR MACHINE LEARNING Jaime Carbonell School of Computer Science Carnegie Mellon University 3:30 pm 7500 Wean Hall Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on. CMU UseNet Seminar Announcement

41 IE by Boundary Detection GRAND CHALLENGES FOR MACHINE LEARNING Jaime Carbonell School of Computer Science Carnegie Mellon University 3:30 pm 7500 Wean Hall Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on. CMU UseNet Seminar Announcement

42 IE by Boundary Detection GRAND CHALLENGES FOR MACHINE LEARNING Jaime Carbonell School of Computer Science Carnegie Mellon University 3:30 pm 7500 Wean Hall Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on. CMU UseNet Seminar Announcement

43 IE by Boundary Detection GRAND CHALLENGES FOR MACHINE LEARNING Jaime Carbonell School of Computer Science Carnegie Mellon University 3:30 pm 7500 Wean Hall Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on. CMU UseNet Seminar Announcement

44 IE by Boundary Detection GRAND CHALLENGES FOR MACHINE LEARNING Jaime Carbonell School of Computer Science Carnegie Mellon University 3:30 pm 7500 Wean Hall Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on. CMU UseNet Seminar Announcement

45 BWI: Learning to detect boundaries Another formulation: learn three probabilistic classifiers: –START(i) = Prob( position i starts a field) –END(j) = Prob( position j ends a field) –LEN(k) = Prob( an extracted field has length k) Then score a possible extraction (i,j) by START(i) * END(j) * LEN(j-i) LEN(k) is estimated from a histogram START(i) and END(j) learned by boosting over simple boundary patterns and features [Freitag & Kushmerick, AAAI 2000]

46 Problems with Sliding Windows and Boundary Finders Decisions in neighboring parts of the input are made independently from each other. –Sliding Window may predict a “seminar end time” before the “seminar start time”. –It is possible for two overlapping windows to both be above threshold. –In a Boundary-Finding system, left boundaries are laid down independently from right boundaries, and their pairing happens as a separate step.

47 Hidden Markov Models

48 Citation Parsing Fahlman, Scott & Lebiere, Christian (1989). The cascade-correlation learning architecture. Advances in Neural Information Processing Systems, pp. 524-532. Fahlman, S.E. and Lebiere, C., “The Cascade Correlation Learning Architecture,” Neural Information Processing Systems, pp. 524-532, 1990. Fahlman, S. E. (1991) The recurrent cascade-correlation learning architecture. NIPS 3, 190-205.

49 Can we do this with probabilistic generative models? Could have classes for {author, title, journal, year, pages} Could classify every word or sequence? –Which sequences? Something interesting in the sequence of fields that we’d like to capture –Authors come first –Title comes before journal –Page numbers come near the end

50 Hidden Markov Models: The Representation A document is a sequence of words Each word is tagged by its class fahlman s e and lebiere c the cascade correlation learning architecture neural information processing systems pp 524 532 1990

51 HMM: Generative Model (1) AuthorTitle Journal Year Pages

52 HMM: Generative Model (2) AuthorTitle Year Pages

53 HMM: Generative Model (3) States: x i State transitions: P(x i |x j ) = a[x i |x j ] Output probabilities: P(o i |x j ) = b[o i |x j ] Markov independence assumption

54 HMMs: Estimating Parameters With fully-labeled data, just like naïve Bayes Estimate MAP output probabilities: Estimate MAP state transitions:

55 HMMs: Performing Extraction Given output words: –fahlman s e 1991 the recurrent cascade correlation learning architecture nips 3 190 205 Find state sequence that maximizes: Lots of possible state sequences to test (5 14 ) Hmm…

56 Representation for Paths: Trellis

57

58

59

60

61 HMM Example: Nymble Task: Named Entity Extraction Train on 450k words of news wire text. Case Language F1. Mixed English93% UpperEnglish91% MixedSpanish90% [Bikel, et al 97] Person Org Other (Five other name classes) start-of- sentence end-of- sentence Results: Bigram within classes Backoff to unigram Special capitalization and number features…

62 Nymble word features

63 HMMs: A Plethora of Applications Information extraction Part of speech tagging Word segmentation Gene finding Protein structure prediction Speech recognition Economics, Climatology, Robotics, …


Download ppt "15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum."

Similar presentations


Ads by Google