Presentation is loading. Please wait.

Presentation is loading. Please wait.

RANLP 2005 - Bernardo Magnini1 Open Domain Question Answering: Techniques, Resources and Systems Bernardo Magnini Itc-Irst Trento, Italy

Similar presentations


Presentation on theme: "RANLP 2005 - Bernardo Magnini1 Open Domain Question Answering: Techniques, Resources and Systems Bernardo Magnini Itc-Irst Trento, Italy"— Presentation transcript:

1 RANLP Bernardo Magnini1 Open Domain Question Answering: Techniques, Resources and Systems Bernardo Magnini Itc-Irst Trento, Italy

2 RANLP Bernardo Magnini2 Outline of the Tutorial I. Introduction to QA II. QA at TREC III. System Architecture - Question Processing - Answer Extraction IV. Answer Validation on the Web V. Cross-Language QA

3 RANLP Bernardo Magnini3 Previous Lectures/Tutorials on QA  Dan Moldovan and Sanda Harabagiu: Question Answering, IJCAI  Marteen de Rijke and Bonnie Webber: Knowlwdge- Intensive Question Answering, ESSLLI  Jimmy Lin and Boris Katz, Web-based Question Answering, EACL 2004

4 RANLP Bernardo Magnini4 I.Introduction to Question Answering n What is Question Answering n Applications n Users n Question Types n Answer Types n Evaluation n Presentation n Brief history

5 RANLP Bernardo Magnini5 Query Driven vs Answer Driven Information Access  What does LASER stand for?  When did Hitler attack Soviet Union?  Using Google we find documents containing the question itself, no matter whether or not the answer is actually provided.  Current information access is query driven.  Question Answering proposes an answer driven approach to information access.

6 RANLP Bernardo Magnini6 Question Answering n Find the answer to a question in a large collection of documents u questions (in place of keyword-based query) u answers (in place of documents)

7 RANLP Bernardo Magnini7 Searching for: Etna Where is Naxos?Searching for: NaxosWhat continent is Taormina in?What is the highest volcano in Europe? Why Question Answering? Document collection From the Caledonian Star in the Mediterranean – September 23, 1990 (www.expeditions.com): On a beautiful early morning the Caledonian Star approaches Naxos, situated on the east coast of Sicily. As we anchored and put the Zodiacs into the sea we enjoyed the great scenery. Under Mount Etna, the highest volcano in Europe, perches the fabulous town of Taormina. This is the goal for our morning. After a short Zodiac ride we embarked our buses with local guides and went up into the hills to reach the town of Taormina. Naxos was the first Greek settlement at Sicily. Soon a harbor was established but the town was later destroyed by invaders.[...] Searching for: Taormina

8 RANLP Bernardo Magnini8 n Document Retrieval u users submit queries corresponding to their information need u system returns (voluminous) list of full-length documents u it is the responsibility of the users to find their original information need, within the returned documents n Open-Domain Question Answering (QA) u users ask fact-based, natural language questions What is the highest volcano in Europe? u system returns list of short answers … Under Mount Etna, the highest volcano in Europe, perches the fabulous town … u more appropriate for specific information needs Alternatives to Information Retrieval

9 RANLP Bernardo Magnini9 What is QA? n Find the answer to a question in a large collection of documents n What is the brightest star visible from Earth? 1. Sirio A is the brightest star visible from Earth even if it is… 2. the planet is 12-times brighter than Sirio, the brightest star in the sky…

10 RANLP Bernardo Magnini10 QA: a Complex Problem (1) n Problem: discovery implicit relations among question and answers Who is the author of the “Star Spangled Banner”? …Francis Scott Key wrote the “Star Spangled Banner” in …comedian-actress Roseanne Barr sang her famous rendition of the “Star Spangled Banner” before …

11 RANLP Bernardo Magnini11 QA: a Complex Problem (2) n Problem: discovery implicit relations among question and answers Which is the Mozart birth date? …. Mozart (1751 – 1791) ….

12 RANLP Bernardo Magnini12 QA: a complex problem (3) n Problem: discovery implicit relations among question and answers Which is the distance between Naples and Ravello? “From the Naples Airport follow the sign to Autostrade (green road sign). Follow the directions to Salerno (A3). Drive for about 6 Km. Pay toll (Euros 1.20). Drive appx. 25 Km. Leave the Autostrade at Angri (Uscita Angri). Turn left, follow the sign to Ravello through Angri. Drive for about 2 Km. Turn right following the road sign "Costiera Amalfitana". Within 100m you come to traffic lights prior to narrow bridge. Watch not to miss the next Ravello sign, at appx. 1 Km from the traffic lights. Now relax and enjoy the views (follow this road for 22 Km). Once in Ravello...”.

13 RANLP Bernardo Magnini13 QA: Applications (1) n Information access: u Structured data (databases) u Semi-structured data (e.g. comment field in databases, XML) u Free text n To search over: u The Web u Fixed set of text collection (e.g. TREC) u A single text (reading comprehension evaluation)

14 RANLP Bernardo Magnini14 QA: Applications (2) n Domain independent QA n Domain specific (e.g. help systems) n Multi-modal QA u Annotated images u Speech data

15 RANLP Bernardo Magnini15 QA: Users n Casual users, first time users u Understand the limitations of the system u Interpretation of the answer returned n Expert users u Difference between novel and already provided information u User Model

16 RANLP Bernardo Magnini16 QA: Questions (1) n Classification according to the answer type u Factual questions (What is the larger city …) u Opinions (What is the author attitude …) u Summaries (What are the arguments for and against…) n Classification according to the question speech act: u Yes/NO questions (Is it true that …) u WH questions (Who was the first president …) u Indirect Requests (I would like you to list …) u Commands (Name all the presidents …)

17 RANLP Bernardo Magnini17 QA: Questions (2) n Difficult questions u Why, How questions require understanding causality or instrumental relations u What questions have little constraint on the answer type (e.g. What did they do?)

18 RANLP Bernardo Magnini18 QA: Answers n Long answers, with justification n Short answers (e.g. phrases) n Exact answers (named entities) n Answer construction: u Extraction: cut and paste of snippets from the original document(s) u Generation: from multiple sentences or documents u QA and summarization (e.g. What is this story about?)

19 RANLP Bernardo Magnini19 QA: Information Presentation n Interfaces for QA u Not just isolated questions, but a dialogue u Usability and user satisfaction n Critical situations u Real time, single answer n Dialog-based interaction u Speech input u Conversational access to the Web

20 RANLP Bernardo Magnini20 QA: Brief History (1) n NLP interfaces to databases: u BASEBALL (1961), LUNAR (1973), TEAM (1979), ALFRESCO (1992) u Limitations: structured knowledge and limited domain n Story comprehension: Shank (1977), Kintsch (1998), Hirschman (1999)

21 RANLP Bernardo Magnini21 QA: Brief History (2) n Information retrieval (IR) u Queries are questions u List of documents are answers u QA is close to passage retrieval u Well established methodologies (i.e. Text Retrieval Conferences TREC) n Information extraction (IE): u Pre-defined templates are questions u Filled template are answers

22 22RANLP Bernardo Magnini Research Context (1) Question Answering Domain specificDomain-independent Structured dataFree text Web Fixed set of collections Single document Growing interest in QA (TREC, CLEF, NT evaluation campaign). Recent focus on multilinguality and context aware QA

23 23RANLP Bernardo Magnini Research Context (2) faithfulness compactness Automatic Summarization Machine Translation Automatic Question Answering as compact as possible answers must be faithful w.r.t. questions (correctness) and compact (exactness) as faithful as possible

24 RANLP Bernardo Magnini24 II.Question Answering at TREC n The problem simplified n Questions and answers n Evaluation metrics n Approaches

25 RANLP Bernardo Magnini25 The problem simplified: The Text Retrieval Conference n Goal u Encourage research in information retrieval based on large-scale collections n Sponsors u NIST: National Institute of Standards and Technology u ARDA: Advanced Research and Development Activity u DARPA: Defense Advanced Research Projects Agency n Since 1999 n Participants are research institutes, universities, industries

26 RANLP Bernardo Magnini26 TREC Questions Q-1391: How many feet in a mile? Q-1057: Where is the volcano Mauna Loa? Q-1071: When was the first stamp issued? Q-1079: Who is the Prime Minister of Canada? Q-1268: Name a food high in zinc. Q-896: Who was Galileo? Q-897: What is an atom? Q-711: What tourist attractions are there in Reims? Q-712: What do most tourists visit in Reims? Q-713: What attracts tourists in Reims Q-714: What are tourist attractions in Reims? Fact-based, short answer questions Definition questions Reformulation questions

27 RANLP Bernardo Magnini27 Answer Assessment n Criteria for judging an answer u Relevance: it should be responsive to the question u Correctness: it should be factually correct u Conciseness: it should not contain extraneous or irrelevant information u Completeness: it should be complete, i.e. partial answer should not get full credit u Simplicity: it should be simple, so that the questioner can read it easily u Justification: it should be supplied with sufficient context to allow a reader to determine why this was chosen as an answer to the question

28 RANLP Bernardo Magnini28 Questions at TREC

29 RANLP Bernardo Magnini29 Exact Answers n Basic unit of a response: [answer-string, docid] pair n An answer string must contain a complete, exact answer and nothing else. What is the longest river in the United States? The following are correct, exact answers Mississippi, the Mississippi, the Mississippi River, Mississippi River mississippi while none of the following are correct exact answers At 2,348 miles the Mississippi River is the longest river in the US. 2,348 miles; Mississippi Missipp Missouri

30 RANLP Bernardo Magnini30 Assessments n Four possible judgments for a triple [ Question, document, answer ] n Rigth: the answer is appropriate for the question n Inexact: used for non complete answers n Unsupported: answers without justification n Wrong: the answer is not appropriate for the question

31 RANLP Bernardo Magnini31 R 1530 XIE Wellington R 1490 NYT Albert DeSalvo R 1503 XIE New Guinea U 1402 NYT R 1426 NYT Sundquist U 1506 NYT Excalibur R 1601 NYT April 18, 1955 X 1848 NYT Enola R 1838 NYT Fala R 1674 APW July 20, 1969 X 1716 NYT Barton R 1473 APW R 1622 NYT Ellen W 1510 NYT Young Girl R=Right, X=ineXact, U=Unsupported, W=Wrong What is the capital city of New Zealand? What is the Boston Strangler's name? What is the world's second largest island? What year did Wilt Chamberlain score 100 points? Who is the governor of Tennessee? What's the name of King Arthur's sword? When did Einstein die? What was the name of the plane that dropped the Atomic Bomb on Hiroshima? What was the name of FDR's dog? What day did Neil Armstrong land on the moon? Who was the first Triple Crown Winner? When was Lyndon B. Johnson born? Who was Woodrow Wilson's First Lady? Where is Anne Frank's diary?

32 RANLP Bernardo Magnini : What year did Wilt Chamberlain score 100 points? DIOGENE: 1962 ASSESMENT: UNSUPPORTED PARAGRAPH: NYT Petty's 200 victories, 172 of which came during a 13-year span between , may be as unapproachable as Joe DiMaggio's 56-game hitting streak or Wilt Chamberlain's 100-point game.

33 RANLP Bernardo Magnini : What's the name of King Arthur's sword? ANSWER: Excalibur PARAGRAPH: NYT ASSESMENT: UNSUPPORTED `QUEST FOR CAMELOT,' with the voices of Andrea Carr, Gabriel Byrne, Cary Elwes, John Gielgud, Jessalyn Gilsig, Eric Idle, Gary Oldman, Bronson Pinchot, Don Rickles and Bryan White. Directed by Frederik Du Chau (G, 100 minutes). Warner Brothers' shaky entrance into the Disney-dominated sweepstakes of the musicalized animated feature wants to be a juvenile feminist ``Lion King'' with a musical heart that fuses ``Riverdance'' with formulaic Hollywood gush. But its characters are too wishy-washy and visually unfocused to be compelling, and the songs (by David Foster and Carole Bayer Sager) so forgettable as to be extraneous. In this variation on the Arthurian legend, a nondescript Celtic farm girl named Kayley with aspirations to be a knight wrests the magic sword Excalibur from the evil would-be emperor Ruber (a Hulk Hogan look-alike) and saves the kingdom (Holden).

34 RANLP Bernardo Magnini : What was the name of the plane that dropped the Atomic Bomb on Hiroshima? DIOGENE: Enola PARAGRAPH: NYT ASSESMENT: INEXACT Tibbets piloted the Boeing B-29 Superfortress Enola Gay, which dropped the atomic bomb on Hiroshima on Aug. 6, 1945, causing an estimated 66,000 to 240,000 deaths. He named the plane after his mother, Enola Gay Tibbets.

35 RANLP Bernardo Magnini : Who was the first Triple Crown Winner? DIOGENE: Barton PARAGRAPH: NYT ASSESMENT: INEXACT Not all of the Triple Crown winners were immortals. The first, Sir Barton, lost six races in 1918 before his first victory, just as Real Quiet lost six in a row last year. Try to find Omaha and Whirlaway on anybody's list of all-time greats.

36 RANLP Bernardo Magnini : Where is Anne Frank's diary? DIOGENE: Young Girl PARAGRAPH: NYT ASSESMENT: WRONG Otto Frank released a heavily edited version of “B” for its first publication as “Anne Frank: Diary of a Young Girl” in 1947.

37 RANLP Bernardo Magnini37 TREC Evaluation Metric: Mean Reciprocal Rank (MRR) n Reciprocal Rank = inverse of rank at which first correct answer was found: [1, 0,5, 0.33, 0.25, 0.2, 0] n MRR: average over all questions n Strict score: unsupported count as incorrect n Lenient score: unsupported count as correct

38 RANLP Bernardo Magnini38 TREC Evaluation Metrics: Confidence-Weighted Score (CWS) Sum for i = 1 to 500 (#-correct-up-to-question i / i) 500 System A: 1  C 2  W 3  C 4  C 5  W System B: 1  W 2  W 3  C 4  C 5  C (1/1) + ((1+0)/2) + (1+0+1)/3) + (( )/4) + (( )/5) 5 Total: ((0+0)/2) + (0+0+1)/3) + (( )/4) + (( )/5) 5 Total: 0.29

39 RANLP Bernardo Magnini39 Evaluation n Best result: 67% n Average over 67 runs:23% TREC-8 TREC-9TREC-10 66% 25% 58% 24% 67% 23%

40 RANLP Bernardo Magnini40 Main Approaches at TREC n Knowledge-Based n Web-based n Pattern-based

41 RANLP Bernardo Magnini41 Knowledge-Based Approach n Linguistic-oriented methodology u Determine the answer type from question formanswer type u Retrieve small portions of documents u Find entities matching the answer type category in text snippets n Majority of systems use a lexicon (usually WordNet) u To find answer type u To verify that a candidate answer is of the correct typecandidate answer u To get definitions n Complex architecture...

42 RANLP Bernardo Magnini42 Web-Based Approach QUESTION Question Processing Component Search Component Auxiliary Corpus WEB ANSWER TREC Corpus Answer Extraction Component

43 RANLP Bernardo Magnini43 Patter-Based Approach (1/3) n Knowledge poor Knowledge poor n Strategy u Search for predefined patterns of textual expressions that may be interpreted as answers to certain question types. u The presence of such patterns in answer string candidates may provide evidence of the right answer.patterns

44 RANLP Bernardo Magnini44 Patter-Based Approach (2/3) n Conditions u Detailed categorization of question types F Up to 9 types of the “Who” question; 35 categories in total u Significant number of patterns corresponding to each question type F Up to 23 patterns for the “Who-Author” type, average of 15 u Find multiple candidate snippets and check for the presence of patterns (emphasis on recall)

45 RANLP Bernardo Magnini45 Pattern-based approach (3/3) n Example: patterns for definition questions n Question: What is A? correct answers 2. …26 correct answers 3. …12 correct answers 4. …9 correct answers 5. …8 correct answers 6. …7 correct answers 7. …3 correct answers total: 88 correct answers

46 RANLP Bernardo Magnini46 Use of answer patterns 1.For generating queries to the search engine. How did Mahatma Gandhi die? Mahatma Gandhi die Mahatma Gandhi die of Mahatma Gandhi lost his life in The TEXTMAP system (ISI) uses 550 patterns, grouped in 105 equivalence blocks. On TREC-2003 questions, the system produced, on average, 5 reformulations for each question. 2.For answer extraction When was Mozart born? P=1 ( - DATE) P=.69 was born on

47 RANLP Bernardo Magnini47 Acquisition of Answer Patterns Relevant approaches: u Manually developed surface pattern library (Soubbotin, Soubbotin, 2001) u Automatically extracted surface patterns (Ravichandran, Hovy 2002) Patter learning: 1. Start with a seed, e.g. (Mozart, 1756) 2. Download Web documents using a search engine 3. Retain sentences that contain both question and answer terms 4. Construct a suffix tree for extracting the longest matching substring that spans and 5. Calculate precision of patterns Precision = # of correct patterns with correct answer / # of total patterns

48 RANLP Bernardo Magnini48 Capturing variability with patterns n Pattern based QA is more effective when supported by variable typing obtained using NLP techniques and resources. When was born? ( - was born in n Surface patterns can not deal with word reordering and apposition phrases: Galileo, the famous astronomer, was born in … n The fact that most of the QA systems use syntactic parsing demonstrates that the successful solution of the answer extraction problem goes beyond the surface form analysis

49 RANLP Bernardo Magnini49 Syntactic answer patterns (1) S NP The VP was inventedPP in Answer patterns that capture the syntactic relations of a sentence. When was invented?

50 RANLP Bernardo Magnini50 Syntactic answer patterns (2) S NP VP The first was inventedPP in 1877 phonograph The matching phase turns out to be a problem of partial match among syntactic trees.

51 RANLP Bernardo Magnini51 III. System Architecture n Knowledge Based approach u Question Processing u Search component u Answer Extraction

52 RANLP Bernardo Magnini52 Knowledge based QA Search Component ANSWER Answer Extraction Component ANSWER VALIDATION NAMED ENTITIES RECOGNITION PARAGRAPH FILTERING ANSWER IDENTIFICATION QUERY COMPOSITION SEARCH ENGINE Document collection MULTIWORDS RECOGNITION KEYWORDS EXPANSION WORD SENSE DISAMBIGUATION QUESTION PARSING ANSWER TYPE IDENTIFICATION TOKENIZATION & POS TAGGING QUESTION Question Processing Component

53 RANLP Bernardo Magnini53 Question Analysis (1) n Input: NLP question n Output: u query for the search engine (i.e. a boolean composition of weighted keywords) u Answer type u Additional constraints: question focus, syntactic or semantic relations that should hold for a candidate answer entity and other entities

54 RANLP Bernardo Magnini54 Question Analysis (2) n Steps: 1. Tokenization 2. POS-tagging 3. Multi-words recognition 4. Parsing 5. Answer type and focus identification 6. Keyword extraction 7. Word Sense Disambiguation 8. Expansions

55 RANLP Bernardo Magnini55 Tokenization and POS-tagging NL-QUESTION: Who was the inventor of the electric light? WhoWhoCCHI[0,0] wasbeVIY[1,1] thedetRS[2,2] inventorinventorSS[3,3] ofofES[4,4] thedetRS[5,5] electricelectricAS[6,6] lightlightSS[7,7] ??XPS[8,8]

56 RANLP Bernardo Magnini56 Multi-Words recognition NL-QUESTION: Who was the inventor of the electric light? WhoWhoCCHI[0,0] wasbeVIY[1,1] thedetRS[2,2] inventorinventorSS[3,3] ofofES[4,4] thedetRS[5,5] electric_light electric_light SS[6,7] ??XPS[8,8]

57 RANLP Bernardo Magnini57 Syntactic Parsing n Identify syntactic structure of a sentence u noun phrases (NP), verb phrases (VP), prepositional phrases (PP) etc. Why did David Koresh ask the FBI for a word processor WRB VBD NNP NNP VB DT NNP IN DT NN NN WHADVP NP NP NP PP VP SQ SBARQ Why did David Koresh ask the FBI for a word processor?

58 RANLP Bernardo Magnini58 Answer Type and Focus n Focus is the word that expresses the relevant entity in the question u Used to select a set of relevant documents u ES: Where was Mozart born? n Answer Type is the category of the entity to be searched as answer u PERSON, MEASURE, TIME PERIOD, DATE, ORGANIZATION, DEFINITION u ES: Where was Mozart born? F LOCATION

59 RANLP Bernardo Magnini59 Answer Type and Focus What famous communist leader died in Mexico City? RULENAME: WHAT-WHO TEST: [“what” [¬ NOUN]* [NOUN:person-p] J +] OUTPUT: [“PERSON” J] Answer type: PERSON Focus: leader This rule matches any question starting with what, whose first noun, if any, is a person (i.e. satisfies the person-p predicate)

60 RANLP Bernardo Magnini60 Keywords Extraction NL-QUESTION: Who was the inventor of the electric light? WhoWhoCCHI[0,0] wasbeVIY[1,1] thedetRS[2,2] inventorinventorSS[3,3] ofofES[4,4] thedetRS[5,5] electric_light electric_light SS[6,7] ??XPS[8,8]

61 RANLP Bernardo Magnini61 Word Sense Disambiguation What is the brightest star visible from Earth?” STAR star#1: celestial body ASTRONOMY star#2: an actor who play …ART BRIGHT bright #1: bright brilliant shining PHYSICS bright #2: popular glorious GENERIC bright #3: promising auspicious GENERIC VISIBLEvisible#1: conspicuous obvious PHYSICS visible#2: visible seeable ASTRONOMY EARTHearth#1: Earth world globe ASTRONOMY earth #2: estate land landed_estate acres ECONOMY earth #3: clay GEOLOGY earth #4: dry_land earth solid_ground GEOGRAPHY earth #5: land ground soil GEOGRAPHY earth #6: earth ground GEOLOGY

62 RANLP Bernardo Magnini62 Expansions - NL-QUESTION: Who was the inventor of the electric light? - BASIC-KEYWORDS: inventor electric-light inventor synonyms discoverer, artificer derivation invention synonyms innovation derivation invent synonyms excogitate electric_light synonyms incandescent_lamp, ligth_bulb

63 RANLP Bernardo Magnini63 Keyword Composition n Keywords and expansions are composed in a boolean expression with AND/OR operators n Several possibilities: u AND composition u Cartesian composition (OR (inventor AND electric_light) OR (inventor AND incandescent_lamp) OR (discoverer AND electric_light) ………………………… OR inventor OR electric_light))

64 RANLP Bernardo Magnini64 Document Collection Pre-processing n For real time QA applications off-line pre-processing of the text is necessary u Term indexing u POS-tagging u Named Entities Recognition

65 RANLP Bernardo Magnini65 Candidate Answer Document Selection n Passage Selection: Individuate relevant, small, text portions n Given a document and a list of keywords: u Paragraph length (e.g. 200 words) u Consider the percentage of keywords present in the passage u Consider if some keyword is obligatory (e.g. the focus of the question).

66 RANLP Bernardo Magnini66 Candidate Answer Document Analysis n Passage text tagging n Named Entity Recognition Who is the author of the “Star Spangled Banner”? … Francis Scott Key wrote the “Star Spangled Banner” in 1814 n Some systems: u passages parsing (Harabagiu, 2001) u Logical form (Zajac, 2001)

67 RANLP Bernardo Magnini67 Answer Extraction (1) n Who is the author of the “Star Spangled Banner”? … Francis Scott Key wrote the “Star Spangled Banner” in 1814 Answer Type = PERSON Candidate Answer = Francis Scott Key Ranking candidate answers: keyword density in the passage, apply additional constraints (e.g. syntax, semantics), rank candidates using the Web

68 RANLP Bernardo Magnini68 Answer Identification Thomas E. Edison

69 RANLP Bernardo Magnini69 IV. Answer Validation n Automatic answer validation n Approach: u web-based u use of patterns u combine statistics and linguistic information n Discussion n Conclusions

70 RANLP Bernardo Magnini70 QA Architecture Search Component ANSWER Answer Extraction Component ANSWER RANKING NAMED ENTITIES RECOGNITION PARAGRAPH FILTERING ANSWER IDENTIFICATION QUERY COMPOSITION SEARCH ENGINE Document collection KEYWORDS EXPANSION WORD SENSE DISAMBIGUATION QUESTION PARSING ANSWER TYPE IDENTIFICATION TOKENIZATION & POS TAGGING QUESTION Question Processing Component

71 RANLP Bernardo Magnini71 The problem: Answer Validation Given a question q and a candidate answer a, decide if a is a correct answer for q What is the capital of the USA? Washington D.C. San Francisco Rome

72 RANLP Bernardo Magnini72 The problem: Answer Validation Given a question q and a candidate answer a, decide if a is a correct answer for q What is the capital of the USA? Washington D.C. correct San Francisco wrong Rome wrong

73 RANLP Bernardo Magnini73 Requirements for Automatic AV n Accuracy: it has to compare well with respect to human judgments n Efficiency: large scale (Web), real time scenarios n Simplicity: avoid the complexity of QA systems

74 RANLP Bernardo Magnini74 Approach n Web-based u take advantage of Web redundancy n Pattern-based u the Web is mined using patterns (i.e. validation patterns) extracted from the question and the candidate answer n Quantitative (as opposed to content-based) u check if the question and the answer tend to appear together in the Web considering the number of documents returned (i.e. documents are not downloaded)

75 RANLP Bernardo Magnini75 Web Redundancy What is the capital of the USA? Capital Region USA: Fly-Drive Holidays in and Around Washington D.C. the Insider’s Guide to the Capital Area Music Scene (Washington D.C., USA). The Capital Tangueros (Washington DC Area, USA) I live in the Nations’s Capital, Washington Metropolitan Area (USA) In 1790 Capital (also USA’s capital): Washington D.C. Area: 179 square km Washington

76 RANLP Bernardo Magnini76 Validation Pattern Capital Region USA: Fly-Drive Holidays in and Around Washington D.C. the Insider’s Guide to the Capital Area Music Scene (Washington D.C., USA). The Capital Tangueros (Washington DC Area, USA) I live in the Nations’s Capital, Washington Metropolitan Area (USA) In 1790 Capital (also USA’s capital): Washington D.C. Area: 179 square km [Capital NEAR USA NEAR Washington]

77 RANLP Bernardo Magnini77 Related Work n Pattern-based QA u Brill, 2001 – TREC-10 u Subboting, 2001 – TREC-10 u Ravichandran and Hovy, ACL-02 n Use of the Web for QA u Clarke et al – TREC-10 u Radev, et al CIKM n Statistical approach on the Web u PMI-IR: Turney, 2001 and ACL-02

78 RANLP Bernardo Magnini78 Architecture questioncandidate answer validation pattern answer validity score correct answer wrong answer > t < t #doc filtering #doc < k

79 RANLP Bernardo Magnini79 Architecture questioncandidate answer validation pattern answer validity score correct answer wrong answer > t < t #doc filtering #doc < k

80 RANLP Bernardo Magnini80 Extracting Validation Patterns questioncandidate answer question pattern (Qp) answer pattern (Ap) stop-word filter term expansion validation pattern answer type named entity recognition stop-word filter

81 RANLP Bernardo Magnini81 Architecture questioncandidate answer validation pattern answer validity score correct answer wrong answer > t < t #doc filtering #doc < k

82 RANLP Bernardo Magnini82 Answer Validity Score n PMI-IR algorithm (Turney, 2001) PMI (Qp, Ap) = P(Qp, Ap) P(Qp) * P(Ap) n The result is interpreted as evidence that the validation pattern is consistent, which imply answer accuracy

83 RANLP Bernardo Magnini83 Answer Validity Score PMI (Qp, Ap) = hits(Qp NEAR Ap) hits(Qp) * hits(Ap) n Three searches are submitted to the Web: hits(Qp) hits(Ap) hits(Qp NEAR Ap)

84 RANLP Bernardo Magnini84 Example What county is Modesto, California in? Answer type: Location Qp = [county NEAR Modesto NEAR California] P(Qp) = P(county, Modesto, California) = *10 8 A1= The Stanislaus County district attorney’s A2 = In Modesto, San Francisco, and Stop-word filter

85 RANLP Bernardo Magnini85 Example (cont.) The Stanislaus County district attorney’s A1p = [ Stanislaus] In Modesto, San Francisco, and A2p = [ San Francisco] P(Stanislaus) = *10 8 P(San Francisco)= *10 8 NER(location)

86 RANLP Bernardo Magnini86 Example (cont.) PMI(Qp, A1p) = 2473 P(Qp, A1p) = *10 8 P(Qp, A2p) = 11 3 *10 8 PMI(Qp, A2p) = 0.89 The Stanislaus County district attorney’s In Modesto, San Francisco, and correct answer wrong answer t = 0.2 * MAX(AVS) > t< t

87 RANLP Bernardo Magnini87 Experiments n Data set: u 492 TREC-2001 questions u 2726 answers: 3 correct answers and 3 wrong answers for each question, randomly selected from TREC-10 participants human-judged corpus n Search engine: Altavista u allows the NEAR operator

88 RANLP Bernardo Magnini88 Experiment: Answers Q-916: What river in the US is known as the Big Muddy ?  The Mississippi  Known as Big Muddy, the Mississippi is the longest  as Big Muddy, the Mississippi is the longest  messed with. Known as Big Muddy, the Mississip  Mississippi is the longest river in the US  the Mississippi is the longest river(Mississippi)  has brought the Mississippi to its lowest  ipes.In Life on the Mississippi,Mark Twain wrote t  Southeast;Mississippi;Mark Twain; officials began  Known; Mississippi; US; Minnesota; Gulf Mexico  Mud Island,;Mississippi;”The;--history,;Memphis

89 RANLP Bernardo Magnini89 Baseline n Consider the documents provided by NIST to TREC-10 participants (1000 documents for each question) n If the candidate answer occurs (i.e. string match) at least one time in the top 10 documents it is judged correct, otherwise it is considered wrong

90 RANLP Bernardo Magnini90 Asymmetrical Measures n Problem: some candidate answers (e.g. numbers) produce an enormous amount of Web documents n Scores for good (Ac) and bad (Aw) answers tend to be similar, making the choice more difficult ~ PMI(q, a c ) = PMI (q, a w ) How many Great Lakes are there? … to cross all five Great Lakes completed a 19.2 …

91 RANLP Bernardo Magnini91 Asymmetric Conditional Probability (ACP) ACP (Qsp, Asp) = P(Qsp | Asp) P(Qsp) * P(Asp) = hits(Qsp NEAR Asp) hits(Qsp) * hits(Asp) 2/3

92 RANLP Bernardo Magnini92 Comparing PMI and ACP ACP increases the difference between the right and the wrong answer.

93 RANLP Bernardo Magnini93 Results BaselineMLHRPMIACP Absolute threshold Relative threshold SR on all 492 TREC-2001 questions BaselineMLHRPMIACP Absolute threshold Relative threshold SR on all 249 factoid questions

94 RANLP Bernardo Magnini94 Discussion (1) n Definition questions are the more problematic u on the subset of 249 named-entities questions success rate is higher (i.e. 86.3) n Relative threshold improve performance (+ 2%) over fixed threshold n Non symmetric measures of co-occurrence work better for answer validation (+ 2%) n Source of errors: u Answer type recognition u Named-entities recognition u TREC answer set (e.g. tokenization)

95 RANLP Bernardo Magnini95 Discussion (2) n Automatic answer validation is a key challenge for Web-based question/answering systems n Requirements: u accuracy with respect to human judgments: 80% success rate is a good starting point u efficiency: documents are not downloaded u simplicity: based on patterns n At present, it is suitable for a generate&test component integrated in a QA system

96 RANLP Bernardo Magnini96 V. Cross-Language QA n Motivations n n Performances n Approaches

97 RANLP Bernardo Magnini97 Motivations n Answers may be found in languages different from the language of the question. n Interest in QA systems for languages other than English. n Force the QA community to design real multilingual systems. n Check/improve the portability of the technologies implemented in current English QA systems.

98 RANLP Bernardo Magnini98 Cross-Language QA Quanto è alto il Mont Ventoux? (How tall is Mont Ventoux?) “Le Mont Ventoux, impérial avec ses 1909 mètres et sa tour blanche telle un étendard, règne de toutes …” 1909 metri English corpusItalian corpusSpanish corpusFrench corpus

99 RANLP Bernardo Magnini99 CL-QA at CLEF n Adopt the same rules used at TREC QA u Factoid questions (i.e. no definition questions) u Exact answers + document id n Use the CLEF corpora (news, ) n Return the answer in the language of the text collection in which it has been found (i.e. no translation of the answer) n QA-CLEF-2003 was an initial step toward a more complex task organized at CLEF-2004 and 2005.

100 100RANLP Bernardo Magnini CLEF 2004 (http://clef-qa.itc.it/2004) Seven groups coordinated the QA track: - ITC-irst (IT and EN test set preparation) - DFKI (DE) - ELDA/ELRA (FR) - Linguateca (PT) - UNED (ES) - U. Amsterdam (NL) - U. Limerick (EN assessment) Two more groups participated in the test set construction: - Bulgarian Academy of Sciences (BG) - U. Helsinki (FI)

101 101RANLP Bernardo Magnini CLEF QA - Overview document collections translation EN => 7 languages systems’ answers 100 monolingual Q&A pairs with EN translation IT FR NL ES … 700 Q&A pairs in 1 language + EN selection of additional questions Multieight-04 XML collection of 700 Q&A in 8 languages extraction of plain text test sets experiments (1 week window) manual assessment question generation (2.5 p/m per group) Exercise (10-23/5) evaluation (2 p/d for 1 run)

102 102RANLP Bernardo Magnini CLEF QA – Task Definition Given 200 questions in a source language, find one exact answer per question in a collection of documents written in a target language, and provide a justification for each retrieved answer (i.e. the docid of the unique document that supports the answer). DEENESFRITNLPT BG DE EN ES FI FR IT NL PT S T 6 monolingual and 50 bilingual tasks. Teams participated in 19 tasks,

103 103RANLP Bernardo Magnini CLEF QA - Questions All the test sets were made up of 200 questions: - ~90% factoid questions - ~10% definition questions - ~10% of the questions did not have any answer in the corpora (right answer- string was “NIL”) Problems in introducing definition questions:  What’s the right answer? (it depends on the user’s model)  What’s the easiest and more efficient way to assess their answers?  Overlap with factoid questions: F Who is the Pope? D Who is John Paul II? the Pope John Paul II the head of the Roman Catholic Church

104 104RANLP Bernardo Magnini CLEF QA – Multieight Как умира Пазолини? TRANSLATION[убит] Auf welche Art starb Pasolini? TRANSLATION[ermordet] ermordet How did Pasolini die? TRANSLATION[murdered] murdered ¿Cómo murió Pasolini? TRANSLATION[Asesinado] Brutalmente asesinado en los arrabales de Ostia Comment est mort Pasolini ? TRANSLATION[assassiné] assassiné assassiné en novembre 1975 dans des circonstances mystérieuses assassiné il y a 20 ans Come è morto Pasolini? TRANSLATION[assassinato] massacrato e abbandonato sulla spiaggia di Ostia Hoe stierf Pasolini? TRANSLATION[vermoord] vermoord Como morreu Pasolini? assassinado

105 105RANLP Bernardo Magnini CLEF QA - Assessment Judgments taken from the TREC QA tracks: - Right - Wrong - ineXact - Unsupported Other criteria, such as the length of the answer-strings (instead of X, which is underspecified) or the usefulness of responses for a potential user, have not been considered. Main evaluation measure was accuracy (fraction of Right responses). Whenever possible, a Confidence-Weighted Score was calculated: 1 Q  Q i=1 number of correct responses in first i ranks i CWS =

106 106RANLP Bernardo Magnini Evaluation Exercise - Participants AmericaEuropeAsiaAustraliaTOTAL submitted runs TREC TREC TREC TREC TREC NTCIR-3 (QAC-1) CLEF CLEF Distribution of participating groups in different QA evaluation campaigns.

107 107RANLP Bernardo Magnini Evaluation Exercise - Participants Number of participating teams-number of submitted runs at CLEF Comparability issue. DEENESFRITNLPT BG DE EN ES FI 1-1 FR IT NL 1-2 PT ST

108 108RANLP Bernardo Magnini Evaluation Exercise - Results Systems’ performance at the TREC and CLEF QA tracks. * considering only the 413 factoid questions ** considering only the answers returned at the first rank accuracy (%) TREC-8 TREC-9TREC-10TREC-11TREC-12 * CLEF-2003 ** monol. bil. CLEF-2004 monol. bil. best system average

109 109RANLP Bernardo Magnini Evaluation Exercise – CL Approaches Question Analysis / keyword extraction INPUT (source language) Candidate Document Selection Document Collection Document Collection Preprocessing Preprocessed Documents Candidate Document Analysis Answer Extraction OUTPUT (target language) question translation into target language translation of retrieved data U. Amsterdam U. Edinburgh U. Neuchatel Bulg. Ac. of Sciences ITC-Irst U. Limerick U. Helsinki DFKI LIMSI-CNRS

110 110RANLP Bernardo Magnini Discussion on Cross-Language QA CLEF multilingual QA track (like TREC QA) represents a formal evaluation, designed with an eye to replicability. As an exercise, it is an abstraction of the real problems. Future challenges: investigate QA in combination with other applications (for instance summarization) access not only free text, but also different sources of data (multimedia, spoken language, imagery) introduce automated evaluation along with judgments given by humans focus on user’s need: develop real-time interactive systems, which means modeling a potential user and defining suitable answer types.

111 RANLP Bernardo Magnini111 References n Books n Pasca, Marius, Open Domain Question Answering from Large Text Collections, CSLI, n Maybury, Mark (Ed.), New Directions in Question Answering, AAAI Press, n Journals n Hirshman, Gaizauskas. Natural Language question answering: the view from here. JNLE, 7 (4), n TREC n E. Voorhees. Overview of the TREC 2001 Question Answering Track. n M.M. Soubbotin, S.M. Soubbotin. Patterns of Potential Answer Expressions as Clues to the Right Answers. n S. Harabagiu, D. Moldovan, M. Pasca, M. Surdeanu, R. Mihalcea, R. Girju, V. Rus, F. Lacatusu, P. Morarescu, R. Brunescu. Answering Complex, List and Context questions with LCC’s Question-Answering Server. n C.L.A. Clarke, G.V. Cormack, T.R. Lynam, C.M. Li, G.L. McLearn. Web Reinforced Question Answering (MultiText Experiments for TREC 2001). n E. Brill, J. Lin, M. Banko, S. Dumais, A. Ng. Data-Intensive Question Answering.

112 RANLP Bernardo Magnini112 References n Workshop Proceedings n H. Chen and C.-Y. Lin, editors Proceedings of the Workshop on Multilingual Summarization and Question Answering at COLING-02, Taipei, Taiwan. n M. de Rijke and B. Webber, editors Proceedings of the Workshop on Natural Language Processing for Question Answering at EACL-03, Budapest, Hungary. n R. Gaizauskas, M. Hepple, and M. Greenwood, editors Proceedings of the Workshop on Information Retrieval for Question Answering at SIGIR- 04, Sheffield, United Kingdom.

113 RANLP Bernardo Magnini113 References n N. Kando and H. Ishikawa, editors Working Notes of the 4th NTCIR Workshop Meeting on Evaluation of Information Access Technologies: Information Retrieval, Question Answering and Summarization (NTCIR- 04), Tokyo, Japan. n M. Maybury, editor Proceedings of the AAAI Spring Symposium on New Directions in Question Answering, Stanford, California. n C. Peters and F. Borri, editors Working Notes of the 5th Cross- Language Evaluation Forum (CLEF-04), Bath, United Kingdom. n J. Pustejovsky, editor Final Report of the Workshop on TERQAS: Time and Event Recognition in Question Answering Systems, Bedford, Massachusetts. n Y. Ravin, J. Prager and S. Harabagiu, editors Proceedings of the Workshop on Open-Domain Question Answering at ACL-01, Toulouse, France.


Download ppt "RANLP 2005 - Bernardo Magnini1 Open Domain Question Answering: Techniques, Resources and Systems Bernardo Magnini Itc-Irst Trento, Italy"

Similar presentations


Ads by Google