Presentation is loading. Please wait.

Presentation is loading. Please wait.

(C) 2000, The University of Michigan 1 Language and Information Handout #4 November 9, 2000.

Similar presentations


Presentation on theme: "(C) 2000, The University of Michigan 1 Language and Information Handout #4 November 9, 2000."— Presentation transcript:

1 (C) 2000, The University of Michigan 1 Language and Information Handout #4 November 9, 2000

2 (C) 2000, The University of Michigan 2 Course Information Instructor: Dragomir R. Radev Office: 305A, West Hall Phone: (734) Office hours: TTh 3-4 Course page: Class meets on Thursdays, 5-8 PM in 311 West Hall

3 (C) 2000, The University of Michigan 3 Readings Textbook: –Oakes Ch.3: 95-96, –Oakes Ch.4: , , –Oakes Ch.5: , , Additional readings –Knight “Statistical Machine Translation Workbook” (http://www.clsp.jhu.edu/ws99/)http://www.clsp.jhu.edu/ws99/ –McKeown & Radev “Collocations” –Optional: M&S chapters 4, 5, 6, 13, 14

4 (C) 2000, The University of Michigan 4 Statistical Machine Translation and Language Modeling

5 (C) 2000, The University of Michigan 5 The Noisy Channel Model Source-channel model of communication Parametric probabilistic models of language and translation Training such models

6 (C) 2000, The University of Michigan 6 Statistics Given f, guess e e f e’ E  FF  E encoderdecoder e’ = argmax P(e|f) = argmax P(f|e) P(e) e e translation modellanguage model

7 (C) 2000, The University of Michigan 7 Parametric probabilistic models Language model (LM) Deleted interpolation Translation model (TM) P(e) = P(e 1, e 2, …, e L ) = P(e 1 ) P(e 2 |e 1 ) … P(e L |e 1 … e L-1 ) P(e L |e 1 … e K-1 )  P(e L |e L-2, e L-1 ) Alignment: P(f,a|e)

8 (C) 2000, The University of Michigan 8 IBM’s EM trained models 1.Word translation 2.Local alignment 3.Fertilities 4.Class-based alignment 5.Non-deficient algorithm (avoid overlaps, overflow)

9 (C) 2000, The University of Michigan 9 Lexical Semantics and WordNet

10 (C) 2000, The University of Michigan 10 Lexemes, lexicon, sense(s) Examples: –Red, n: the color of blood or a ruby –Blood, n: the red liquid that circulates in the heart, arteries and veins of animals –Right, adj: located nearer the right hand esp. being on the right when facing the same direction as the observer Do dictionaries gives us definitions?? Meanings of words

11 (C) 2000, The University of Michigan 11 Relations among words Homonymy: –Instead, a bank can hold the investments in a custodial account in the client’s name. –But as agriculture burgeons on the east bank, the river will shrink even more. Other examples: be/bee?, wood/would? Homophones Homographs Applications: spelling correction, speech recognition, text- to-speech Example: Un ver vert va vers un verre vert.

12 (C) 2000, The University of Michigan 12 Polysemy They rarely serve red meat, preferring to prepare seafood, poultry, or game birds. He served as U.S. ambassador to Norway in 1976 and He might have served his time, come out and led an upstanding life. Homonymy: distinct and unrelated meanings, possibly with different etymology (multiple lexemes). Polysemy: single lexeme with two meanings. Example: an “idea bank”

13 (C) 2000, The University of Michigan 13 Synonymy Principle of substitutability How big is this plane? Would I be flying on a large or small plane? Miss Nelson, for instance, became a kind of big sister to Mrs. Van Tassel’s son, Benjamin. ?? Miss Nelson, for instance, became a kind of large sister to Mrs. Van Tassel’s son, Benjamin. What is the cheapest first class fare? ?? What is the cheapest first class cost?

14 (C) 2000, The University of Michigan 14 Semantic Networks Used to represent relationships between words Example: WordNet - created by George Miller’s team at Princeton (http://www.cogsci.princeton.edu/~wn) Based on synsets (synonyms, interchangeable words) and lexical matrices

15 (C) 2000, The University of Michigan 15 Lexical matrix

16 (C) 2000, The University of Michigan 16 Synsets Disambiguation –{board, plank} –{board, committee} Synonyms –substitution –weak substitution –synonyms must be of the same part of speech

17 (C) 2000, The University of Michigan 17 $./wn board -hypen Synonyms/Hypernyms (Ordered by Frequency) of noun board 9 senses of board Sense 1 board => committee, commission => administrative unit => unit, social unit => organization, organisation => social group => group, grouping Sense 2 board => sheet, flat solid => artifact, artefact => object, physical object => entity, something Sense 3 board, plank => lumber, timber => building material => artifact, artefact => object, physical object => entity, something

18 (C) 2000, The University of Michigan 18 Sense 4 display panel, display board, board => display => electronic device => device => instrumentality, instrumentation => artifact, artefact => object, physical object => entity, something Sense 5 board, gameboard => surface => artifact, artefact => object, physical object => entity, something Sense 6 board, table => fare => food, nutrient => substance, matter => object, physical object => entity, something

19 (C) 2000, The University of Michigan 19 Sense 7 control panel, instrument panel, control board, board, panel => electrical device => device => instrumentality, instrumentation => artifact, artefact => object, physical object => entity, something Sense 8 circuit board, circuit card, board, card => printed circuit => computer circuit => circuit, electrical circuit, electric circuit => electrical device => device => instrumentality, instrumentation => artifact, artefact => object, physical object => entity, something Sense 9 dining table, board => table => furniture, piece of furniture, article of furniture => furnishings => instrumentality, instrumentation => artifact, artefact => object, physical object => entity, something

20 (C) 2000, The University of Michigan 20 Antonymy “x” vs. “not-x” “rich” vs. “poor”? {rise, ascend} vs. {fall, descend}

21 (C) 2000, The University of Michigan 21 Other relations Meronymy: X is a meronym of Y when native speakers of English accept sentences similar to “X is a part of Y”, “X is a member of Y”. Hyponymy: {tree} is a hyponym of {plant}. Hierarchical structure based on hyponymy (and hypernymy).

22 (C) 2000, The University of Michigan 22 Other features of WordNet Index of familiarity Polysemy

23 (C) 2000, The University of Michigan 23 board used as a noun is familiar (polysemy count = 9) bird used as a noun is common (polysemy count = 5) cat used as a noun is common (polysemy count = 7) house used as a noun is familiar (polysemy count = 11) information used as a noun is common (polysemy count = 5) retrieval used as a noun is uncommon (polysemy count = 3) serendipity used as a noun is very rare (polysemy count = 1) Familiarity and polysemy

24 (C) 2000, The University of Michigan 24 Compound nouns advisory board appeals board backboard backgammon board baseboard basketball backboard big board billboard binder's board binder board blackboard board game board measure board meeting board member board of appeals board of directors board of education board of regents board of trustees

25 (C) 2000, The University of Michigan 25 Overview of senses 1. board -- (a committee having supervisory powers; "the board has seven members") 2. board -- (a flat piece of material designed for a special purpose; "he nailed boards across the windows") 3. board, plank -- (a stout length of sawn timber; made in a wide variety of sizes and used for many purposes) 4. display panel, display board, board -- (a board on which information can be displayed to public view) 5. board, gameboard -- (a flat portable surface (usually rectangular) designed for board games; "he got out the board and set up the pieces") 6. board, table -- (food or meals in general; "she sets a fine table"; "room and board") 7. control panel, instrument panel, control board, board, panel -- (an insulated panel containing switches and dials and meters for controlling electrical devices; "he checked the instrument panel"; "suddenly the board lit up like a Christmas tree") 8. circuit board, circuit card, board, card -- (a printed circuit that can be inserted into expansion slots in a computer to increase the computer's capabilities) 9. dining table, board -- (a table at which meals are served; "he helped her clear the dining table"; "a feast was spread upon the board")

26 (C) 2000, The University of Michigan 26 Top-level concepts {act, action, activity} {animal, fauna} {artifact} {attribute, property} {body, corpus} {cognition, knowledge} {communication} {event, happening} {feeling, emotion} {food} {group, collection} {location, place} {motive} {natural object} {natural phenomenon} {person, human being} {plant, flora} {possession} {process} {quantity, amount} {relation} {shape} {state, condition} {substance} {time}

27 (C) 2000, The University of Michigan 27 Information Extraction

28 (C) 2000, The University of Michigan 28 Types of Information Extraction Template filling Language reuse Biographical information Question answering

29 (C) 2000, The University of Michigan 29 MUC-4 Example INCIDENT: DATE30 OCT 89 INCIDENT: LOCATIONEL SALVADOR INCIDENT: TYPEATTACK INCIDENT: STAGE OF EXECUTIONACCOMPLISHED INCIDENT: INSTRUMENT ID INCIDENT: INSTRUMENT TYPE PERP: INCIDENT CATEGORYTERRORIST ACT PERP: INDIVIDUAL ID"TERRORIST" PERP: ORGANIZATION ID "THE FMLN" PERP: ORG. CONFIDENCEREPORTED: "THE FMLN" PHYS TGT: ID PHYS TGT: TYPE PHYS TGT: NUMBER PHYS TGT: FOREIGN NATION PHYS TGT: EFFECT OF INCIDENT PHYS TGT: TOTAL NUMBER HUM TGT: NAME HUM TGT: DESCRIPTION"1 CIVILIAN" HUM TGT: TYPE CIVILIAN: "1 CIVILIAN" HUM TGT: NUMBER1: "1 CIVILIAN" HUM TGT: FOREIGN NATION HUM TGT: EFFECT OF INCIDENTDEATH: "1 CIVILIAN" HUM TGT: TOTAL NUMBER On October 30, 1989, one civilian was killed in a reported FMLN attack in El Salvador.

30 (C) 2000, The University of Michigan 30 Yugoslav PresidentSlobodan Milosevic [description] NP Phrase to be reused Language reuse [entity]

31 (C) 2000, The University of Michigan 31 NP Example Andrija Hebrang,The Croatian Defense Minister [description] [entity] NP Punc

32 (C) 2000, The University of Michigan 32 Issues involved Text generation depends on lexical resources Lexical choice Corpus processing vs. manual compilation Deliberate decisions by writers Difficult to encode by hand Dynamically updated (Scott O’Grady) No full semantic representation

33 (C) 2000, The University of Michigan 33 Named entities Richard Butler met Tareq Aziz Monday after rejecting Iraqi attempts to set deadlines for finishing his work. Yitzhak Mordechai will meet Mahmoud Abbas at 7 p.m. (1600 GMT) in Tel Aviv after a 16-month-long impasse in peacemaking. Sinn Fein deferred a vote on Northern Ireland's peace deal Sunday. Hundreds of troops patrolled Dili on Friday during the anniversary of Indonesia's 1976 annexation of the territory.

34 (C) 2000, The University of Michigan 34 Entities + Descriptions Chief U.N. arms inspector Richard Butler met Iraq’s Deputy Prime Minister Tareq Aziz Monday after rejecting Iraqi attempts to set deadlines for finishing his work. Israel's Defense Minister Yitzhak Mordechai will meet senior Palestinian negotiator Mahmoud Abbas at 7 p.m. (1600 GMT) in Tel Aviv after a 16-month-long impasse in peacemaking. Sinn Fein, the political wing of the Irish Republican Army, deferred a vote on Northern Ireland's peace deal Sunday. Hundreds of troops patrolled Dili, the Timorese capital, on Friday during the anniversary of Indonesia's 1976 annexation of the territory.

35 (C) 2000, The University of Michigan 35 Building a database of descriptions Size of database: 59,333 entities and 193,228 descriptions as of 08/01/98 Text processed: 494 MB (ClariNet, Reuters, UPI) Length: 1-15 lexical items Accuracy: (precision 94%, recall 55%)

36 (C) 2000, The University of Michigan 36 Ung Huot A senior member Cambodia’s Cambodian foreign minister Co-premier First prime minister Foreign minister His excellency Mr. New co-premier New first prime minister Newly-appointed first prime minister Premier Multiple descriptions per entity Profile for Ung Huot

37 (C) 2000, The University of Michigan 37 Language reuse and regeneration += CONCEPTSCONSTRAINTSCONSTRUCTS Corpus analysis: determining constraints Text generation: applying constraints

38 (C) 2000, The University of Michigan 38 Understanding: full parsing is expensive Generation: expensive to use full parses Bypassing certain stages (e.g., syntax) Not(!) template-based: still required extraction, analysis, context identification, modification, and generation Factual sentences, sentence fragments Reusability of a phrase Language reuse and regeneration

39 (C) 2000, The University of Michigan 39 Context-dependent solution Redefining the relation: DescriptionOf (E,C) = {D i,c, D i,c is a description of E in context C} If named entity E appears in text and the context is C: Insert DescriptionOf (E,C) in text.

40 (C) 2000, The University of Michigan 40 Multiple descriptions per entity Bill Clinton U.S. President President An Arkansas native Democratic presidential candidate Profile for Bill Clinton

41 (C) 2000, The University of Michigan 41 Choosing the right description Bill Clinton CONTEXT U.S. President …………………………..foreign relations President ………………………………… national affairs An Arkansas native ……………....false bomb alert in AR Democratic presidential candidate …………….. elections Pragmatic and semantic constraints on lexical choice.

42 (C) 2000, The University of Michigan 42 Semantic information from WordNet All words contribute to the semantic representation First sense is used only What is a synset?

43 (C) 2000, The University of Michigan 43 WordNet synset hierarchy { } director, manager, managing director { } administrator, decision maker { } head, chief, top dog { } leader { } person, individual, someone, somebody, human { } life form, organism, being, living thing { } entity, something

44 (C) 2000, The University of Michigan 44 Lexico-semantic matrix Profile for Ung Huot

45 (C) 2000, The University of Michigan 45 Choosing the right description Topic approximation by context: words that appear near the entity in the text (bag) Name of the entity (set) Length of article (continuous) Profile: set of all descriptions for that entity (bag) - parent synset offsets for all words w i. Semantic information: WordNet synset offsets (bag)

46 (C) 2000, The University of Michigan 46 Choosing the right description (Context, Entity, Description, Length, Profile, Parent) Classes Ripper feature vector [Cohen 1996]

47 (C) 2000, The University of Michigan 47 Example (training)

48 (C) 2000, The University of Michigan 48 Sample rules Total number of rules: 4085 for 100,000 inputs

49 (C) 2000, The University of Michigan 49 Evaluation 35,206 tuples; 11,504 distinct entities; 3.06 DDPE Training: 90% of corpus (10,353 entities) Test: 10% of corpus (1,151 entities)

50 (C) 2000, The University of Michigan 50 Evaluation Rule format (each matching rule adds constraints): X [A] (evidence of A) Y [B] (evidence of B) X  Y [A] [B] (evidence of A and B) Classes are in 2 W (powerset of WN nodes) P&R on the constraints selected by system

51 (C) 2000, The University of Michigan 51 Definition of precision and recall ModelSystemPR 50.0 %[A] [B] [C] [A] [B] [D] [B] [D]33.3 % 66.7 %

52 (C) 2000, The University of Michigan 52 Precision and recall

53 (C) 2000, The University of Michigan 53 Question Answering

54 (C) 2000, The University of Michigan 54 Q: When did Nelson Mandela become president of South Africa? A: 10 May 1994 Q: How tall is the Matterhorn? A: The institute revised the Matterhorn 's height to 14,776 feet 9 inches Q: How tall is the replica of the Matterhorn at Disneyland? A: In fact he has climbed the 147-foot Matterhorn at Disneyland every week end for the last 3 1/2 years Q: If Iraq attacks a neighboring country, what should the US do? A: ?? Question answering

55 (C) 2000, The University of Michigan 55

56 (C) 2000, The University of Michigan 56 The TREC evaluation Document retrieval Eight years Information retrieval? Corpus: texts and questions

57 (C) 2000, The University of Michigan 57 Prager et al (SIGIR) Radev et al (ANLP/NAACL)

58 (C) 2000, The University of Michigan 58

59 (C) 2000, The University of Michigan 59

60 (C) 2000, The University of Michigan 60

61 (C) 2000, The University of Michigan 61

62 (C) 2000, The University of Michigan 62 Features (1) Number: position of the span among all spans returned. Example: “Lou Vasquez” was the first span returned by GuruQA on the sample question. Rspanno: position of the span among all spans returned within the current passage. Count: number of spans of any span class retrieved within the current passage. Notinq: the number of words in the span that do not appear in the query. Example: Notinq (“Woodbridge high school”) = 1, because both “high” and “school” appear in the query while “Woodbridge” does not. It is set to –100 when the actual value is 0.

63 (C) 2000, The University of Michigan 63 Type: the position of the span type in the list of potential span types. Example: Type (“Lou Vasquez”) = 1, because the span type of “Lou Vasquez”, namely “PERSON” appears first in the SYN-set, “PERSON ORG NAME ROLE”. Avgdst: the average distance in words between the beginning of the span and the words in the query that also appear in the passage. Example: given the passage “Tim O'Donohue, Woodbridge High School's varsity baseball coach, resigned Monday and will be replaced by assistant Johnny Ceballos, Athletic Director Dave Cowen said.” and the span “Tim O’Donohue”, the value of avgdst is equal to 8. Sscore: passage relevance as computed by GuruQA. Features (2)

64 (C) 2000, The University of Michigan 64 Combining evidence TOTAL (span) = – 0.3 * number – 0.5 * rspanno * count * notinq – 15.0 * types – 1.0 * avgdst * sscore

65 (C) 2000, The University of Michigan 65 Extracted text

66 (C) 2000, The University of Michigan bytes 250 bytes Results

67 (C) 2000, The University of Michigan 67 Style and Authorship Analysis

68 (C) 2000, The University of Michigan 68 Style and authorship analysis Use of nouns, verbs… Use of rare words Positional and contextual distribution Use of alternatives: “and/also”, “since/because”, “scarcely/hardly”

69 (C) 2000, The University of Michigan 69 Sample problem 15-th century Latin work “De Imitatione Christi” Was it written by Thomas a Kempis or Jean Charlier de Gerson? Answer: by Kempis Why?

70 (C) 2000, The University of Michigan 70 Yule’s K characteristic Vocabulary richness: measure of the probability that any randomly selected pair of words will be identical K = 10,000 x (M2 - M1)/(M1 x M1) M1, M2 - distribution moments M1 - total number of usages (words including repetitions) M2 - sum of all vocabulary words in each frequency group, from 1 to the maximum word frequency, multiplied by the square of the frequency

71 (C) 2000, The University of Michigan 71 Example Text consisting of 12 words, where two of the words occur once, two occur twice, and two occur three times. M0 = 6 M1 = 12 M2 = (2 x 1 2 ) + (2 x 2 2 ) + (2 x 3 2 ) = 28 K increases as the diversity of the vocabulary decreases.

72 (C) 2000, The University of Michigan 72 Example (cont’d) Three criteria used: –total vocabulary size –frequency distribution of the different words –Yule’s K –the mean frequency of the word sin the sample –the number of nouns unique to a particular sample Pearson’s coefficient used

73 (C) 2000, The University of Michigan 73 Federalist papers Published in to persuade the population of New York state to ratify the new American constitution Published under the pseudonym Publius, the three authors were James Madison, John Jay, and Alexander Hamilton. Before dying in a duel, Hamilton claimed some portion of the essays. It was agreed that Jay wrote 5 essays, Hamilton - 43, Madison Three others were jointly written by Hamilton and Madison, and 12 were disputed

74 (C) 2000, The University of Michigan 74 Method Mosteller and Wallace (1963) used Bayesian statistics to determine which papers were written by whom. Authors had tried to imitate each other. So - sentence length and other easily imitable features are not useful. Madison and Hamilton were found to vary in their use of “by” (H) and “to” (M), “enough” (H) and “whilst” (M).

75 (C) 2000, The University of Michigan 75 Cluster Analysis

76 (C) 2000, The University of Michigan 76 Clustering Idea: find similar objects and group them together Examples: –all news stories on the same topic –all documents from the same genre or language Types of clustering: classification (tracking) and categorization (detection)

77 (C) 2000, The University of Michigan 77 Non-hierarchical clustering Concept of a centroid document/centroid similarity other parameters: –number of clusters –maximum and minimum size for each cluster –vigilance parameter –overlap between clusters

78 (C) 2000, The University of Michigan 78 Hierarchical clustering Similarity matrix (expensive: the SIM matrix needs to be updated after every iteration) Average linkage method dendrograms

79 (C) 2000, The University of Michigan 79 Introduction Abundance of newswire on the Web Multiple sources reporting on the same event Multiple modalities (speech, text) Summarization and filtering

80 (C) 2000, The University of Michigan 80 Introduction TDT participation topic detection and tracking –CIDR Multi-document summarization –statistical, domain-dependent –knowledge-based (SUMMONS)

81 (C) 2000, The University of Michigan 81 Topics and events Topic = event (single act) or activity (ongoing action) Defined by content, time, and place of occurrence [Allan et al. 1998, Yang et al. 1998] Examples: –Marine fighter pilot’s plane cuts cable in Italian Alps (February 3, 1998) –Eduard Shavardnadze assassination attempt (February 9, 1998) –Jonesboro shooting (March 24, 1998)

82 (C) 2000, The University of Michigan 82 TDT overview Event detection: monitoring a continuous stream of news articles and identifying new salient events Event tracking: identifying stories that belong to predefined event topics [Story segmentation: identifying topic boundaries]

83 (C) 2000, The University of Michigan 83 The TDT-2 corpus Corpus described in [ Doddington et al. 1999, Cieri et al ] One hundred topics, 54K stories, 6 sources Two newswire sources (AP, NYT); 2 TV stations (ABC, CNN-HN); 2 radio stations (PRI, VOA) 11 participants (4 industrial sites, 7 universities)

84 (C) 2000, The University of Michigan 84 Detection conditions Default: –Newswire + Audio - automatic transcription –Deferral period of 10 source files –Given boundaries for ASR

85 (C) 2000, The University of Michigan 85 Description of the system Single-pass clustering algorithm Normalized, tf*idf-modified, cosine-based similarity between document and centroid detection only, standard evaluation conditions, no deferral

86 (C) 2000, The University of Michigan 86 Research problems focus on speedup search space of five experimental parameters tradeoffs between parallelization and accuracy

87 (C) 2000, The University of Michigan 87 Vector-based representation Term 1 Term 2 Term 3 Document Centroid 

88 (C) 2000, The University of Michigan 88 Vector-based matching The cosine measure sim (D,C) = (d k. c k. idf(k)) (d k ) 2. (c k ) 2 k   k k

89 (C) 2000, The University of Michigan 89 Description of the system sim  T

90 (C) 2000, The University of Michigan 90 Description of the system sim > Tsim < T

91 (C) 2000, The University of Michigan 91 Centroid size

92 (C) 2000, The University of Michigan 92 Centroid size

93 (C) 2000, The University of Michigan 93 Centroid size

94 (C) 2000, The University of Michigan 94 Parameter space Similarity –DECAY: Number of words at beginning of document that will be considered in computing vector similarities ( ) –IDF: Minimum value for idf so that a word is considered (1 - 10) –SIM: Similarity threshold ( ) Centroids –KEEPI: Keep all words whose tf*idf scores are above a certain threshold (1-10) –KEEP: Keep at least that many words in centroid (1-50)

95 (C) 2000, The University of Michigan 95 Parameter selection (dev-test)

96 (C) 2000, The University of Michigan 96 Cluster stability

97 (C) 2000, The University of Michigan 97 Parallelization

98 (C) 2000, The University of Michigan 98 Parallelization C(P)

99 (C) 2000, The University of Michigan 99 Parallelization

100 (C) 2000, The University of Michigan 100 Parallelization

101 (C) 2000, The University of Michigan 101 Evaluation principles C Det (R,H) = C miss.P miss (R,H).P topic + C FalseAlarm.P FalseAlarm (R,H).(1-P topic ) C Miss = 1 C FalseAlarm = 1 P Miss (R,H) = N Miss (R,H)/|R| P FalseAlarm (R,H) = N FalseAlarm (R,H)/|S-R| P topic = 0.02 (a priori probability) R - set of stories in a reference target topic H - set of stories in a system-defined topic S - set of stories to be scored in eval corpus Task: to determine H(R) = argmin{C Det (R,H)} H

102 (C) 2000, The University of Michigan 102 Official results

103 (C) 2000, The University of Michigan 103 Results

104 (C) 2000, The University of Michigan 104 Novelty detection reute reute German court convicts Vogel of extortion BERLIN, Jan 9 (Reuter) - A German court on Tuesday convicted Wolfgang Vogel, the East Berlin lawyer famous for organising Cold War spy swaps, on charges that he extorted money from would-be East German emigrants. The Berlin court gave him a two-year suspended jail sentence and a fine -- less than the 3 3/8 years prosecutors had sought. reute reute East German spy-swap lawyer convicted of extortion BERLIN (Reuter) - The East Berlin lawyer who became famous for engineering Cold War spy swaps, Wolfgang Vogel, was convicted by a German court Tuesday of extorting money from East German emigrants eager to flee to the West. Vogel, a close confidant of former East German leader Erich Honecker and one of the Soviet bloc's rare millionaires, was found guilty of perjury, four counts of blackmail and five counts of falsifying documents. The Berlin court gave him the two-year suspended sentence and a $63,500 fine. Prosecutors had pressed for a jail sentence of 3 3/8 years and a $215,000 penalty...


Download ppt "(C) 2000, The University of Michigan 1 Language and Information Handout #4 November 9, 2000."

Similar presentations


Ads by Google