Presentation is loading. Please wait.

Presentation is loading. Please wait.

 Motivation & Previous Work  Sentence Compression Approach  Linguistically-motivated Heuristics  Word Significance  Compression Generation and Selection.

Similar presentations


Presentation on theme: " Motivation & Previous Work  Sentence Compression Approach  Linguistically-motivated Heuristics  Word Significance  Compression Generation and Selection."— Presentation transcript:

1

2  Motivation & Previous Work  Sentence Compression Approach  Linguistically-motivated Heuristics  Word Significance  Compression Generation and Selection  Experiment Results  Conclusions & Future Work

3  no Chinese parallel corpus  hard to create a sentence/compression parallel corpus

4  An example of system output [Original] 第四种子乔科维奇退赛, 让原以三比六, 六比一, 四比一领先的第二种子 纳达尔获胜过关. Fourth seed Djokovic withdrew from the game, and allowed second seed Nadal, who was leading 3-6, 6-1, 4-1, to claim the victory and progress through. [Human] 乔科维奇退赛让纳达尔获胜过关. Djokovic withdrew from the game, and allowed Nadal to claim the victory and progress through. [Approach 1] 乔科维奇退赛. Djokovic withdrew from the game. [Approach 2] 乔科维奇退赛让种子纳达尔获胜过关. Djokovic withdrew from the game, and allowed seed Nadal to claim the victory and progress through.

5 Parse Tree Trim Dorr 2003 Sentence Scoring Hori 2003 Clarke 2006+ Clarke 2008+ Noisy Channel Turner 2005 Knight 2002 Galley 2007 Decision Tree Knight 2002 Nguyen 2004 Large Margin Learning McDonald 2006 Cohn 2007 Cohn 2008 Unsupervised Learning MaxEnt Riezler 2003 Supervised Learning Sentence Compression Headline Generation Japanese Speech Japanese Speech Paraphrasing Corpus Paraphrasing Corpus Non-Corpus-Based

6  Parse Tree Trimming(Dorr et al. 2003)  linguistically-motivated heuristics  hand-made rules to remove low content components  iteratively trim until reach desired length  reduce the risk of deleting important information by applying rules in a certain order safe rules (DT, TIME)  more dangerous rules (CONJ)  the most dangerous rules (PP)

7  Parse Tree Trimming (Dorr et al. 2003)  Pros: ▪ comparative good performance ▪ retain grammaticality if parsing is correct  Cons: ▪ require considerable linguist’s skill to produce proper rules in a proper order ▪ sensitive to POS and parsing errors ▪ not flexible and capable to preserve informative components

8  Sentence Scoring (Hori & Furui 2004)  improved by Clarke & Lapata in 2006  given an input sentence W = w 1, w 2, …, w n  ranking possible compressions  language model + word significance  Score(compressed sentence C) = p1 * Word Significance Score (all words in C) + p2 * Language Model Score (C) + p3 * Subject-Object-Verb Score (all words in C)

9  Sentence Scoring (Hori & Furui 2004)  language model  word significance  Pros: ▪ do not rely heavily on training corpus  Cons: ▪ the weighting parameters are experimentally optimized or estimated by a parallel corpus. ▪ use only language model to encourage compression and ensure grammaticality

10  Combine  Linguistically-motivated Heuristics ▪ ensure grammaticality ▪ rules are easier to develop, determining only possible low content components instead of selecting specific constituents for removal  Information Significance Scoring ▪ preserve the most important information ▪ enhance the tolerance of POS and parsing errors

11  Combined Approach: Heuristics + Information Significance ▪ use heuristic to determine potentially low content constituents ▪ do real deletion according to word significance

12  1. take a Chinese Treebank-style parse as input  2. use linguistically-motivated heuristics to determine potentially removable constituents  3. generate a series of candidate compressions by deleting removable nodes based on word significance  4. select the best compressing according to information density

13 Combined Approach: Heuristics + Information Significance  Used to determine potentially low content constituents  Basic: (same) ▪ parenthetical elements ▪ adverbs except negative ▪ adjectives ▪ DNPs (phrase + “ 的 ”, modifiers of NP) ▪ DVPs (phrase + “ 地 ”, modifiers of VP) ▪ noun coordination phrases  Complex: (more relaxed, general) ▪ verb coordination phrases ▪ relative clauses ▪ appositive clauses ▪ prepositional phrases ▪ all children of NP nodes except the last noun word ▪ sentential coordination

14 Heuristics-only Approach  Used to remove specific low content constituents  Basic: (same) ▪ parenthetical elements ▪ adverbs except negative ▪ adjectives ▪ DNPs (phrase + “ 的 ”, modifiers of NP) ▪ DVPs (phrase + “ 地 ”, modifiers of VP) ▪ noun coordination phrases  Complex: (more strict, conservative) ▪ all children of NP nodes except temporal nouns and proper nouns and the last noun word ▪ all simple clauses (IP) except the first one in sentential coordination ▪ prepositional phrases except those that may contain location or date information, according to a hand-made list of prepositions

15  An example of applying heuristics  *: nodes labeled as removable by combined approach  #: nodes trimmed out by heuristics-only approach ( (IP (NP (*NP (NR 韩国 )) (#*ADJP (JJ 现代 )) (NP (#*NN 汽车 ) (NN 公司 ))) (VP (VC 是 ) (NP (#*DNP (NP (NR 沃尔沃 )) (DEG 的 )) (#*ADJP (JJ 潜在 )) (NP (NN 买家 )))) (PU.))) ( (IP (NP (*NP (NR South Korean )) (#*ADJP (JJ Hyundai)) (NP (#*NN motor) (NN company))) (VP (VC is) (NP (#*DNP (NP (NR Volvo)) (DEG ’s)) (#*ADJP (JJ potential)) (NP (NN buyer)))) (PU.))) POS error

16  An example of applying heuristics  *: nodes labeled as removable by combined approach  #: nodes trimmed out by heuristics-only approach ( (IP (NP (*NP (NR 韩国 )) (#*ADJP (JJ 现代 )) (NP (#*NN 汽车 ) (NN 公司 ))) (VP (VC 是 ) (NP (#*DNP (NP (NR 沃尔沃 )) (DEG 的 )) (#*ADJP (JJ 潜在 )) (NP (NN 买家 )))) (PU.))) ( (IP (NP (*NP (NR South Korean )) (#*ADJP (JJ Hyundai)) (NP (#*NN motor) (NN company))) (VP (VC is) (NP (#*DNP (NP (NR Volvo)) (DEG ’s)) (#*ADJP (JJ potential)) (NP (NN buyer)))) (PU.))) trimmed out by heuristic- only approach

17  Event-based Word Significance Score  verb or common noun: tf-idf  proper noun: tf-idf + w  0therwise: 0  weighted parsing tree  depend on word itself regardless of POS  overcome some POS errors

18  Generate a series of candidate compressions  by repeatedly trimming the weighted parse tree  greedy algorithm  remove one node with the lowest weight and get a candidate compressed sentence  update the weights of all ancestors of the removed node  repeat until no node is removable

19  Information Density  used to select the best compression

20  Information Density D(s)Sentence 0.254 韩国现代汽车公司是沃尔沃的潜在买家. The South Korean Hyundai Motor Company is a potential buyer of Volvo. 0.288 韩国现代汽车公司是沃尔沃的买家. The South Korean Hyundai Motor Company is a buyer of Volvo. 0.332 韩国现代公司是沃尔沃的买家. The South Korean Hyundai Company is a buyer of Volvo. 0.282 韩国公司是沃尔沃的买家. The South Korean company is a buyer of Volvo. 0.209 公司是沃尔沃的买家. The company is a buyer of Volvo. 0.0 公司是买家. The company is a buyer.

21  79 documents from Chinese newswires  the first sentence of each news article  challenging task  headline-like compression  average length : 61.5 characters  often connects two or more self-complete sentences together

22  Human evaluation * The combined approach sacrifices grammaticality to reduce the linguistic complexity of the heuristics ** word significance improves the heuristics on informativeness *** with varying length constraints, depending on original sentence length Compression Rate Grammaticality (1 ~ 5) Informativeness (0~100%) Human38.5%4.96290.7% Heuristics54.1%4.11464.9% Heu+Sig52.8%3.854 *68.8% ** Heu+Sig+L ***34.3%3.66456.1%

23  compression with good grammar  perform well on most of the cases  perform terribly on about 20 cases out of all 76 ▪ POS or parsing errors ▪ grammatically correct but semantically incorrect  Grammaticality (1 ~ 5) Number of Sentence Informativeness (0~100%) Heuristics > 4.54575.9% Heuristics >= 462--- Heu+Sig > 4.53581.8% Heu+Sig >= 457---

24  First attempt in Chinese  heuristics  ensure grammaticality  word significance  control word deletion, balancing sentence length and information loss  Pros:  not rely on parallel corpus  reduce the complexity of composing heuristics  easily extend to other languages or domains  overcome some POS and parsing errors  competitive to a finely-tuned heuristics-only approach

25  applications in summarization, headline generation  keyword selection and weighting  language model  parallel corpus in Chinese  statistical, machine learning

26 A Parse-and-Trim Approach with Information Significance for Chinese Sentence Compression


Download ppt " Motivation & Previous Work  Sentence Compression Approach  Linguistically-motivated Heuristics  Word Significance  Compression Generation and Selection."

Similar presentations


Ads by Google