Page 1 Generalized Inference with Multiple Semantic Role Labeling Systems Peter Koomen, Vasin Punyakanok, Dan Roth, (Scott) Wen-tau Yih Department of Computer Science University of Illinois at Urbana-Champaign
Page 2 Outline System Architecture Pruning Argument Identification Argument Classification Inference [main difference from other systems] Inference with Multiple Systems The same approach used by the SRL to assure a coherent output is used with input produced by multiple systems.
Page 3 System Architecture Identify argument candidates Pruning Argument Identifier Binary classification Classify argument candidates Argument Classifier Multi-class classification Inference Use the estimated probability distribution given by the argument classifier, and Expressive structural and linguistic constraints. Infer the optimal global output – modeled as a constrained optimization problem
Page 4 Pruning [Xue&Palmer 2004] Significant errors due to PP attachment Consider PP as attached to both NP and VP DevelPrecRecF1 Gold Charniak
Page 5 Modified Pruning DevelPrecRecF1 Gold Charniak Charniak Modified heuristic
Page 6 Argument Identification Argument identifier is trained with a phrase-based classifier. Learning Algorithm – SNoW A sparse network of linear classifiers Weight update: a regularized variation of the Winnow multiplicative update rule When probability estimation is needed, we use softmax
Page 7 Argument Identification (Features) Parse tree structure from Collins & Charniak’s parsers Clauses, chunks and POS tags are from UPC processors
Page 8 Argument Classification Similar to argument identification, using SNoW as a multi-class classifier Classes also include NULL
Page 9 Inference Occasionally, the output of the argument classifier violates some constraints. The inference procedure [Punyakanok et al., 2004] Input: the probability estimation (by the argument classifier), and structural and linguistic constraints Output: the best legitimate global predictions Formulated as an optimization problem and solved via Integer Linear Programming. Allows incorporating expressive (non-sequential) constraints on the variables (the arguments types).
Page 10 Integer Linear Programming Inference For each argument a i Set up a Boolean variable: a i,t indicating if a i is classified as t Goal is to maximize i score(a i = t ) a i,t Subject to the (linear) constraints Any Boolean constraints can be encoded this way. If score(a i = t ) = P(a i = t ), the objective is find the assignment that maximizes the expected number of arguments that are correct and satisfies the constraints
Page 11 Constraints No overlapping or embedding arguments a i, a j overlap or embed: a i,NULL + a j,NULL 1
Page 12 Constraints No overlapping or embedding arguments No duplicate argument classes for A0-A5 Exactly one V argument per predicate If there is a C-V, there must be V-A1-C-V pattern If there is an R-arg, there must be arg somewhere If there is a C-arg, there must be arg somewhere before Each predicate can take only core arguments that appear in its frame file. More specifically, we check for only the minimum and maximum ids
Page 13 Results PrecRecF1 DevCollins Charniak WSJCollins Charniak BrownCollins Charniak
Page 14 Inference with Multiple Systems The performance of SRL heavily depends on the very first stage – pruning [IJCAI 2005] which is derived directly from the full parse trees Joint Inference allows improvement over semantic role labeling classifiers Combine different SRL systems through joint inference Systems are derived using different full parse trees
Page 15 Inference with Multiple Systems Multiple Systems Train and test with Collins’ parse outputs Train with Charniak’ best parse outputs Test with 5-best Charniak’ parse outputs
Page 16..., traders say, unable to cool the selling panic in both stocks and futures. a1a1 a1a1 a4a4 b1b1 b3b3 b2b2 traders the selling panic in both stocks and futures NullA0A1A NullA0A1A NullA0A1A NullA0A1A Naïve Joint Inference NullA0A1A
Page 17 a1a1 a1a1 a4a4 a3a3 a2a2 b1b1 b3b3 b2b2 b4b4 NullA0A1A Joint Inference – Phantom Candidates Default Priors
Page 18 Results of Joint Inference
Page 19 Results of Joint Inference
Page 20 Results of Joint Inference
Page 21 Results of Different Combination
Page 22 Conclusion The ILP inference can naturally be extended to reason over multiple SRL systems.
Page 23 Thank You