Presentation is loading. Please wait.

Presentation is loading. Please wait.

F IXING M AX -P RODUCT : A U NIFIED L OOK AT M ESSAGE P ASSING A LGORITHMS Nicholas Ruozzi and Sekhar Tatikonda Yale University.

Similar presentations


Presentation on theme: "F IXING M AX -P RODUCT : A U NIFIED L OOK AT M ESSAGE P ASSING A LGORITHMS Nicholas Ruozzi and Sekhar Tatikonda Yale University."— Presentation transcript:

1 F IXING M AX -P RODUCT : A U NIFIED L OOK AT M ESSAGE P ASSING A LGORITHMS Nicholas Ruozzi and Sekhar Tatikonda Yale University

2 P REVIOUS W ORK Recent work related to max-product has focused on convergent and correct message passing schemes: TRMP [Wainwright et al. 2005] MPLP [Globerson and Jaakkola 2007] Max-Sum Diffusion [Werner 2007] “Splitting” Max-product [Ruozzi and Tatikonda 2010]

3 Typical approach: focus on a dual formulation of the MAP LP Message passing scheme is derived as a coordinate ascent scheme on a concave dual: MPLP TRMP Max-Sum Diffusion P REVIOUS W ORK MAP MAP LP Concave Dual

4 Many of these algorithms can be seen as maximizing a specific lower bound [Sontag and Jaakkola 2009] The maximization is performed over reparameterizations of the objective function that satisfy specific constraints Different constraints correspond to different dual formulations P REVIOUS W ORK

5 T HIS W ORK Focus on the primal problem: Choose a reparameterization of the objective function Reparameterization in terms of messages Construct concave lower bounds from this reparameterization by exploiting concavity of min Perform coordinate ascent on these lower bounds MAP Reparamet- erization Concave Lower Bound

6 T HIS W ORK Many of the common message passing schemes can be captured by the “splitting” family of reparameterizations Many possible lower bounds of interest Produces an unconstrained concave optimization problem MAP Reparamet- erization Concave Lower Bound

7 O UTLINE Background Min-sum Reparameterizations “Splitting” Reparameterization Lower bounds Message Updates

8 M IN -S UM Minimize an objective function that factorizes as a sum of potentials (assume f is bounded from below) (some multiset whose elements are subsets of the variables)

9 C ORRESPONDING G RAPH 2 1 3

10 R EPARAMETERIZATIONS We can rewrite the objective function as This does not change the objective function as long as the messages are finite valued at each x The objective function is reparameterized in terms of the messages No dependence on messages passed from i to ®

11 B ELIEFS Typically, we express the reparameterization in terms of beliefs (meant to represent min-marginals): With this definition, we have:

12 M IN -S UM The min-sum algorithm updates ensure that, after updating m ®i, In other words, Can estimate an assignment from a collection of messages by choosing Upon convergence,

13 C ORRECTNESS G UARANTEES The min-sum algorithm does not guarantee the correctness of this estimate upon convergence Assignments that minimize b i need not minimize f: Notable exceptions: trees, single cycles, singly connected graphs

14 L OWER B OUNDS Can derive lower bounds that are concave in the messages from reparameterizations: Lower bound is a concave function of the messages (and beliefs) We want to find the choice of messages that maximizes the lower bound This lower bound may not be tight

15 O UTLINE Background Min-sum Reparameterizations “Splitting” Reparameterization Lower bounds Message Updates

16 “G OOD ” R EPARAMETERIZATIONS Many possible reparameterizations How do we choose reparameterizations that produce “nice” lower bounds? Want estimates corresponding to the optimal choice of messages to minimize the objective function Want the bound to be concave in the messages Want the coordinate ascent scheme to remain local

17 “S PLITTING ” REPARAMETERIZATION where c i, c ®  0 and the beliefs are defined as:

18 “S PLITTING ” REPARAMETERIZATION TRMP: is a collection of spanning trees in the factor graph and ¹ is a probability distribution on spanning trees Choose c ® = ¹ ® Can extend this to a collection of singly connected subgraphs [Ruozzi and Tatikonda 2010]

19 “S PLITTING ” REPARAMETERIZATION Min-sum, TRMP, MPLP, and Max-Sum Diffusion can all be characterized as splitting reparameterizations One possible lower bound: We could choose c such that f can be written as a nonnegative combination of the beliefs

20 “S PLITTING ” R EPARAMETERIZATION TRMP lower bound: Max-Sum Diffusion lower bound:

21 O UTLINE Background Min-sum Reparameterizations “Splitting” Reparameterization Lower bounds Message Updates

22 F ROM L OWER BOUNDS TO M ESSAGE U PDATES We can construct the message updates by ensuring that we perform coordinate ascent on our lower bounds Can perform block updates over trees [Meltzer et al. 2009] [Kolmogorov 2009] [Sontag and Jaakkola 2009] Key observation: Equality iff there is an x that simultaneously minimizes both functions

23 M AX -S UM D IFFUSION Want Solving for m ®i gives Do this for all ® 2  i

24 S PLITTING U PDATE Suppose all coefficients are positive and c i > 0 We want Solving for m ®i gives Do this for all ® 2  i

25 C ONCLUSION MPLP, TRMP, and Max-Sum Diffusion are all instances of the splitting reparameterization for specific choices of the constants and lower bound Different lower bounds produce different unconstrained concave optimization problems Choice of lower bound corresponds to choosing different dual formulations Maximization is performed with respect to the messages, not the beliefs Many more reparameterizations and lower bounds are possible Is there a reparameterization in which the lower bounds are strictly concave?

26 Q UESTIONS ?


Download ppt "F IXING M AX -P RODUCT : A U NIFIED L OOK AT M ESSAGE P ASSING A LGORITHMS Nicholas Ruozzi and Sekhar Tatikonda Yale University."

Similar presentations


Ads by Google