Presentation is loading. Please wait.

Presentation is loading. Please wait.

Continuous Variables Write message update equation as an expectation: Proposal distribution W t (x t ) for each node Samples define a random discretization.

Similar presentations


Presentation on theme: "Continuous Variables Write message update equation as an expectation: Proposal distribution W t (x t ) for each node Samples define a random discretization."— Presentation transcript:

1 Continuous Variables Write message update equation as an expectation: Proposal distribution W t (x t ) for each node Samples define a random discretization of the state space Messages are weightings defined on this discrete domain Samples x t (j), proposal W t held fixed for analysis Particle Belief Propagation Alexander Ihler David McAllester Toyota Technological Institute, Chicago Graphical Models Bren School of Information & Computer Science University of California, Irvine Distribution written in terms of potential functions: Graph Separation Conditional Independence I. Message Product: Multiply incoming messages (from all nodes but s ) with the local observation to form a distribution over x t, II. Message Propagation: Transform distribution from node t to node s using the pairwise potential  s,t Integrate over x t to form f’n summarizing t’s knowledge about x s. Particle Belief PropagationConsistency Convergence Rates Experimental Evaluation Theorem 1: For a tree with k nodes, if we sample n particles at each node with n > k 2 R W ln(kn/  ), and compute the message values defined by (1), then with probability at least 1-  over the choice of particles we have, simultaneously for all nodes s and all particles x s (i), Theorem 2: Under the same conditions as Theorem 1, with probability at least 1-  ’ over the choice of particles we have for all nodes s that Stereo depth maps (Use pairwise; can generalize to higher order) Goal: infer marginal distributions (can use to construct estimators, etc.) Belief Propagation Neighborhood (nodes adjacent to s) Message (represents information from x t about x s ) Belief (approximate marginal) PBP Outline: “Stripped down” particle approximation algorithm Enables theoretical analysis Consistent; n -1/2 convergence rate Related to PAC bounds for learning Convergence results for particle filtering Suggests Proposal distribution choice MCMC particle update procedure Other extensions … in Nonparametric Belief Propagation (NBP) Nodes u, s both send messages to t Samples drawn at source (u,s) to represent messages Sample sets will not overlap! Product = ? Solution: smooth the messages with a Gaussian kernel Leads to sampling from product of Gaussian mixtures Nominally, O(n^d) where d = # of neighbors. …in PBP, Sample locations for incoming messages drawn at destination, t Ensures message particles overlap No smoothing required Product is O(n); Propagation O(n 2 ) Easy to show this algorithm is consistent (approaches true BP messages as n ! 1) Assume large but finite set of states Define states taken on by some particle, and counts Then, messages can be written as and since c t ! W t, we have consistency. Resampling Methods Most stochastic versions of BP have a resampling operation Allows “better” particles to be selected with more information Ex: NBP: smooth messages, draw from product each iteration Koller et al. (1999): Fit distribution, re-draw samples PBP: Can rewrite Suggests drawing samples from the belief B(x t ) (Done in practice by other algorithms previously also) Can use MCMC to sample from the current belief: Use Metropolis-Hastings algorithm Evaluate the ratio of beliefs at any two points via (2) Define a rate constant Analyze finite trees; extensible to more general cases i.e., with high probability our beliefs are accurate at the sample locations. Moreover, we can define a belief estimate at any value of x s by, i.e., with high probability our beliefs are also accurate in an L-1 sense. (2) (1) + (Left image) (Right image) (Depth map) (Graph) From Middlebury data set; Scharstein & Szeliski 2002 Use to evaluate convergence properties: univariate x t, discretization tractable Compare particle approximations to exact Local: W t = local likelihood function True belief: W t = B t (x t ) Estimated belief: use MCMC to sample from current estimate NBP, message & belief-based samples PBP improves at rate n -1/2 as predicted NBP improves at a similar rate slightly slower (~ n -2/5 )? consistent with kernel variance rate (Smoothing seems to hurt performance here) CDE GFH I J B A C D E G F H I J Sensor localization BP messages: tractable closed form for discrete or jontly Gaussian random variables General continuous problems No closed form Discretization becomes intractiable in as few as 2-3 dimensions Need to use approximations (Koller et al. 1999; Coughlan & Ferreira 2002; Sudderth et al. 2003; Isard 2003, 2008; others…) (Ihler et al. 2005) (Sudderth et al. 2003) y2=rand(1,10);


Download ppt "Continuous Variables Write message update equation as an expectation: Proposal distribution W t (x t ) for each node Samples define a random discretization."

Similar presentations


Ads by Google