Presentation is loading. Please wait.

Presentation is loading. Please wait.

Bucket Renormalization for Approximate Inference

Similar presentations


Presentation on theme: "Bucket Renormalization for Approximate Inference"— Presentation transcript:

1 Bucket Renormalization for Approximate Inference
Sungsoo Ahn1 Joint work with Michael Chertkov2, Adrian Weller3 and Jinwoo Shin1 1Korea Advanced Institute of Science and Technology (KAIST) 2Los Alamos National Laboratory (LANL) 3University of Cambridge June 7th, 2018

2 Goal: approximate inference in GMs
Graphical model (GM) is a family of distributions, factorized by graphs. E.g., Ising model [Ising, 1920] for distribution of atomic spins. This talk is about undirected GMs with discrete variables. Protein structure (A) being modeled by graphical model (B) [Kamisety et al, 2008]

3 Goal: approximate inference in GMs
Graphical model (GM) is a family of distributions, factorized by graphs.

4 Goal: approximate inference in GMs
Graphical model (GM) is a family of distributions, factorized by graphs. Distribution requires space to specify (for binary variables). GM factorization allows to be stored in order of space. Partition function is essential for inference & normalization However, NP-hard to compute, so we need approximations. e.g., MCMC, variational inference and approximate variable elimination

5 Approximate variable elimination
Sequentially summing out variables (approximately) one-by-one. e.g., mini bucket elimination for upper bounding Z. terminates in fixed number of iterations. compared to other families, much faster but inaccurate. Bucket Renormalization New approximate variable eliminations with superior performance. variant of mini bucket elimination, but without bounding property. can also be seen as low-rank approximation of GMs

6 Summary for rest of the talk
Variable (bucket) elimination Mini bucket renormalization (MBR) Global bucket renormalization (GBR)

7 Variable (bucket) elimination for exact Z
For each variable in GM: Collect adjacent factors, i.e., bucket. Sum variable over bucket to generate a new factor.

8 Variable (bucket) elimination for exact Z
For each variable in GM: Collect adjacent factors, i.e., bucket. Sum variable over bucket to generate a new factor.

9 Variable (bucket) elimination for exact Z
For each variable in GM: Collect adjacent factors, i.e., bucket. Sum variable over bucket to generate a new factor. Requires computation & memory.

10 Variable (bucket) elimination for exact Z
For each variable in GM: Collect adjacent factors, i.e., bucket. Generate new factor by marginalizing bucket over the variable. Complexity is determined by size of bucket. Key idea: replacing with approximation.

11 Summary for rest of the talk
Variable (bucket) elimination Mini bucket renormalization (MBR) Global bucket renormalization (GBR)

12 Mini bucket renormalization
Idea 1. Splitting variables, then adding compensating factors.

13 Mini bucket renormalization
Idea 1. Splitting variables, then adding compensating factors. Number of splitting is decided by available resources. Choosing a nice compensation factor is important. Idea 2. Comparing with the optimal compensation.

14 Algorithm description
Given variable to marginalize: Split the variables and generate mini buckets: Add compensating factors for each of split variables: Generate new factors by summing out each mini buckets:

15 Mini bucket renormalization
Idea 2. Comparing with the optimal compensation. The resulting optimization is equivalent to rank-1 truncated SVD. minimize L2-difference

16 Connection to rank-1 truncated SVD
Eventually, we are minimizing error of rank-1 projection:

17 Algorithm description
Given compensating factors to choose: Sum out over mini bucket and : Compare with the optimal compensation:

18 Illustration of mini bucket renormalization
MBR with elimination order 1,2,3,4,5 with memory budget .

19 Illustration of mini bucket renormalization
MBR with elimination order 1,2,3,4,5 with memory budget .

20 Illustration of mini bucket renormalization
MBR with elimination order 1,2,3,4,5 with memory budget .

21 Illustration of mini bucket renormalization
MBR with elimination order 1,2,3,4,5 with memory budget .

22 Illustration of mini bucket renormalization
MBR with elimination order 1,2,3,4,5 with memory budget .

23 Illustration of mini bucket renormalization
MBR with elimination order 1,2,3,4,5 with memory budget .

24 Illustration of mini bucket renormalization
MBR with elimination order 1,2,3,4,5 with memory budget .

25 Illustration of mini bucket renormalization
MBR with elimination order 1,2,3,4,5 with memory budget .

26 Illustration of mini bucket renormalization
MBR with elimination order 1,2,3,4,5 with memory budget .

27 Illustration of mini bucket renormalization
MBR with elimination order 1,2,3,4,5 with memory budget .

28 Why MBR is called a renormalization
Splitting & compensation without variable elimination results in a renormalized GM: This can be interpreted as a tractable approximation to original GM.

29 Summary for rest of the talk
Variable (bucket) elimination Mini bucket renormalization (MBR) Global bucket renormalization (GBR)

30 Global bucket renormalization (GBR)
Recall from mini bucket renormalization: minimize L2-difference GBR aims to find a better choice of compensation at cost of additional computation.

31 Global bucket renormalization (GBR)
Idea: increasing the scope of comparison: minimize L2-difference

32 Global bucket renormalization (GBR)
Idea: increasing the scope of comparison: However, complexity is hard as computing the partition function. As a heuristic, we perform comparison in the renormalized GM. minimize L2-difference

33 Global bucket renormalization (GBR)
Idea: increasing the scope of comparison: However, complexity is hard as computing the partition function. As a heuristic, we perform comparison in the renormalized GM.

34 Experiments We measure log-Z approximation ratio of our algorithms:
mini bucket renormalization (MBR) global bucket renormalization (GBR) and compare with 4 existing algorithms: mini bucket elimination (MBE) weighted mini bucket elimination (WMBE) belief propagation (BP) and mean field approximation (MF).

35 Ising GM experiments Comparison over varying interaction parameter (or temperature). GBR > MBR > MF > WMBE ≈ MBE > BP GBR > MBR > BP > MF > WMBE > MBE complete graph with 15 variables grid graph with 15x15 variables

36 UAI 2014 competition experiments
Number in brackets denote # of algorithms dominating others. GBR ≈ MBR > BP > WMBE > MBE Promedus dataset Linkage dataset

37 Conclusion We proposed bucket renormalization, based on splitting & compensation Highly inspired from tensor network renormalization (TNR) in statistica l physics and tensor decomposition algorithms. Arxiv version available at: Thank you for listening!

38

39 Ising GM experiments Comparison over varying order of available memory ibound. complete graph with 15 variables grid graph with 15x15 variables


Download ppt "Bucket Renormalization for Approximate Inference"

Similar presentations


Ads by Google