Survey Propagation. Outline Survey Propagation: an algorithm for satisfiability 1 – Warning Propagation – Belief Propagation – Survey Propagation Survey.

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

10/7/2014 Constrainedness of Search Toby Walsh NICTA and UNSW
NP-Hard Nattee Niparnan.
Exact Inference. Inference Basic task for inference: – Compute a posterior distribution for some query variables given some observed evidence – Sum out.
Variational Methods for Graphical Models Micheal I. Jordan Zoubin Ghahramani Tommi S. Jaakkola Lawrence K. Saul Presented by: Afsaneh Shirazi.
“Using Weighted MAX-SAT Engines to Solve MPE” -- by James D. Park Shuo (Olivia) Yang.
Max Cut Problem Daniel Natapov.
CPSC 422, Lecture 21Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 21 Mar, 4, 2015 Slide credit: some slides adapted from Stuart.
Exact Inference in Bayes Nets
Lecture 22: April 18 Probabilistic Method. Why Randomness? Probabilistic method: Proving the existence of an object satisfying certain properties without.
NP-complete and NP-hard problems Transitivity of polynomial-time many-one reductions Concept of Completeness and hardness for a complexity class Definition.
Generating Hard Satisfiability Problems1 Bart Selman, David Mitchell, Hector J. Levesque Presented by Xiaoxin Yin.
CS774. Markov Random Field : Theory and Application Lecture 17 Kyomin Jung KAIST Nov
Introduction to Belief Propagation and its Generalizations. Max Welling Donald Bren School of Information and Computer and Science University of California.
Survey Propagation Algorithm
Belief Propagation by Jakob Metzler. Outline Motivation Pearl’s BP Algorithm Turbo Codes Generalized Belief Propagation Free Energies.
Introduction to Approximation Algorithms Lecture 12: Mar 1.
Hardness Results for Problems P: Class of “easy to solve” problems Absolute hardness results Relative hardness results –Reduction technique.
Graphs 4/16/2017 8:41 PM NP-Completeness.
1 Boolean Satisfiability in Electronic Design Automation (EDA ) By Kunal P. Ganeshpure.
08/1 Foundations of AI 8. Satisfiability and Model Construction Davis-Putnam, Phase Transitions, GSAT Wolfram Burgard and Bernhard Nebel.
Global Approximate Inference Eran Segal Weizmann Institute.
Algorithms in Exponential Time. Outline Backtracking Local Search Randomization: Reducing to a Polynomial-Time Case Randomization: Permuting the Evaluation.
The Theory of NP-Completeness
NP-complete and NP-hard problems
Analysis of Algorithms CS 477/677
Computability and Complexity 24-1 Computability and Complexity Andrei Bulatov Approximation.
1 Paul Beame University of Washington Phase Transitions in Proof Complexity and Satisfiability Search Dimitris Achlioptas Michael Molloy Microsoft Research.
Hardness Results for Problems
Ryan Kinworthy 2/26/20031 Chapter 7- Local Search part 2 Ryan Kinworthy CSCE Advanced Constraint Processing.
1 Message Passing and Local Heuristics as Decimation Strategies for Satisfiability Lukas Kroc, Ashish Sabharwal, Bart Selman (presented by Sebastian Brand)
Factor Graphs Young Ki Baik Computer Vision Lab. Seoul National University.
Graph Coalition Structure Generation Maria Polukarov University of Southampton Joint work with Tom Voice and Nick Jennings HUJI, 25 th September 2011.
1 Naïve Bayes Models for Probability Estimation Daniel Lowd University of Washington (Joint work with Pedro Domingos)
Nattee Niparnan. Easy & Hard Problem What is “difficulty” of problem? Difficult for computer scientist to derive algorithm for the problem? Difficult.
Performing Bayesian Inference by Weighted Model Counting Tian Sang, Paul Beame, and Henry Kautz Department of Computer Science & Engineering University.
Undirected Models: Markov Networks David Page, Fall 2009 CS 731: Advanced Methods in Artificial Intelligence, with Biomedical Applications.
1 1. Draw the machine schema for a TM which when started with input 001 halts with abbb on the tape in our standard input format. 2. Suppose you have an.
CHAPTERS 7, 8 Oliver Schulte Logical Inference: Through Proof to Truth.
Belief Propagation. What is Belief Propagation (BP)? BP is a specific instance of a general class of methods that exist for approximate inference in Bayes.
The satisfiability threshold and clusters of solutions in the 3-SAT problem Elitza Maneva IBM Almaden Research Center.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 33.
Week 10Complexity of Algorithms1 Hard Computational Problems Some computational problems are hard Despite a numerous attempts we do not know any efficient.
EMIS 8373: Integer Programming NP-Complete Problems updated 21 April 2009.
Physics Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 7th~10th Belief propagation Kazuyuki Tanaka Graduate School of Information Sciences,
NP-COMPLETE PROBLEMS. Admin  Two more assignments…  No office hours on tomorrow.
1 Mean Field and Variational Methods finishing off Graphical Models – Carlos Guestrin Carnegie Mellon University November 5 th, 2008 Readings: K&F:
CS 3343: Analysis of Algorithms Lecture 25: P and NP Some slides courtesy of Carola Wenk.
CPSC 422, Lecture 21Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 21 Oct, 30, 2015 Slide credit: some slides adapted from Stuart.
CSE 589 Part V One of the symptoms of an approaching nervous breakdown is the belief that one’s work is terribly important. Bertrand Russell.
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
Accelerating Random Walks Wei Wei and Bart Selman.
NPC.
CSC 413/513: Intro to Algorithms
Why almost all satisfiable k - CNF formulas are easy? Danny Vilenchik Joint work with A. Coja-Oghlan and M. Krivelevich.
1 Structure Learning (The Good), The Bad, The Ugly Inference Graphical Models – Carlos Guestrin Carnegie Mellon University October 13 th, 2008 Readings:
The NP class. NP-completeness Lecture2. The NP-class The NP class is a class that contains all the problems that can be decided by a Non-Deterministic.
Lecture 7: Constrained Conditional Models
Richard Anderson Lecture 26 NP-Completeness
Computability and Complexity
Richard Anderson Lecture 26 NP-Completeness
NP-Completeness Yin Tat Lee
Complexity 6-1 The Class P Complexity Andrei Bulatov.
Physical Fluctuomatics 7th~10th Belief propagation
Expectation-Maximization & Belief Propagation
NP-Completeness Yin Tat Lee
Instructor: Aaron Roth
Directional consistency Chapter 4
Mean Field and Variational Methods Loopy Belief Propagation
Presentation transcript:

Survey Propagation

Outline Survey Propagation: an algorithm for satisfiability 1 – Warning Propagation – Belief Propagation – Survey Propagation Survey Propagation Revisited 2 – SP’s relation to BP – Covers 1 Braunstein, A., Mézard, M., and Zecchina, R Survey propagation: An algorithm for satisfiability. Random Struct. Algorithms 27, 2 (Sep. 2005), Kroc, L.; Sabharwal, A.; and Selman, B Survey propagation revisited. In UAl 2007.

Survey Propagation and Message Passing New threshold defining two regions of solutions, easy SAT and hard SAT – Easy SAT region has many, highly-clustered solutions, and any two solutions are connected by a few number of steps – Hard SAT region has many clusters of solutions, with fewer overall solutions than easy SAT SP is an efficient way of computing solutions in the Hard SAT region

New Model - Factor Graph

Message Passing Treat the Factor Graph as a network Depending on the current state of the network, messages will be passed between the nodes These messages will later be used to determine the final value of a variable

Cavity Fields Used to determine the value of a given message, and varies depending on the actual algorithm in use Calculated by removing a function node from consideration

Warning Propagation Messages are from the set {0,1} Passed from Function nodes to Variable Nodes – Product of cavity fields Initially randomly generate messages Calculate new warnings for edges until we finish the time, or we hit a fixed point for all edges – New warnings are calculated from the cavity fields (∑ b Є V+(j)\a u b→j ) – (∑ b Є V-(j)\a u b→j )

WP Example Local Field Contradiction Numbers The whole algorithm is an iterative process involving multiple WP executions, simplifying the graph using local fields

Belief Propagation Finds the total number of solutions, as well as the fraction of assignments with variable X i is true Two types of messages: – From Function node to Variable node Given a value for X i, the probability that it satisfies the function clause – From Variable node to Function node Probability that variable i takes value X i in the absence of function clause a

Satisfiable and Unsatisfiable Sets

BP Example Blue is the cavity field of the variable node, red is the message from function to variable. Feed these values into another function to compute probabilities of being true

WP and BP Work 100% of the time when the factor graph is a tree Can be used as heuristics when it is not a tree, but no guarantee on solution Goal: Create a more efficient scheme for the general case

Basics of Survey Propagation Generally the same as BP, but deals better with clusters Details of the message change – A message sent from function to variable is the probability that a warning is sent to the variable node – A warning is sent if all other variable nodes connected to the function do not satisfy the function. Note: Behaves exactly like WP on tree factor graphs

Clustering of Configurations SAT Assignments The problem with BP’s approximations is that they do not hold globally, but do hold if you restrict the computation to a specific cluster. SP fixes this.

SP Results with Don’t Care State

Testing Data

Results New way to solve SAT using cavity fields and message passing Works well on generic input, as well as the tree structure Has not been rigorously examined, but is based off of similar work

Why does SP Work? As of 2008, SP is the only method of solving SAT problems with 1,000,000 variables (and larger) Does so in near linear time (even in the hardest region) SP behaves like backtrack searches, except it almost never has to backtrack

SP’s relation to BP BP works well when the number of SAT clusters is low SP seems to work well ALL the time Work in 2005 shows that SP is equivalent to BP search over a new object, covers This is a generalization of the clusters described previously

Covers σ Є {0,1,*}ⁿ is a cover of F if: 1.Every clause has at least one satisfying literal, or two *-ed literals 2.No unsupported variables are assigned 0 or 1 X = Y = Z = (x V ¬y V ¬z) Λ (¬x V y V ¬z) Λ (¬x V ¬y V z)

Computing Covers New Factor Graphs:

Solving SAT using Covers Let P(F) be a cover of F: The factor graph of P(F) can bee seen as a Bayesian Network, so marginals can be computed using Bayes’ Theorem These marginals, when fed to the BP algorithm, computes a solution for the original SAT problem (and results in the SP algorithm). The SP algorithm has been shown to compute marginals over these covers.

SP, BP and Marginals Magnetization of a Variable: The difference between the marginal being positive vs. negative

Results The reason why SP works so well seems to be related to these covers These covers seem to expose hidden properties of a SAT instance, such as backbone variables

Interesting Aside SAT and Decision SAT are self-reducible Finding covers is NOT self-reducible (x V ¬y V ¬z) Λ (¬x V y V ¬z) Λ (¬x V ¬y V z) (y V ¬z) Λ (¬y V z) (¬z) x = 1?Oracle: yes y = 0?Oracle: yes (1,0,0) a cover of the original? No!