1 Hybrid of search and inference: time- space tradeoffs chapter 10 ICS-275A Fall 2003.

Slides:



Advertisements
Similar presentations
Verification/constraints workshop, 2006 From AND/OR Search to AND/OR BDDs Rina Dechter Information and Computer Science, UC-Irvine, and Radcliffe Institue.
Advertisements

Tree Clustering for Constraint Networks 1 Chris Reeson Advanced Constraint Processing Fall 2009 By Rina Dechter & Judea Pearl Artificial Intelligence,
Constraint Satisfaction Problems
CS498-EA Reasoning in AI Lecture #15 Instructor: Eyal Amir Fall Semester 2011.
1 Finite Constraint Domains. 2 u Constraint satisfaction problems (CSP) u A backtracking solver u Node and arc consistency u Bounds consistency u Generalized.
ICS-271:Notes 5: 1 Lecture 5: Constraint Satisfaction Problems ICS 271 Fall 2008.
Lauritzen-Spiegelhalter Algorithm
Anagh Lal Tuesday, April 08, Chapter 9 – Tree Decomposition Methods- Part II Anagh Lal CSCE Advanced Constraint Processing.
Constraint Optimization Presentation by Nathan Stender Chapter 13 of Constraint Processing by Rina Dechter 3/25/20131Constraint Optimization.
Exact Inference in Bayes Nets
Junction Trees And Belief Propagation. Junction Trees: Motivation What if we want to compute all marginals, not just one? Doing variable elimination for.
MPE, MAP AND APPROXIMATIONS Lecture 10: Statistical Methods in AI/ML Vibhav Gogate The University of Texas at Dallas Readings: AD Chapter 10.
Foundations of Constraint Processing, Spring 2008 April 16, 2008 Tree-Structured CSPs1 Foundations of Constraint Processing CSCE421/821, Spring 2008:
Lectures on Network Flows
Junction Trees: Motivation Standard algorithms (e.g., variable elimination) are inefficient if the undirected graph underlying the Bayes Net contains cycles.
1 Directional consistency Chapter 4 ICS-179 Spring 2010 ICS Graphical models.
Anagh Lal Monday, April 14, Chapter 9 – Tree Decomposition Methods Anagh Lal CSCE Advanced Constraint Processing.
CPSC 322, Lecture 12Slide 1 CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12 (Textbook Chpt ) January, 29, 2010.
1 Directional consistency Chapter 4 ICS-275 Spring 2007.
Ryan Kinworthy 2/26/20031 Chapter 7- Local Search part 1 Ryan Kinworthy CSCE Advanced Constraint Processing.
1 Hybrid of search and inference: time- space tradeoffs chapter 10 ICS-275 Spring 2007.
Resolution versus Search: Two Strategies for SAT Brad Dunbar Shamik Roy Chowdhury.
M. HardojoFriday, February 14, 2003 Directional Consistency Dechter, Chapter 4 1.Section 4.4: Width vs. Local Consistency Width-1 problems: DAC Width-2.
Ch 13 – Backtracking + Branch-and-Bound
General search strategies: Look-ahead Chapter 5 Chapter 5.
Distributed Combinatorial Optimization
Stochastic greedy local search Chapter 7 ICS-275 Spring 2007.
Chapter 5 Outline Formal definition of CSP CSP Examples
Module #1 - Logic 1 Based on Rosen, Discrete Mathematics & Its Applications. Prepared by (c) , Michael P. Frank and Modified By Mingwu Chen Trees.
Backtracking.
Ryan Kinworthy 2/26/20031 Chapter 7- Local Search part 2 Ryan Kinworthy CSCE Advanced Constraint Processing.
1 Book: Constraint Processing Author: Rina Dechter Chapter 10: Hybrids of Search and Inference Time-Space Tradeoffs Zheying Jane Yang Instructor: Dr. Berthe.
Combinatorial Algorithms Unate Covering Binate Covering Graph Coloring Maximum Clique.
CSE 326: Data Structures NP Completeness Ben Lerner Summer 2007.
CSE 589 Part VI. Reading Skiena, Sections 5.5 and 6.8 CLR, chapter 37.
Probabilistic Networks Chapter 14 of Dechter’s CP textbook Speaker: Daniel Geschwender April 1, 2013 April 1&3, 2013DanielG--Probabilistic Networks1.
1 Directional consistency Chapter 4 ICS-275 Spring 2009 ICS Constraint Networks.
Wednesday, January 29, 2003CSCE Spring 2003 B.Y. Choueiry Directional Consistency Chapter 4.
Stochastic greedy local search Chapter 7 ICS-275 Spring 2009.
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
Arc Consistency CPSC 322 – CSP 3 Textbook § 4.5 February 2, 2011.
Accelerating Random Walks Wei Wei and Bart Selman.
LIMITATIONS OF ALGORITHM POWER
Foundations of Constraint Processing, Spring 2009 Structure-Based Methods: An Introduction 1 Foundations of Constraint Processing CSCE421/821, Spring 2009.
Lecture 5: Constraint Satisfaction Problems
Lecture. Today Problem set 9 out (due next Thursday) Topics: –Complexity Theory –Optimization versus Decision Problems –P and NP –Efficient Verification.
Multiway Search Trees Data may not fit into main memory
Structure-Based Methods Foundations of Constraint Processing
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
The minimum cost flow problem
Lectures on Network Flows
Exact Inference Continued
Directional Resolution: The Davis-Putnam Procedure, Revisited
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
1.3 Modeling with exponentially many constr.
Structure-Based Methods Foundations of Constraint Processing
Structure-Based Methods Foundations of Constraint Processing
Boi Faltings and Martin Charles Golumbic
Structure-Based Methods Foundations of Constraint Processing
Boi Faltings and Martin Charles Golumbic
1.3 Modeling with exponentially many constr.
Chapter 5: General search strategies: Look-ahead
Advanced consistency methods Chapter 8
Graphs and Algorithms (2MMD30)
Exact Inference Continued
Major Design Strategies
Structure-Based Methods Foundations of Constraint Processing
An Introduction Structure-Based Methods
Structure-Based Methods Foundations of Constraint Processing
Presentation transcript:

1 Hybrid of search and inference: time- space tradeoffs chapter 10 ICS-275A Fall 2003

ICS 275A - Constraint Networks 2 Reasoning Methods Our focus - search and elimination Search (“guessing” assignments, reasoning by assumptions) Branch-and-bound (optimization) Backtracking search (CSPs) Cycle-cutset (CSPs, belief nets) Variable elimination (inference, “propagation” of constraints, probabilities, cost functions) Dynamic programming (optimization) Adaptive consistency (CSPs) Joint-tree propagation (CSPs, belief nets)

Fall 2003 ICS 275A - Constraint Networks 3 Search : Backtracking Search 0

Fall 2003 ICS 275A - Constraint Networks 4 Satisfiability: Inference vs search Search = O(exp(n))

Fall 2003 ICS 275A - Constraint Networks 5 Search complexity distributions Complexity histograms (deadends, time) => continuous distributions (Frost, Rish, and Vila 1997; Selman and Gomez 1997, Hoos 1998) nodes explored in the search space Frequency (probability)

Fall 2003 ICS 275A - Constraint Networks 6 Bucket Elimination A E D C B E A D C B || R D BE, || R E || R DB || R DCB || R ACB || R AB RARA R C BE

Fall 2003 ICS 275A - Constraint Networks 7 DR versus DPLL: complementary properties Uniform random 3-CNFs (large induced width) (k,m)-tree 3-CNFs (bounded induced width )

Fall 2003 ICS 275A - Constraint Networks 8 Exact CSP techniques: complexity

Fall 2003 ICS 275A - Constraint Networks 9 Outline; Road Map Methods Tasks

Fall 2003 ICS 275A - Constraint Networks 10 The cycle-cutset effect A cycle-cutset is a subset of nodes in an undirected graph whose removal results in a graph with no cycles An instantiated variable cuts the flow of information: cut a cycle. If a cycle-cutset is instantiated the remaining problem is a tree and can be solved efficiently

Fall 2003 ICS 275A - Constraint Networks 11 Example of the cycle-cutset scheme

Fall 2003 ICS 275A - Constraint Networks 12 Complexity of the cycle-cutset scheme Theorem: Algorithm cycle-cutset decomposition has time complexity of where n is the number of variables, c is the cycle-cutset size and k is the domain size. The space complexity of the algorithm is linear.

Fall 2003 ICS 275A - Constraint Networks 13 Recursive-search: a linear space search guided by a tree-decomposition Given a tree network, we identify a node x_1 which, when removed, generates two subtrees of size n/2 (approximately). T_n is the time to solve a binary tree starting at x_1. T_n obeys recurrence T_n = k 2 T_n/2, T_1 = k We get: T_n = n k^{logn +1} Given a tree-decomposition having induced-width w* this generalize to recursive conditioning of tree-decompositions: T_n = n k^({w*+1} log n) because the number of values k is replaced by th enumber of tuples k^w*

Fall 2003 ICS 275A - Constraint Networks 14 Alternative views of recursive- search Proposition 1: Given a constraint network R= (X,D,C), having graph G, a tree-decomposition T = (X, chi,Psi) that has induced-width w*, having diameter r (the longet path from cluster leaf to cluster leaf, then there exists a DFS tree dfs(T) whose depth is bounded by O(log r w*). Proposition 2: Recursive-conditioning along a tree- decomposition T of a constraint problem R= (X,D,C), having induced-width w*, is identical to backjumping along the DFS ordering of its corresponding dfs(T). Proposition 3: Recursive-conditioning is a depth-first search traversal of the AND/OR search tree relative to the DFS spanning tree dfs(T).

Fall 2003 ICS 275A - Constraint Networks 15 Example Consider a chain graph or a k-tree.

Fall 2003 ICS 275A - Constraint Networks 16 Hybrid: conditioning first Generalize cycle-cutset: condition of a subset that yield a bounded inferene problem, not necessarily linear. b-cutset: a subset of nodes is called a b-cutset iff when the subset is removed the resulting graph has an induced-width less than or equal to b. A minimal b-cutset of a graph has a smallest size among all b-cutsets of the graph. A cycle-cutset is a 1-cutset of a graph. Adjusted induced-width

Fall 2003 ICS 275A - Constraint Networks 17 Elim-cond(b) Idea: runs backtracking search on the b-cutset variables and bucket-elimination on the remaining variables. Input: A constraint network R = (X,D,C), Y a b-cutset, d an ordering that starts with Y whose adjusted induced-width, along d, is bounded by b, Z = X-Y. Output: A consistent assignment, if there is one. 1. while {y}  next partial solution of Y found by backtracking, do a) z  solution found by adaptive-consistency(R_y). B) if z is not false, return solution (y,z). 2. endwhile. return: the problem has no solutions.

Fall 2003 ICS 275A - Constraint Networks 18 Complexity of elim-cond(b) Theorem: Given R= (X,D,C), if elim-cond(b) is applied along ordering d when Y is a b-cutset then the space complexity of elim-cond(b) is O(n exp(b)), and its time complexity is O(n exp (|Y|+b)).

Fall 2003 ICS 275A - Constraint Networks 19 Finding a b-cutset Verifying a b-cutset can be done in polynomial time A simple greedy: use a good induced-width ordering and starting at the top add to the b- cutset any variable with more than b parents. Alternative: generate a tree-decomposition And select a b-cutset that reduce each cluster below b.

Fall 2003 ICS 275A - Constraint Networks 20 Time-space tradeoff using b-cutset There is no guaranteed worst-case time improvement of elim- cond(b) over pure bucket-elimination. the size of the smallest cycle-cutset (1-cutset), c_1 and the smallest induced width, w*, obey: c_1 >= w* - 1. Therefore, 1 +c_1 >= w*, where the left side of this inequality is the exponent that determines time complexity of elim-cond(b=1), while w* governs the complexity of bucket- elimination. c_i-c_(i+1) >= 1 1+c_1 >= 2+c_2 >=... b+c_b,... >= w*+c_w* = w* we get a hybrid scheme whose time complexity decreases as its space increases until it reaches the induced-width.

Fall 2003 ICS 275A - Constraint Networks 21 DCDR(b): Elim-cond(b) on propositional theories: elimination replaced by resolution

Fall 2003 ICS 275A - Constraint Networks 22 Example of conditioning on A Consider the theory: (~C v E)(A v B v C v D)(~A v B v E v D)(B v C v D)

Fall 2003 ICS 275A - Constraint Networks 23

Fall 2003 ICS 275A - Constraint Networks 24 Example of DC-DR(2)

Fall 2003 ICS 275A - Constraint Networks 25 DCDR(b): empirical results

Fall 2003 ICS 275A - Constraint Networks 26 Hybrid, inference first: The super cluster tree elimination Algorithm CTE is time exponential in the cluster size and space exponential in the separator size. Trade space for time by increasing the cluster size and decreasing the spearator sizes. Join clusters with fat separators.

Fall 2003 ICS 275A - Constraint Networks 27 Example

Fall 2003 ICS 275A - Constraint Networks 28 A primary and secondary tree- decompositions

Fall 2003 ICS 275A - Constraint Networks 29 Sep-based time-space tradeoff Let T be a tree-decomposition of hypergraph H. Let s_0,s_1,...,s_n be the sizes of the separators in T, listed in strictly descending order. With each separator size s_i we associate a secondary tree decomposition T_i, generated by combining adjacent nodes whose separator sizes are strictly greater than s_i. We denote by r_i the largest set of variables in any cluster of T_i. Note that as s_i decreases, r_i increase. Theorem: The complexity of CTE when applied to each T_i is O( n exp(r_i)) time, and O( n exp(s_i)) space.

Fall 2003 ICS 275A - Constraint Networks 30 Super-buckets From a bucket-tree to a join-tree to a super-bucket tree

Fall 2003 ICS 275A - Constraint Networks 31 Super-bucket elimination (Dechter and El Fattah, 1996) Eliminating several variables ‘at once’ Conditioning is done only in super-buckets

Fall 2003 ICS 275A - Constraint Networks 32 Non-separable component: a special case of tree-decomposition A connected graph G=(V,E) has a separation node v if there exist nodes a and b such that all paths connecting a and b pass through v. A graph that has a separation node is called separable, and one that has none is called non- separable. A subgraph with no separation nodes is called a non-separable component or a bi- connected component. A dfs algorithm can find all non-separable components and they have a tree structure

Fall 2003 ICS 275A - Constraint Networks 33 Decomposition into non- spearable components Assume a constraint network having unary, binary and ternary constraints :R = { R_ AD,R_ AB, R_ DC,R_ BC, R_ GF,D_G,D_F,R_EHI,R_CFE }.

Fall 2003 ICS 275A - Constraint Networks 34 Executing ATC(Adaptive tree consistency)

Fall 2003 ICS 275A - Constraint Networks 35 Complexity Theorem: If R = (X,D,C) is a constraint network whose constraint graph has non-separable components of at most size r, then the super-bucket elimination algorithm, whose buckets are the non- separable components, is time exponential O(n exp(r)) and is linear in space.

Fall 2003 ICS 275A - Constraint Networks 36 Hybrids of hybrids hybrid(b_1,b_2): First, a tree-decomposition having separators bounded by b_1 is created, followed by application of the CTE algorithm, but each clique is processed by elim-cond(b_2). If c^*_{b_2} is the size of the maximum b_2-cutset in each clique of the b_1-tree- decomposition, the algorithm is space exponential in b_1 but time exponential in c^*_{b_2}. Special cases: hybrid(b_1,1): Applies cycle-cutset in each clique. b_1 = b_2. For b=1, hybrid(1,1) is the non-separable components utilizing the cycle-cutset in each component. The space complexity of this algorithm is linear but its time complexity can be much better than the cycle-cutsets cheme or the non-separable component approach alone.

Fall 2003 ICS 275A - Constraint Networks 37 Case study: circuit diagnosis Problem: Given a circuit and its unexpected output, identify faulty components. The problem can be modeled as a constraint optimization problem and solved by bucket elimination.

Fall 2003 ICS 275A - Constraint Networks 38 Case study: combinatorial circuits: benchmark used for fault diagnosis and testing community Problem: Given a circuit and its unexpected output, identify faulty components. The problem can be modeled as a constraint optimization problem and solved by bucket elimination.

Fall 2003 ICS 275A - Constraint Networks 39 Case study: C432 A circuit’s primal graph For every gate we connect inputs and outputs Join-tree of c432 Seperator size is 23

Fall 2003 ICS 275A - Constraint Networks 40 Join-tree of C3540 (1719 vars) max sep size 89

Fall 2003 ICS 275A - Constraint Networks 41 Secondary trees for C432

Fall 2003 ICS 275A - Constraint Networks 42 Time-space tradeoffsTime/Space tradeoff Time is measured by the maximum of the separator size and the cutset size and space by the maximum separator size.

Fall 2003 ICS 275A - Constraint Networks 43 Constraint Optimization: Combinatorial Auction: Bucket-elimination vs Search b1 b2 b3 b4 b5 b6 Bucket-elimination = Dynamic programming Bucket-elimination: In a bucket sum costs and maximize over constrained assignments: b2b2 b1b1 b3b3 b4b4 b5b5 b6b6 Search: Branch and Bound or Best-first search.