1 SAT Genealogy Alexander Nadel, Intel, Haifa, Israel The Technion, Haifa, Israel July, 3 2012.

Slides:



Advertisements
Similar presentations
The behavior of SAT solvers in model checking applications K. L. McMillan Cadence Berkeley Labs.
Advertisements

Exploiting SAT solvers in unbounded model checking
Exploiting SAT solvers in unbounded model checking K. L. McMillan Cadence Berkeley Labs.
Hybrid BDD and All-SAT Method for Model Checking Orna Grumberg Joint work with Assaf Schuster and Avi Yadgar Technion – Israel Institute of Technology.
Chaff: Engineering an Efficient SAT Solver Matthew W.Moskewicz, Concor F. Madigan, Ying Zhao, Lintao Zhang, Sharad Malik Princeton University Modified.
Presented by Monissa Mohan 1.  A highly optimized BCP algorithm  Two watched literals  Fast Backtracking  Efficient Decision Heuristic  Focused on.
Chaff: Engineering an Efficient SAT Solver Matthew W.Moskewicz, Concor F. Madigan, Ying Zhao, Lintao Zhang, Sharad Malik Princeton University Presenting:
1 Local Restarts in SAT Solvers Vadim Ryvchin and Ofer Strichman Technion, Haifa, Israel.
SAT Based Abstraction/Refinement in Model-Checking Based on work by E. Clarke, A. Gupta, J. Kukula, O. Strichman (CAV’02)
Introduction to MiniSat v1.14 Presented by Yunho Kim Provable Software Lab, KAIST.
Dana Nau: Lecture slides for Automated Planning Licensed under the Creative Commons Attribution-NonCommercial-ShareAlike License:
UIUC CS 497: Section EA Lecture #2 Reasoning in Artificial Intelligence Professor: Eyal Amir Spring Semester 2004.
Proofs from SAT Solvers Yeting Ge ACSys NYU Nov
Daniel Kroening and Ofer Strichman 1 Decision Procedures An Algorithmic Point of View SAT.
Heuristics for Efficient SAT Solving As implemented in GRASP, Chaff and GSAT.
1/30 SAT Solver Changki PSWLAB SAT Solver Daniel Kroening, Ofer Strichman.
IBM Labs in Haifa © 2005 IBM Corporation Adaptive Application of SAT Solving Techniques Ohad Shacham and Karen Yorav Presented by Sharon Barner.
A Scalable Algorithm for Minimal Unsatisfiable Core Extraction Nachum Dershowitz¹ Ziyad Hanna² Alexander Nadel¹, ² 1 Tel-Aviv University 2 Intel SAT’06.
ULTIMATELY INCREMENTAL SAT Alexander Nadel 1, Vadim Ryvchin 1,2, and Ofer Strichman 2 1 – Intel, Haifa, Israel 2 – Technion, Haifa, Israel SAT’14, Vienna,
Reduction of Interpolants for Logic Synthesis John Backes Marc Riedel University of Minnesota Dept.
SAT and Model Checking. Bounded Model Checking (BMC) A.I. Planning problems: can we reach a desired state in k steps? Verification of safety properties:
Boosting Minimal Unsatisfiable Core Extraction. Agenda Introduction and motivation New algorithms ◦ Generic scheme ◦ Resolution-based algorithm ◦ Selector-variable-based.
Proof-based Abstraction Presented by Roman Gershman Ken McMillan, Nina Amla.
Willis Lemasters Grant Conklin. Searching a tree recursively one branch at a time, abandoning any branch which does not satisfy the search constraints.
Hrinking hrinking A signment tack tack. Agenda Introduction Algorithm Description Heuristics Experimental Results Conclusions.
Boolean Satisfiability Solvers Wonhong Nam
Heuristics for Efficient SAT Solving As implemented in GRASP, Chaff and GSAT.
Presented by Ed Clarke Slides borrowed from P. Chauhan and C. Bartzis
Chaff: Engineering an Efficient SAT Solver Matthew W.Moskewicz, Concor F. Madigan, Ying Zhao, Lintao Zhang, Sharad Malik Princeton University Presenting:
GRASP-an efficient SAT solver Pankaj Chauhan. 6/19/ : GRASP and Chaff2 What is SAT? Given a propositional formula in CNF, find an assignment.
Efficient SAT Solving for Non- clausal Formulas using DPLL, Graphs, and Watched-cuts Himanshu Jain Edmund M. Clarke.
GRASP SAT solver Presented by Constantinos Bartzis Slides borrowed from Pankaj Chauhan J. Marques-Silva and K. Sakallah.
Efficient Reachability Checking using Sequential SAT G. Parthasarathy, M. K. Iyer, K.-T.Cheng, Li. C. Wang Department of ECE University of California –
Search in the semantic domain. Some definitions atomic formula: smallest formula possible (no sub- formulas) literal: atomic formula or negation of an.
Formal Verification Group © Copyright IBM Corporation 2008 IBM Haifa Labs SAT-based unbounded model checking using interpolation Based on a paper “Interpolation.
Last time Proof-system search ( ` ) Interpretation search ( ² ) Quantifiers Equality Decision procedures Induction Cross-cutting aspectsMain search strategy.
1 Understanding the Power of Clause Learning Ashish Sabharwal, Paul Beame, Henry Kautz University of Washington, Seattle IJCAI ConferenceAug 14, 2003.
1 Abstraction Refinement for Bounded Model Checking Anubhav Gupta, CMU Ofer Strichman, Technion Highly Jet Lagged.
ENGG3190 Logic Synthesis “Boolean Satisfiability” Winter 2014 S. Areibi School of Engineering University of Guelph.
SAT Solving Presented by Avi Yadgar. The SAT Problem Given a Boolean formula, look for assignment A for such that.  A is a solution for. A partial assignment.
On Bridging Simulation and Formal Verification Eugene Goldberg Cadence Research Labs (USA) VMCAI-2008, San Francisco, USA.
Boolean Satisfiability and SAT Solvers
SAT and SMT solvers Ayrat Khalimov (based on Georg Hofferek‘s slides) AKDV 2014.
Solvers for the Problem of Boolean Satisfiability (SAT) Will Klieber Aug 31, 2011 TexPoint fonts used in EMF. Read the TexPoint manual before you.
1 Agenda Modeling problems in Propositional Logic SAT basics Decision heuristics Non-chronological Backtracking Learning with Conflict Clauses SAT and.
Incremental formal verification of hardware Hana Chockler Alexander Ivrii Arie Matsliah Shiri Moran Ziv Nevo IBM Research - Haifa.
On the Relation between SAT and BDDs for Equivalence Checking Sherief Reda Rolf Drechsler Alex Orailoglu Computer Science & Engineering Dept. University.
Boolean Satisfiability Present and Future
SAT 2009 Ashish Sabharwal Backdoors in the Context of Learning (short paper) Bistra Dilkina, Carla P. Gomes, Ashish Sabharwal Cornell University SAT-09.
February 22-25, 2010 Designers Work Less with Quality Formal Equivalence Checking by Orly Cohen, Moran Gordon, Michael Lifshits, Alexander Nadel, and Vadim.
Nikolaj Bjørner Microsoft Research DTU Winter course January 2 nd 2012 Organized by Flemming Nielson & Hanne Riis Nielson.
Efficient SAT Solving Under Assumptions Alexander Nadel 1 and Vadim Ryvchin 1,2 1 – Intel, Haifa, Israel 2 – Technion, Haifa, Israel SAT’12, Trento, Italy.
SAT Solver Heuristics. SAT-solver History Started with David-Putnam-Logemann-Loveland (DPLL) (1962) –Able to solve variable problems Satz (Chu Min.
Preprocessing in Incremental SAT Alexander Nadel 1, Vadim Ryvchin 1,2, and Ofer Strichman 2 1 – Intel, Haifa, Israel 2 – Technion, Haifa, Israel SAT’12,
Heuristics for Efficient SAT Solving As implemented in GRASP, Chaff and GSAT.
SAT Solving As implemented in - DPLL solvers: GRASP, Chaff and
A Decision-Making Procedure for Resolution-Based SAT-solvers Eugene Goldberg Cadence Research Labs (USA) SAT-2008, Guangzhou, P.R. China.
Heuristics for Efficient SAT Solving As implemented in GRASP, Chaff and GSAT.
SAT Solving Overview Alexander Nadel February 2017.
Hybrid BDD and All-SAT Method for Model Checking
Inference and search for the propositional satisfiability problem
Parallelism in SAT Solvers
Heuristics for Efficient SAT Solving
Mining backbone literals in incremental SAT
ECE 667 Synthesis and Verification of Digital Circuits
Decision Procedures An Algorithmic Point of View
Resolution Proofs for Combinational Equivalence
Decision heuristics based on an Abstraction/Refinement model
GRASP-an efficient SAT solver
Faster Extraction of High-Level Minimal Unsatisfiable Cores
Presentation transcript:

1 SAT Genealogy Alexander Nadel, Intel, Haifa, Israel The Technion, Haifa, Israel July,

Agenda Introduction Early Days of SAT Solving Core SAT Solving Conflict Analysis and Learning Boolean Constraint Propagation Decision Heuristics Restart Strategies Inprocessing Extensions to SAT Incremental SAT Solving under Assumptions Simultaneous Satisfiability (SSAT) Diverse Solutions Generation High-level (group-oriented) MUC Extraction 2

Agenda Introduction Early Days of SAT Solving Core SAT Solving Conflict Analysis and Learning Boolean Constraint Propagation Decision Heuristics Restart Strategies Inprocessing Extensions to SAT Incremental SAT Solving under Assumptions Simultaneous Satisfiability (SSAT) Diverse Solutions Generation High-level (group-oriented) MUC Extraction 3 We won’t use implication graphs for explanation, but: Duality between search and resolution

What is SAT? Find a variable assignment (AKA solution or model) that satisfies a propositional formula or prove that there are no solutions SAT solvers operate on CNF formulas: Any formula can be reduced to a CNF 4 CNF Formula: clause negative literal positive literal F = ( a + c ) ( b + c ) (a’ + b’ + c’ )

SAT: Theory and Practice Theory: SAT is the first known NP-complete problem Stephen Cook, 1971 One can check a solution in polynomial time Can one find a solution in polynomial time? The P=NP question… Practice: Amazingly, nowadays SAT solvers can solve industrial problems having millions of clauses and variables SAT has numerous applications in formal verification, planning, bioinformatics, combinatorics, … 5

Approaches to SAT Solving Backtrack search: DFS search for a solution The baseline approach for industrial-strength solvers. In focus today. Look-ahead: BFS search for a solution Helpful for certain classes of formulas Recently, there were attempts of combining it with backtrack search Local search Helpful mostly for randomly generated formulas 6

Early Days of SAT Solving Agenda Resolution Backtrack Search 7

a + b + g + h’ + f a + b + g + h’ Resolution: a Way to Derive New Valid Clauses Resolution over a pair of clauses with exactly one pivot variable: a variable appearing in different polarities: a + b + c’ + fg + h’ + c + f - The resolvent clause is a logical consequence of the two source clauses Known to be invented by Davis&Putnam, 1960 Had been invented independently by Lowenheim in early 1900’s (as well as the DP algorithm, presented next) According to Chvatal&Szemeredy, 1988 (JACM)

DP Algorithm: Davis&Putnam, (a + b)(a + b’)(a’ + c)(a’ + c’)(a + b + c)(b + c’ + f’)(b’ + e) (a + c + e)(c’ + e + f) (a + e + f) (a’ + c)(a’ + c’) (c)(c’)( ) SAT UNSAT (a) Remove the variables one-by-one by resolution over all the clauses containing that variable DP is sound and complete

Backtrack Search or DLL: Davis- Logemann-Loveland, 1962 a + b b’ + c b’ + c’ a’ + b

Backtrack Search or DLL: Davis- Logemann-Loveland, 1962 a + b b’ + c b’ + c’ a’ + b a’

Backtrack Search or DLL: Davis- Logemann-Loveland, 1962 a + b b’ + c b’ + c’ a’ + b a’ Decision level 1 a is the decision variable; a’ is the decision literal

Backtrack Search or DLL: Davis- Logemann-Loveland, 1962 a + b b’ + c b’ + c’ a’ + b a’ b’ Decision level 2

Backtrack Search or DLL: Davis- Logemann-Loveland, 1962 a + b b’ + c b’ + c’ a’ + b a + b a’ b’ A conflict. A blocking clause – a clause, falsified by the current assignment – is encountered.

Backtrack Search or DLL: Davis- Logemann-Loveland, 1962 a + b b’ + c b’ + c’ a’ + b a + b a’ b’ b Backtrack and flip

Backtrack Search or DLL: Davis- Logemann-Loveland, 1962 a + b b’ + c b’ + c’ a’ + b a + b b’ + c a’ b’ b c’ Decision level 1 Decision level 2

Backtrack Search or DLL: Davis- Logemann-Loveland, 1962 a + b b’ + c b’ + c’ a’ + b a + b b’ + cb’ + c’ a’ b’ b c’c Decision level 1

Backtrack Search or DLL: Davis- Logemann-Loveland, 1962 a + b b’ + c b’ + c’ a’ + b a + b b’ + cb’ + c’ a’ b’ b c’c a

Backtrack Search or DLL: Davis- Logemann-Loveland, 1962 a + b b’ + c b’ + c’ a’ + b a + b b’ + cb’ + c’ a’ b’ b c’c a b

Backtrack Search or DLL: Davis- Logemann-Loveland, 1962 a + b b’ + c b’ + c’ a’ + b a + b b’ + cb’ + c’ b’ + c a’ b’ b c’c a b

Backtrack Search or DLL: Davis- Logemann-Loveland, 1962 a + b b’ + c b’ + c’ a’ + b a + b b’ + cb’ + c’ b’ + cb’ + c’ a’ b’ b c’c a b c

Backtrack Search or DLL: Davis- Logemann-Loveland, 1962 a + b b’ + c b’ + c’ a’ + b a + b b’ + cb’ + c’ b’ + cb’ + c’ a’ + b a’ b’ b c’c a b c b’

Backtrack Search or DLL: Davis- Logemann-Loveland, 1962 a + b b’ + c b’ + c’ a’ + b a + b b’ + cb’ + c’ b’ + cb’ + c’ a’ + b a’ b’ b c’c a b c b’ UNSAT!

Core SAT Solving: the Principles DLL could solve problems with <2000 clauses How can modern SAT solvers solve problems with millions of clauses and variables? The major principles: Learning and pruning Block already explored paths Locality and dynamicity Focus the search on the relevant data Well-engineered data structures Extremely fast propagation 24

Agenda Introduction Early Days of SAT Solving Core SAT Solving Conflict Analysis and Learning Boolean Constraint Propagation Decision Heuristics Restart Strategies Inprocessing Extensions to SAT Incremental SAT Solving under Assumptions Simultaneous Satisfiability (SSAT) Diverse Solutions Generation High-level (group-oriented) MUC Extraction 25

Duality between Basic Backtrack Search and Resolution One can associate a resolution derivation with every invocation of DLL over an unsatisfiable formula

Duality between Basic Backtrack Search and Resolution a + b b’ + c b’ + c’ a’ + b

Duality between Basic Backtrack Search and Resolution a + b b’ + c b’ + c’ a’ + b a’

Duality between Basic Backtrack Search and Resolution a + b b’ + c b’ + c’ a’ + b a + b a’ b’

Duality between Basic Backtrack Search and Resolution a + b b’ + c b’ + c’ a’ + b a + b a’ b’ A parent clause P(x) is associated with every flip operation for variable x. It contains: The flipped literal A subset of previously assigned falsified literals The parent clause justifies the flip: its existence proves that the explored subspace has no solutions b

Duality between Basic Backtrack Search and Resolution a + b b’ + c b’ + c’ a’ + b a + b b’ + c a’ b’ b c’

Duality between Basic Backtrack Search and Resolution a + b b’ + c b’ + c’ a’ + b a + b b’ + c a’ b’ b c’c

Duality between Basic Backtrack Search and Resolution a + b b’ + c b’ + c’ a’ + b a + b b’ + cb’ + c’ a’ b’ b c’c

Duality between Basic Backtrack Search and Resolution a + b b’ + c b’ + c’ a’ + b b’ a + b b’ + cb’ + c’ a’ b’ b c’c Backtracking over a flipped variable x can be associated with a resolution operation: P = P(x)  P P is to become the parent clause for the upcoming flip P is initialized with the last blocking clause P old P(c) P new

Duality between Basic Backtrack Search and Resolution a + b b’ + c b’ + c’ a’ + b a b’ a + b b’ + cb’ + c’ a’ b’ b c’c Backtracking over a flipped variable x can be associated with a resolution operation: P = P(x)  P P is to become the parent clause for the upcoming flip P is initialized with the last blocking clause P new P old P(b)

b’ b c’c Duality between Basic Backtrack Search and Resolution a + b b’ + c b’ + c’ a’ + b a b’ a + b b’ + cb’ + c’ a’a  (a) The parent clause P(a) is derived by resolution. The resolution proof  (a) of the parent clause is called parent resolution

Duality between Basic Backtrack Search and Resolution a + b b’ + c b’ + c’ a’ + b a’a b b’ b c’c a b’ a + b b’ + cb’ + c’

Duality between Basic Backtrack Search and Resolution a + b b’ + c b’ + c’ a’ + b b’ + c a’a b c’ b’ b c’c a b’ a + b b’ + cb’ + c’

Duality between Basic Backtrack Search and Resolution a + b b’ + c b’ + c’ a’ + b b’ + c a’a b c’c b’ b c’c a b’ a + b b’ + cb’ + c’ P(c)

Duality between Basic Backtrack Search and Resolution a + b b’ + c b’ + c’ a’ + b b’ + cb’ + c’ a’a b c’c b’ b c’c a b’ a + b b’ + cb’ + c’

Duality between Basic Backtrack Search and Resolution a + b b’ + c b’ + c’ a’ + b b’ b’ + cb’ + c’ a’a b c’c b’ b c’c a b’ a + b b’ + cb’ + c’ P old P(c) P new

Duality between Basic Backtrack Search and Resolution a + b b’ + c b’ + c’ a’ + b b’ b’ + cb’ + c’ a’a b c’c b’ b c’c a b’ a + b b’ + cb’ + c’  (b)

Duality between Basic Backtrack Search and Resolution a + b b’ + c b’ + c’ a’ + b b’ b’ + cb’ + c’ a’ + b a’a b c’c b’ b c’c a b’ a + b b’ + cb’ + c’

Duality between Basic Backtrack Search and Resolution a + b b’ + c b’ + c’ a’ + b a’ b’ b’ + cb’ + c’ a’ + b a’a b c’c b’ b c’c a b’ a + b b’ + cb’ + c’ P old P(b) P new

Duality between Basic Backtrack Search and Resolution a + b b’ + c b’ + c’ a’ + b a’ b’ b’ + cb’ + c’ a’ + b a’a b c’c b’ b c’c a b’ a + b b’ + cb’ + c’ P old P(a) P new

Duality between Basic Backtrack Search and Resolution a + b b’ + c b’ + c’ a’ + b a’ b’ b’ + cb’ + c’ a’ + b a’a b c’c b’ b c’c a b’ a + b b’ + cb’ + c’

Duality between Basic Backtrack Search and Resolution a’ b’ b’ + cb’ + c’ a’ + b a’a b c’c b’ b c’c a b’ a + b b’ + cb’ + c’  The final trace of DLL is both a decision tree (top-down view) and a resolution refutation (bottom-up view)  Variables associated with the edges are both decision variables in the tree and pivot variables for the resolution  A forest of parent resolutions is maintained  The forest converges to one resolution refutation in the end (for an UNSAT formula)

Conflict Clause Recording a’ b’ b’ + cb’ + c’ a’ + b a’a b c’c b’ b c’c a b’ a + b b’ + cb’ + c’ The idea: update the instance with conflict clauses, that is some of the clauses generated by resolution Introduced in SAT by Bayardo&Schrag, 1997 (rel_sat)

Conflict Clause Recording a’ b’ b’ + cb’ + c’ a’ + b a’a b c’c b’ b c’c a b’ a + b b’ + cb’ + c’ Assume the brown clause below was recorded

Conflict Clause Recording a’ b’ b’ + cb’ + c’ a’ + b a’a b c’c b’ b c’c a b’ a + b b’ + cb’ + c’ Assume the brown clause below was recorded The violet part would not have been explored It is redundant

Conflict Clause Recording a’ a’ + b a’a b b’ b c’c a b’ a + b b’ + cb’ + c’ Assume the brown clause below was recorded The violet part would not have been explored It is redundant

Conflict Clause Recording Most of the modern solvers record every non-trivial parent clause (since Chaff) : recorded : not recorded a’ b’ c’cd’df’fg’g be’e a

Enhancing CCR: Local Conflict Clause Recording The parent-based scheme is asymmetric w.r.t polarity selection a’ b’ c’cd’df’fg’g be’e a

Enhancing CCR: Local Conflict Clause Recording The parent-based scheme is asymmetric w.r.t polarity selection Solution: record an additional local conflict clause: a would-be conflict clause if the last polarity selection was flipped Dershowitz&Hanna&Nadel, 2007 (Eureka) : local conflict clause a’ b’ c’cd’df’fg’g be’e a

Managing Conflict Clauses Keeping too many clauses slows down the solver Deleting irrelevant clauses is very important. Some of the strategies: Size-based: remove too long clauses Marques-Silva&Sakallah, 1996 (GRASP) Age-based: remove clauses that weren’t used for BCP Goldberg&Novikov, 2002 (Berkmin) Locality-based (glue): remove clauses, whose literals are assigned far away in the search tree Audemard&Simon, 2009 (Glucose) 55

Modern Conflict Analysis Next, we present the following two techniques, commonly used in modern SAT solvers: Non-chronological backtracking (NCB) GRASP 1UIP scheme GRASP& Chaff Both techniques prune the search tree and the associated forest of parent resolutions

Non-Chronological Backtracking (NCB) b’ b c’c a + b b’ + c b’ + c’ a’ + b a b’ a + b b’ + cb’ + c’ a’ … d’  NCB is an additional pruning operation before flipping: eliminate all the decision levels adjacent to the decision level of the flipped literal, so that the parent clause is still falsified e  (e) e’ Assume we are about to flip a

Non-Chronological Backtracking (NCB) b’ b c’c a + b b’ + c b’ + c’ a’ + b a b’ a + b b’ + cb’ + c’ a’ … d’  NCB is an additional pruning operation before flipping: eliminate all the decision levels adjacent to the decision level of the flipped literal, so that the parent clause is still falsified e  (e) e’ Assume we are about to flip a Eliminate irrelevant decision levels

Non-Chronological Backtracking (NCB) b’ b c’c a + b b’ + c b’ + c’ a’ + b a b’ a + b b’ + cb’ + c’ a’ …  NCB is an additional pruning operation before flipping: eliminate all the decision levels adjacent to the decision level of the flipped literal, so that the parent clause is still falsified Assume we are about to flip a Eliminate irrelevant decision levels Flip a

1UIP Scheme

 1UIP scheme consists of:  A stopping condition for backtracking: stop whenever P contains one variable of the last decision level, called the 1UIP variable

1UIP Scheme  1UIP scheme consists of:  A stopping condition for backtracking: stop whenever P contains one variable of the last decision level, called the 1UIP variable a + b b’ + c b’ + c’ a’ + b b’ a + b b’ + cb’ + c’ a’ b’ b c’c P

1UIP Scheme  1UIP scheme consists of:  A stopping condition for backtracking: stop whenever P contains one variable of the last decision level, called the 1UIP variable  A rewriting operation: consider the 1UIP variable as a decision variable and P as its parent clause a + b b’ + c b’ + c’ a’ + b b’ a + b b’ + cb’ + c’ a’ b’ b c’c P

1UIP Scheme  1UIP scheme consists of:  A stopping condition for backtracking: stop whenever P contains one variable of the last decision level, called the 1UIP variable  A rewriting operation: consider the 1UIP variable as a decision variable and P as its parent clause a + b b’ + c b’ + c’ a’ + b b’ a + b b’ + cb’ + c’ a’ b’ b c’c P

1UIP Scheme  1UIP scheme consists of:  A stopping condition for backtracking: stop whenever P contains one variable of the last decision level, called the 1UIP variable  A rewriting operation: consider the 1UIP variable as a decision variable and P as its parent clause a + b b’ + c b’ + c’ a’ + b b’ a + b b’ + cb’ + c’ a’ b’ b c’c

1UIP Scheme  1UIP scheme consists of:  A stopping condition for backtracking: stop whenever P contains one variable of the last decision level, called the 1UIP variable  A rewriting operation: consider the 1UIP variable as a decision variable and P as its parent clause  A pruning technique: eliminate all the disconnected variables of the last decision level (along with their parent resolutions) a + b b’ + c b’ + c’ a’ + b b’ a + b b’ + cb’ + c’ a’ b’ b c’c

1UIP Scheme  1UIP scheme consists of:  A stopping condition for backtracking: stop whenever P contains one variable of the last decision level, called the 1UIP variable  A rewriting operation: consider the 1UIP variable as a decision variable and P as its parent clause  A pruning technique: eliminate all the disconnected variables of the last decision level (along with their parent resolutions) a + b b’ + c b’ + c’ a’ + b b’ b’ + cb’ + c’ b c’c b’

Agenda Introduction Early Days of SAT Solving Core SAT Solving Conflict Analysis and Learning Boolean Constraint Propagation Decision Heuristics Restart Strategies Inprocessing Extensions to SAT Incremental SAT Solving under Assumptions Simultaneous Satisfiability (SSAT) Diverse Solutions Generation High-level (group-oriented) MUC Extraction 68

The unit clause rule A clause is unit if all of its literals but one are assigned to 0. The remaining literal is unassigned, e.g.: Boolean Constraint Propagation (BCP) Pick unassigned variables of unit clauses as decisions whenever possible 80-90% of running time of modern SAT solvers is spent in BCP Introduced already in the original DLL a = 0, b = 1, c is unassigned a + b’ +  c Boolean Constraint Propagation 69

Data Structures for Efficient BCP Naïve: for each clause hold pointers to all its literals How to minimize the number of clause visits? When can a clause become unit? All literals in a clause but one are assigned to 0 For an N-literal clause, this can only occur after N-1 of the literals have been assigned to 0 So, theoretically, one could completely ignore the first N-2 assignments to this clause. The solution: one picks two literals in each clause to watch and thus can ignore any assignments to the other literals in the clause. Introduced by Zhang, 1997 (SATO solver); enhanced by Moskewicz& Madigan&Zhao&Zhang&Malik, 2001 (Chaff) 70

Watched Lists : Example abcdefgh WW 71

Watched Lists : Example abcdefgh WW 72 a’

Watched Lists : Example abcdefgh W W The clause is visited The corresponding watch moves to any unassigned literal No pointers to the previously visited literals are saved 73 a’

Watched Lists : Example abcdefgh W W 74 a’ c’

Watched Lists : Example abcdefgh W W The clause is not visited! 75 a’ c’

Watched Lists : Example abcdefgh W W 76 a’ c’ g’ e’

Watched Lists : Example abcdefgh W W The clause is not visited! 77 a’ c’ g’ e’

Watched Lists : Example abcdefgh W W 78 a’ c’ g’ e’ h’

Watched Lists : Example abcdefgh WW The clause is visited The corresponding watch moves to any unassigned literal No pointers to the previously visited literals are saved 79 a’ c’ e’ g’ h’

Watched Lists : Example abcdefgh WW 80 a’ c’ e’ g’ h’ f’

Watched Lists : Example abcdefgh WW 81 a’ c’ e’ g’ h’ f’

Watched Lists : Example abcdefgh WW 82 a’ c’ e’ g’ h’ f’ b’

Watched Lists : Example abcdefgh WW The watched literal b is visited. It is identified that the clause became unit! 83 a’ c’ e’ g’ h’ f’ b’

Watched Lists : Example abcdefgh W b is unassigned : the watches do not move No need to visit the clause during backtracking! W 84 a’ c’ e’ g’ h’ f’  Backtrack b’

Watched Lists : Example f is unassigned : the watches do not move  Backtrack abcdefgh WW 85 a’ c’ e’ g’ h’ f’ b’

Watched Lists : Example a’ c’ e’ g’ h’ When all the literals are unassigned, the watches pointers do not get back to their initial positions f’  Backtrack abcdefgh WW 86 b’

Watched Lists : Caching Chu&Harwood&Stuckey, 2008 Divide the clauses into various cache levels to improve cache performance Most of the modern solvers put one literal of each clause in the WL Special data structures for clauses of length 2 and 3 87

Agenda Introduction Early Days of SAT Solving Core SAT Solving Conflict Analysis and Learning Boolean Constraint Propagation Decision Heuristics Restart Strategies Inprocessing Extensions to SAT Incremental SAT Solving under Assumptions Simultaneous Satisfiability (SSAT) Diverse Solutions Generation High-level (group-oriented) MUC Extraction 88

Decision Heuristics Which literal should be chosen at each decision point? Critical for performance!

Old-Days’ Static Decision Heuristics Go over all clauses that are not satisfied Compute some function f(A) for each literal— based on frequency Choose literal with maximal f(A)

Variable-based Dynamic Heuristics: VSIDS VSIDS was the first dynamic heuristic (Chaff) Each literal is associated with a counter Initialized to number of occurrences in input Counter is increased when the literal participates in a conflict clause Occasionally, counters are halved Literal with the maximal counter is chosen Breakthrough compared to static heuristics: Dynamic: focuses search on recently used variables and clauses Extremely low overhead

Enhancements to VSIDS Adjusting the scope: increase the scores for every literal in the newly generated parent resolution (Berkmin) Additional dynamicity: multiply scores by 95% after each conflict, rather than occasionally halve the scores Eén&Sörensson, 2003 (Minisat) 92

The Clause-Based Heuristic (CBH) The idea: use relevant clauses for guiding the decision heuristic The Clause-Based Heuristic or CBH (Eureka) All the clauses (both initial and conflict clauses) are organized in a list The next variable is chosen from the top-most unsatisfied clause After a conflict: All the clauses that participate in the newly derived parent resolution are moved to the top, then The conflict clause is placed at the top Partial clause-based heuristics: Berkmin, HaifaSAT

CBH: More CBH is even more dynamic than VSIDS: prefers variables from very recent conflicts CBH tends to pick interrelated variables: Variables whose joint assignment increases the chances of: Satisfying clauses in satisfiable branches Quickly reaching conflicts in unsatisfiable branches Variables appearing in the same clause are interrelated: Picking variables from the same clause, results in either that:  the clause becomes satisfied, or  there’s a contradiction 94

Polarity Selection Phase Saving: Strichman, 2000; Pipatsrisawat&Darwiche, 2007 (RSAT) Assign a new decision variable the last polarity it was assigned: dynamicity rules again 95

Decision Heuristics: the Current Status Everybody uses phase saving Most of the SAT solvers use VSIDS Intel’s Eureka uses CBH for most of the instances and VSIDS for tiny instances only We plan to compare VSIDS and CBH thoroughly in our new solver Fiver 96

Core SAT Solving: the Major Enhancements to DLL Boolean Constraint Propagation Conflict Analysis and Learning Decision Heuristics Restart Strategies Pre- and Inter- Processing 97 The slides on restarts are based on Vadim Ryvchin’s SAT’08 presentation

Agenda Introduction Early Days of SAT Solving Core SAT Solving Conflict Analysis and Learning Boolean Constraint Propagation Decision Heuristics Restart Strategies Inprocessing Extensions to SAT Incremental SAT Solving under Assumptions Simultaneous Satisfiability (SSAT) Diverse Solutions Generation High-level (group-oriented) MUC Extraction 98

99 Restarts Restarts: the solver backtracks to decision level 0, when certain criteria are met crucial impact on performance Motivation: Dynamicity: refocus the search on relevant data Variables identified as important will be pick first by the decision heuristic after the restart Avoid spending too much time in ‘bad’ branches

100 Restart Criteria Restart after a certain number of conflicts has been encountered either: Since the previous restart: global Gomes&Selman&Kautz, 1998 Higher than a certain decision level: local Ryvchin&Strichman, 2008 Next: methods to calculate the threshold on the number of conflicts Holds for both global and local schemes

101 Restarts Strategies 1. Arithmetic (or fixed) series. Parameters: x, y. Init(t) = x Next(t)=t+y

102 Restarts Strategies (cont.) 2. Luby et al. series. Parameter: x. Init(t) = x Next(t) = t i *x Ruan&Horvitz&Kautz, 2003 t i = …

103 Restarts Strategies (cont.) 3. Inner-Outer Geometric series. Parameters: x, y, z. Init(t) = x if (t*y < z) Next(t) = t*y else Next(t) = x Next(z) = z*y Armin Biere, 2007 (Picosat)

Agenda Introduction Early Days of SAT Solving Core SAT Solving Conflict Analysis and Learning Boolean Constraint Propagation Decision Heuristics Restart Strategies Inprocessing Extensions to SAT Incremental SAT Solving under Assumptions Simultaneous Satisfiability (SSAT) Diverse Solutions Generation High-level (group-oriented) MUC Extraction 104

Preprocessing and Inprocessing The idea: Simplify the formula prior (pre-) and during (in-) the search History: Freeman, 1995 (POSIT): first mentioning of preprocessing in the context of SAT Eén&Biere, 2005 (SatELite): a commonly used efficient preprocessing procedure Heule&Järvisalo&Biere ( ): a series of papers on inprocessing Used in the current state-of-the-art solvers Lingeling and CryptoMinisat Nadel&Ryvchin&Strichman (2012): apply SatELite in incremental SAT solving 105

Inprocessing Techniques SatELite: Subsumption: remove clause (C+D) if (C) exists Self-subsuming resolution: replace (D+l’) by (D), if (C+l) exists, such that C  D Variable elimination: apply DP for variables, whose elimination does not increase the number of clauses Example: (a+b)(a+b’)(a’+c)(a’+c’)  (a)(a’+c)(a’+c’) Example of other techniques: Failed literal elimination with BCP: Repeat for a certain subset of literals on decision level 0:  Propagate a literal l with BCP.  If a conflict emerges, l must be 0  the formula can be simplified 106

Agenda Introduction Early Days of SAT Solving Core SAT Solving Conflict Analysis and Learning Boolean Constraint Propagation Decision Heuristics Restart Strategies Inprocessing Extensions to SAT Incremental SAT Solving under Assumptions Simultaneous Satisfiability (SSAT) Diverse Solutions Generation High-level (group-oriented) MUC Extraction 107

Extensions to SAT Nowadays, SAT solving is much more than finding one solution to a given problem Extensions to SAT: Incremental SAT under assumptions Simultaneous SAT (SSAT): SAT over multiple properties at once Diverse solution generation Minimal Unsatisfiable Core (MUC) extraction Push/pop support Model minimization ALL-SAT XOR clauses support ISSAT: assumptions are implications … 108

Agenda Introduction Early Days of SAT Solving Core SAT Solving Conflict Analysis and Learning Boolean Constraint Propagation Decision Heuristics Restart Strategies Inprocessing Extensions to SAT Incremental SAT Solving under Assumptions Simultaneous Satisfiability (SSAT) Diverse Solutions Generation High-level (group-oriented) MUC Extraction 109

Incremental SAT Solving under Assumptions The challenge: speed-up solving of related SAT instances by enabling re-use of relevant data Incremental SAT solving has numerous applications Next, we review a prominent application in Formal Verification of Hardware 110

Reasoning about Circuit Properties with SAT-based Bounded Model Checking (BMC) BMC: given a circuit and a property, does the property holds for the first n cycles? Unroll: generate a combinational instantiation of the circuit for each cycle Run a SAT solver for each cycle over: The translation of unrolled circuit to CNF The negation of the property at that cycle The property holds for n cycles iff all the SAT solver invocations return UNSAT 111

BMC Example a b ch g The property: b’  h’

BMC Example: Cycle 0 a b h g cici A user-given initial value a b ch g The property: b’  h’

BMC Example: Cycle 0 a b h g cici h + g’ + c i ’ h’ + g h’ + c i g + a’ + b’ g’ + a g’ + b b’ h The negation of the property b’  h’: a b ch g UNSAT! The property: b’  h’

BMC Example: Cycle 1 a b h g cici a b ch g bxbx hxhx cxcx axax gxgx The property: b’  h’

BMC Example: Cycle 1 a b h g cici h + g’ + c i ’ h’ + g h’ + c i g + a’ + b’ g’ + a g’ + b bx’hxbx’hx The negation of the property b x ’  h x ’: a b ch g bxbx hxhx cxcx c x + h’ c x ’ + h axax gxgx g x + a x ’ + b x ’ g x ’ + a x g x ’ + b x h x + g x ’ + c x ’ h x ’ + g x h x ’ + c x UNSAT! The property: b’  h’

Re-Using Relevant Information from Previous Cycles 117 The property: b’  h’ a b h g cici bxbx hxhx cxcx h + g’ + c i ’ h’ + g h’ + c i g + a’ + b’ g’ + a g’ + b b’ h bx’hxbx’hx c x + h’ c x ’ + h g x + a x ’ + b x ’ g x ’ + a x g x ’ + b x h x + g x ’ + c x ’ h x ’ + g x h x ’ + c x C 0 : cycle 0C 1 : cycle 1 S 0 : cycle 0-specific S 1 : cycle 1-specific C 0 and C 1 hold globally S 0 and S 1 hold solely for a particular cycle

Pervasive Clause Learning; Marques- Silva&Sakallah, 1997 (GRASP); Strichman, Cycle 0: create a CNF instance C 0  S 0 and solve it Let C 0 * be the set of pervasive conflict clauses, that is conflict clauses that depend only on C 0 Cycle 1: create a CNF instance C 0  C 1  S 1  C 0 * and solve it h + g’ + c i ’ h’ + g h’ + c i g + a’ + b’ g’ + a g’ + b b’ h bx’hxbx’hx c x + h’ c x ’ + h g x + a x ’ + b x ’ g x ’ + a x g x ’ + b x h x + g x ’ + c x ’ h x ’ + g x h x ’ + c x C 0 : cycle 0C 1 : cycle 1 S 0 : cycle 0-specific S 1 : cycle 1-specific

119 Cycle 0: create a CNF instance C 0  S 0 and solve it Let C 0 * be the set of pervasive conflict clauses, that is conflict clauses that depend only on C 0 Cycle 1: create a CNF instance C 0  C 1  S 1  C 0 * and solve it a + h’ g h + g’ + c i ’ h’ + g h’ + c i g + a’ + b’ g’ + a g’ + b b’ h bx’hxbx’hx c x + h’ c x ’ + h g x + a x ’ + b x ’ g x ’ + a x g x ’ + b x h x + g x ’ + c x ’ h x ’ + g x h x ’ + c x C 0 : cycle 0C 1 : cycle 1 S 0 : cycle 0-specific S 1 : cycle 1-specific C0*C0* Pervasive Clause Learning; Marques- Silva&Sakallah, 1997 (GRASP); Strichman, 2001

Incremental SAT Solving under Assumptions; Eén&Sörensson, 2003 (Minisat) 120 Cycle 0: create a CNF instance C 0 and solve it under the assumptions S 0 S 0 clauses are not part of the instance, instead: The literals of S 0 are used as the first decision, or assumptions The solver stops, whenever one of the assumptions must be flipped Cycle 1: add the clauses C 1 to the same instance and solve under the assumptions S 1 h + g’ + c i ’ h’ + g h’ + c i g + a’ + b’ g’ + a g’ + b b’ h bx’hxbx’hx c x + h’ c x ’ + h g x + a x ’ + b x ’ g x ’ + a x g x ’ + b x h x + g x ’ + c x ’ h x ’ + g x h x ’ + c x C 0 : cycle 0C 1 : cycle 1 S 0 : cycle 0-specific S 1 : cycle 1-specific

Incremental SAT Solving: More Minisat’s method is the state-of-the-art Advantages: Re-uses a single solver instance: heuristics are incremental All the clauses are re-used GRASP’s method advantage Assumptions are unit clauses: preprocessing can use them to simplify the formula Incremental SAT solving was not compatible with preprocessing Nadel&Ryvchin&Strichman 2012: Make incremental SAT solving compatible with SatELite Show a way to treat assumptions efficiently 121

Agenda Introduction Early Days of SAT Solving Core SAT Solving Conflict Analysis and Learning Boolean Constraint Propagation Decision Heuristics Restart Strategies Inprocessing Extensions to SAT Incremental SAT Solving under Assumptions Simultaneous Satisfiability (SSAT) Diverse Solutions Generation High-level (group-oriented) MUC Extraction 122

Simultaneous SAT (SSAT) A SAT-based algorithm to efficiently solve chunks of related properties in one SAT solver invocation For example, one can solve multiple properties during BMC Khasidashvili&Nadel&Palti&Hanna, 2005 Khasidashvili&Nadel,

p1p1 p1p1 p2p2 p2p2 C2C2 C2C2 C1C1 C1C1 Example: Solve Both p 1 and p 2

Incremental SAT-based Approach p1p1 p1p1 p2p2 p2p2 C2C2 C2C2 C1C1 C1C1 Translate C 1 to CNF formula F Solve F under the assumption p 1 ’ Update F with clause projection of C 2 \C 1 Solve F under the assumption p 2 ’

SSAT Approach p1p1 p1p1 p2p2 p2p2 C2C2 C2C2 C1C1 C1C1 Translate both C 1 and C 2 to CNF formula F Find the status of both p 1 and p 2 in the same invocation of the SAT solver

Advantages of SSAT approach to Incremental SAT-based Approach Looks at all the properties at once One solution can falsify more than one property May find conflict clauses (lemmas) relevant for solving many POs

SSAT: the Algorithm Interface Input A combinational formula F (in CNF) A list of proof objectives (POs) p 1,p 2,…,p n Output Each p i is either falsifiable  A model to F, such that p i = 0, exists (F  p i ’ is SAT) valid  p i always holds, given F (F  p i ’ is UNSAT) 128

SSAT Algorithm Interface Example F = (a + b)  c’  a’ POs: a, b, c, a’, b’, c’ a is falsifiable: a = 0; b = 1; c = 0 is the model b is valid: there is no model to F, where b = 0 In another words, (a + b)  c’  a’  b’ is UNSAT c is falsifiable: a = 0; b = 1; c = 0 is the model a’ is valid: no model to F where a = 1 b’ is falsifiable with a = 0; b = 1; c = 0 c’ is valid: no model to F where c = 1 Both l and l’ may be falsifiable Example: F = a + b; PO: a 129

Basic SSAT Algorithm SSAT(F; P={p 1,p 2,…,p n }) While (P is non-empty) Pick any s  P Solve F under the assumption s’ If satisfiable by a satisfying assignment   T:={s  other POs in P falsified by  }  Return to the user that the POs T are falsifiable  P := P \ T If unsatisfiable  Return that s is valid  P := P \ {p} Initialized with clause projection of the union of cones of all the properties

SSAT: More How to boost SSAT Take further advantage of reasoning about all the POs at once Pick all the POs as decision variables and assign them 0 Fairness: rotate unsolved POs Set an internal time threshold for an attempt to solve one PO When the threshold expires:  Move the unresolved PO to the end of unsolved POs list  Switch to another PO SSAT is widely used at Intel Applied as the core reasoning engine for simultaneous model checking algorithms we developed

Agenda Introduction Early Days of SAT Solving Core SAT Solving Conflict Analysis and Learning Boolean Constraint Propagation Decision Heuristics Restart Strategies Inprocessing Extensions to SAT Incremental SAT Solving under Assumptions Simultaneous Satisfiability (SSAT) Diverse Solutions Generation High-level (group-oriented) MUC Extraction 132

DiversekSet: Generating Diverse Solutions DiversekSet in SAT: generate a user-given number of diverse solutions, given a CNF formula Nadel, 2011 The problem has multiple applications at Intel 133

New Initial states initial states deep bugs Max FV bound Application: Semi-formal FPV

Multi-Threaded Search to Enhance Coverage Choosing a single path through waypoints may miss the bug Must search along multiple diverse paths calculated:

Diversification Quality as the Average Hamming Distance Quality: the average Hamming distance between the solutions, normalized to [0…1] a b c     Hamming distances matrix 2

Diversification Quality as the Average Hamming Distance Quality: the average Hamming distance between the solutions, normalized to [0…1] a b c     Hamming distances matrix 22

Diversification Quality as the Average Hamming Distance Quality: the average Hamming distance between the solutions, normalized to [0…1] a b c     Hamming distances matrix 221

Diversification Quality as the Average Hamming Distance Quality: the average Hamming distance between the solutions, normalized to [0…1] a b c     Hamming distances matrix 221 2

Diversification Quality as the Average Hamming Distance Quality: the average Hamming distance between the solutions, normalized to [0…1] a b c     Hamming distances matrix

Diversification Quality as the Average Hamming Distance Quality: the average Hamming distance between the solutions, normalized to [0…1] a b c     Hamming distances matrix

Diversification Quality as the Average Hamming Distance Quality: the average Hamming distance between the solutions, normalized to [0…1] a b c     Hamming distances matrix VariablesSolutions Hamming Distance

Diversification Quality as the Average Hamming Distance Quality: the average Hamming distance between the solutions, normalized to [0…1] a b c     Hamming distances matrix

Algorithms for DiversekSet in SAT in a Glance The idea: Adapt a modern CDCL SAT solver for DiversekSet Make minimal changes to remain efficient Compact algorithms: Invoke the SAT solver once to generate all the solutions Restart after a solution is generated Modify the polarity and variable selection heuristics for generating diverse solutions

Algorithms for DiversekSet in SAT in a Glance Cont. Polarity-based algorithms: Change solely the polarity selection heuristic pRand: pick the polarity randomly pGuide: pick the polarity so as to improve the diversification quality Balance the number of 0’s and 1’s assigned to a variable by picking  {0,1} when variable was assigned  ’ more times pGuide outperforms pRand in terms of both diversification quality and performance Quality can be improved further by taking BCP into account and adapting the variable ordering

Agenda Introduction Early Days of SAT Solving Core SAT Solving Conflict Analysis and Learning Boolean Constraint Propagation Decision Heuristics Restart Strategies Inprocessing Extensions to SAT Incremental SAT Solving under Assumptions Simultaneous Satisfiability (SSAT) Diverse Solutions Generation High-level (group-oriented) MUC Extraction 146

Unsatisfiable Core Extraction An unsatisfiable core is an unsatisfiable subset of an unsatisfiable set of constraints An unsatisfiable core is minimal if removal of any constraint makes it satisfiable (local minima) Has numerous applications

Example Application: Proof-based Abstraction Refinement for Model Checking; McMillan et al.,’03; Gupta et al.,’03 No Bug Valid Model Check A BMC(M,P,k) Cex C at depth k Bug No A  A  latches/gates in the UNSAT core of BMC(M,P,k) Inputs: model M, property P Output: does P hold under M? Abstract model A  { } Spurious? The UNSAT core is used for refinement The UNSAT core is required in terms of latches/gates Yes Turn latches/ gates into free inputs

Example Application 2: Assumption Minimization for Compositional Formal Equivalence Checking (FEC); Cohen et al.,’10 FEC verifies the equivalence between the design (RTL) and its implementation (schematics). The whole design is too large to be verified at once. FEC is done on small sub-blocks, restricted with assumptions. Assumptions required for the proof of equivalence of sub- blocks must be proved relative to the driving logic. MUC extraction in terms of assumptions is vital for feasibility. Inputs Outputs Assumption Assertion

Traditionally, a Clause-Level UC Extractor is the Workhorse Clause-level UC extraction: given a CNF formula, extract an unsatisfiable subset of its clauses F = ( a + b ) ( b’ + c ) (c’ ) (a’ + c ) ( b + c ) ( a + b + c’ ) Dozens of papers on clause-level UC extraction since 2002

Traditional UC Extraction for Practical Needs: the Input An interesting constraint The remainder (the rest of the formula) The user is interested in a MUC in terms of these constraints

Traditional UC Extraction: Example Input 1 An unrolled latch The rest of the unrolled circuit Proof-based abstraction refinement

Traditional UC Extraction: Example Input 1 An assumption Equivalence between sub-block RTL and implementation Assumption minimization for FEV

Traditional UC Extraction: Stage 1: Translate to Clauses An interesting constraint The remainder (the rest of the formula) Each small square is a propositional clause, e.g. (a + b ’ )

Traditional UC Extraction: Stage 2: Extract a Clause-Level UC An interesting constraint The remainder (the rest of the formula) Colored squares belong to the clause-level UC

Traditional UC Extraction: Stage 3: Map the Clause-Level UC Back to the Interesting Constraints An interesting constraint The remainder (the rest of the formula) The UC contains three interesting constraints

High-Level Unsatisfiable Core Extraction Real-world applications require reducing the number of interesting constraints in the core rather than clauses Latches for abstraction refinement Assumptions for compositional FEV Most of the algorithms for UC extraction are clause-level High-level UC: extracting a UC in terms of interesting constraints only Liffiton&Sakallah, 2008; Nadel, 2010; Ryvchin&Strichman, 2011

Small/Minimal Clause-Level UC  Small/Minimal High-Level UC A small clause-level UC, but the high-level UC is the largest possible: A large clause-level UC, but the high-level UC is empty:

High-Level Unsatisfiable Core Extraction: Main Results Minimal UC extraction: high-level algorithms solve Intel families that are out of reach for clause-level algorithms Non-minimal UC extraction: high-level algorithms are preferable 2-3x boost on difficult benchmarks

160 Thanks!