Presentation is loading. Please wait.

Presentation is loading. Please wait.

Learning Conditional Abstractions (CAL)

Similar presentations


Presentation on theme: "Learning Conditional Abstractions (CAL)"— Presentation transcript:

1 Learning Conditional Abstractions (CAL)
Bryan A. Brady1* Randal E. Bryant2 Sanjit A. Seshia3 1IBM, Poughkeepsie, NY 2CS Department, Carnegie Mellon University 3EECS Department, UC Berkeley *Work performed at UC Berkeley FMCAD 2011, Austin, TX 1 November 2011

2 Learning Conditional Abstractions
Learning Conditional Abstractions (CAL): Use machine learning from traces to compute abstraction conditions. Philosophy: Create abstractions by generalizing simulation data.

3 Abstraction Levels in FV
Term-level verifiers SMT-based verifiers (e.g., UCLID) Able to scale to much more complex systems How to decide what to abstract? Term Level ??? Designs are typically at this level Bit Vector Level Bit Blast Most tools operate at this level Model checkers Equivalence checkers Capacity limited by State bits Details of bit-manipulations Bit Level

4 Motivating Example Equivalence/Refinement Checking
= fB fA Design A Design B Difficult to reason about some operators Multiply, Divide Modulo, Power Term-level abstraction Replace bit-vector operators with uninterpreted functions Represent data with arbitrary encoding x1 x2 xn * / % ^ f(...) MEMOCODE 2010

5 Term-Level Abstraction
Precise, word-level 16 ALU JMP = 1 4 out1 19 15 instr Fully uninterpreted UF 16 20 out2 instr Example instr := JMP 1234 out1:= 1234 out2:= Need to partially abstract MEMOCODE 2010

6 Term-Level Abstraction
Precise, word-level 16 ALU JMP = 1 4 out 19 15 instr Fully uninterpreted UF 16 20 out instr Partially-interpreted 16 UF JMP = 1 4 out 19 15 instr MEMOCODE 2010

7 Term-Level Abstraction
Manual Abstraction Requires intimate knowledge of design Multiple models of same design Spurious counter-examples RTL Perform Abstraction Verification Model Automatic Abstraction How to choose the right level of abstraction Some blocks require conditional abstraction Often requires many iterations of abstraction refinement

8 Outline Motivation Related work Background The CAL Approach
Illustrative Example Results Conclusion

9 Related Work Author/Technique Abstraction Type Abstraction Granularity
Method R. E. Bryant, et al., TACAS 2007 Data Datapath reduction via successive approximation CEGAR P. Bjesse CAV’08 Reduces datapaths without BV ops Selective bit-blasting Z. Andraus, et al. DAC’04, LPAR’08 Data, Function Fully abstracts all operators ATLAS Function Partially abstracts some modules Hybrid static-dynamic CAL Machine learning/CEGAR MEMOCODE 2010

10 Outline Motivation Related work Background The CAL Approach
ATLAS Conditional Abstraction The CAL Approach Illustrative Example Results Conclusion

11 Background: The ATLAS Approach
Hybrid approach Phase 1: Identify abstraction candidates with random simulation Phase 2: Use dataflow analysis to compute conditions under which it is precise to abstract Phase 3: Generate abstracted model Identify Abstraction Candidates Compute Abstraction Conditions Generate Abstracted Model

12 Identify Abstraction Candidates
= Find isomorphic sub-circuits (fblocks) Modules, functions fA fB Design A Design B Replace each fblock with a random function, over the inputs of the fblock RFa b c a b RFc a RFb c a b c a RFb c a b RFc a b c RFa b c Verify via simulation: Check original property for N different random functions x1 x2 xn

13 Identify Abstraction Candidates
Do not abstract fblocks that fail in some fraction of simulations = fA fB Intuition: fblocks that can not be abstracted will fail when replaced with random functions. Design A Design B a UFb c a b c a b c a UFb c Replace remaining fblocks with partially-abstract functions and compute conditions under which the fblock is modeled precisely Intuition: fblocks can contain a corner case that random simulation didn’t explore x1 x2 xn

14 Modeling with Uninterpreted Functions
interpretation condition g y1 y2 yn UF b 1 = fA fB c Design A Design B a UFb c a b c a UFb c a b c x1 x2 xn

15 Interpretation Conditions
D1,D2 : word-level designs T1,T2 : term-level models x : input signals c : interpretation condition Problem: Compute interpretation condition c(x) such that∀x.f1⇔f2 f1 = f2 = Trivial case, model precisely: c = true Ideal case, fully abstract: c = false Realistic case, we need to solve: D1 D2 T1 T2 ∃c ≠ true s.t. ∀x.f1⇔f2 This problem is NP-hard, so we use heuristics to compute c x c

16 Outline Motivation Related work Background The CAL Approach
Illustrative Example Results Conclusion

17 Related Work Previous work related to Learning and Abstraction
Learning Abstractions for Model Checking Anubhav Gupta, Ph.D. thesis, CMU, 2006 Localization abstraction: learn the variables to make visible Our approach: Learn when to apply function abstraction

18 The CAL Approach CAL = Machine Learning + CEGAR
Identify abstraction candidates with random simulation Perform unconditional abstraction If spurious counterexamples arise, use machine learning to refine abstraction by computing abstraction conditions Repeat Step 3 until Valid or real counterexample

19 Generate Term-Level Model
The CAL Approach RTL Random Simulation Modules to Abstract Generate Term-Level Model Abstraction Conditions Invoke Verifier Learn Abstraction Conditions Valid? Yes Done No Generate Similar Traces Yes Counter example Spurious? Simulation Traces No Done

20 Use of Machine Learning
Algorithm Concept (classifier) Examples (positive/negative) In our setting: When we generate similar traces, we do it for each fblock being abstracted and we do it one by one, just like before . reason: we only want to interpret fblocks that cause errors Learning Algorithm Simulation traces (correct / failing) Interpretation condition MEMOCODE 2010

21 Important Considerations in Learning
How to generate traces for learning? What are the relevant features? Random simulations: using random functions in place of UFs Counterexamples Inputs to functional block being abstracted Signals corresponding to “unit of work” being processed

22 Generating Traces: Witnesses
Modified version of random simulation Design A Design B x1 x2 xn fA fB = a b c RFa RFb RFc Replace all modules that are being abstracted with RF at same time Verify via simulation for N iterations Log signals for each passing simulation run It is important to note here that ALL of the modules {a,b,c} are being abstracted here. This is different from the situation in slides 15/16/17, so it is important to make this distinction when presenting this slide. Important note: initial state selected randomly or based on a testbench MEMOCODE 2010

23 Generating Traces: Similar Counterexamples
Replace modules that are being abstracted with RF, one by one = fA fB Verify via simulation for N iterations Design A Design B RFa b c a b RFc a RFb c a b c a b c Log signals for each failing simulation run Repeat this process for each fblock that is being abstracted When we generate similar traces, we do it for each fblock being abstracted and we do it one by one, just like before . reason: we only want to interpret fblocks that cause errors Important note: initial state set to be consistent with the original counterexample for each verification run x1 x2 xn MEMOCODE 2010

24 Feature Selection Heuristics
Include inputs to the fblock being abstracted Advantage: automatic, direct relevance Disadvantage: might not be enough Include signals encoding the “unit-of-work” being processed by the design Example: an instruction, a packet, etc. Advantage: often times the “unit-of-work” has direct impact on whether or not to abstract Disadvantage: might require limited human guidance When we generate similar traces, we do it for each fblock being abstracted and we do it one by one, just like before . reason: we only want to interpret fblocks that cause errors MEMOCODE 2010

25 Outline Motivation Related work Background The CAL Approach
Illustrative Example Results Conclusion

26 Learning Example Example: Y86 processor design Abstraction: ALU module
Unconditional abstraction  Counterexample Sample data set bad,7,0,1,0 good,11,0,1,-1 good,11,0,1,1 good,6,3,-1,-1 good,6,6,-1,1 good,9,0,1,1 Attribute, instr, aluOp, argA, argB {0,1,...,15} When we generate similar traces, we do it for each fblock being abstracted and we do it one by one, just like before . reason: we only want to interpret fblocks that cause errors Abstract interpretation: x < 0  -1 x = 0  0 x > 0  1 {-1,0,1} {Good, Bad} MEMOCODE 2010

27 Learning Example Example: Y86 processor design Abstraction: ALU module
Unconditional abstraction  Counterexample Sample data set bad,7,0,1,0 good,11,0,1,-1 good,11,0,1,1 good,6,3,-1,-1 good,6,6,-1,1 good,9,0,1,1 Feature selection based on “unit-of-work” Interpretation condition learned: InstrE = JXX ∧ b = 0 When we generate similar traces, we do it for each fblock being abstracted and we do it one by one, just like before . reason: we only want to interpret fblocks that cause errors Verification succeeds when above interpretation condition is used! MEMOCODE 2010

28 Learning Example Example: Y86 processor design Abstraction: ALU module
Unconditional abstraction  Counterexample Sample data set bad,0,1,0 good,0,1,-1 good,0,1,1 good,3,-1,-1 good,6,-1,1 If feature selection is based on fblock inputs only... Interpretation condition learned: When we generate similar traces, we do it for each fblock being abstracted and we do it one by one, just like before . reason: we only want to interpret fblocks that cause errors true Recall that this means we always interpret! Poor decision tree results from reasonable design decision. More information needed. MEMOCODE 2010

29 Outline Motivation Related work Background The CAL Approach
Illustrative Example Results Conclusion

30 Experiments/Benchmarks
Pipeline fragment: Abstract ALU JUMP must be modeled precisely. ATLAS: Automatic Term-Level Abstraction of RTL Designs. B. A. Brady, R. E. Bryant, S. A. Seshia, J. W. O’Leary. MEMOCODE 2010 Low-power Multiplier: Performs equivalence checking between two versions of a multiplier One is a typical multiplier The “low-power” version shuts down the multiplier and uses a shifter when one of the operands is a power of 2 Low-Power Verification with Term-Level Abstraction. B. A. Brady. TECHCON ‘10 Y86: Correspondence checking of 5-stage microprocessor Multiple design variations Computer Systems: A Programmer’s Perspective. Prentice-Hall, R. E. Bryant and D. R. O’Hallaron.

31 Experiments/Benchmarks
Pipeline fragment Interpretation Condition ABC (sec) UCLID Runtime (sec) SAT SMT true 0.02 28.51 27.01 op = JMP -- 0.31 0.01 Low-Power Multiplier BMC Depth UCLID Runtime (sec) SAT SMT No Abs Abs 1 2.81 2.55 1.27 1.38 2 12.56 14.79 2.80 2.63 5 67.43 22.45 8.23 8.16 10 216.75 202.25 21.18 22.00 interpretation condition for LP MULT: interpret anytime either input is power of 2. not listing in table because i wanted to show how things look when BMC depth is increased and this makes it harder to include interp conds in an aesthetically pleasing way MEMOCODE 2010

32 Experiments/Benchmarks
Interpretation Condition ABC (sec) UCLID Runtime (sec) SAT SMT true > 1200 op = ADD∧ aluB = 0 -- 133.03 105.34 InstrE = JXX ∧ aluB = 0 101.10 65.52 Y86: BTFNT Interpretation Condition ABC (sec) UCLID Runtime (sec) SAT SMT true > 1200 op = ADD∧ aluB = 0 -- 154.95 89.02 InstrE = JXX 191.34 187.64 BTFNT 94.00 52.76 red: by hand abstraction. what we thought was a good abstractino before. bold: CAL abstraction why do we not compare this against ATLAS? Because when we use ATLAS, it can’t solve Y86 BTFNT or NT if MULT is included as instruction. NOTE that for Y86 NT, the CEGAR loop is iterated twice, whereas it’s only iterated once for BTFNT. Y86: NT MEMOCODE 2010

33 Outline Motivation Related work Background The CAL Approach
Illustrative Example Results Conclusion

34 Summary / Future Work Summary Future Work
Use machine learning + CEGAR to compute conditional function abstractions Outperforms purely bit-level techniques Future Work Better feature selection: picking “unit-of-work” signals Investigate using different abstraction conditions for different instantiations of the same fblock. Apply to software Investigate interactions between abstractions

35 Thanks!

36 NP-Hard Need to interpret MULT when f(x1,x2,...,xn) = true
Checking satisfiability of f(x1,x2,...,xn) is NP-Hard +10 +1 +5 +2 x1 = ... MULT x x2 xn f

37 Related Work Author Abstraction Type Abstraction Granularity Method
Z. Andraus, et al. DAC’04, LPAR’08 Data, Function Fully abstracts all operators CEGAR H. Jain, et al. DAC’05 Data Maintains predicates over data signals Predicate abstraction P. Bjesse CAV’08 Reduces datapaths without BV ops Selective bit-blasting R. E. Bryant, et al., TACAS 2007 Datapath reduction via successive approximation v2ucl Type qualifiers and inference ATLAS Function Partially abstracts some modules Hybrid static-dynamic CAL Machine learning/CEGAR MEMOCODE 2010

38 Term-Level Abstraction
Function Abstraction: Represent functional units with uninterpreted functions ALU f Data Abstraction: Represent data with arbitrary integer values No specific encoding x0 x1 xn-1 x MEMOCODE 2010


Download ppt "Learning Conditional Abstractions (CAL)"

Similar presentations


Ads by Google