Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 State Assignment The problem: Assign a unique code to each state to produce a minimal binary logic level implementation. Given: 1.|S| states, 2.at least.

Similar presentations


Presentation on theme: "1 State Assignment The problem: Assign a unique code to each state to produce a minimal binary logic level implementation. Given: 1.|S| states, 2.at least."— Presentation transcript:

1 1 State Assignment The problem: Assign a unique code to each state to produce a minimal binary logic level implementation. Given: 1.|S| states, 2.at least  log |S|  state bits (minimum width encoding), 3.at most |S| state bits (one-hot encoding) estimate impact of encoding on the size and delay of the optimized implementation. Most known techniques are directed towards reducing the size. Difficult to estimate delay before optimization. There are possible state assignments for 2 n codes (n   log |S|  ), since there are that many ways to select |S| distinct state codes and |S|! ways to permute them among the states.

2 2 State Assignment Techniques are in two categories: Heuristic techniques try to capture some aspect of the process of optimizing the logic, e.g. common cube extraction. Example: 1.Construct a weighted graph of the states. Weights express the gain in keeping the codes of the pair of states ‘close’. 2.Assign codes that minimize a proximity function (graph embedding step) Exact techniques model precisely the process of optimizing the resulting logic as an encoding problem. Example: 1.Perform an appropriate multi-valued logic minimization (optimization step) 2.Extract a set of encoding constraints, that put conditions on the codes 3.Assign codes that satisfy the encoding constraints

3 3 SA as an Encoding Problem State assignment is a difficult problem: transform ‘optimally’ a cover of multi-valued (symbolic) logic functions intoa cover of multi-valued (symbolic) logic functions into an equivalent cover of two-valued logic functions.an equivalent cover of two-valued logic functions. Various applications dictate various definitions of optimality. Encoding problems are classified as 1.input (symbolic variables appear as input) 2.output (symbolic variables appear as output) 3.input-output (symbolic variables appear as both input and output) State assignment is the most difficult case of input-output encoding where the state variable is both an input and output.

4 4 Encoding Problems: status Input Encoding Output Encoding Input- Output Encoding two-level well understood theorywell understood theory efficient algorithmsefficient algorithms many applicationsmany applicationstwo-level well understood theorywell understood theory no efficient algorithmno efficient algorithm few applicationsfew applicationstwo-level well understood theorywell understood theory no efficient algorithmno efficient algorithm few applicationsfew applications multi-level basic theory developedbasic theory developed algorithms not yet maturealgorithms not yet mature few applicationsfew applicationsmulti-level no theoryno theory heuristic algorithmsheuristic algorithms few applicationsfew applicationsmulti-level no theoryno theory heuristic algorithmsheuristic algorithms few applicationsfew applications

5 5 Input Encoding for Two- Level Implementations Reference: [De Micheli, Brayton, Sangiovanni 1985] To solve the input encoding problem for minimum product term count: 1.represent the function as a multi-valued function 2.apply multi-valued minimization (Espresso-MV) 3.extract, from the minimized multi-valued cover, input encoding constraints 4.obtain codes (of minimum length) that satisfy all input encoding constraints

6 6 Input Encoding Example: single primary output of an FSM 3 inputs: state variable, s, and two binary variables, c 1 and c 2 4 states: s 0, s 1, s 2, s 3 1 output: y y is 1 when following conditions hold: (state = s 0 ) and c 2 = 1 (state = s 0 ) or (state = s 2 ) and c 1 = 0 (state = s 1 ) and c 2 = 0 and c 1 = 1 (state = s 3 ) or (state = s 2 ) and c 1 = 1 Pictorially for output Y=1: Values along the state axis are not ordered.

7 7 Input Encoding Function representation Symbolic variables represented by multiple valued (MV) variables X i restricted to P i = { 0,1,…,n i -1 }.Symbolic variables represented by multiple valued (MV) variables X i restricted to P i = { 0,1,…,n i -1 }. Output is a binary valued function f of variables X iOutput is a binary valued function f of variables X i f : P 1  P 2  …  P n  B S  P i and literal X S is as:S  P i and literal X S is as: X i S = {1 if X i  S i {0 otherwise Example: y = X {0} c 2 + X {0,2} c 1 ’+ X {1} c 1 c 2 ’ + X {2,3} c 1 (only 1 MV variable) The cover of the function can also be written in positional form: -1 1000 1 0- 1010 1 10 0100 1 1- 0011 1

8 8 Input Encoding y = X {0} c 2 + X {0,2} c 1 ’+ X {1} c 1 c 2 ’ + X {2,3} c 1 Minimize using classical two-level MV-minimization (Espresso-MV). Minimized result: y = X {0,2,3} c 1 c 2 + X {0,2} c 1 ’+ X {1,2,3} c 1 c 2 ’

9 9 Input Encoding Extraction of face constraints: y = X {0,2,3} c 1 c 2 + X {0,2} c 1 ’+ X {1,2,3} c 1 c 2 ’ The face constraints are: (assign codes to keep MV-literals in single cubes) (s 0, s 2, s 3 ) (s 0, s 2 ) (s 1, s 2, s 3 ) two-level minimization results in the fewest product terms for any possible encodingtwo-level minimization results in the fewest product terms for any possible encoding would like to select an encoding that results in the same number of product termswould like to select an encoding that results in the same number of product terms each MV literal should be embedded in a face (subspace or cube) in the encoded space of the associated MV-variable.each MV literal should be embedded in a face (subspace or cube) in the encoded space of the associated MV-variable. unused codes may be used as don’t caresunused codes may be used as don’t cares

10 10 Input Encoding Satisfaction of face constraints: lower bound can always be achieved (one-hot encoding will do it),lower bound can always be achieved (one-hot encoding will do it), –but this results in many bits (large code width) for a fixed code width need to find embedding that will result in the fewest cubesfor a fixed code width need to find embedding that will result in the fewest cubes Example: A satisfying solution is: s 0 = 001, s 1 = 111, s 2 = 101, s 3 = 100 The face constraints are assigned to the following faces: (s 0, s 2, s 3 )  202 (s 0, s 2 )  201 (s 1, s 2, s 3 )  122 Encoded representation: y = X {0,2,3} c 1 c 2 + X {0,2} c 1 ’+ X {1,2,3} c 1 c 2 ’ -> x 2 ’ c 1 c 2 + x 2 ’ x 3 c 1 ’+ x 1 c 1 c 2 ’ -> x 2 ’ c 1 c 2 + x 2 ’ x 3 c 1 ’+ x 1 c 1 c 2 ’ Note: needed 3 bits instead of the minimum 2. 0 2 3 1 dc dc x3x3x3x3 x2x2x2x2 x1x1x1x1 dc dc

11 11 Procedure for Input Encoding Summary: A face constraint is satisfied by an encoding if the supercube of the codes of the symbolic values in the literal does not contain any code of a symbolic value not in the literal. Satisfying all the face constraints guarantees that the encoded minimized binary-valued cover will have size no greater than the multi-valued cover. Once an encoding satisfying all face constraints has been found, a binary-valued encoded cover can be constructed directly from the multi-valued cover, by replacing each multi-valued literal by the supercube corresponding to that literal.replacing each multi-valued literal by the supercube corresponding to that literal.

12 12 Satisfaction of Input Constraints Any set of face constraints can be satisfied by one-hot encoding the symbols. But code length is too long … Finding a minimum length code that satisfies a given set of constraints is NP-hard [Saldanha, Villa, Brayton, Sangiovanni 1991]. Many approaches proposed. The above reference describes the best algorithm currently known, based on the concept of encoding dichotomies. [Tracey 1965, Ciesielski 1989].

13 13 Satisfying Input Constraints Ordered Dichotomy: A ordered dichotomy is a 2-block partition of a subset of symbols. Example: (s0 s1;s2 s3)A ordered dichotomy is a 2-block partition of a subset of symbols. Example: (s0 s1;s2 s3) –Left block is s0,s1 - associated with 0 bit –Right block is s2 s3 - associated with 1 bit A dichotomy is complete if each symbol of the entire symbol set appears in one blockA dichotomy is complete if each symbol of the entire symbol set appears in one block –(s0;s1 s2) is not complete but (s0 s1;s2 s3) is complete Two dichotomies d1 and d2 are compatible ifTwo dichotomies d1 and d2 are compatible if 1.the left block of d1 is disjoint from the right block of d2, and 2.the right block of d1 is disjoint from the left block of d2.

14 14 Satisfying Input Constraints The union of two compatible dichotomies, d1 and d2, is (a;b) whereThe union of two compatible dichotomies, d1 and d2, is (a;b) where 1.a is the union of the left blocks of d1 and d2 and 2.b is the union of the right blocks of d1 and d2. The set of all compatibles of a set of dichotomies is obtained byThe set of all compatibles of a set of dichotomies is obtained by –a initial dichotomy is a compatible –the union of two compatibles is a compatible Example: initial (0 1;2) (1:4) (3:2) (2;4) generated: (0 1 ;2 4) (0 1 3;2) (1 3;2 4) (1 2;4) generated: (0 1 ;2 4) (0 1 3;2) (1 3;2 4) (1 2;4) (0 1 3;2 4) (0 1 3;2 4)

15 15Dichotomies Dichotomy d1 covers d2 ifDichotomy d1 covers d2 if 1.the left block of d1 contains the left block of d2, and 2.the right block of d1 contains the right block of d2 (s0;s1 s2) is covered by (s0 s3;s1 s2 s4) A dichotomy is a prime compatible of a given set if it is not covered by any other compatible of the set.A dichotomy is a prime compatible of a given set if it is not covered by any other compatible of the set. A set of complete dichotomies generates an encoding: Each dichotomy generates a bit of the encodingA set of complete dichotomies generates an encoding: Each dichotomy generates a bit of the encoding –The left block symbols are 0 in that bit –The right block symbols are 1 in that bit Example: (s0 s1;s2 s3) and (s0 s3;s1 s2) ==> s0=00, s1=01, s2=11, s3=10

16 16 Input constraint satisfaction Initial dichotomies: Each face constraint with k symbols generates 2(n-k) initial dichotomies (symbol set (s0 s1 s2 s3 s4 s5))Each face constraint with k symbols generates 2(n-k) initial dichotomies (symbol set (s0 s1 s2 s3 s4 s5)) –(s1 s2 s4) ==> (s1 s2 s4;s0) (s1 s2 s4;s3) (s1 s2 s4;s5) (s0;s1 s2 s4) (s3;s1 s2 s4) (s5;s1 s2 s4) (s0;s1 s2 s4) (s3;s1 s2 s4) (s5;s1 s2 s4) Also require that each symbol is encoded uniquelyAlso require that each symbol is encoded uniquely (s0;s1) (s0;s2) … (s0;s5) (s0;s1) (s0;s2) … (s0;s5) (s1;s0) (s1;s2) … (s1;s5) (s1;s0) (s1;s2) … (s1;s5) (s5;s0) (s5;s1) … (s5;s4) (s5;s0) (s5;s1) … (s5;s4) n 2 -n such additional initial dichotomies All initial dichotomies, { d }, must be satisfied by encoding - a dichotomy, d, is satisfied by a set of encoding dichotomies if there exists a bit where the symbols in left block of d are 0 and the symbols in right block of d are 1.All initial dichotomies, { d }, must be satisfied by encoding - a dichotomy, d, is satisfied by a set of encoding dichotomies if there exists a bit where the symbols in left block of d are 0 and the symbols in right block of d are 1. Note: can delete any initial dichotomy d that is covered by another e, since since satisfaction of e implies satisfaction of d.

17 17 Input constraint satisfaction Procedure: 1.Given set of face constraints, generate initial dichotomies. 2.Generate set of prime dichotomies. 3.Form covering matrix Rows = initial dichotomiesRows = initial dichotomies Columns = prime dichotomiesColumns = prime dichotomies A 1 is in (i, j) if prime j covers dichotomy i.A 1 is in (i, j) if prime j covers dichotomy i. 4.Solve unate covering problem

18 18 Efficient generation of prime dichotomies 1.Introduce symbol for each initial dichotomy 2.Form a 2-clause for each dichotomy a that is incompatible with dichotomy b i.e. (a+b) 3.Multiply product of all 2-clauses to SOP. 4.For each term in SOP, list dichotomies not in term - these are the set of all primes dichotomies. Example: initial dichotomies (a b c d e) ab,ac,bc,cd,de are incompatibles CNF is (a+b)(a+c)(b+c)(c+d)(d+e) = abd+acd+ace+bcd+bce Primes are cUe, bUe, bUd, aUe, aUd

19 19 Efficient generation of prime dichotomies Procedure for multiplying 2-CNF expression E Procedure cs(E) x = splitting variable C = sum terms with x R = sum terms without x p = product of all variables in C except x q = cs(R) result = single_cube_containment(x q + p q) result = single_cube_containment(x q + p q) Note: procedure is linear in number of primes

20 20 Applications of Input Encoding References: PLA decomposition with multi-input decoders [Sasao 1984, K.C.Chen and Muroga 1988, Yang and Ciesielski 1989]PLA decomposition with multi-input decoders [Sasao 1984, K.C.Chen and Muroga 1988, Yang and Ciesielski 1989] Boolean decomposition in multi-level logic optimization [Devadas, Wang, Newton, Sangiovanni 1989]Boolean decomposition in multi-level logic optimization [Devadas, Wang, Newton, Sangiovanni 1989] Communication based logic partitioning [Beardslee 1992, Murgai 1993]Communication based logic partitioning [Beardslee 1992, Murgai 1993] Approximation to state assignment [De Micheli, Brayton, Sangiovanni 1985, Villa and Sangiovanni 1989]Approximation to state assignment [De Micheli, Brayton, Sangiovanni 1985, Villa and Sangiovanni 1989]

21 21 Input Encoding: PLA Decomposition Consider the PLA: y 1 = x 1 x 3 x 4 ’ x 6 + x 2 x 5 x 6 ’ x 7 + x 1 x 4 x 7 ’ + x 1 ’ x 3 ’ x 4 x 7 ’ + x 3 ’ x 5 ’ x 6 ’ x 7 ’ y 2 = x 2 ’ x 4 ’ x 6 + x 3 x 4 ’ x 6 + x 2 ’ x 5 x 6 ’ x 7 + x 1 ’ x 3 ’ x 4 x 7 ’ + x 1 ’ x 3 ’ x 5 ’ x 6 ’ x 7 ’ + x 2 ’ x 4 ’ x 5 ’ x 7 ’ Decompose it into a driving PLA fed by inputs {X 4, X 5, X 6, X 7 } and a driven PLA fed by the outputs of the driving PLA and the remaining inputs {X 1, X 2, X 3 } so as to minimize the area. x 4 x 5 x 6 x 7 x 4 x 5 x 6 y 1 y 2

22 22 Input Encoding: PLA Decomposition y 1 = x 1 x 3 x 4 ’ x 6 + x 2 x 5 x 6 ’ x 7 + x 1 x 4 x 7 ’ + x 1 ’ x 3 ’ x 4 x 7 ’ + x 3 ’ x 5 ’ x 6 ’ x 7 ’ y 2 = x 2 ’ x 4 ’ x 6 + x 3 x 4 ’ x 6 + x 2 ’ x 5 x 6 ’ x 7 + x 1 ’ x 3 ’ x 4 x 7 ’ + x 1 ’ x 3 ’ x 5 ’ x 6 ’ x 7 ’ + x 2 ’ x 4 ’ x 5 ’ x 7 ’ Projecting onto the selected inputs there are 5 distinct product terms: x 4 ’ x 6, x 5 x 6 ’ x 7, x 4 x 7 ’, x 5 ’ x 6 ’ x 7 ’, x 4 ’ x 5 ’ x 7 ’ x 4 ’ x 6, x 5 x 6 ’ x 7, x 4 x 7 ’, x 5 ’ x 6 ’ x 7 ’, x 4 ’ x 5 ’ x 7 ’ Product term pairs (x 4 x 7 ’, x 5 ’ x 6 ’ x 7 ’ ) and ( x 5 ’ x 6 ’ x 7 ’, x 4 ’ x 5 ’ x 7 ’ ) are not disjoint.

23 23 PLA Decomposition y 1 = x 1 x 3 x 4 ’ x 6 + x 2 x 5 x 6 ’ x 7 + x 1 x 4 x 7 ’ + x 1 ’ x 3 ’ x 4 x 7 ’ + x 3 ’ x 5 ’ x 6 ’ x 7 ’ y 2 = x 2 ’ x 4 ’ x 6 + x 3 x 4 ’ x 6 + x 2 ’ x 5 x 6 ’ x 7 + x 1 ’ x 3 ’ x 4 x 7 ’ + x 1 ’ x 3 ’ x 5 ’ x 6 ’ x 7 ’ + x 2 ’ x 4 ’ x 5 ’ x 7 ’ To re-encode make disjoint all product terms involving selected inputs: y 1 = x 1 x 3 x 4 ’ x 6 + x 2 x 5 x 6 ’ x 7 + x 1 x 4 x 7 ’ + x 1 ’ x 3 ’ x 4 x 7 ’ + x 3 ’ x 4 ’ x 5 ’ x 6 ’ x 7 ’ (using reduce) y 2 = x 2 ’ x 4 ’ x 6 + x 3 x 4 ’ x 6 + x 2 ’ x 5 x 6 ’ x 7 + x 1 ’ x 3 ’ x 4 x 7 ’ + x 1 ’ x 3 ’ x 4 ’ x 5 ’ x 6 ’ x 7 ’ + x 2 ’ x 4 ’ x 5 ’ x 6 ’ x 7 ’ (using reduce) Projecting on the selected inputs there are 4 disjoint product terms: x 4 ’ x 6, x 5 x 6 ’ x 7, x 4 x 7 ’, x 4 ’ x 5 ’ x 6 ’ x 7 ’ View these as 4 values 1,2,3,4 of an MV variable S: y 1 = x 1 x 3 S {1} + x 2 S {2} + x 1 S {3} + x 1 ’ x 3 ’ S {3} + x 3 ’ S {4} y 2 = x 2 ’ S {1} + x 3 S {1} + x 2 ’ S {2} + x 1 ’ x 3 ’ S {3} + x 1 ’ x 3 ’ S {4} + x 2 ’ S {4}

24 24 Input Encoding: PLA Decomposition y 1 = x 1 x 3 S {1} + x 2 S {2} + x 1 S {3} + x 1 ’ x 3 ’ S {3} + x 3 ’ S {4} y 2 = x 2 ’ S {1} + x 3 S {1} + x 2 ’ S {2} + x 1 ’ x 3 ’ S {3} + x 1 ’ x 3 ’ S {4} + x 2 ’ S {4} Performing MV two-level minimization: y 1 = x 1 x 3 S {1,3} + x 2 S {2} + x 3 ’ S {3,4} y 1 = x 1 x 3 S {1,3} + x 2 S {2} + x 3 ’ S {3,4} y 2 = x 2 ’ S {1,2,4} + x 3 S {1} + x 1 ’ x 3 ’ S {3,4} Face constraints are: (1,3 ), (3,4 ), (1,2,4 ) Codes of minimum length are: enc(1 ) = 001, enc(2 ) = 011, enc(3) = 100, enc(4) = 111. x 10 x9x9x9x9 x8x8x8x8 1 3 2 4

25 25 Input Encoding: PLA Decomposition The driven PLA becomes: y 1 = x 1 x 3 x 9 ’ + x 2 x 8 ’x 9 x 10 + x 3 ’ x 8 y 2 = x 2 ’ x 10 + x 3 x 8 ’x 9 ’x 10 + x 1 ’ x 3 ’ x 8 The driving PLA becomes the function: f: {X 4, X 5, X 6, X 7 }  {X 8, X 9, X 10 } f(x 4 ’ x 6 ) = enc( 1 ) = 001f(x 5 x 6 ’ x 7 ) = enc( 2 ) = 011 f(x 4 x 7 ’ ) = enc( 3 ) = 100f(x 4 ’ x 5 ’ x 6 ’ x 7 ’ ) = enc( 4 ) = 111 Represented in SOP as: x 8 = x 4 x 7 ’ + x 4 ’ x 5 ’ x 6 ’ x 7 ’ x 9 = x 5 x 6 ’ x 7 + x 4 ’ x 5 ’ x 6 ’ x 7 ’ x 10 = x 4 ’ x 6 + x 5 x 6 ’ x 7 + x 4 ’ x 5 ’ x 6 ’ x 7 ’ Result: Area before decomposition: 160. Area after decomposition: 128. x 4 x 5 x 6 x 7 x 4 x 5 x 6 x 8, x 9, x 10 y 1 y 2

26 26 Output Encoding for Two-Level Implementations The problem: find binary codes for symbolic outputs in a logic function so as to minimize a two-level implementation of the function.find binary codes for symbolic outputs in a logic function so as to minimize a two-level implementation of the function.Terminology: Assume that we have a symbolic cover S with a symbolic output assuming n values. The different values are denoted v 0, …, v n-1.Assume that we have a symbolic cover S with a symbolic output assuming n values. The different values are denoted v 0, …, v n-1. The encoding of a symbolic value v i is denoted enc(v i ).The encoding of a symbolic value v i is denoted enc(v i ). The onset of v i is denoted ON i. Each ON i is a set of D i minterms {m i1, …, m iDi }.The onset of v i is denoted ON i. Each ON i is a set of D i minterms {m i1, …, m iDi }. Each minterm m ij has a tag indicating which symbolic value’s onset it belongs to. A minterm can only belong to one symbolic value’s onset.Each minterm m ij has a tag indicating which symbolic value’s onset it belongs to. A minterm can only belong to one symbolic value’s onset.

27 27 Output Encoding Consider a function f with symbolic outputs and two different encoded realizations of f : 0001 out1 0001 0010001 10000 00-0 out2 00-0 01000-0 01000 0011 out2 0011 0100011 01000 0100 out3 0100 0110100 00100 1000 out3 1000 0111000 00100 1011 out4 1011 1001011 00010 1111 out5 1111 1011111 00001 Two-level logic minimization exploits the sharing between the different outputs to produce a minimum cover.Two-level logic minimization exploits the sharing between the different outputs to produce a minimum cover. In the second realization no sharing is possible. The first can be minimized to: 1111 001 1-11 100 0100 011 0001 101 1000 011 00-- 010

28 28 Dominance Constraints Definition: enc(v i ) > enc(v j ) iff the code of v i bit-wise dominates the code of v j, i.e. for each bit position where v j has a 1, v i also has a 1. Example: 101 > 001 If enc(v i ) > enc(v j ) then ON i can be used as a DC set when minimizing ON j. Example: symbolic cover encoded cover minimized encoded cover symbolic cover encoded cover minimized encoded cover 0001 out10001 110 0001 110 00-0 out200-0 010 00-- 010 0011 out20011 010 Here enc(out1) = 110 > enc(out2) = 010. The input minterm 0001 of out1 has been included into the single cube 00-- that asserts the code of out2. Algorithms to exploit dominance constraints implemented in Cappucino (De Michelli, 1986) and Nova (Villa and Sangiovanni, 1990).

29 29 Disjunctive Constraints If enc(v i ) = enc(v j ) + enc(v k ), ON i can be minimized using ON j and ON k as don’t cares. Example: symbolic cover encoded cover minimized encoded cover symbolic cover encoded cover minimized encoded cover 101 out1 1101 11 1 10- 01 1 100 out2 1100 01 1 1-1 10 1 111 out3 1111 10 1 Here enc(out1) = enc(out2) + enc(out3). The input minterm 101 of out1 has been merged with the input minterm 100 of out2 (resulting in 10-) and with the input minterm 111 of out3 (resulting in 1-1). Input minterm 101 now asserts 11 (i.e. the code of out1), by activating both cube 10- that asserts 01 and cube 1-1 that asserts 10. Algorithm to exploit dominance and disjunctive constraints implemented in esp_sa (Villa, Saldanha, Brayton and Sangiovanni, 1995).

30 30 Exact Output Encoding Proposed by Devadas and Newton, 1991. The algorithm consists of the following: 1.Generate generalized prime implicants (GPIs) from the original symbolic cover. 2.Solve a constrained covering problem, requires the selection of a minimum number of GPIs that form an encodeable cover.requires the selection of a minimum number of GPIs that form an encodeable cover. 3.Obtain codes (of minimum length) that satisfy the encoding constraints. 4.Given the codes of the symbolic outputs and the selected GPIs, construct trivially a PLA with product term cardinality equal to the number of GPIs.


Download ppt "1 State Assignment The problem: Assign a unique code to each state to produce a minimal binary logic level implementation. Given: 1.|S| states, 2.at least."

Similar presentations


Ads by Google