Presentation is loading. Please wait.

Presentation is loading. Please wait.

Design Hierarchy Guided Multilevel Circuit Partitioning

Similar presentations


Presentation on theme: "Design Hierarchy Guided Multilevel Circuit Partitioning"— Presentation transcript:

1 Design Hierarchy Guided Multilevel Circuit Partitioning
Yongseok Cheon and D.F. Wong Department of Computer Sciences The University of Texas at Austin

2 Outline Motivation & Contribution Problem Design hierarchy
Rent’s rule & Rent exponent Our approach Design hierarchy guided clustering Design hierarchy guided ML partitioning Experimental results

3 Motivation Natural question: How to use design hierarchy for partitioning? Effectiveness of multilevel partitioning Similarity between design hierarchy (DH) and ML clustering tree

4 Contribution Rent exponent as a quality indicator
Intelligent and systematic use of hierarchical logical grouping information for better partitioning Partitioning results with higher quality, more stability obtained

5 Partitioning problem Netlist hypergraph Partitioned hypergraph

6 Multilevel partitioning
Multilevel clustering (coarsening) Initial partitioning Multilevel FM refinement with unclustering (uncoarsening) hMetis (3) (1) (2)

7 DH guided ML partitioning

8 Design hierarchy Hierarchical grouping which already has some implications on connectivity information To identify which hierarchical element is good or bad in terms of physical connectivity, Rent’s rule is used

9 Rent’s rule Rent’s rule & Rent exponent E = external pin count
B = # of cells inside P = avg # of pins per cell r = Rent exponent

10 Rent exponent For a hierarchical element H, Rent exponent for H
E = external pin count I = internal pin count P = avg # of pins per cell = (I+E)/|H|

11 Rent exponent Small r  more strongly connected cells inside
Large r  more weakly connected cells inside r = ln(4/34)/ln = r = ln(15/25)/ln = 0.778

12 Selective preservation of DH
Global Rent exponent, r = weighted average of Rent exponents of all hierarchical elements in DH = A hierarchical element H is determined to be preserved or broken according to r(H) If r(H)  r : H will be used as a search scope for clustering of the cells inside H – positive scope If r(H)  r : H is removed from DH and the cells inside of H can be freely clustered with outside cells – negative scope

13 Modification of DH Remove all negative scopes from design hierarchy D – scope tree D’ H(v) (parent of v in D’) : served as clustering scope for v Design hierarchy tree D : negative scope : positive scope Scope tree D' H1 H2 H3 H4 u v H(u) = H1 H(v) = H2

14 DH guided ML clustering
Input: bottommost hypergraph G1 & design hierarchy D Output: k-level clustering tree C Modify D to D’ do Perform cluster_one_level(Gk) with D’  upper level hypergraph Gk+1 Update D’ k = k+1 until Gk is saturated

15 Global saturation Saturation condition(stopping criteria):
# of vertices   or Problem size reduction rate   ( =100, =0.9 in our experiments )

16 Clustering scope Hierarchical node as clustering scope
For each anchor v, best neighbor w to be matched with v is searched within H(v) u is selected as an anchor before v if H(u)  H(v) Scope tree D' H1 H2 H3 H4 u v

17 Scope restricted clustering
cluster_one_level() For randomly selected unmatched vertex v, find w within the scope H(v) that maximizes the clustering cost, Vertices with smaller scopes are selected as anchors earlier Create a new upper level cluster v’ with v and w H(v’) := H(v) since H(v)  H(w)

18 Scope restricted clustering(cnt’d)
cluster_one_level() – continued If no best target w, create v’ only with v If w already matched in v’, append v to v’ “unmatched” condition is relaxed - already matched neighbor w is also considered  More problem size reduction H(v’) := H(v) since H(v)  H(v’)

19 One level clustering No reduction rate control to take full advantage of design hierarchy  aggressively reduced # of levels in resulting clustering tree Cluster sizes are controlled such that they cannot exceed  = bal_ratiototal_size Local saturation condition for scope X: # of vertices in X  (X) or Size reduction rate in X  (X) ( in our experiments )

20 Scope tree restructuring
Scope tree is restructured after one level clustering by removing saturated scopes Enlarged clustering scopes are used at higher level clustering with bigger & fewer clusters Restructured scope Scope tree D' H(u) = H1 tree after one level H(u') = H3 H3 H(v) = H2 clustering H3 H(v') = H4 H1 H4 H4 H2 u v u' v' H1 and H2 are saturated!

21 DH guided ML partitioning
dhml Perform Rent exponent computation on D Apply DH guided ML clustering to obtain k level clustering tree C At the coarsest level, execute 20 runs of FM and pick the best one From the partition at level k down to level 0, apply unclustering and FM_partition to improve the partition from upper levels

22 DH guided ML partitioning
Multi way partitioning: dhml RBP Recursive bi-partitioning Partial design hierarchy trees used at each sub-partitioning Performance compared with hMetis RBP version

23 Experimental results Circuit characteristics Circuit # cells # nets
levels/# hier nodes Ind1 Ind2 Ind3 Ind4 Ind5 Ind6 15186 136340 224908 414633 19152 183340 187595 414013 6/302 9/10427 5/57590 13/94796 13/33277 11/35449

24 Experimental results Cut set size comparison (Minimum cut size from 5 runs of dhml & 10 runs of hMetis RBP) Up to 16% better quality in half # of runs Circuit 2-way 16-way 256-way dhml hMetis Ind1 64 69 437 483 - Ind2 133 134 1203 1294 14633 16137 Ind3 292 305 1454 1551 7450 7508 Ind4 202 208 3394 3498 12013 13999 Ind5 1376 1352 7410 7950 22474 24454 ind6 55 56 8275 8265 33472 35075

25 Experimental results Quality stability

26 Experimental results Observation
20-50% better quality in the initial partition at the coarsest level Number of levels reduced to 55-75% of hMetis while still producing up to 16% better cut quality More stable cut quality implying smaller # of runs needed to obtain the near-best solution Similar or little more runtime than hMetis

27 Summary Systematic ML partitioning method exploiting design hierarchy presented ML clustering guided by design hierarchy Rent exponent Clustering scope restriction Dynamic scope restructuring Experimental results show… Better clustering tree More stable and higher quality solution


Download ppt "Design Hierarchy Guided Multilevel Circuit Partitioning"

Similar presentations


Ads by Google