1 / 27

Design Hierarchy Guided Multilevel Circuit Partitioning

Design Hierarchy Guided Multilevel Circuit Partitioning. Yongseok Cheon and D.F. Wong Department of Computer Sciences The University of Texas at Austin. Outline. Motivation & Contribution Problem Design hierarchy Rent’s rule & Rent exponent Our approach

evan
Télécharger la présentation

Design Hierarchy Guided Multilevel Circuit Partitioning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Design Hierarchy Guided Multilevel Circuit Partitioning Yongseok Cheon and D.F. Wong Department of Computer Sciences The University of Texas at Austin

  2. Outline • Motivation & Contribution • Problem • Design hierarchy • Rent’s rule & Rent exponent • Our approach • Design hierarchy guided clustering • Design hierarchy guided ML partitioning • Experimental results

  3. Motivation • Natural question: How to use design hierarchy for partitioning? • Effectiveness of multilevel partitioning • Similarity between design hierarchy (DH) and ML clustering tree

  4. Contribution • Rent exponent as a quality indicator • Intelligent and systematic use of hierarchical logical grouping information for better partitioning • Partitioning results with higher quality, more stability obtained

  5. Partitioning problem Netlist hypergraph Partitioned hypergraph

  6. Multilevel partitioning • Multilevel clustering (coarsening) • Initial partitioning • Multilevel FM refinement with unclustering (uncoarsening) • hMetis (3) (1) (2)

  7. DH guided ML partitioning

  8. Design hierarchy • Hierarchical grouping which already has some implications on connectivity information • To identify which hierarchical element is good or bad in terms of physical connectivity, Rent’s rule is used

  9. Rent’s rule • Rent’s rule & Rent exponent • E = external pin count • B = # of cells inside • P = avg # of pins per cell • r = Rent exponent

  10. Rent exponent • For a hierarchical element H, • Rent exponent for H • E = external pin count • I = internal pin count • P = avg # of pins per cell = (I+E)/|H|

  11. Rent exponent • Small r  more strongly connected cells inside • Large r  more weakly connected cells inside r = ln(4/34)/ln10 + 1 = 0.0147 r = ln(15/25)/ln10 + 1 = 0.778

  12. Selective preservation of DH • Global Rent exponent, r = weighted average of Rent exponents of all hierarchical elements in DH = • A hierarchical element H is determined to be preserved or broken according to r(H) • If r(H)  r : H will be used as a search scope for clustering of the cells inside H – positive scope • If r(H)  r : H is removed from DH and the cells inside of H can be freely clustered with outside cells – negative scope

  13. Design : negative scope hierarchy tree Scope tree D' H(u) = H1 D : positive scope H3 H(v) = H2 H1 H4 H2 u v Modification of DH • Remove all negative scopes from design hierarchy D – scope tree D’ • H(v) (parent of v in D’) : served as clustering scope for v

  14. DH guided ML clustering • Input: bottommost hypergraph G1 & design hierarchy D • Output: k-level clustering tree C • Modify D to D’ • do • Perform cluster_one_level(Gk) with D’  upper level hypergraph Gk+1 • Update D’ • k = k+1 • until Gk is saturated

  15. Global saturation • Saturation condition(stopping criteria): • # of vertices   or • Problem size reduction rate   • ( =100, =0.9 in our experiments )

  16. Scope tree D' H3 H1 H4 H2 u v Clustering scope • Hierarchical node as clustering scope • For each anchor v, best neighbor w to be matched with v is searched within H(v) • u is selected as an anchor before v if H(u)  H(v)

  17. Scope restricted clustering • cluster_one_level() • For randomly selected unmatched vertex v, find w within the scope H(v) that maximizes the clustering cost, • Vertices with smaller scopes are selected as anchors earlier • Create a new upper level cluster v’ with v and w • H(v’) := H(v) since H(v)  H(w)

  18. Scope restricted clustering(cnt’d) • cluster_one_level() – continued • If no best target w, create v’ only with v • If w already matched in v’, append v to v’ • “unmatched” condition is relaxed - already matched neighbor w is also considered  More problem size reduction • H(v’) := H(v) since H(v)  H(v’)

  19. One level clustering • No reduction rate control to take full advantage of design hierarchy  aggressively reduced # of levels in resulting clustering tree • Cluster sizes are controlled such that they cannot exceed  = bal_ratiototal_size • Local saturation condition for scope X: • # of vertices in X  (X) or • Size reduction rate in X  (X) • ( in our experiments )

  20. Scope tree restructuring • Scope tree is restructured after one level clustering by removing saturated scopes • Enlarged clustering scopes are used at higher level clustering with bigger & fewer clusters Restructured scope Scope tree D' H(u) = H1 tree after one level H(u') = H3 clustering H3 H(v) = H2 H3 H(v') = H4 H1 H4 H4 H2 u v u' v' H1 and H2 are saturated!

  21. DH guided ML partitioning • dhml • Perform Rent exponent computation on D • Apply DH guided ML clustering to obtain k level clustering tree C • At the coarsest level, execute 20 runs of FM and pick the best one • From the partition at level k down to level 0, apply unclustering and FM_partition to improve the partition from upper levels

  22. DH guided ML partitioning • Multi way partitioning: dhml RBP • Recursive bi-partitioning • Partial design hierarchy trees used at each sub-partitioning • Performance compared with hMetis RBP version

  23. Experimental results • Circuit characteristics

  24. Experimental results • Cut set size comparison (Minimum cut size from 5 runs of dhml & 10 runs of hMetis RBP) • Up to 16% better quality in half # of runs

  25. Experimental results • Quality stability

  26. Experimental results • Observation • 20-50% better quality in the initial partition at the coarsest level • Number of levels reduced to 55-75% of hMetis while still producing up to 16% better cut quality • More stable cut quality implying smaller # of runs needed to obtain the near-best solution • Similar or little more runtime than hMetis

  27. Summary • Systematic ML partitioning method exploiting design hierarchy presented • ML clustering guided by design hierarchy • Rent exponent • Clustering scope restriction • Dynamic scope restructuring • Experimental results show… • Better clustering tree • More stable and higher quality solution

More Related