Presentation is loading. Please wait.

Presentation is loading. Please wait.

DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

Similar presentations


Presentation on theme: "DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan."— Presentation transcript:

1 DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan

2 2 12/09/02 Outline u Motivation, issues in benchmarking u bX in the picture u Sample application: Evaluation of tools u Future focus u Contact info, links

3 3 12/09/02 Motivation, issues in benchmarking 1. Evaluation independent reproduction of results and experiments independent reproduction of results and experiments explicit methods required explicit methods required  minimum room for misinterpretation of results evaluation of algorithms across entire problem space evaluation of algorithms across entire problem space  conflicting and correlating optimization objectives  separation of placement and routing tasks

4 4 12/09/02 Motivation, issues in benchmarking (cont’d) 2. Availability of results raw experimental results raw experimental results availability allows verification availability allows verification results provide insight into the performance of a tool results provide insight into the performance of a tool

5 5 12/09/02 Motivation, issues in benchmarking (cont’d) 3. Standard formats meaningful comparison of results meaningful comparison of results compatibility between tools and benchmarks compatibility between tools and benchmarks correct interpretation of benchmarks correct interpretation of benchmarks

6 6 12/09/02 bX in the picture 1. Automation ‘live’ repository ‘live’ repository  support for execution of tools on benchmarks  distributed network of computational hosts online reporting of results online reporting of results  automatic updates when changes in dependencies occur

7 7 12/09/02 bX in the picture (cont’d) 2. Scripts and flows reproduction of results reproduction of results  scripts and flows describe experiments  scripts can be saved, shared and reused representation of entire problem space representation of entire problem space  relationship between optimization objectives  e.g. the effect of placement results on routing

8 8 12/09/02 bX in the picture (cont’d) 3. Standard formats interoperability between tools and benchmarks interoperability between tools and benchmarks meaningful comparison of results meaningful comparison of results

9 9 12/09/02 Sample application: Evaluation of tools 1. Placers Capo Capo  randomized  fixed die placer  emphasis on routability  tuned on proprietary Cadence benchmarks

10 10 12/09/02 Sample application: Evaluation of tools (cont’d) 1. Placers (cont’d) Dragon Dragon  randomized  variable-die placer  tuned on IBM-Place benchmarks

11 11 12/09/02 Sample application: Evaluation of tools (cont’d) 1. Placers (cont’d) KraftWerk KraftWerk  deterministic  fixed-die placer  results typically have cell overlaps  additional legalization step by DOMINO

12 12 12/09/02 Sample application: Evaluation of tools (cont’d) 2. Benchmarks PEKO PEKO  artificial netlists  designed to match statistical parameters of IBM netlists  known optimal wirelength  concern that they are not representative of industry circuits

13 13 12/09/02 Sample application: Evaluation of tools (cont’d) 2. Benchmarks (cont’d) grids grids  4 fixed vertices, n 2 1x1 movables  tests placers on datapath-like circuits  known optimal placement  results are easily visualized for debugging

14 14 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow placer benchmark parameters evaluator placement evaluation post-processing post-processor A script in bX serves as a template describing an experiment, and can be saved and shared. Scripts are instantiated by defining the individual components of the script. Flows are instantiated scripts. Flows can be re-executed to reproduce results.

15 15 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) placer benchmark parameters evaluator placement evaluation post-processing post-processor Flow parameters: Capo PEKO (default) placement map overlap/legality & wirelength placer benchmark parameters evaluator post-processor After completion, the results of the jobs will be automatically posted online. In the case of the placement job, the results include wirelength and runtime.

16 16 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) post-processing post-processor Flow parameters: Capo PEKO (default) placement map overlap/legality & wirelength placer benchmark parameters evaluator post-processor

17 17 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) If we swapped Capo with Dragon: Capo PEKO (default) placement map overlap/legality & wirelength placer benchmark parameters evaluator post-processor

18 18 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) If we swapped Capo with Dragon: Dragon PEKO (default) placement map overlap/legality & wirelength placer benchmark parameters evaluator post-processor

19 19 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) If we swapped Capo with Dragon: Dragon PEKO (default) placement map overlap/legality & wirelength placer benchmark parameters evaluator post-processor

20 20 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) placer benchmark parameters evaluator placement evaluation post-processing post-processor Capo PEKO (default) placement map overlap/legality & wirelength placer benchmark parameters evaluator post-processor

21 21 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) placer benchmark parameters evaluator placement evaluation post-processing post-processor Modify the flow: Capo PEKO (default) placement map overlap/legality & wirelength placer benchmark parameters evaluator post-processor

22 22 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) placer benchmark parameters evaluator placement evaluation post-processing post-processor Modify the flow: Capo grid (default) placement map overlap/legality & wirelength placer benchmark parameters evaluator post-processor

23 23 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) placer benchmark parameters evaluator placement evaluation post-processing post-processor Modify the flow: Capo grid (default) grid graph overlap/legality & wirelength placer benchmark parameters evaluator post-processor

24 24 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) placer benchmark parameters evaluator placement evaluation post-processing post-processor Modify the flow: Capo grid (default) grid graph overlap/legality & wirelength placer benchmark parameters evaluator post-processor

25 25 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) Swap Capo with Dragon: Capo grid (default) grid graph overlap/legality & wirelength placer benchmark parameters evaluator post-processor

26 26 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) Swap Capo with Dragon: Dragon grid (default) grid graph overlap/legality & wirelength placer benchmark parameters evaluator post-processor

27 27 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) Swap Capo with Dragon: Dragon grid (default) grid graph overlap/legality & wirelength placer benchmark parameters evaluator post-processor

28 28 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) placer benchmark parameters evaluator placement evaluation post-processing post-processor Capo PEKO (default) congestion map overlap/legality & wirelength placer benchmark parameters evaluator post-processor

29 29 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) placer benchmark parameters evaluator placement evaluation post-processing post-processor KraftWerk PEKO (default) congestion map overlap/legality & wirelength placer benchmark parameters evaluator post-processor

30 30 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) placer benchmark parameters evaluator placement evaluation post-processing post-processor KraftWerk PEKO (default) congestion map overlap/legality & wirelength placer benchmark parameters evaluator post-processor

31 31 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) placer benchmark parameters evaluator placement evaluation post-processing post-processor KraftWerk PEKO (default) congestion map overlap/legality & wirelength placer benchmark parameters legalizer legalization evaluator post-processor

32 32 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) placer benchmark parameters evaluator placement evaluation post-processing post-processor KraftWerk PEKO (default) congestion map overlap/legality & wirelength DOMINO placer benchmark parameters legalizer legalization legalizer evaluator post-processor

33 33 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) placer benchmark parameters evaluator placement evaluation post-processing post-processor KraftWerk PEKO (default) congestion map overlap/legality & wirelength DOMINO placer benchmark parameters evaluator post-processor legalizer legalization legalizer

34 34 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) placer benchmark parameters evaluator placement evaluation post-processing post-processor KraftWerk PEKO (default) congestion map overlap/legality & wirelength DOMINO placer benchmark parameters evaluator post-processor legalizer legalization legalizer

35 35 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) placer benchmark parameters evaluator placement evaluation post-processing post-processor KraftWerk PEKO (default) congestion map overlap/legality & wirelength DOMINO placer benchmark parameters evaluator post-processor legalizer legalization router routing router legalizer

36 36 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) placer benchmark parameters evaluator placement evaluation post-processing post-processor KraftWerk PEKO (default) congestion map overlap/legality & wirelength DOMINO placer benchmark parameters evaluator post-processor legalizer legalization router routing router legalizer

37 37 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) placer benchmark parameters evaluator placement evaluation KraftWerk PEKO (default) overlap/legality & wirelength DOMINO placer benchmark parameters evaluator legalizer legalization legalizer post-processing post-processor router routing router

38 38 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) placer benchmark parameters evaluator1 placement evaluation KraftWerk PEKO (default) overlap/legality & wirelength DOMINO placer benchmark parameters evaluator1 legalizer legalization legalizer

39 39 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) placer benchmark parameters placement KraftWerk PEKO (default) overlap/legality & wirelength DOMINO placer benchmark parameters evaluator1 legalizer legalization legalizer evaluator1 evaluation evaluator2 evaluation

40 40 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) placer benchmark parameters placement KraftWerk PEKO (default) overlap/legality & wirelength routability DOMINO placer benchmark parameters evaluator1 legalizer legalization legalizer evaluator2 evaluator1 evaluation evaluator2 evaluation

41 41 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) placer benchmark parameters placement KraftWerk PEKO (default) overlap/legality & wirelength routability DOMINO placer benchmark parameters evaluator1 legalizer legalization evaluator2 evaluator1 evaluation evaluator2 evaluation evaluator3 evaluation legalizer

42 42 12/09/02 Sample application: Evaluation of tools (cont’d) 3. Example flow (cont’d) placer benchmark parameters placement KraftWerk PEKO (default) overlap/legality & wirelength routability DOMINO timing analysis placer benchmark parameters evaluator1 legalizer legalization evaluator2 evaluator1 evaluation evaluator2 evaluation evaluator3 evaluation legalizer evaluator3

43 43 12/09/02 Future Focus 1. Easy deployment downloadable bX distribution downloadable bX distribution  in the form of a binary or installation package

44 44 12/09/02 Future Focus (cont’d) 2. Interpretation of results multiple views and query support multiple views and query support for example, for example,  ‘show all results for solver S’  ‘show the hardest benchmarks for solver S’  ‘has the solution quality decreased for benchmark B, since the upload of the new version of solver S?’ since the upload of the new version of solver S?’

45 45 12/09/02 Future Focus (cont’d) 3. Type checking MIME-like affinity between solvers and benchmarks MIME-like affinity between solvers and benchmarks  compatibility checks  useful for performing queries on different ‘families’ ‘learning’ of new file types ‘learning’ of new file types

46 46 12/09/02 Future focus (cont’d) 4. GSRC Bookshelf populate bX with implementations from Bookshelf populate bX with implementations from Bookshelf  still the same ‘one-stop-shop’  except that it will be a live repository

47 47 12/09/02 Future Focus (cont’d) 5. OpenAccess method of communicating data between jobs method of communicating data between jobs  provide interoperability between tools single ‘design-through-manufacturing’ data model single ‘design-through-manufacturing’ data model

48 48 12/09/02 Contact info, links For more info or source code: bx@umich.edu Feedback and comments are appreciated. OpenAccesswww.openeda.org www.openeda.org www.cadence.com/feature/open_access.html GSRC Bookshelfwww.gigascale.org/bookshelf www.gigascale.org/bookshelf Thanks!


Download ppt "DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan."

Similar presentations


Ads by Google