Presentation is loading. Please wait.

Presentation is loading. Please wait.

DARPA A Metrics System for Continuous Improvement of Design Technology Andrew B. Kahng and Stefanus Mantik.

Similar presentations


Presentation on theme: "DARPA A Metrics System for Continuous Improvement of Design Technology Andrew B. Kahng and Stefanus Mantik."— Presentation transcript:

1 DARPA A Metrics System for Continuous Improvement of Design Technology Andrew B. Kahng and Stefanus Mantik

2 2 12/9/99 Motivation: Complexity of the Design Process u Ability to make silicon has outpaced ability to design it u Complex data, system interactions u SOC s more functionality and customization, in less time s design at higher levels of abstraction, reuse existing design components s customized circuitry must be developed predictably, with less risk u Key question: “Will the project succeed, i.e., finish on schedule and under budget while meeting performance goals?” u SOC design requires an organized, optimized design process

3 3 12/9/99 Value of CAD Tools Improvement Not Clear u What is $ value of a “better” scheduler, mapper, placer? u What is $ value of GUI, usability, …? u What is the right objective? s min wirelength  routable s min literals  amenable to layout u Value well-defined only in context of overall design process

4 4 12/9/99 What is the Design Process? u Not like any “flow/methodology” bubble chart s backs of envelopes, budgeting wars s changed specs, silent decisions, e-mails, lunch discussions s ad hoc assignments of people, tools to meet current needs s proprietary databases, incompatible scripts/tools, platform- dependent GUIs, lack of usable standards s design managers operate on intuition, engineers focus on tool shortcomings u Why did it fail? s “CAD tools” s “inexperienced engineers” u Must measure to diagnose, and diagnose to improve

5 5 12/9/99 What Should be Measured? u Many possibilities s running a tool with wrong options, wrong subset of standard s bug in a translator/reader s assignment of junior designer to project with multiple clocks s difference between 300 MHz and 200 MHz in the spec s changing an 18-bit adder into a 28-bit adder midstream s decision to use domino in critical paths s one group stops attending budget/floorplan meetings u Solution: record everything, then mine the data

6 6 12/9/99 Design Process Data Collection u What revision of what block was what tool called on? s by whom? s when? s how many times? With what keystrokes? u What happened within the tool as it ran? s what was CPU/memory/solution quality? s what were the key attributes of the instance? s what iterations/branches were made, under what conditions? u What else was occurring in the project? s e-mails, spec revisions, constraint and netlist changes, … u Everything is fair game; bound only by server bandwidth

7 7 12/9/99 Unlimited Range of Possible Diagnoses u User performs same operation repeatedly with nearly identical inputs s tool is not acting as expected s solution quality is poor, and knobs are being twiddled u Email traffic in a project: s missed deadline, missed revised deadline; people disengaged; project failed u On-line docs always open to particular page s command/option is unclear

8 8 12/9/99 METRICS System Architecture Metrics Data Warehouse Tool xmitter Data-Mining Reporting Inter/Intra-net Server Java Applets Web Browsers Wrapper or embedded

9 9 12/9/99 METRICS Transmitter u No functional change to the tool s use API to send the available metrics u Low overhead s example: standard-cell placer using Metrics API  < 2% runtime overhead s even less overhead with buffering u Won’t break the tool on transmittal failure s child process handles transmission while parent process continues its job initToolRun() sendMetrics()

10 10 12/9/99 METRICS Transmitter EDA Tool Tool wrapper EDA Tool API Java Servlet Oracle8i Inter/Intra-net XML SQL

11 11 12/9/99 Transmitter Example /** API Example **/ int main(int argc, char * argv[ ] ) {... toolID = initToolRun( projectID, flowID );... printf( “Hello World\n” ); sendMetric( projectID, flowID, toolID, “TOOL_NAME”, “Sample” ); sendMetric( projectID, flowID, toolID, “TOOL_VERSION”, “1.0” );... terminateToolRun( projectID, flowID, toolID ); return 0; } ## Wrapper example ( $File, $PID, $FID ) = @ARGV; $TID = `initToolRun $PID $FID`; open ( IN, “< $File” ); while ( ) { if ( /Begin\s+(\S+)\s+on\s+(\S+.*)/) { system “sendMetrics $PID $FID $TID \ TOOL_NAME $1”; system “sendMetrics $PID $FID $TID \ START_TIME $2”; }... } system “terminateToolRun $PID $FID \ $TID”;

12 12 12/9/99 Example of METRICS XML TOOL 173 9 32 173 9 P32 93762541300 TOOL_NAME CongestionAnalysis

13 13 12/9/99 Current Testbed: A Metricized P&R Flow LEF DEF Placed DEF QP ECO Legal DEF Congestion Map WRoute Capo Placer Routed DEF CongestionAnalysis Incr. WRoute Final DEF METRICSMETRICS

14 14 12/9/99 METRICS Reporting u Web-based s platform independent s accessible from anywhere u Example: correlation plots created on-the-fly s understand the relation between two metrics s find the importance of certain metrics to the flow s always up-to-date

15 15 12/9/99 METRICS Reporting Java Servlet Oracle8i Inter/Intra-net SQL WEB Browser Local Graphing Tool (GNUPlot) data plot Request Report Request Report Request Data 3rd Party Graphing Tool (Excel,Lotus) Wrapper Data Future implementation

16 16 12/9/99 Example Reports Congestion vs WL # Via vs WL

17 17 12/9/99 METRICS Standards u Standard metrics naming across tools s same name  same meaning, independent of tool supplier s generic metrics and tool-specific metrics s no more ad hoc, incomparable log files u Standard schema for metrics database

18 18 12/9/99 Generic and Specific Tool Metrics

19 19 12/9/99 Current Status u Completion of METRICS server with Oracle8i, Servlet, and XML parser u Initial transmittal API in C++ u METRICS wrapper for Cadence P&R tools u Simple reporting scheme for correlations

20 20 12/9/99 Additional Infrastructure u Industrial standard network discovery s Jini, UPNP (Universal Plug & Play), SLP (Salutation) u Security s encryption for XML data s SSL (Secure Socket Layer) s user id & password authentication (reporting) s registered users (transmitting) u 3rd party reporting tool s MS Office integration, Crystal report, … u Data mining

21 21 12/9/99 METRICS Demo u Transmission of metrics s API inside tools s Perl wrapper for log files u Reporting s correlation reports s progress on current tool run, flow, design

22 22 12/9/99 Potential Benefits to Project Management u Accurate Resource Prediction At any point in Design Cycle s up front estimates for People, Time, Technology, EDA Licenses, IP re-use... s go/no go at earliest point u Accurate Project Post-mortems s Everything tracked - tools, flows, users, notes s Optimize for next Project based on past results s No “loose”, random data or information left at Project end (log files!!!) u Management Console s Web-based, status-at-a-glance of Tools, Designs, Systems at any point in project u No wasted resources s prevent out of sync runs s no duplication of data or effort

23 23 12/9/99 Potential Benefits to Tools R&D u Methodology for continuous tracking data over entire lifecycle of instrumented tools u More efficient analysis of realistic data s no need to rely only on extrapolations of tiny artificial “benchmarks” s no need to collect source files for test cases, and re-run in house u Facilitates identification of key design metrics, effects on tools s standardized vocabulary, schema for design/instance attributes u Improves benchmarking s apples to apples, and what are the apples in the first place s apples to oranges as well, given enough correlation research

24 24 12/9/99 Potential Research Enabled by METRICS u Tools: s scope of applicability s predictability s usability u Designs: s difficulty of design or manufacturing s verifiability, debuggability/probe-ability s likelihood of a bug escape s $ cost (function of design effort, integratability, migratability,...) u Statistical metrics, time-varying metrics u What is the appropriate abstraction of manufacturing process for design? s Impact of manufacturing on design productivity s Inter- and intra-die variation s Topography effects s Impact, tradeoffs of newer lithography techniques and materials

25 25 12/9/99 Ongoing Work u Work with EDA, designer community to establish standards s tool users: list of metrics needed for design process optimization s tool vendors: implementation of the metrics requested with the standardized naming u Improve the transmitter s add message buffering s “recovery” system for network / server failure u Extend METRICS system to include project management tools, email communications, etc. u Additional reports, data mining


Download ppt "DARPA A Metrics System for Continuous Improvement of Design Technology Andrew B. Kahng and Stefanus Mantik."

Similar presentations


Ads by Google