Determining Test Quality through Dynamic Runtime Monitoring of SystemVerilog Assertions Kelly D. Larson

Slides:



Advertisements
Similar presentations
Object Oriented Analysis And Design-IT0207 iiI Semester
Advertisements

Lectures on File Management
Clock Domain Crossing (CDC)
- Verifying “Golden” reused IPs The Evil’s in the Edits William C Wallace Texas Instruments Nitin Jayaram Texas Instruments Nitin Mhaske Atrenta Inc Vijay.
Xiushan Feng* ASIC Verification Nvidia Corporation Automatic Verification of Dependency 1 TM Jayanta Bhadra
Annoucements  Next labs 9 and 10 are paired for everyone. So don’t miss the lab.  There is a review session for the quiz on Monday, November 4, at 8:00.
C++ Programming: Program Design Including Data Structures, Fourth Edition Chapter 15: Exception Handling.
Chapter 16: Exception Handling C++ Programming: From Problem Analysis to Program Design, Fifth Edition.
(Quickly) Testing the Tester via Path Coverage Alex Groce Oregon State University (formerly NASA/JPL Laboratory for Reliable Software)
Component Patterns – Architecture and Applications with EJB copyright © 2001, MATHEMA AG Component Patterns Architecture and Applications with EJB JavaForum.
Interrupts (contd..) Multiple I/O devices may be connected to the processor and the memory via a bus. Some or all of these devices may be capable of generating.
1 Presenter: Chien-Chih Chen. 2 An Assertion Library for On- Chip White-Box Verification at Run-Time On-Chip Verification of NoCs Using Assertion Processors.
1 Assertion Based Verification 2 The Design and Verification Gap  The number of transistors on a chip increases approximately 58% per year, according.
Chapter 2: Algorithm Discovery and Design
1 Fall 2008ACS-1903 for Loop Reading files String conversions Random class.
Exceptions Objectives At the conclusion of this lesson, students should be able to Explain the need for exceptions Correctly write programs that use.
EE694v-Verification-Lect5-1- Lecture 5 - Verification Tools Automation improves the efficiency and reliability of the verification process Some tools,
Principle of Functional Verification Chapter 1~3 Presenter : Fu-Ching Yang.
Chapter 2: Algorithm Discovery and Design
Design Synopsys System Verilog API Donations to Accellera João Geada.
Effective Questioning in the classroom
Gaurav Gupta Freescale Semiconductors Adiel Khan, Parag Goel, Amit Sharma, Varun S, Abhisek Verma Synopsys
Presenter: Shant Mandossian EFFECTIVE TESTING OF HEALTHCARE SIMULATION SOFTWARE.
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
Preventing behavior problems Rules – did you cover everything? Consequences – did you ever talk to the student? Include the parents? Communicating rules.
Invitation to Computer Science, Java Version, Second Edition.
Some Course Info Jean-Michel Chabloz. Main idea This is a course on writing efficient testbenches Very lab-centric course: –You are supposed to learn.
Testing 99 PART 2: Getting Going (chapter 10) Gradual adoption Current practice is changed little in each step. First step: use coverage. If coverage is.
Chapter 3 Developing an algorithm. Objectives To introduce methods of analysing a problem and developing a solution To develop simple algorithms using.
July 30, 2001Systems Architecture II1 Systems Architecture II (CS ) Lecture 8: Exploiting Memory Hierarchy: Virtual Memory * Jeremy R. Johnson Monday.
Debugging in Java. Common Bugs Compilation or syntactical errors are the first that you will encounter and the easiest to debug They are usually the result.
IT253: Computer Organization
Java Threads. What is a Thread? A thread can be loosely defined as a separate stream of execution that takes place simultaneously with and independently.
From Quality Control to Quality Assurance…and Beyond Alan Page Microsoft.
DEBUGGING. BUG A software bug is an error, flaw, failure, or fault in a computer program or system that causes it to produce an incorrect or unexpected.
CPS120: Introduction to Computer Science Decision Making in Programs.
Assertions Jean-Michel Chabloz. Assertions basic Idea We assert that a certain “thing” should be true. –If it is true, fine –If it is false, then we get.
1 Extending FPGA Verification Through The PLI Charles Howard Senior Research Engineer Southwest Research Institute San Antonio, Texas (210)
SystemVerilog Meeting 2003 Way Beyond Waveforms Novas SystemVerilog Donation Amadies SunBassam Tabbara.
SVA Encapsulation in UVM enabling phase and configuration aware assertions by Mark Litterick Verification Consultant Verilab GmbH, Munich, Germany.
CS 0401 Debugging Hints and Programming Quirks for Java by John C. Ramirez University of Pittsburgh.
David Streader Computer Science Victoria University of Wellington Copyright: David Streader, Victoria University of Wellington Debugging COMP T1.
Chapter 15: Exception Handling C++ Programming: Program Design Including Data Structures, Fifth Edition.
CS212: Object Oriented Analysis and Design Lecture 19: Exception Handling.
Software Quality Assurance and Testing Fazal Rehman Shamil.
PROGRAMMING TESTING B MODULE 2: SOFTWARE SYSTEMS 22 NOVEMBER 2013.
Intro to Loops 1.General Knowledge 2.Two Types of Loops 3.The WHILE loop 1.
JavaScript and Ajax (Control Structures) Week 4 Web site:
Architecture optimization and design verification of the Timepix3 and the Velopix pixel ASICs Tuomas Poikela (TUCS/CERN) SV/UVM Mini workshop.
Interrupts and Exception Handling. Execution We are quite aware of the Fetch, Execute process of the control unit of the CPU –Fetch and instruction as.
/16 Final Project Report By Facializer Team Final Project Report Eagle, Leo, Bessie, Five, Evan Dan, Kyle, Ben, Caleb.
Some of the utilities associated with the development of programs. These program development tools allow users to write and construct programs that the.
CSC 108H: Introduction to Computer Programming
Exceptions.
Debugging Intermittent Issues
Introduction to System Verilog Assertions
Mechanism: Address Translation
In-situ Visualization using VisIt
Introduction to Triggers
Chapter 10: Managed Exception Monitoring
Debugging Intermittent Issues
Coding Defensively Coding Defensively
Chapter 14: Exception Handling
Exceptions Handling the unexpected
CSC 143 Error Handling Kinds of errors: invalid input vs programming bugs How to handle: Bugs: use assert to trap during testing Bad data: should never.
Exception Handling In Text: Chapter 14.
Lecture 9: Testbench and Division
SystemVerilog and Verification
Mechanism: Address Translation
Chapter 11: Exception Handling
Presentation transcript:

Determining Test Quality through Dynamic Runtime Monitoring of SystemVerilog Assertions Kelly D. Larson

Is this you? Its only 2 weeks til tapeout. We have over 10,000 tests in our regression… I wonder if they all work?…

How do you know your tests are doing what they should? A complex SoC can have well over 10,000 tests. Making tests completely self-checking can be difficult. A test which passes may simply mean that nothing BAD happened... And not necessarily that anything GOOD happened.

The cost of bad tests A broken test steals valuable simulation cycles. A broken test is not testing what it should, and introduces a coverage hole. Your coverage report may not be sufficient to expose these gaps.

Solution We need a way to tie the pass/fail condition of an individual test to the specific conditions or goals of the test. This needs to be a scalable solution. This needs to work with both constrained random tests, as well as processor-centric style directed testing. Assertion Monitor!

Specific Test Requirements My test was supposed to hit a specific coverage point or fail. Did it? I know my test was supposed to make condition 'X' happen exactly five times or fail. Did it? Because of the way that I wrote my test, I should never see 'Y' happen, and if it does I want the test to fail even though Y itself is not illegal. Will it?

Assertion Monitor Most standard use of SystemVerilog assertions is to target DESIGN QUALITY. Our Assertion Monitor solution will target TEST QUALITY. Design Quality Test Quality

What about Assertion Coverage? Using dynamic assertion monitor is similar to analyzing assertion coverage reports, except: – Focus is on individual tests, not overall results. – Done while simulation is running. – Can fail test immediately upon detecting a problem with the test. – More flexible, can fail for condition hit or not hit, or hit within defined ranges.

Example Use Case: Arbiter req_hi gnt_hi req_med gnt_med req_low gnt_low check_hi: assert clk) disable iff (reset) req_hi |=> gnt_hi); check_med: assert clk) disable iff (reset) (req_med & !req_hi) |=> gnt_med); check_low: assert clk) disable iff (reset) (req_low & !req_med & !req_hi) |=> gnt_low); Arbiter Block Arbiter Block

Make it interesting… The assertions in this example will catch illegal activity, but they wont actually insure that any arbitration actually occurred. In this case, well need add another cover point to observe an interesting condition. check_arb: cover clk) disable iff (reset) (req_hi & req_med & req_low));

Plusarg directives What wed like now is to be able to run the simulation with an additional argument which requires our interesting condition to occur in order for the test to pass. +RequireAssert=check_arb +ProhibitAssert=check_hi How about a test where I know a particular condition should not occur?

Assertion Monitor Plusargs Type# Arg Example & DescriptionWhen Checked Require 0 +RequireAssert=myassert. Assertion must fire at least once during the test. End of test 1 +RequireAssert=myassert:x, Assertion must fire at least x times during the test. End of test 2 +RequireAssert=myassert:x:y, Assertion must fire in the range greater than or equal to x, and less than or equal to y times. During (too many), End (too few) Prohibit 0 +ProhibitAssert=myassert, Assertion must never fire during the simulation. During Test 1 +ProhibitAssert=myassert:x, Assertion must not fire x or more times (less is OK). During Test 2 +ProhibitAssert=myassert:x:y, Assertions cannot fire in the range of [x:y] inclusive (less or more is OK). End of test

Assertion Monitor Components Assertion Monitor has three main components RTL (SVA) UVM (SystemVerilog) UVM (SystemVerilog) DPI (C) DPI (C)

RTL Component RTL is instrumented with SVA assertions and coverpoints. – Ideally we can make use of existing assertions written for design quality. Assertion monitor treats assertions and coverpoints as the same. – From test quality perspective we dont really care if the assertions passes or fails, only that the condition was tested.

UVM Component Before the test begins, parse command line directives and call DPI routine to instrument assertion tracking. At the end of the test, do final check for proper behavior. Provide utility functions that will allow the assertion monitor DPI routines to report UVM errors and warnings.

C DPI Component Provide a data structure to store runtime information about monitored assertions. Provide the mechanism to attach a callback routine to monitored assertions. Provide the callback routine which will be run every time a monitored assertion or cover point successfully passes. Provide an end of test routine which will do a final check.

SystemVerilog Assertion API Our assertion monitor makes use of two key features of the Assertion API: 1)The ability to iterate through a design to find specific assertions. 2)The ability to attach our own callback (subroutine) to an assertion which will get called whenever the assertion (or cover point) passes successfully.

Iterating through RTL Assertions Assertion API allows us to easily iterate through handles of all of the assertions and coverpoints in the design. Handle gives us access to the hierarchical path, and allows us to attach a callback. itr = vpi_iterate(vpiAssertion, NULL); while (assertion = vpi_scan(itr)) { /* process assertion */ }

Registering a Callback vpiHandle vpi_register_assertion_cb( vpiHandle assertion, /* handle to assertion */ PLI_INT32 reason, /* reason for which callbacks needed */ vpi_assertion_callback_func *cb_rtn, PLI_BYTE8 *user_data /* user data to be supplied to cb */ ); typedef PLI_INT32 (vpi_assertion_callback_func)( PLI_INT32 reason, /* callback reason */ p_vpi_time cb_time, /* callback time */ vpiHandle assertion, /* handle to assertion */ p_vpi_attempt_info info, /* attempt related information */ PLI_BYTE8 *user_data /* registered user data */ );

Flow: Before Test UVM C DPI RTL Parse plusargs Initialize struct Create callback Attach CB to assertion Callback attached to RTL assertion or cover

Flow: During Test UVM C DPI RTL Display UVM error Statistics updated Check for errors Assertion pass executes callback

Flow: After Test UVM C DPI RTL End of Test: Call Final check Final check for errors Display UVM error

Dont like plusargs? Assertions can be monitored by calling DPI routines directly from the testbench. Can be called from within random testcases to provide dynamic feedback to help guide the progression of the testcase itself.

Constrained Random Testing In this example, assertion is registered with the DPI directly from the UVM sequence. Sequence actively monitors the assertion until 3 successful passes are detected. class bus_seq extends uvm_sequence #(bus_txn); virtual task body(); ahandle = register_assert("upper.overflow_detect",-1,-1,1); while (successes < 3) begin `uvm_do(tr); successes = num_assert_successes(ahandle); end endtask: body endclass: bus_seq

Summary Its more important than ever to make sure every simulation cycle is well spent. Broken tests not only waste cycles, but add risk by exposing unwanted coverage holes. SystemVerilog has built-in facilities that allow us to dynamically track assertions and coverpoints during the test. Dynamically tracking coverage helps insure that a test continues to do what its supposed to do throughout the project.

Thank You! Kelly D. Larson