Presentation is loading. Please wait.

Presentation is loading. Please wait.

Shared Secrets of WDF – Part 2 Testing WDF

Similar presentations


Presentation on theme: "Shared Secrets of WDF – Part 2 Testing WDF"— Presentation transcript:

1

2 Shared Secrets of WDF – Part 2 Testing WDF
Robert Kjelgaard Shyamal Varma Wei Mao

3 Agenda Bob Kjelgaard Shyamal Varma Wei Mao Introduction DDI Testing
Stress / Fault Injection /concurrency Whitebox techniques Shyamal Varma State Machine testing Wei Mao Performance Testing Versioning Testing Bob, Shyamal, and Wei Summary

4 What does WDF QA Do? Participate in product design and monitor implementation Assess product quality Advocate for customer Develop , use, and maintain test and reporting systems We develop test software and tools We like to break stuff, but We are not software testers

5 We Test WDF on Many Platforms
1.9 1.7 1.5 2k (KMDF) XP 2k3 Vista RTM Vista SP1/2k8 Windows 7 2k (KMDF) XP 2k3 Vista RTM Vista SP1/2k8 2k (KMDF) XP 2k3 (KMDF) Vista RTM

6 Product Testing Is Essential
When do we test? Constantly How do we test? Many ways, automated and manual How do we automate tests? By design and our own ingenuity and creativity where we reuse Is automation expensive? Yes, but worth it How do we develop and maintain test code? Just like any piece of software

7 Some of our problems are unique (so the same is true of our solutions)
We are testing generalized frameworks for drivers Not a driver or even a driver class We cannot write all possible drivers But we must ensure to the highest level possible that our product can be used for many kinds of drivers Our product includes complicated internal state machines We have to validate our packaging and versioning mechanisms We must keep tabs on performance We do not have unlimited resources So we have to use our resources wisely and effectively

8 Testing the DDI : Design Goals
The DDIs are primary developer interface to WDF It is imperative they work exactly as they should or WDF value is weakened Broad coverage It’s more than just parameters, it’s also when and from where you call Automated effectively, run every day Easily modifiable test harness ~400 KMDF DDI ~200 UMDF DDI More with each new version Reliability, reproducibility and diagnosability are paramount

9 Design Secrets of Our DDI Tests
Reducing design and implementation overhead of control KMDF—most drivers are scriptable ActiveX controls KMDF—initialization-time DDIs use property bags for control and reporting UMDF—use named pipes for control Effectively handle stops, bugchecks, and breaks KMDF—we use IAT hooks to convert to exceptions UMDF—special hooks built into the core so stops do not occur Leverage device simulation framework (DSF) to reduce hardware dependencies and increase reach Example: USB device with 255 interfaces, or 15 pipes or no pipes (boundary cases)

10 Typical DDI test flow using ActiveX-style COM automation
Script Script creates ActiveX Object Script calls method of Object to perform a test Script determines , logs and reports pass / fail status Execution Flow Marshalling Core User-mode proxy issues object creation IOCTL Proxy marshals parameters into a private IOCTL Proxy unpacks and returns the results User Kernel KM server uses test driver’s object factory to create object Server stub unpacks the IOCTL and issues the call Server marshals the results into the IOCTL return buffer Driver Test Driver initializes interface with KM server Test Driver’s factory creates the object Test Driver performs requested call

11 Concurrency, Fault Injection and Stress
Driver Verifier low-resolution simulation Our own adaptive fault injection tools This approach is available to you in WdfTester tool Random testing built into drivers Self-testing stress drivers Software bus drivers instead of root enumeration Multithreaded apps with overlapped I/O and multiple handles

12 White-box Techniques for That Last Bit of Coverage
PDB-based techniques to directly access internals PDB can give RVA of any symbol (data or code) in binary It also can return correct offsets for fields within a structure We get module base address and size using documented DDI Test app can pass RVAs and binary to KM driver, which resolves them IAT Hooking to manipulate the OS side of runtime. Similar to user mode Detours package from MS research (no trampolines) High maintenance and risky—used sparingly. But it beats giving up.

13 WDF State Machine Testing Shyamal Varma

14 PnP/Power State Machine Testing
The framework manages device PnP and power-related events with state machines PnP state machine (about 55 states and 112 transitions) Power state machine (about 82 states and 168 transitions) Power policy state machine (about 124 states and 246 transitions) State machines are complex and contain large number of states

15 A Peek at the State Machines

16 PnP/Power State Machine Test : Goals
Generate state machine model diagrams programmatically from state machine source code Allows comparison with specification Easier to verify changes to code Perform basic validation of the state machine at build time Look for dead states Obtain state machine state and transition coverage information Obtained while running tests Ensure that we have coverage as close to 100% as possible Add new tests to cover test holes

17 Generating State Machine Diagrams
A graph visualization tool is used that makes images (jpeg) out of a description in simple text language A tool parses state machine-related code from WDF source tree and generates the text file for the graph visualization tool

18 Checking for Dead States
“Dead state” =No state transition leading out of the state Check for dead states while building the framework code Spec explorer tool from Microsoft Research.

19 Checking for Dead States
Spec Explorer A tool for model-based testing Encode a system’s intended behavior (specification) in machine executable form (a model program) Model program written in Spec# We use a small subset of this tool’s functionality to check for dead states.

20 Checking for Dead States
A tool parses state machine code from the WDF source tree and generates Spec# code This spec# code is passed to the spec explorer engine to check for dead states Dead states will prevent framework code from building (compiling)

21 Obtaining Coverage Information
WDF state machine transition information is logged while running various tests A tool reads this data and generates a text file containing coverage statistics State and state transition coverage information obtained Graph visualization tool can be used to visualize coverage data

22 Performance Testing Versioning Testing Wei Mao

23 Performance Test - Goal
Measure and save performance data Data transfer throughput Latency Build-to-build compare Flag any performance regression Capture other kernel events for future analyzing Fully automated, runs regularly Unified reporting among multiple test applications / drivers

24 Performance Test - Goal
Implementation consideration Compare result on same machine, same OS Consistent result with multiple runs and calc average Extra indices: CPU utilization %, memory set variation Cover both high/low end machines

25 Performance Test – Test Application
Open the test device, write a pattern to it, read it back and verify it Log the result using event tracing for Windows (ETW) Configuration Options Multi threads: 1, 2, 4, 10 Chunk size: 10 bytes, 256, 4096

26 Performance Test - KMDF Test Driver
Configuration Options IO type: buffered, direct, neither Execution level: passive, dispatch Synchronization scope: device, queue, none Register preprocess callback: escape to WDM Optional Upper Filter Driver CreateDefaultQueue or not Forward and forget vs. reformat, forward w/ completion routine WDM Test Driver Same configuration options as KMDF

27 Performance Test - UMDF Test Drivers
Various test drivers Memory copy Forward to Win32 file I/O UMDF filter above KMDF stack KMDF stack can be memory copy or fixed transfer rate device Various configuration s Parallel / sequential queue Multiple queues Locking mode I/O type: sync and async, etc.

28 Performance Test - Reporting
The same ETW schema is used among different event providers Trace is captured with Xperf ETL is converted to XML with Tracerpt Compare with historic data given certain allowed margin Generate HTML report Flag performance regression

29 Versioning Test Goal Non-goal
Verify both the WDF co-installer and WDF hotfix install correctly on all supported platforms Cover broad test matrix Non-goal API/DDI compatibility – which is verified separately

30 Versioning – Test Matrix Execution
We have many installation scenarios to cover Primarily done manually through WDF 1.7. Each installation scenario must begin with a clean machine For example: Install 1.9, reimage, then upgrade 1.7 to 1.9 System Restore is used to minimize reimage time For effective control, we wrote our own test scenario execution scheduler Input: scenario description file in XML Save and resume execution context upon reboot Benefit: the scenario is self-documented

31 Summaries

32 Our summary WDF is heavily tested and more testing is added regularly
The primary tools and techniques we use in test development and execution are available to you in the WDK Others, such as Driver Verifier and App Verifier, are readily available [KMDF 1.9] If you test with Driver Verifier, you get KMDF verifier for free We have WDF-specific tools in the WDK WdfTester lets you do some of the same things we do Call logging and tracing Adaptive (or other targeted forms of) fault injection WdfVerifier is there to make your testing easier

33 Call to Action Use PFD, SDV and compile to W4 on your driver code
Use Driver Verifier and App Verifier in testing Use WDF! Use WdfTester and WdfVerifier on your WDF drivers Feedback on WDF-specific tools (and any WDF quality concerns): We can’t act effectively on your behalf without your input

34 Additional Resources Web Resources
Hardware and Driver Developer Community Blogs Spec Explorer AppVerifier WDK Documentation WdfVerifier Tool WdfTester Tool Driver Verifier DSF Debugging UMDF drivers

35 WDF DDC 2008 Sessions Ask the Experts Table, Panel Disccussion Session
Day / Time Shared Secrets about Windows Driver Framework: Part 1 Mon and Wed. 8:30-9:30 Getting a Logo for your Windows Driver Foundation Driver Mon. 4-5 Tues. 2:45-3:45 Using WinDBG to Debug Kernel-Mode Windows Driver Framework Drivers Mon. 2:45-3:45 Wed Using Kernel-Mode Driver Framework in Miniport Drivers Mon. 4-5 Packaging and Deploying KMDF and UMDF Drivers Tues. 4-5 Wed. 8:30-9:30 Exploring a KMDF Storage Driver: Parts 1 and 2 Tues. 9:45-12:00 What’s New in Windows Driver Framework Mon. 8:30-9:30 Wed. 9:45-10:45 Discussion: Windows Driver Framework Wed. 1:30-2:30 Ask the Experts Table Tues. evening Ask the Experts Table, Panel Disccussion

36 Backup slides

37 Testing the DDI : Design Approach
Black box testing approach to ensure all DDIs behave as per the specification. Outline of our approach Exercise boundary values Error guessing (guessing at effects on internal state of framework) Valid test cases (valid combinations of parameters and state) Invalid test cases (invalid parameters or calling at inappropriate times) Equivalence partitioning- identify combinations that don’t add test value

38 UMDF Handling of Invalid DDI Test Cases
When a driver does something illegal, WudfHost.exe breaks into the debugger if one is present. Want to break into the debugger only when something unexpected occurs. Test hook to invoke a callback in the driver instead of a debugger break. Test driver enables the callback before invoking the invalid DDI and then disables it.

39 Utilizing Import Address Table hooks to keep things running and more…
We have a legacy (NT4 style) driver that modifies a kernel module’s import address table, either when loaded or on-the-fly. We use this on the KMDF runtime (and in other cases the loader) for Call logging Fault injection Converting bugchecks, breakpoints, asserts, etc. into exceptions. Test driver uses SEH and knowledge of test to handle or continue. If continued, we report the bugcheck, breakpoint [if unexpected], or assert. Allows extensive invalid case testing without having to coordinate two machines or script attached debuggers, etc. Not a safe approach for product drivers, but acceptable for controlled environment test usages [there are still risks]

40 IoTarget test suite- a design example
Initial goal was a scenario test to exhaustively cover various aspects of R/W request processing when remote I/O targets are used. These aspects were analyzed and enumerated so a large cross product could be run to cover all of these cases. A stress test was to be used as well, so the architecture also needed stress features (such as the ability to remove devices at will mid-test) Three SW-only drivers (all KMDF, utilizing ActiveX for test programming as in DDI tests) Bus driver with minimal wake / idle simulated support (COM interface specifies HW ID and bus address to add / remove) Target (looks like raw memory block device) Hunter (accepts requests and processes them using a target).

41 IoTarget Test Suite- Analyzing the test space
Some aspects we wanted to vary had to be set at AddDevice time (e.g. passive synchronization, default queue model, buffered/direct/neither I/O)- these were “fixed attributes”. Others could be changed on the fly (e.g. complete request at passive or dispatch level, use asynchronous or synchronous sends)- these are “live attributes”. Fixed attributes map to device bus address so PDO PNP caps query is unambiguous way to know them Live attributes are set via a single scripted call to hunter or target. Total combos is (Target Fixed * Target Live) * (Hunter Fixed * Hunter Live) “Scenario test” runs all combos- 130K+ variations currently

42 IoTarget Test Suite- adding stress and fault injection
Hunter and target drivers have additional programmable behaviors Capability to fail calls, change delays, start / stop targets, etc. Either fixed, or at random via a programmable distribution of each behavior Hunters can arbitrarily pick their own targets at random All via the ActiveX interfaces [thus independent of any I/O paths]. Bus driver (as noted) can add or remove a hunter or target at any time. Stress app allows selection of behaviors and distributions, issues multi-threaded I/O to hunters, and adds /removes devices at random. Run in conjunction with driver verifier lo-res for additional stress.

43 Versioning – Test Matrix (examples)
Minor version update Install higher minor version of coinstaller No WDF device was present on System, e.g. Windows XP Inbox WDF device is running, e.g. Vista. May require reboot Lower or same minor version Install v1.7 package on a v1.9 system, and it should still work Coinstaller is different from the one used by the driver at build time IHV driver was developed with v1.7, and the INF referenced 1.7 too A critical fix is fixed in v1.9 W/o recompile IHV modifies INF to use v1.9 and resubmits for logo

44 Versioning – Coinstaller Limitation
We will publicly release only one coinstaller per major.minor version PnP limits that any update to coinstaller must have a different name I.e., we cannot patch coinstaller once it is released We can release multiple hotfixes for current mj.mn version You may have to direct customers to latest hotfix if needed by your driver Remember: hotfix will not introduce any DDI change (add/remove etc.) New versions are our fix of record for previous versions


Download ppt "Shared Secrets of WDF – Part 2 Testing WDF"

Similar presentations


Ads by Google