November 5, 2007 ACM WEASEL Tech Efficient Time-Aware Prioritization with Knapsack Solvers Sara Alspaugh Kristen R. Walcott Mary Lou Soffa University of.

Slides:



Advertisements
Similar presentations
MCS 312: NP Completeness and Approximation algorithms Instructor Neelima Gupta
Advertisements

Time-Aware Test Suite Prioritization Kristen R. Walcott, Mary Lou Soffa University of Virginia International Symposium on Software Testing and Analysis.
CS527: Advanced Topics in Software Engineering (Software Testing and Analysis) Darko Marinov September 18, 2008.
Chapter 5 Fundamental Algorithm Design Techniques.
Understanding Operating Systems Fifth Edition
Problem Solving Dr. Andrew Wallace PhD BEng(hons) EurIng
Test Case Filtering and Prioritization Based on Coverage of Combinations of Program Elements Wes Masri and Marwa El-Ghali American Univ. of Beirut ECE.
Prioritizing User-session-based Test Cases for Web Applications Testing Sreedevi Sampath, Renne C. Bryce, Gokulanand Viswanath, Vani Kandimalla, A.Gunes.
0-1 Knapsack Problem A burglar breaks into a museum and finds “n” items Let v_i denote the value of ith item, and let w_i denote the weight of the ith.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Dynamic lot sizing and tool management in automated manufacturing systems M. Selim Aktürk, Siraceddin Önen presented by Zümbül Bulut.
Active Set Support Vector Regression
UMass Lowell Computer Science Advanced Algorithms Computational Geometry Prof. Karen Daniels Spring, 2001 Lecture 6 Start of Part II Material Monday,
A Tool for Partitioning and Pipelined Scheduling of Hardware-Software Systems Karam S Chatha and Ranga Vemuri Department of ECECS University of Cincinnati.
Fundamental Techniques
The Knapsack Problem Input –Capacity K –n items with weights w i and values v i Goal –Output a set of items S such that the sum of weights of items in.
Query Processing Presented by Aung S. Win.
DEXA 2005 Quality-Aware Replication of Multimedia Data Yicheng Tu, Jingfeng Yan and Sunil Prabhakar Department of Computer Sciences, Purdue University.
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
Recursion and Dynamic Programming. Recursive thinking… Recursion is a method where the solution to a problem depends on solutions to smaller instances.
1 1 Slide © 2000 South-Western College Publishing/ITP Slides Prepared by JOHN LOUCKS.
1 Distributed Energy-Efficient Scheduling for Data-Intensive Applications with Deadline Constraints on Data Grids Cong Liu and Xiao Qin Auburn University.
1 Exploring Custom Instruction Synthesis for Application-Specific Instruction Set Processors with Multiple Design Objectives Lin, Hai Fei, Yunsi ACM/IEEE.
1 0-1 Knapsack problem Dr. Ying Lu RAIK 283 Data Structures & Algorithms.
CSC 413/513: Intro to Algorithms Greedy Algorithms.
Synchronization Transformations for Parallel Computing Pedro Diniz and Martin Rinard Department of Computer Science University of California, Santa Barbara.
RecBench: Benchmarks for Evaluating Performance of Recommender System Architectures Justin Levandoski Michael D. Ekstrand Michael J. Ludwig Ahmed Eldawy.
Autonomic scheduling of tasks from data parallel patterns to CPU/GPU core mixes Published in: High Performance Computing and Simulation (HPCS), 2013 International.
Introduction to Algorithms Chapter 16: Greedy Algorithms.
Gary Sham HKOI 2010 Greedy, Divide and Conquer. Greedy Algorithm Solve the problem by the “BEST” choice. To find the global optimal through local optimal.
1 Short Term Scheduling. 2  Planning horizon is short  Multiple unique jobs (tasks) with varying processing times and due dates  Multiple unique jobs.
Honors Track: Competitive Programming & Problem Solving Optimization Problems Kevin Verbeek.
1 Programming for Engineers in Python Autumn Lecture 12: Dynamic Programming.
Dynamic Programming continued David Kauchak cs302 Spring 2012.
Resource Mapping and Scheduling for Heterogeneous Network Processor Systems Liang Yang, Tushar Gohad, Pavel Ghosh, Devesh Sinha, Arunabha Sen and Andrea.
Mobile Agent Migration Problem Yingyue Xu. Energy efficiency requirement of sensor networks Mobile agent computing paradigm Data fusion, distributed processing.
A Smart Pre-Classifier to Reduce Power Consumption of TCAMs for Multi-dimensional Packet Classification Yadi Ma, Suman Banerjee University of Wisconsin-Madison.
Implicit Hitting Set Problems Richard M. Karp Erick Moreno Centeno DIMACS 20 th Anniversary.
M Tech Project – First Stage Improving Branch-And-Price Algorithms For Solving 1D Cutting Stock Problem Soumitra Pal [ ]
1 Dynamic Programming Topic 07 Asst. Prof. Dr. Bunyarit Uyyanonvara IT Program, Image and Vision Computing Lab. School of Information and Computer Technology.
Optimization Problems In which a set of choices must be made in order to arrive at an optimal (min/max) solution, subject to some constraints. (There may.
Dynamic Programming David Kauchak cs161 Summer 2009.
Lecture 151 Programming & Data Structures Dynamic Programming GRIFFITH COLLEGE DUBLIN.
CS 3343: Analysis of Algorithms Lecture 19: Introduction to Greedy Algorithms.
Improving Structural Testing of Object-Oriented Programs via Integrating Evolutionary Testing and Symbolic Execution Kobi Inkumsah Tao Xie Dept. of Computer.
1 1 © 2003 Thomson  /South-Western Slide Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Spring 2008The Greedy Method1. Spring 2008The Greedy Method2 Outline and Reading The Greedy Method Technique (§5.1) Fractional Knapsack Problem (§5.1.1)
CS 361 – Chapter 10 “Greedy algorithms” It’s a strategy of solving some problems –Need to make a series of choices –Each choice is made to maximize current.
Divide and Conquer. Problem Solution 4 Example.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 18.
Journal of Computational and Applied Mathematics Volume 253, 1 December 2013, Pages 14–25 Reporter : Zong-Dian Lee A hybrid quantum inspired harmony search.
Greedy algorithms: CSC317
CSC317 Greedy algorithms; Two main properties:
The Greedy Method and Text Compression
Robbing a House with Greedy Algorithms
Approximation Algorithms
Algorithm Design Methods
Dynamic Programming.
CS 3343: Analysis of Algorithms
CS Algorithms Dynamic programming 0-1 Knapsack problem 12/5/2018.
Greedy Algorithms Many optimization problems can be solved more quickly using a greedy approach The basic principle is that local optimal decisions may.
Multi-Objective Optimization
Advanced Algorithms Analysis and Design
Dynamic Programming.
CSC 413/513- Intro to Algorithms
Lecture 4 Dynamic Programming
0-1 Knapsack problem.
Area Coverage Problem Optimization by (local) Search
IIS Progress Report 2016/01/18.
Algorithm Course Algorithms Lecture 3 Sorting Algorithm-1
Presentation transcript:

November 5, 2007 ACM WEASEL Tech Efficient Time-Aware Prioritization with Knapsack Solvers Sara Alspaugh Kristen R. Walcott Mary Lou Soffa University of Virginia Michael Belanich Gregory M. Kapfhammer Allegheny College

Test Suite Prioritization Testing occurs throughout software development life cycle  Challenge: time consuming and costly Prioritization: reordering the test suite  Goal: find errors sooner in testing Doesn’t consider the overall time budget  Alternative: time-aware prioritization Goal 1: find errors sooner in testing Goal 2: execute within time constraint

Motivating Example Original test suite with fault information T2 1 fault 2 min. T4 6 faults 2 min. T3 2 faults 2 min. T1 4 faults 2 min. Assume: - Same execution time - Unique faults found T4 6 faults 2 min. T1 4 faults 2 min. T2 1 fault 2 min. T3 2 faults 2 min. Prioritized test suite Testing time budget: 4 minutes

The Knapsack Problem for Time-Aware Prioritization Maximize:, where is the code coverage of test and is either 0 or 1. Subject to the constraint: where is the execution time of test and is the time budget. P n i = 1 c i ¤ x i x i t i ii c i t max P n i = 1 t i ¤ x i · t max

The Knapsack Problem for Time-Aware Prioritization T2 1 line 2 min. T4 5 lines 2 min. T3 2 lines 2 min. T1 4 lines 2 min. Time Budget: 4 min. Total Value: Space Remaining: 0 4 min. 5 2 min. 9 0 min. Assume test cases cover unique requirements.

The Extended Knapsack Problem Value of each test case depends on test cases already in prioritization  Test cases may cover same requirements T2 1 line 2 min. T4 5 lines 2 min. T3 2 lines 2 min. T1 4 lines 2 min. Time Budget: 4 min. Total Value: Space Remaining: 0 4 min. 5 2 min. 7 0 min. T1 0 lines 2 min. UPDATE

Goals and Challenges Evaluate traditional and extended knapsack solvers for use in time-aware prioritization  Effectiveness Coverage-based metrics  Efficiency Time overhead Memory overhead How does overlapping code coverage affect results of traditional techniques? Is the cost of extended knapsack algorithms worthwhile?

The Knapsack Solvers Random: select tests cases at random Greedy by Ratio: order by coverage/time Greedy by Value: order by coverage Greedy by Weight: order by time Dynamic Programming: break problem into sub-problems; use sub-problem results for main solution Generalized Tabular: use large tables to store sub-problem solutions

The Knapsack Solvers (continued) Core: compute optimal fractional solution then exchange items until optimal integral solution found Overlap-Aware: uses a genetic algorithm to solve the extended knapsack problem for time- aware prioritization

The Scaling Heuristic Order the test cases by their coverage-to- execution-time ratio such that: If, then it is possible to find an optimal solution that includes. Check the inequality for each test case until it no longer holds. belong in the final prioritization. T i T 1 c 1 £ j t max t 1 k ¸ c 2 £ ³ t max t 2 ´ h T 1 ;::: T x ¡ 1 i c 1 t 1 ¸ c 2 t 2 ¸ ::: ¸ c n t n T x ; x 2 [ 1 ; n ]

Implementation Details Knapsack Solver Test Transformer Coverage Calculator Test Suite (T) New Test Suite (T ’) Program Under Test (P) Knapsack Solver Parameters 1. Selected Solver 2. Reduction Preference 3. Knapsack Size

Evaluation Metrics Code coverage: Percentage of requirements executed when prioritization is run  Basic block coverage used Coverage preservation: Proportion of code covered by prioritization versus code covered by entire original test suite Order-aware coverage: Considers both the order in which test cases execute in addition to overall code coverage

Experiment Design Goals of experiment:  Measure efficiency of algorithms and scaling in terms of time and space overhead  Measure effectiveness of algorithms and scaling in terms of three coverage-based metrics Case studies:  JDepend  Gradebook Knapsack Size  25, 50, and 75% of execution time of original test suite

Summary of Experimental Results Prioritizer Effectiveness:  Overlap-aware solver had highest overall coverage for each time limit Greedy by Value solver good for Gradebook All Greedy solvers good for JDepend Prioritizer Efficiency:  All algorithms took small amount of time and memory except for Dynamic Programming, Generalized Tabular, and Core  Overlap-aware solver required hours to run  Generalized Tabular had prohibitively large memory requirements  Scaling heuristic reduced overhead in some cases

Conclusions Most sophisticated algorithm not necessarily most effective or most efficient Trade-off: effectiveness versus efficiency Efficiency or effectiveness most important?  Effectiveness  overlap-aware prioritizer  Efficiency  low-overhead prioritizer Prioritizer choice depends on test suite nature  Time versus coverage of each test case  Coverage overlap between test cases

Future Research Use larger case studies with bigger test suites Use case studies written in other languages Evaluate other knapsack solvers such as branch-and-bound and parallel solvers Incorporate other metrics such as APFD Use synthetically generated test suites

Questions? Thank you!

Case Study Applications GradebookJDepend Classes522 Functions73305 NCSS Test Cases2853 Test Suite Exec. Time7.008 s5.468 s