Presentation is loading. Please wait.

Presentation is loading. Please wait.

Computer Science Automated Test Data Generation for Aspect-Oriented Programs Mark Harman (King’s College London, UK) Fayezin Islam (T-Zero Processing Services,

Similar presentations


Presentation on theme: "Computer Science Automated Test Data Generation for Aspect-Oriented Programs Mark Harman (King’s College London, UK) Fayezin Islam (T-Zero Processing Services,"— Presentation transcript:

1 Computer Science Automated Test Data Generation for Aspect-Oriented Programs Mark Harman (King’s College London, UK) Fayezin Islam (T-Zero Processing Services, US) Tao Xie (North Carolina State University, US) Stefan Wappler (Berner & Mattner, Germany)

2 Computer Science Background Automated testing of aspect-oriented programs Testing aspectual composition behavior (pointcut behavior) [Ferrari et al. ICST 08, Anbalagan&Xie ISSRE 08, …] Testing aspectual behavior (advice behavior) [Xie&Zhao AOSD 06, Xie et al. ISSRE 06,...]

3 Computer Science Testing Aspectual Behavior Aspect weaving (e.g., with ajc, abc) – Aspect  Class (bytecode) – Advice  Method (bytecode) Straightforward unit testing: feed aspect classes to OO test generation tools based on bytecode – Issues: arguments can be thisJoinPoint or AroundClosure objects Aspectra: generate tests for woven classes but focus on aspectual behavior – Feed woven classes to OO test generation tools – Base classes == “test drivers” Leverage existing OO tools for testing AOP programs [Xie&Zhao AOSD 05] E.g., Parasoft Jtest (random testing tool)

4 Computer Science Example Program Under Test +

5 Computer Science New Contributions A new system of automated test data generation for AOP based on Search-Based Testing, i.e., evolutionary testing – Input domain: receiver object, method parameters – E.g., account.debit(amount) Empirical studies to demonstrate the benefits of the system in AOP structural testing – Effectiveness: better than random testing – Efficiency: AOP domain reduction techniques – Efficiency: focusing test effort on aspectual branches to improve efficiency

6 Computer Science What is Search-Based Testing? In search-based testing, we apply search techniques to search large input spaces, guided by a fitness function. Fitness function measures how good/close one input is in reaching the goal e.g., covering true branch of

7 Computer Science Evolutionary Algorithms Selection Insertion Recombination Mutation Fitness evaluation End? chromosome

8 Computer Science Evolutionary Testing Selection Insertion Mutation End? Test cases Recombination Fitness evaluation

9 Computer Science Evolutionary Testing Selection Insertion Mutation End? Test cases Execution Recombination Fitness evaluation

10 Computer Science Evolutionary Testing Selection Insertion Mutation End? Test cases Execution Monitoring Recombination Fitness evaluation chromosome Method seq

11 Computer Science Structural Evolutionary Testing Target: true branch

12 Computer Science Fitness = (A - B) = (100 – 0) = (100 – 50) = (100 – 101) + 1 Evaluation of predicate in a branching condition if (A < B) Structural Evolutionary Testing Lower fitness value, the better 0 fitness value, reach the target Target: true branch

13 Computer Science AOP Domain Reduction - Motivation Input domain: receiver object, method parameters – E.g., account.debit(amount) Not all input variables are relevant to the coverage of the target branch inside aspects Target: false branch …

14 Computer Science Slicing-based Domain Reduction Irrelevant-input-variable identification Start with slicing criterion: predicates of target branches Extract backward program slices (based on data and control dependence) Identify relevant input variables: input variables showing up in the slices: account.debit(amount) Domain reduction: searching for only relevant input vars Target: false branch …

15 Computer Science EvolutionaryAspectTester (EAT) System Implementation Indus slicer [Ranganath et al. 07] EvoUnit [Wappler 08] Aspectra [Xie&Zhao 06]

16 Computer Science Evaluation Benchmarks 14 benchmarks from [Xie&Zhao 06, Rinard et al. 04, Dufour et al. 04, Hannemann&Kiczales 02]

17 Computer Science Study 1: Assessment of evo testing RQ 1.1. Can evolutionary testing outperform random testing for AOP testing?

18 Computer Science RQ 1.1: Assessment of evo testing Coverage improvement of evolutionary testing over random testing Better branch coverage on 5/14 benchmarks 43%

19 Computer Science RQ 1.1: Assessment of evo testing cont. 61% Effort reduction of evolutionary testing over random testing Effort reduction on 9/14 benchmarks

20 Computer Science Findings: Assessment of evo testing RQ 1.1. Can evolutionary testing outperform random testing for testing aspect-oriented programs? Better branch coverage on 5/14 benchmarks (0%~43%) Effort reduction on 9/14 benchmarks (0%~61%)

21 Computer Science Study 2: Impact of domain reduction RQ 2.1. #branches that have irrelevant parameters and %parameters that are irrelevant for each such branch? RQ 2.2. %effort reduction for each such branch? RQ 2.3. %effort reduction for each program?

22 Computer Science RQ 2.1: Impact of domain reduction 90/434 aspectual branches with irrelevant parameters

23 Computer Science RQ 2.1: Impact of domain reduction Input domain reduction for branches with non-0 reduction Input domain reduction (25%~100%)

24 Computer Science RQ 2.2: Impact of domain reduction Effort reduction per branch of using domain reduction Effort increase on 25%, same on 6%, and reduction on 69% branches 94% -88% Easy/trivial branches

25 Computer Science RQ 2.3: Impact of domain reduction Effort reduction per program of using domain reduction Effort reduction (17%~93%)

26 Computer Science Findings: Impact of domain reduction RQ 2.1. #branches that have irrelevant parameters (99/434) and %parameters that are irrelevant for each such branch (25%~100%)? RQ 2.2. %effort reduction for each such branch (-88%~94%)(69% branches get reduction)? RQ 2.3. %effort reduction for each program (17%~93%)?

27 Computer Science Study 3: Impact of focusing on testing aspectual behavior RQ 3.1. %effort reduction for test data generation if aspectual behavior instead of all behavior is focused on?

28 Computer Science RQ 3.1: Impact of aspect focusing Effort reduction of focusing on aspectual behavior over all behavior Effort reduction on all 14 benchmarks 99.99% 3%

29 Computer Science RQ 3.1: Impact of aspect focusing cont. Coverage improvement of focusing on aspectual behavior over all behavior Coverage improvement on 6/14 benchmarks 62%

30 Computer Science Study 3: Impact of focusing on testing aspectual behavior RQ 3.1. %effort reduction for test data generation if aspectual behavior instead of all behavior is focused on? Effort reduction on all 14 benchmarks (3% ~ 99.99%) Coverage improvement on 6/14 benchmarks (0% ~ 62%)

31 Computer Science Conclusion A new system of automated test data generation for AOP based on Search-Based Testing Empirical studies to demonstrate the benefits of the system in AOP structural testing – Effectiveness: better than random testing – Efficiency: AOP domain reduction techniques – Efficiency: focusing test effort on aspectual branches to improve efficiency Future work on more advanced techniques (e.g., symbolic execution), more testing objectives, larger AOP programs

32 Computer Science Questions?

33 Computer Science Level 4 Level 3 Level 2 Level 1 Fitness = Approximation_Level + Local_Distance 101 = 0 + (100 – 0) = 0 + (100 – 50) = 0 + (100 – 101) + 1 Evaluation of predicate in a branching condition if (A < B) Local_Distance = (A – B) + 1 Identify relevant branching statements using control dependence, e.g.,(#expected/#actual – 1) Target Approximation level Structural Evolutionary Testing Local distance Lower fitness value, the better 0 fitness value, reach the target

34 Computer Science RQ 2.4: Impact of domain reduction Co-lateral coverage improvement effect of domain reduction 9 branches have statistically significant change in co-lateral coverage


Download ppt "Computer Science Automated Test Data Generation for Aspect-Oriented Programs Mark Harman (King’s College London, UK) Fayezin Islam (T-Zero Processing Services,"

Similar presentations


Ads by Google