Generation of Conformance Test Suites for Compositions of Web Services Using Model Checking José García-Fanjul, Claudio de la Riva and Javier Tuya University.

Slides:



Advertisements
Similar presentations
1 Verification by Model Checking. 2 Part 1 : Motivation.
Advertisements

PhUSE 2010 Unit Testing and Code Coverage Assessment with SASUnit - Key Technologies for Development of reliable SAS Macros - HMS Analytical Software.
A Method for Validating Software Security Constraints Filaret Ilas Matt Henry CS 527 Dr. O.J. Pilskalns.
Auto-Generation of Test Cases for Infinite States Reactive Systems Based on Symbolic Execution and Formula Rewriting Donghuo Chen School of Computer Science.
Generating test cases specifications for BPEL compositions of web services using SPIN José García-Fanjul, Javier Tuya, and Claudio de la Riva Pointner.
Verification and Validation
Alan Shaffer, Mikhail Auguston, Cynthia Irvine, Tim Levin The 7th OOPSLA Workshop on Domain-Specific Modeling October 21-22, 2007 Toward a Security Domain.
Budapest University of Technology and EconomicsDagstuhl 2004 Department of Measurement and Information Systems 1 Towards Automated Formal Verification.
Preparing Data for Quantitative Analysis
Ossi Taipale, Lappeenranta University of Technology
Towards Self-Testing in Autonomic Computing Systems Tariq M. King, Djuradj Babich, Jonatan Alava, and Peter J. Clarke Software Testing Research Group Florida.
Formal verification in SPIN Karthikeyan Bhargavan, Davor Obradovic CIS573, Fall 1999.
Abhinn Kothari, 2009CS10172 Parth Jaiswal 2009CS10205 Group: 3 Supervisor : Huzur Saran.
1 Temporal Claims A temporal claim is defined in Promela by the syntax: never { … body … } never is a keyword, like proctype. The body is the same as for.
Software Model Checking for Embedded Systems PIs: Matthew Dwyer 1, John Hatcliff 1, and George Avrunin 2 Post-docs: Steven Seigel 2, Radu Iosif 1 Students:
Testing Without Executing the Code Pavlina Koleva Junior QA Engineer WinCore Telerik QA Academy Telerik QA Academy.
Pervasive Enablement of Business Process 徐天送 2004/11/2.
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Verification and Validation l Assuring that a software system meets a user's.
On-the-fly Model Checking from Interval Logic Specifications Manuel I. Capel & Miguel J. Hornos Dept. Lenguajes y Sistemas Informáticos Universidad de.
Dynamically Discovering Likely Program Invariants to Support Program Evolution Michael Ernst, Jake Cockrell, William Griswold, David Notkin Presented by.
Automatically Extracting and Verifying Design Patterns in Java Code James Norris Ruchika Agrawal Computer Science Department Stanford University {jcn,
1 Advanced Material The following slides contain advanced material and are optional.
EE694v-Verification-Lect5-1- Lecture 5 - Verification Tools Automation improves the efficiency and reliability of the verification process Some tools,
Automata and Formal Lanugages Büchi Automata and Model Checking Ralf Möller based on slides by Chang-Beom Choi Provable Software Lab, KAIST.
System Design Research Laboratory Specification-based Testing with Linear Temporal Logic Li Tan Oleg Sokolsky Insup Lee University of Pennsylvania.
1 Formal Engineering of Reliable Software LASER 2004 school Tutorial, Lecture1 Natasha Sharygina Carnegie Mellon University.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Verification and Validation.
Software Engineering Tools and Methods Presented by: Mohammad Enamur Rashid( ) Mohammad Rashim Uddin( ) Masud Ur Rahman( )
High Level: Generic Test Process (from chapter 6 of your text and earlier lesson) Test Planning & Preparation Test Execution Goals met? Analysis & Follow-up.
Software Process Reviews/Audits
Software testing standards ISO/IEC and 33063
Static testing Elena Rudovol February, 13, Sitecore. Compelling Web Experiences Page 2 What is static testing? Static Testing do.
Romaric GUILLERM Hamid DEMMOU LAAS-CNRS Nabil SADOU SUPELEC/IETR ESM'2009, October 26-28, 2009, Holiday Inn Leicester, Leicester, United Kingdom.
Software Testing Verification and validation planning Software inspections Software Inspection vs. Testing Automated static analysis Cleanroom software.
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Verification and Validation l Assuring that a software system meets a user's.
Verification and Validation Yonsei University 2 nd Semester, 2014 Sanghyun Park.
MusalaSoft Quality Process Overview Damyan Kasapov, QA Engineer Tsvetelina Kovacheva, QA Engineer March 15, 2005.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Verification and Validation.
Best Practices By Gabriel Rodriguez
Let us start from the V-Model Verification Phases Requirements analysis System Design Architecture Design Module Design Coding Validation phases Unit.
Chapter 8 – Software Testing Lecture 1 1Chapter 8 Software testing The bearing of a child takes nine months, no matter how many women are assigned. Many.
Advanced Technology Center Slide 1 Requirements-Based Testing Dr. Mats P. E. Heimdahl University of Minnesota Software Engineering Center Dr. Steven P.
1 A Static Analysis Approach for Automatically Generating Test Cases for Web Applications Presented by: Beverly Leung Fahim Rahman.
Department of CS and Mathematics, University of Pitesti State-based Testing is Functional Testing ! Florentin Ipate, Raluca Lefticaru University of Pitesti,
Verification and Validation in the Context of Domain-Specific Modelling Janne Merilinna.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Software Verification, Validation and Testing.
Data Abstractions for the Verification of Web Service Compositions Raman KazhamiakinMarco Pistore DIT, University.
This chapter is extracted from Sommerville’s slides. Textbook chapter
By, Venkateswara Reddy. Tallapu Reddy. 1.Introduction. 2.What is X-Machine Testing..?? 3.Methods of X-Machine Testing. 4.Variants of X- Machine. 5.Stream.
1 Qualitative Reasoning of Distributed Object Design Nima Kaveh & Wolfgang Emmerich Software Systems Engineering Dept. Computer Science University College.
Software Testing. Software testing is the execution of software with test data from the problem domain. Software testing is the execution of software.
Verification and Validation Assuring that a software system meets a user's needs.
A Self-Configuring Test Harness for Web Applications Jairo Pava School of Computing and Information Sciences Florida International University Courtney.
CrossCheckSimulation Results Conclusions References Model Instrumentation Modeling with CUTS Property Specification SPRUCE Challenge Problem Checking Model.
Parastoo Mohagheghi 1 A Multi-dimensional Framework for Characterizing Domain Specific Languages Øystein Haugen Parastoo Mohagheghi SINTEF, UiO 21 October.
Modelling and Analysis of Time-related Properties in Web Service Compositions Raman KazhamiakinParitosh K. PandyaMarco Pistore
Properties Incompleteness Evaluation by Functional Verification IEEE TRANSACTIONS ON COMPUTERS, VOL. 56, NO. 4, APRIL
Quality Assurance in the Presence of Variability Kim Lauenroth, Andreas Metzger, Klaus Pohl Institute for Computer Science and Business Information Systems.
Software Systems Verification and Validation Laboratory Assignment 4 Model checking Assignment date: Lab 4 Delivery date: Lab 4, 5.
This chapter is extracted from Sommerville’s slides. Textbook chapter 22 1 Chapter 8 Validation and Verification 1.
Whole Test Suite Generation. Abstract Not all bugs lead to program crashes, and not always is there a formal specification to check the correctness of.
Verification vs. Validation Verification: "Are we building the product right?" The software should conform to its specification.The software should conform.
SOFTWARE TESTING AND QUALITY ASSURANCE. Software Testing.
Laurea Triennale in Informatica – Corso di Ingegneria del Software I – A.A. 2006/2007 Andrea Polini XVII. Verification and Validation.
Verification and Validation. Topics covered l Verification and validation planning l Program Testing l Software inspections.
Paul Ammann & Jeff Offutt
Formal verification in SPIN
Chapter 8 – Software Testing
Chapter 7 Software Testing.
Presentation transcript:

Generation of Conformance Test Suites for Compositions of Web Services Using Model Checking José García-Fanjul, Claudio de la Riva and Javier Tuya University of Oviedo (Spain) This work is supported by the Ministry of Science and Education (Spain) under the National Program for Research, Development and Innovation, projects IN2TEST (TIN C03-02) and REPRIS (TIN E). TAIC PART. Testing: Academic & Industrial Conference. Windsor (UK) - August, 2006

TAIC PART – Windsor (UK) - August, 2006José García-Fanjul (Page 2) Motivation Investment in web services software is increasing [IDC, 2006]: Investment in web services software is increasing [IDC, 2006]: Doubling from 2003 to 2004 (reaching $2,3 billion).Doubling from 2003 to 2004 (reaching $2,3 billion). (Expectedly) becoming $15 billion by 2009.(Expectedly) becoming $15 billion by … but there are not many research works on testing web services software. … but there are not many research works on testing web services software.

TAIC PART – Windsor (UK) - August, 2006José García-Fanjul (Page 3) Do we need new testing methods for web services? Testing web services software is different. Testing web services software is different. Some unresolved challenges (from [Zhang and Zhang, 2005] and [Canfora and Di Penta, 2006]) Some unresolved challenges (from [Zhang and Zhang, 2005] and [Canfora and Di Penta, 2006]) The need to remotely test web services, with its associated cost.The need to remotely test web services, with its associated cost. Lack of observability of the service code and structure.Lack of observability of the service code and structure. The ability to dynamically search and invoke web services.The ability to dynamically search and invoke web services.

TAIC PART – Windsor (UK) - August, 2006José García-Fanjul (Page 4) The research hypothesis A new method for generating test suites for compositions of web services is needed, with these characteristics: A new method for generating test suites for compositions of web services is needed, with these characteristics: It will be static (no need to execute the software for obtaining the test cases).It will be static (no need to execute the software for obtaining the test cases). The selection of the test cases will be guided by adequacy criteria.The selection of the test cases will be guided by adequacy criteria. The only required input will be a specification of the composition (BPEL).The only required input will be a specification of the composition (BPEL). No knowledge about the partners particular behaviour. No knowledge about the partners particular behaviour.

TAIC PART – Windsor (UK) - August, 2006José García-Fanjul (Page 5) Background: Model checking A formal verification technique to automatically ascertain if a property holds in a model. A formal verification technique to automatically ascertain if a property holds in a model. 1) A model must be built for the system we want to check. 2) Properties must be specified. 3) The tool (model checker) searches all the possible states within the model. 4) If the property does not hold, it provides a counterexample showing its violation.

TAIC PART – Windsor (UK) - August, 2006José García-Fanjul (Page 6) Using model checking for software testing [Ammann, 1998] To obtain a test case for a certain condition C. To obtain a test case for a certain condition C. Model of the software C NEVER holds Counterexample (fulfilling C) 1)The model checker (for instance, SPIN) is fed with a model for the software… 2) …and a LTL formula stating that C never holds. 3) The output obtained from the tool is a counterexample in which the software fulfils C. 4) That counterexample can be transformed into a test case, as it describes an execution of the software in which the desired test condition holds.

TAIC PART – Windsor (UK) - August, 2006José García-Fanjul (Page 7) Overview of the method 1)A model will be built from the BPEL specification of the business process. 2) Adequacy criteria will be used to select test requirements. So: The PROMELA code will be instrumented to discern if an execution meets a requirement. LTL properties will be properly constructed (the negation of the requirements). 3) The model checker is executed, and counterexamples obtained. 4) Counterexamples are transformed into test cases specifications.

TAIC PART – Windsor (UK) - August, 2006José García-Fanjul (Page 8) Preliminary work The method has been applied to a sample composition called “loan approval”. The method has been applied to a sample composition called “loan approval”. Its objective is to conclude whether a certain request for a loan will be approved or not.Its objective is to conclude whether a certain request for a loan will be approved or not. The chosen criterion has been a transition coverage criterion. The chosen criterion has been a transition coverage criterion.

A representation of the “loan approval” sample composition. 1) Receives a request from a partner called “customer” 2) The “assessor” Partner measures the risk associated with low amount requests. 3) Requests made for a large amount of money or which are evaluated by the assessor as not having a low risk will be examined by another partner (“approver”).

Test cases for the “loan approval” sample composition. First test case: Transition #1 LTL property: [] (!tran1)

TAIC PART – Windsor (UK) - August, 2006José García-Fanjul (Page 11) 0 :in run customer() 1 cus request.firstName Proce Statement customer customer customer customer 1 cus request.name = und cus request.amount = :in run bpel_loan_appr bpe f1a1== Proce Statement bpel_loa bpel_loa bpel_loa bpel_loa bpel_loa customer customer customer customer 0 :in run approver() :in run assessor() cus values: 1!3,3, cus BPEL_loanApprovalP Proce Statement BPEL_loa bpel_loa bpel_loa bpel_loa bpel_loa bpel_loa customer customer customer customer 2 bpe values: 1?3,3,3 [3,3,3] [...] Extracting a test case from the counterexample. customer: request.amount = 3 customer: BPEL_loanApprovalPort_IN!request bpel: request.amount<4 bpel: tran1 = true bpel: loanassessor_riskAssessmentPort_IN!request assessor: riskAssessment.risk = low assessor: loanassessor_riskAssessmentPort_OUT! riskAssessment bpel: tran3 = true bpel: approvalInfo.accept = yes bpel: tran5 = true bpel: BPEL_loanApprovalPort_OUT!approvalInfo bpel: bpel_ends = true INPUTS 1. the customer makes a request for an amount of 3 (less than four) 2. the risk assessment from the assessor is low EXPECTED OUTPUT The reply to the customer is affirmative.

Test cases for the “loan approval” sample composition. First test case: It covers transition #1… but also #3 and #5.

Test cases for the “loan approval” sample composition. First test case: It covers transition #1… but also #3 and #5. Second test case: It covers transitions 2 and 6.

Test cases for the “loan approval” sample composition. First test case: It covers transition #1… but also #3 and #5. Second test case: It covers transitions 2 and 6. Third test case: It covers transition 4 (and also 1 and 6).

TAIC PART – Windsor (UK) - August, 2006José García-Fanjul (Page 15) Expected contributions Main contribution: Main contribution: Definition of a new method to obtain conformance test suites for compositions of web services. The method will rely on a model checking tool (SPIN) for obtaining the test cases specifications.Definition of a new method to obtain conformance test suites for compositions of web services. The method will rely on a model checking tool (SPIN) for obtaining the test cases specifications. Specifically, research will address how to: Specifically, research will address how to: Transform a BPEL specification to a PROMELA model.Transform a BPEL specification to a PROMELA model. Instrument PROMELA code, considering the adequacy criteria.Instrument PROMELA code, considering the adequacy criteria. Construct LTL properties for the counterexamples to show sample executions of the model that meet the criteria.Construct LTL properties for the counterexamples to show sample executions of the model that meet the criteria. Automatically obtain a test suite specification from the counterexamples that SPIN provides.Automatically obtain a test suite specification from the counterexamples that SPIN provides.

Thank you for your attention José García-Fanjul