Presentation is loading. Please wait.

Presentation is loading. Please wait.

UC Marco Vieira University of Coimbra

Similar presentations


Presentation on theme: "UC Marco Vieira University of Coimbra"— Presentation transcript:

1 Research @ UC Marco Vieira University of Coimbra mvieira@dei.uc.pt

2 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[1] ::.. Outline  Fault Injection for Failure Prediction Methods Validation  Robustness Testing for Web Services  Web Services Robustness Improvement  Other Research Topics  Questions & Comments

3 Fault Injection for Failure Prediction Methods Validation Research @ UC

4 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[3] ::.. Why do computers fail?

5 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[4] ::.. Hardware problems

6 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[5] ::.. Environment problems

7 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[6] ::.. Bad configuration

8 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[7] ::.. Misuse

9 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[8] ::.. Not proven design

10 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[9] ::.. But most of the times it is due to…

11 Research @ UC

12 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[11] ::.. What can we do?  Don’t use computers Sooner or later they will fail!!!  OR: Build better software Many tried… most have failed Find ways to identify failure-prone situations And react accordingly…  But… How do we know when a failure is about to happen?

13 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[12] ::.. Predict!

14 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[13] ::.. But prediction is hard…  Prophets do not exist!  Failure prediction methods are complex  Needs lots of data  How to improve this? That is the goal of our research…

15 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[14] ::.. Our idea Injection of realistic software faults to validate and improve failure prediction methods

16 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[15] ::.. Research goals  Use the injection of realistic software faults to validate and improve failure prediction mechanisms  Four scenarios: 1. Help identifying symptoms for failure prediction 2. Accelerate the learning/training phase of prediction algorithms 3. Evaluate the figures of merit of prediction algorithms 4. Integrate fault injection in the prediction algorithm, as a form of continuous training

17 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[16] ::.. Identifying symptoms for prediction  Preparation phase Setup the experimental environment  Profiling phase Collect data for building a profile for each param  Fault injection phase Generate failure-related data  Symptoms identification phase Build a model of the behavior of each parameter Identify the parameters that show potential symptoms by deviating from that model

18 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[17] ::.. Variables ranking  Correlate potential symptoms with the observed failures  Rank variables based on the highest rate of valid symptoms Characterized using the F-Measure Represents the harmonic mean of precision and recall

19 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[18] ::.. Experimental demo  Two workloads: WKL#1 - light workload: 7-Zip application compacting 4GB file WKL#2 - heavier workload: COSBI OpenSourceMark benchmark suite

20 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[19] ::.. Overall results WKL#1WKL#2

21 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[20] ::.. Top-10 parameters

22 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[21]

23 Robustness Testing for Web Services Research @ UC

24 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[23] ::.. The problem Web Services must provide a robust service to the client applications  Development tools lack mechanisms to: Characterize the robustness of Web Services code Compare robustness of alternative Web Services

25 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[24] ::.. Web Services robustness testing  Erroneous Web Services call parameters Generated using a set of predefined rules Based on the data types of each parameter Injected during the Web Services execution GetWeather(city, day) → GetWeather(“Coimbra”, null)  Key components needed: Workload Robustness tests Failure modes classification

26 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[25] ::.. Preparing the tests  Obtain web service definitions List of operations Parameters Data types Domains  The WSDL file is processed automatically to obtain the required information  The domain for each parameter cannot be deduced from the WSDL description Must be provided by the user

27 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[26] ::.. Workload  A workload is needed to exercise each operation of the web service  A generic workload that fits all Web Services is not feasible We need to generate a workload for each web service  Workload generation: User defined workload Random workload

28 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[27] ::.. Parameters values mutation TypeParameter Mutation String Replace by null value Replace by empty string Replace by predefined string Replace by string with nonprintable characters Add nonprintable characters to the string Replace by alphanumeric string Add characters to overflow max size Number… List… Date… Boolean…

29 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[28] ::.. The wsrbench tool…  Implements the Web Services testing approach  Available online: http://wsrbench.dei.uc.pt

30 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[29] ::.. Experimental evaluation  Tested 250 Web Services (1200 operations) publicly available The majority is listed at http://webservices.seekda.com /  Web Services owned by different parties E.g., Microsoft and Xara  Some services implement the same functionally e.g., Text Disguise and Free Captcha Service

31 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[30] ::.. Failure modes observed

32 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[31] ::.. Main causes of the problems

33 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[32]

34 Web Services Robustness Improvement Research @ UC

35 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[34] ::.. Goal  A practical way to improve web services robustness Domain expression language Transparent server-side domain validation process  Very important for creating highly robust services  Can also be used in legacy services No source code is available Changes applied by using bytecode instrumentation

36 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[35] ::.. Approach 1)Accurately describe the service (improved) 2)Generate and execute a service workload 3)Execute robustness tests (improved for providers) 4)Correct disclosed issues and behavior verification

37 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[36] ::.. Service description  Domain information is typically unavailable Lack of adequate development tools for domain expression XSD Schema inability to describe parameter dependencies

38 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[37] ::.. Example: Service domain  Ex: Operation1 takes 2 integer parameters: A and B  Valid domain for A: [1, 5] U [6, 10]  Valid domain for B: [10, 20] U [30,40]  However, the service requires that: When A is in [1, 5] B must be in [30, 40] and vice-versa

39 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[38] ::.. Example: Service domain

40 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[39] ::.. Example: Service domain We need to announce our service as accepting: r1 and r4

41 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[40] ::.. Example: Service domain We need to announce our service as accepting: r1 and r4 Problem: XSD is unable to express this! Solution: Use an XSD extension that can express domain dependencies

42 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[41]......... ::.. Extended Domain Expression Language

43 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[42] ::.. Extended Domain Expression Language.........  Logical OR and XPath functions can also be used for more complex domains (starts-with, contains, etc)

44 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[43] ::.. Robustness tests execution  We automatically generate robustness tests using the services definitions and exercise services: Through their public interface Through external services responses (composite)  Based on a rule set, we use fault injection to mutate incoming messages  We also perform exception injection All declared exceptions and a set of Runtime exceptions  At the end we check any response domain violation

45 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[44] ::.. Robustness testing tool  The service is instrumented using AOP (AspectJ)

46 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[45] ::.. Robustness problems removal  Incoming requests undergo a transparent and complete validation according to the announced domain

47 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[46] ::.. Experimental evaluation  3 versions of TPC-App web services (A, B, and C) 1)We analyzed the WSDL and XSD of each service 2)Manually extended each XSD to use EDEL 3)Operations domains were fully defined 4)Created a test workload, and measured its coverage 5)Cobertura indicated +80% general coverage

48 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[47] ::.. Results overview Web Service Robustness Problems ABC ChangePaymentMethod14130 NewCustomer44260 NewProducts410 ProductDetail040

49 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[48] ::.. Improvement verification  We deployed 3 EDEL-based protected versions of the same web services  No robustness issues were uncovered!  We re-ran the workload, and identified no out-of- domain responses  Responses were double-checked

50 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[49]

51 Other Research Topics Research @ UC

52 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[51] ::.. Security  Vulnerability & attack injection Validation of security mechanisms  Tools for developing non-vulnerable web services Penetration testing Static analysis Anomaly detection  Security benchmarking for transactional systems Comparing systems in terms of security features

53 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[52] ::.. Dynamic systems  Resilience benchmarking for self-adaptive systems Accessing resilience-related metrics Compare systems with autonomic capabilities  V&V of large-scale, dynamic service systems Support traceability to evolving requirements Cope with agile software development process Explore the notion of regression in V&V Cope withsuccessive software releases Dynamic and evolving system V&V

54 N APLES, 14 TH A PRIL 2010 R ESEARCH @ UC[53] ::.. Questions & Comments Marco Vieira Center for Informatics and Systems University of Coimbra mvieira@dei.uc.pt


Download ppt "UC Marco Vieira University of Coimbra"

Similar presentations


Ads by Google