Presentation is loading. Please wait.

Presentation is loading. Please wait.

A Framework for the Assessment and Selection of Software Components and Connectors in COTS-based Architectures Jesal Bhuta, Chris Mattmann {jesal,

Similar presentations


Presentation on theme: "A Framework for the Assessment and Selection of Software Components and Connectors in COTS-based Architectures Jesal Bhuta, Chris Mattmann {jesal,"— Presentation transcript:

1 A Framework for the Assessment and Selection of Software Components and Connectors in COTS-based Architectures Jesal Bhuta, Chris Mattmann {jesal, mattmann}@usc.edu USC Center for Systems & Software Engineering http://csse.usc.edu http://csse.usc.edu February 13, 2007

2 2 Outline  Motivation and Context  COTS Interoperability Evaluation Framework  Demonstration  Experimentation & Results  Conclusion and Future work

3 3 COTS-Based Applications Growth Trend  Number of systems using OTS components steadily increasing –USC e-Services projects show number of CBA’s rise from 28% in 1997 to 70% in 2002 –Standish group’s 2000 survey found similar results (54%) in the industry [Standish 2001 - Extreme Chaos] CBA Growth Trend in USC e-Services Projects Standish Group Results

4 4 COTS Integration: Issues  COTS products are created with their own set of assumptions which are not always compatible –Example: Java-Based Customer Relationship Management (CRM) and Microsoft Access integration  CRM supports JDBC, MS SQL supports ODBC Java CRM Microsoft SQL Server JDBC ODBC

5 5 Case Study [Garlan et al. 1995]  Develop a software architecture toolkit  COTS selected –OBST, public domain object oriented database –Inter-views, GUI toolkit –Softbench, event-based tool integration mechanism –Mach RPC interface generator, an RPC mechanism  Estimated time to integrate: 6 months and 1 person-year  Actual time to integrate: 2 years and 5 person-years

6 6 Problem: Reduced Trade-Off Space  Detailed interoperability assessment is effort intensive –Requires detailed analysis of interfaces and COTS characteristics, prototyping  Large number of COTS products available in the market –Over 100 CRM solutions, over 50 databases = 5000 possible combinations  This results in interoperability assessment being neglected until late in development cycle  These reduce trade-off space between –medium and low priority requirements chosen over cost to integrate COTS

7 7 Statement of Purpose To develop an efficient and effective COTS interoperability assessment framework by: 1.Utilizing existing research and observations to introduce concepts for representing COTS products 2.Developing rules that define when specific interoperability mismatches could occur 3.Synthesizing (1 and 2) to develop a comprehensive framework for performing interoperability assessment early (late inception) in the system development cycle Efficient: Acting or producing effectively with a minimum of unnecessary effort Effective: Producing the desired effect (effort reduction during COTS integration )

8 8 Proposed Framework: Scope  Specifically addresses the problem of technical interoperability  Does not address non-technical interoperability issues –Human computer interaction incompatibilities –Inter/intra organization incompatibilities

9 9 Motivating Example: Large Scale Distributed Scenario  Manage and disseminate –Digital content (planetary science data)  Data disseminated in multiple intervals  Two user classes separated by distributed geographic networks (Internet) –Scientists from European Space Agency (ESA) –External users

10 10 Interoperability Evaluation Framework Interfaces

11 11 COTS Representation Attributes

12 12 COTS Definition Example: Apache 2.0

13 13 COTS Interoperability Evaluation Framework

14 14 Integration Rules  Interface analysis rules –Example: ‘Failure due incompatible error communication’  Internal assumption analysis rules –Example: ‘Data connectors connecting components that are not always active’  Dependency analysis rules –Example: ‘Parent node does not support dependencies required by the child components’  Each rule includes: pre-conditions, results

15 15 Integration Rules: Interface Analysis  ‘Failure due incompatible error communication’ –Pre-conditions  2 components (A and B) communicating via data &/or control (bidirectional)  One component’s (A) error handling mechanism is ‘notify’  Two components have incompatible error output/error input methods –Result  Failure in the component A will not be communicated in component B causing a permanent block or failure in component B

16 16 Integration Rules: Internal Assumption Analysis  ‘Data connectors connecting components that are not always active’ –Pre-conditions  2 components connected via a data connector  One of the component does not have a central control unit –Result  Potential data loss Component AComponent B Pipe

17 17 Integration Rules: Dependency Analysis  ‘Parent node does not support dependencies required by the child components’ –Pre-condition:  Component in the system requires one or more software components to function –Result:  The component will not function as expected

18 18 Voluminous Data Intensive Interaction Analysis  An Extension Point implementation of the Level of Service Connector Selector  Distribution connector profiles (DCPs) –Data access, distribution, streaming [Mehta et. al 2000] metadata captured for each profiled connector –Can be generated manually, or using an automatic process  Distribution Scenarios –Constraint queries phrased against the architectural vocabulary of data distribution  Total Volume  Number of Users  Number of User Types  Delivery Intervals  Data Types  Geographic Distribution  Access Policies  Performance Requirements

19 19 Voluminous Data Intensive Interaction Analysis  Need to understand the relationship between the scenario dimensions and the connector metadata –If we understood the relationship we would know which connectors to select for a given scenario  Current approach allows both Bayesian inference and linear equations as a means of relating the connector metadata to the scenario dimensions  For our motivating example –3 Connectors, C1-C3 –Profiled 12 major OTS connector technologies  Including bbFTP, gridFTP, UDP bursting technologies, FTP, etc. –Apply selection framework to “rank” most appropriate of 12 OTS connector solutions for given example scenarios

20 20 Voluminous Data Intensive Interaction Analysis  Precision-Recall analysis –Evaluated framework against 30 real-world data distribution scenarios –10 high volume, 9 medium volume, and 11 low volume scenarios –Used expert analysis to develop “answer key” for scenarios  Set of “right” connectors  Set of “wrong” connectors  Applied Bayesian and linear programming connector selection algorithm –Clustered ranked connector lists using k-means clustering (k=2) to develop similar answer key for each algorithm  Bayesian selection algorithm: 80% precision, linear programming 48% –Bayesian algorithm more “white box” –Linear algorithm more “black box” –White box is better

21 Demonstration

22 22 Experiment 1  Conducted in graduate software engineering course on 8 projects –6 projects COTS-Based Applications  2 web-based (3-tier) projects, 1 shared data project, 1 client- server project, 1 web-service interaction project and 1 single- user system –Implemented this framework before RLCA* milestone on their respective projects –Data collected using surveys  Immediately after interoperability assessment  After the completion of the project * Rebaselined Life Cycle Architecture

23 23 Experiment 1 Results Data SetGroupsMean Standard Deviation P- Value Dependency Accuracy Pre Framework Application79.3%17.9 0.017 Post Framework Application100%0 Interface Accuracy Pre Framework Application76.9%14.4 0.0029 Post Framework Application100%0 Actual Assessment Effort Projects using this framework1.531.71 0.053 Equivalent projects that did not use this framework 5 hrs3.46 Actual Integration Effort Projects using this framework9.5 hrs2.17 0.0003 Equivalent projects that did not use this framework 18.2 hrs3.37 * Accuracy of Dependency Assessment: 1 – (number of unidentified dependencies/total number of dependencies) ** Accuracy of Interface Assessment: 1 – (number of interface interaction mismatches identified/total number of interface interactions) Accuracy: a quantitative measure of the magnitude of error [IEEE 1990]

24 24 Experiment 2 – Controlled Experiment Treatment GroupControl Group Number of Students7581 On campus Students6065 DEN Students1516 Average Experience1.473 years1.49 years Average OnCampus Experience 0.54 years0.62 years Average DEN Experience 5.12 years5 years

25 25 Experiment 2 - Cumulative Results Data SetGroupsMean Standard Deviation P- Value Hypothesis IH1: Dependency Accuracy Treatment Group (75)100%0 <0.0001 (t=20.7; sdev=8.31; DOF=154) Control Group (81)72.5%11.5 Hypothesis IH2: Interface Accuracy Treatment Group (75)100%0 <0.0001 (t=13.0; sdev=9.37; DOF=154) Control Group (81)80.5%13.0 Hypothesis IH3: Actual Assessment Effort Treatment Group (75)72.8 min28.8 <0.0001 (t=-9.04; sdev=77.5; DOF=154) Control Group (81)185 min104

26 26 Experiment 2 – DEN Results Data SetGroupsMean Standard Deviation P- Value Hypothesis IH1: Dependency Accuracy Treatment Group (60)100%0 <0.0001 (t=17.9; sdev=8.50; DOF=123) Control Group (65)72.6%11.8 Hypothesis IH2: Interface Accuracy Treatment Group (60)100%0 <0.0001 (t=12.0; sdev=9.12; DOF=123) Control Group (65)80.4%12.6 Hypothesis IH3: Actual Assessment Effort Treatment Group (60)67.1 min23.1 <0.0001 (t=-8.75; sdev=74.2; DOF=123) Control Group (65)183 min100

27 27 Conclusion and Future Work  Results (so far) indicate a “sweet spot” in small e- services project  Framework-based tool automates initial interoperability analysis: –Interface, internal assumption, dependency mismatches  Further experimental analysis ongoing –Different software development domains –Projects with greater COTS complexity  Additional quality of service extensions

28 28 Questions


Download ppt "A Framework for the Assessment and Selection of Software Components and Connectors in COTS-based Architectures Jesal Bhuta, Chris Mattmann {jesal,"

Similar presentations


Ads by Google