Computing Department On the Design of a Testbed for AOSD Alessandro Garcia May 2007.

Slides:



Advertisements
Similar presentations
Configuration Management
Advertisements

Ch:8 Design Concepts S.W Design should have following quality attribute: Functionality Usability Reliability Performance Supportability (extensibility,
The design process IACT 403 IACT 931 CSCI 324 Human Computer Interface Lecturer:Gene Awyzio Room:3.117 Phone:
Computing Department On the Impact of Aspectual Decompositions on Design Stability: An Empirical Study Phil Greenwood 1 Co-authors:
Presented by: Thabet Kacem Spring Outline Contributions Introduction Proposed Approach Related Work Reconception of ADLs XTEAM Tool Chain Discussion.
Using UML, Patterns, and Java Object-Oriented Software Engineering Royce’s Methodology Chapter 16, Royce’ Methodology.
Systems Engineering in a System of Systems Context
Chapter 1 Software Development. Copyright © 2005 Pearson Addison-Wesley. All rights reserved. 1-2 Chapter Objectives Discuss the goals of software development.
Creating Architectural Descriptions. Outline Standardizing architectural descriptions: The IEEE has published, “Recommended Practice for Architectural.
Iterative development and The Unified process
Supplement 02CASE Tools1 Supplement 02 - Case Tools And Franchise Colleges By MANSHA NAWAZ.
1 FM Overview of Adaptation. 2 FM RAPIDware: Component-Based Design of Adaptive and Dependable Middleware Project Investigators: Philip McKinley, Kurt.
The Software Product Life Cycle. Views of the Software Product Life Cycle  Management  Software engineering  Engineering design  Architectural design.
Technion Israel Early Aspects Based on lectures by Awais Rashid, Alessandro Garcia Lectures at AOSD Summer School, Brussels, July 2006 © A. Rashid, A.
Architecture Tradeoff Analysis Method Based on presentations by Kim and Kazman
Architecture, Implementation, and Testing Architecture and Implementation Prescriptive architecture vs. descriptive architecture Prescriptive architecture:
Deriving AO Software Architectures using the AO-ADL Tool Suite Luis Fernández, Lidia Fuentes, Mónica Pinto, Juan A. Valenzuela Universidad de Málaga
What is Business Analysis Planning & Monitoring?
Principles of Object Technology Module 1: Principles of Modeling.
S/W Project Management
Introduction to RUP Spring Sharif Univ. of Tech.2 Outlines What is RUP? RUP Phases –Inception –Elaboration –Construction –Transition.
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
1 IBM Software Group ® Mastering Object-Oriented Analysis and Design with UML 2.0 Module 1: Best Practices of Software Engineering.
Chapter 2 The process Process, Methods, and Tools
“Enhancing Reuse with Information Hiding” ITT Proceedings of the Workshop on Reusability in Programming, 1983 Reprinted in Software Reusability, Volume.
Secure Systems Research Group - FAU Aspects and mobile applications Sergio Soares Paulo Borba, “PaDA: A Pattern for Distribution Aspects” In Second Latin.
© 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley 1 A Discipline of Software Design.
VTT-STUK assessment method for safety evaluation of safety-critical computer based systems - application in BE-SECBS project.
CSE 303 – Software Design and Architecture
Architecture-Based Runtime Software Evolution Peyman Oreizy, Nenad Medvidovic & Richard N. Taylor.
RUP Design RUP Artifacts and Deliverables
EMI INFSO-RI SA2 - Quality Assurance Alberto Aimar (CERN) SA2 Leader EMI First EC Review 22 June 2011, Brussels.
Role-Based Guide to the RUP Architect. 2 Mission of an Architect A software architect leads and coordinates technical activities and artifacts throughout.
Hyper/J and Concern Manipulation Environment. The need for AOSD tools and development environment AOSD requires a variety of tools Life cycle – support.
Model-Driven Analysis Frameworks for Embedded Systems George Edwards USC Center for Systems and Software Engineering
Design engineering Vilnius The goal of design engineering is to produce a model that exhibits: firmness – a program should not have bugs that inhibit.
OOI CI LCA REVIEW August 2010 Ocean Observatories Initiative OOI Cyberinfrastructure Architecture Overview Michael Meisinger Life Cycle Architecture Review.
Software Engineering Principles Principles form the basis of methods, techniques, methodologies and tools Principles form the basis of methods, techniques,
Object-oriented Analysis and Design Stages in a Software Project Requirements Writing Analysis Design Implementation System Integration and Testing Maintenance.
On the Modularity Assessment of Aspect- Oriented Multi-Agent Systems Product Lines: a Quantitative Study Camila Nunes
Abstract We present two Model Driven Engineering (MDE) tools, namely the Eclipse Modeling Framework (EMF) and Umple. We identify the structure and characteristic.
1 Introduction to Software Engineering Lecture 1.
Lucian Voinea Visualizing the Evolution of Code The Visual Code Navigator (VCN) Nunspeet,
Assessing the influence on processes when evolving the software architecture By Larsson S, Wall A, Wallin P Parul Patel.
Design Rules for Increasing Modularity with CaesarJ Carlos Eduardo Pontual Advisor: Paulo Borba 17/06/2010.
SOFTWARE DESIGN AND ARCHITECTURE LECTURE 05. Review Software design methods Design Paradigms Typical Design Trade-offs.
Notes of Rational Related cyt. 2 Outline 3 Capturing business requirements using use cases Practical principles  Find the right boundaries for your.
Software Engineering Prof. Ing. Ivo Vondrak, CSc. Dept. of Computer Science Technical University of Ostrava
Introduction Better Faster Cheaper (pick any two) On-going issue that continues to motivate research in software engineering Applications: –continue to.
Aspect-Oriented Requirements Engineering David Schaefer, Joao Araujo, Isabel Brito, Awais Rashid, Claudia Mesquita.
Computing Department Testbed for Aspect-Oriented Software Development: TAO Phil Greenwood and numerous other contributors.
1 An Aspect-Oriented Implementation Method Sérgio Soares CIn – UFPE Orientador: Paulo Borba.
Software Architecture Evaluation Methodologies Presented By: Anthony Register.
MODEL-BASED SOFTWARE ARCHITECTURES.  Models of software are used in an increasing number of projects to handle the complexity of application domains.
© 2006 Pearson Addison-Wesley. All rights reserved 2-1 Chapter 2 Principles of Programming & Software Engineering.
Testing OO software. State Based Testing State machine: implementation-independent specification (model) of the dynamic behaviour of the system State:
CSE 303 – Software Design and Architecture
Overview of RUP Lunch and Learn. Overview of RUP © 2008 Cardinal Solutions Group 2 Welcome  Introductions  What is your experience with RUP  What is.
1 Modularity Analysis of Use Case Implementations Fernanda d’Amorim Advisor: Paulo Borba.
Evaluate Phase Pertemuan Matakuliah: A0774/Information Technology Capital Budgeting Tahun: 2009.
Banaras Hindu University. A Course on Software Reuse by Design Patterns and Frameworks.
Metadata Driven Aspect Specification Ricardo Ferreira, Ricardo Raminhos Uninova, Portugal Ana Moreira Universidade Nova de Lisboa, Portugal 7th International.
Info-Tech Research Group1 Info-Tech Research Group, Inc. is a global leader in providing IT research and advice. Info-Tech’s products and services combine.
Aspect-oriented Code Generation Approaches Abid Mehmood & Dayang N. A. Jawawi Department of Software Engineering Faculty of Computer Science and Information.
Introduction to OOAD and UML
Design Engineering 1. Analysis  Design 2 Characteristics of good design 3 The design must implement all of the explicit requirements contained in the.
Process 4 Hours.
Software Project Configuration Management
Chapter 8 Software Evolution.
Presentation transcript:

Computing Department On the Design of a Testbed for AOSD Alessandro Garcia May 2007

Computing Department Key Researchers Lancaster – UK –Phil Greenwood , Alessandro Garcia –Eduardo Figueiredo, Nelio Cacho, Claudio Sant’Anna, Americo Sampaio, Awais Rashid Recife – Brazil –Sergio Soares, Marcos Dosea, Paulo Borba Kiel – Germany & Waterloo – Canada –Thiago Bartolomei Lisbon – Portugal –Joao Araujo, Ana Moreira, Isabel Brito, Ricardo Argenton Malaga – Spain –Monica Pinto, Lidia Fuentes Salvador & Natal – Brazil –Thais Batista, Christina Chavez, Lyrene Silva Other Contributors: Milan/Italy, Fraunhofer/Germany, Colorado/USA, Rio/Brazil, INRIA/France, Siemens/Germany…

Computing Department AOSD: from embryonic techniques… … to integration and testing in real-world settings Growing need to assess AO methodologies –AOSD is becoming a sufficiently established research community Need to compare AO approaches with other contemporary modularization approaches Creation of an experimental environment for end-to-end evaluation of AOSD techniques –requirements –architecture –design –implementation –maintenance

Computing Department Uncountable barriers Available systems lack proper documentation Difficult to find multiple AO and non-AO implementations for the same system –even worst: guarantee that the non-AO and AO decompositions are good ones is a tricky activity PhD research studies: difficult to find or develop from scratch a plausible “benchmark” –many risks: time-consuming task, inherent bias, etc… –collaboration is the only alternative left Quantitative or qualitative indicators are often NOT ready for use Replication of studies becomes a pain

Computing Department A Testbed for AOSD Towards more scientific and cohesive research –serve as a communication and collaboration vehicle achieve widely-accepted exemplars, indicators, and data that can be reused and refined –facilitate the identification of “unknown” problems and benefits inherent to AOSD effects throughout the lifecycle –bottlenecks specific to certain SE phases and their transitions –accelerate the progress in the area by offering context to pinpoint technique-specific problems

Computing Department Testbeds vs. Software Engineering Recent recognition of the pivotal role of benchmarking on the community cohesion and rapid progress 1 Some fields have faced some progress on benchmarking –e.g. reverse engineering, software refactoring, and program comprehension However… –there is not much work on benchmarking modularization techniques –reports about the process of designing, instantiating, and evolving benchmarks in software engineering is rare 1 S. Sim, S. Easterbrook, R. Holt. Using Benchmarking to Advance Research: A Challenge to Software Engineering. Proc. 25th Intl. Conf. on Software Engineering, Portland, Oregon, pp , 3-10 May, 

Computing Department Timeline June 2006 proposal accepted July 2006 August 2006 December 2006 contributions of artefacts starts... choice of the benchmark goal preparation of the 1st pilot stability study starts... new needs identified, e.g.: - concern interaction metrics - redefinition of metrics to CaesarJ - measurement reliability: tool support choice of the change scenarios circulation of the questionnarie 1st benchmark definition starts... indicators definition Testbed design Benchmark instantiaions September 2006 October 2006 preparation of the pilot AO requirements study starts... conclusion of the 1st study

Computing Department Outline Testbed design: the first benchmark Testbed elements Testbed instantiation Testbed evolution EA & the Testbed

Computing Department Testbed design: the first benchmark a number of decisions… such as: application selection –it should be a system likely to be universally used to different assessment purposes –ten candidate applications were examined Tourist Guide System, Pet Store, J2ME Games, CVS Eclipse Plug-In, OpenORB middleware system, etc. –each application was ranked according to weighted criteria

Computing Department Selection Criteria Examples –availability of AO and non-AO implementations (important) –availability of documentation (least important) –system generality (important) –heterogeneous types of concern interactions (most important) –aspects emerging in different phases (least important) –previous acceptance by the research community (most important) –paradigm neutral (most important) –a variety of crosscutting and non-crosscutting concerns (important) e.g. widely-scoped vs. more localized ones e.g. those requiring different uses of AO mechanisms –elegance of the AO and non-AO decompositions (important)

Computing Department Health Watcher (HW) System 1 Java version was developed by a company in Brazil Several desirable properties –real-life system –non-trivial –Java and AspectJ implementations available elegant OO and AO designs –some requirements, architecture and design documentation available –designed with modularity, reusability, maintainability and stability in mind –used in a reasonable number of studies that report well-accepted non-AO and AO design decompositions: OOPSLA.02, FSE.06, S:P&E 2006, ICSM.06, EWSA.06, EA.06, ESEM.07, etc… Important that multiple applications are used in the testbed to allow broad conclusions to be made 1 Soares et al. Implementing Distribution and Persistence Aspects with AspectJ. OOPSLA 2002 

Computing Department Health Watcher Architecture

Computing Department Artefacts Repository Initially a limited number of approaches have been applied –Requirements (e.g. Use-Cases, V-Graph, AOV-Graph, AORE, AORA) –Architecture (e.g. UML, ACME, AO ADL, AspectualACME, AOGA) –Design (UML, Theme/UML, aSideML) –Implementation (Java, AspectJ, CaesarJ, AWED, JBoss) Contributors reported: –strengths and weaknesses of the HW system –issues to be benchmarked

Computing Department What issues to benchmark? Questionnaires sent to a representative set of SE institutions –understand which areas the existing AO techniques… … were mature enough –phases: requirements engineering, detailed design and implementation –e.g. “pointcut languages” … in evolution stage (e.g. aspect interaction) … target quality attributes (e.g. enhanced maintainability and reusability) Investigation of typical “ilities” in previous empirical studies involving modularization techniques (e.g. OO, AO, etc…): –modularity, maintainability and reusability e.g. software stability –reliability e.g. error proneness –specification effort and outcome quality e.g. time spent, recall, and precision

Computing Department What issues to benchmark? Impact of AO mechanisms on particular SE activities or phases –phases are often assessed in isolation –desirable to determine the affects of one phase on subsequent phases E.g. how changes in my AO program impacts the stability of the architecture decomposition (compared with OO program changes)? Which motivating comparison? –OO vs. AO? or –Multiple AO techniques

Computing Department Enhancing HW System… … to include changes and produce releases –both widely-scoped and localized changes –changes to both CCCs and non-CCCs –different categories: perfective changes and refactorings, corrective changes, evolutionary changes, etc… … to address the identified weaknesses w.r.t. –our original criteria e.g. include localized CCCs, such as design patterns –feedback received from the contributors e.g. need for improving the categories of aspect interactions … based on the history of HW changes in the deployed Java system

Computing Department Stability Indicators Generality –indicators not tied to one specific artefact/technique type Traceability in the assessment process –support assessment of effects of one phase on subsequent phases –SE-wide properties modularity: cohesion, coupling, SoC, interface simplicity, etc… change impact and stability –concern interaction

Computing Department Testbed Elements Design Stability Study Consequence: more mature elements

Computing Department Outline Testbed design: the first benchmark Testbed elements Testbed instantiation –study on architecture and implementation stability 1 Java vs. AspectJ vs. CaesarJ –study on AO requirements engineering 2 2 A. Sampaio et al. A Comparative Study of Aspect-Oriented Requirements Engineering Approaches. Proc. of the 1 st International Symposium on Empirical Software Engineering and Measurement (ESEM.07), September (to appear)  1 P. Greenwood et al. On the Impact of Aspectual Decompositions on Design Stability: An Empirical Study. Proceedings of the 21st European Conference on Object-Oriented Programming (ECOOP.07), July 2007, Germany. (to appear) 

Computing Department Instantiation of the Benchmark (Design Stability Study) Application of the selected metric suites to each of the artefacts generated –Java, AspectJ, and CeasarJ programs –Non-AO architecture (N-Tier architecture) vs. AO architecture Multi-dimensional analysis, including: –modularity sustenance –observance of architectural and design ripple effects –which categories of aspects (and respective interfaces) have exhibited or not stability –satisfaction of basic design principles through the releases

Computing Department Instantiation of the Benchmark (Design Stability Study) Outcomes overview + Concerns aspectized upfront tend to show superior modularity stability + AO solutions required less intrusive modification in modules + Aspectual decompositions have demonstrated superior satisfaction of the Open-Closed principle -Highlighted the “fragile pointcut” problem: ripple effects observed in interacting aspect interfaces -AO modifications tended to propagate to seemingly unrelated modules + Architectural ripple effects observed only in the OO solution: undesirable changes relative to exception handling in multiple layers 1 P. Greenwood et al. On the Impact of Aspectual Decompositions on Design Stability: An Empirical Study. Proceedings of the 21st European Conference on Object-Oriented Programming (ECOOP.07), July 2007, Germany. 

Computing Department Instantiation of the Benchmark (AO Requirements Study) 2 A. Sampaio et al. A Comparative Study of Aspect-Oriented Requirements Engineering Approaches. Proc. of the 1 st International Symposium on Empirical Software Engineering and Measurement (ESEM.07), September (to appear) 

Computing Department Instantiation of the Benchmark (AO Requirements Study) comparison of four eminent AORE approaches –time effectiveness (person-minutes) –accuracy of their produced outcome precision and recall of the models produced example of research question: –which activities are the main bottlenecks in terms of effort for each AORE approach? target: 1 st author interested in learning which tasks should be automated in the EA-Miner tool main outcome: composition specification and conflict analysis

Computing Department Timeline - Evolution June 2006 proposal accepted July 2006 August 2006 December 2006 contributions of artefacts starts... choice of the benchmark goal 1st pilot stability study starts... circulation of the questionnarie 1st benchmark definition starts... indicators definition Benchmark instantiaions September 2006 October 2006 conclusion of the 1st study contributions of artefacts starts... choice of the benchmark goal circulation of the questionnarie 1st benchmark definition starts... indicators definition Testbed design – lack of architectural changes: added EH – fix bugs encountered – improvement of “alignments” – metrics redefinitions thanks to CaesarJ mechanisms – more details in the architecture documentation – refine architecture metrics – improved definition of concern interaction metrics requirements study – common naming scheme – common activities

Computing Department Evolution: feedback from the studies new categories of crosscutting concerns –implementation level checked exceptions: EH aspectization is more challenging –use of exception-softening mechanism complex, context-sensitive exception handlers use of around advice –detailed design level: use of design patterns –plenty of different uses of AO mechanisms (role-based composition, multiple inheritance, etc…) Particular aspect interactions still not investigated –more than two aspects sharing the same join point –no presence of pointcuts picking out advice executions

Computing Department EA and the Testbed Status: –repository of AO and non-AO artifacts –no changes have been applied Improvements are necessary, e.g.: –there is no detailed problem description only use cases; requirements information is missing –most of the requirements-level aspects are directly mapped to architecture and implementation aspects –alignment of existing AO and non-AO artefacts needs to be improved –some architecture models are abstract, and some architectural views are missing

Computing Department EA and the Testbed Elements of the testbed repository have shown to be useful even for unanticipated assessment contexts, e.g. –AO measurement (U. Waterloo – Thiago Bartolomei) –dynamic AO metrics (U. Milan – Walter Cazzola) –AO design heuristics (U. Lancaster – Figueiredo, Sant’Anna, Garcia) –architectural styles and aspects (U. Bologna, U. Lancaster, UFBA, UFRN) Used and extended in several ways –Investigate the interplay of AO requirements composition mechanisms and several attributes requirements description stability traceability change impact analysis understandability etc…

Computing Department EA and the Testbed Other lessons learned –it is very difficult to design a proper testbed without the effective participation of the technique experts e.g. J. Araujo and A. Moreira (AORE technique) e.g. T. Bartolomei from CaesarJ team –testbed is an effective collaboration/communication tool enables developers/researchers of emerging EA techniques to communicate a common set of artefacts improved problem understanding not targeted to one specific phase –developers gain an improved awareness of all development phases enables focused discussions at EA workshops –we need more funding $$$

Computing Department Future Expansions Other benchmarks –… for assessing stability in early aspects techniques –… for error proneness Expand testbed elements –New applications –Apply more approaches –Develop new metrics Testbed repository is a semi-open resource by now The elements used and generated in the stability study is available at:

Computing Department On the Design of a Testbed for AOSD Alessandro Garcia May 2007

Computing Department Contributing to the Testbed Aim is to become an extensive open resource. Only a limited number of approaches initially applied to the testbed. Requires further contributions form the SE community. –Applications –New approaches –Metric suites

Computing Department Summary Provided an overview of the various elements that contribute to the testbed. Illustrated how traceability can be achieved across development phases in terms of assessing approaches. Given a concrete example of how the testbed can be instantiated which can also be achieved in other development phases. Highlighted the benefits of using a common testbed for the community.

Computing Department Other issues Important that the testbed is an open resource. Necessary for users of the testbed to contribute results gathered. Repository of data Guidelines on how to select the benchmarks and indicators (and previous data) Validation of the benchmark (which issues should we consider)? Plethora of new composition mechanisms in AOSD –How much they should affect the benchmarks design? E.g. CaesarJ has feature-oriented programming mechanisms that are most suited to PLs

Computing Department Outline Provide an overview of the testbed. –Aims –Elements –Design Decisions Detail the targeted development phases. –Approaches –Metrics Example instantiation of the testbed. –Stability case-study at the implementation phase. Subset of results. –Comparison of AORE approaches. –Results of the stability case-study Benefits and future work

Computing Department Testbed design: the first benchmark Answer key questions regard the effectiveness of AOSD through the development life-cycle. Provide a valuable resource to the software engineering community. A common testbed used to assess and compare AO and non-AO approaches. A communication vehicle for AO proponents.

Computing Department Possible focus of upcoming benchmarks –design stability –error proneness –impact of aspects in adjacent phases e.g. requirements -> architecture (traceability, quality of decisions made, etc...)

Computing Department Achieving Traceability Phases are often assessed in isolation Desirable to determine the affects of one phase on subsequent phases Number of attributes are common across development phases –Concern Interaction –Modularity –Stability –Change Impact

Computing Department Requirements Phase Number of approaches applied –Viewpoint-based AORE –AO Requirement Analysis (AORA) –MDSOC –AOV-Graph Difficult to compare varied approaches Testbed project initiated related work for comparing AORE appraoches 2 –Provides common schemes for comparison Some commonalities exist for comparison –Effort – time to produce documentation –Modularity 2 A. Sampaio et al, “A Comparative Study of Aspect-Oriented Requirements Engineering Approaches”, Proc. of the 1 st International Symposium on Empirical Software Engineering and Measurement (ESEM), September (to appear) 

Computing Department Architecture Design Phase A variety of architecture approaches applied. –ACME, AspectualACME, AO-ADL, Aspectual Template, AOSD-Europe Notation. A specific metric suite has been developed for assessing architecture design approaches. –Coupling –Cohesion –Interface Complexity –SoC –Interactions Other general attributes to measured. –Effort –Stability –Change impact These metrics allow correlation to the requirements phase.

Computing Department Instantiation of the Benchmark (Implementation Phase) (1) Aim was to compare/assess stability of AO and non-AO approaches. Involved selecting various elements provided by the testbed. –Application, metric suites, etc. Apply new approaches to base artefacts (Java/AspectJ implementation) to create new artefacts. –CaesarJ

Computing Department Usar o timeline para dar exemplos –Como os estudos retroalimentaram a definicao dos benchmarks Change scenarios (different HW releases) –Can be reused for studies involving traceability, reuse, effectiveness of change impact analysis techniques, etc.. indicators (concern interaction analysis) common naming scheme

Computing Department Results gathered can influence future development of the testbed Metrics collected in the stability study highlighted deficiencies in some changes. –Added additional changes to improve coverage. Development of new metrics. –Modularity metrics unable to capture all variations in the code due to their level of granularity. –Developed and applied change propagation metrics to be able to analyse all phenomenon to explicitly investigate the differences between AspectJ and CaesarJ.

Computing Department The Testbed as a Communication Tool Enables developers/researchers across phases to communicate. –A common set of artefacts. –Improved problem understanding. –Not targeted to one specific phase. Developers gain an improved awareness of all development phases. Enables focused discussions at workshops etc.

Computing Department Need to establish commonalities between approaches in order for comparisons to be made –tasks e.g. concerns, concern interaction, change propagation, modularity

Computing Department both architecture and implementation measures Instantiation of the Benchmark (Design Stability Study)

Computing Department Instantiation of the Benchmark (AO Requirements Study) Outcomes Overview –composition is the corner stone of AORE Composition specification is a time-consuming activity –improves change management and conflict analysis –this trade-off requires further analysis Conflict analysis is also a significant task –composition specification and conflict analysis –future: comparison with non-AO RE approaches