Presentation is loading. Please wait.

Presentation is loading. Please wait.

(c) DAIMI - OOSSHenrik Bærbak Christensen1 Architectural Evaluation Overview, Questioning Techniques.

Similar presentations


Presentation on theme: "(c) DAIMI - OOSSHenrik Bærbak Christensen1 Architectural Evaluation Overview, Questioning Techniques."— Presentation transcript:

1 (c) DAIMI - OOSSHenrik Bærbak Christensen1 Architectural Evaluation Overview, Questioning Techniques

2 (c) DAIMI - OOSSHenrik Bærbak Christensen2 Literature Main –[Bass el al., 2003] chap 11 (ATAM) Bass, L., Clements, P., Kazmann, R. Software Architecture in Practice. Addison-Wesley 2003. –[Barbacci et al., 2003] Barbacci, M., Ellison, R., Lattanze, A., Stafford, J., Weinstock, C., and Wood, W. (2003). Quality Attribute Workshops, 3rd Edition. Technical Report CMU/SEI-2003- TR-016, Software Engineering Institute. –[Bardram et al., 2004] Bardram, J., Christensen, H. B., and Hansen, K. M. (2004). Architectural Prototyping: An Approach for Grounding Architectural Design and Learning. In Proceedings of the 4th Working IEEE/IFIP Conference on Software Architecture (WICSA 2004), pages 15-24, Oslo, Norway.

3 (c) DAIMI - OOSSHenrik Bærbak Christensen3 A Simplified Development Process NB! Iterative, incremental, parallel, … process ?

4 (c) DAIMI - OOSSHenrik Bærbak Christensen4 Rationale for Architectural Evaluation Software architecture allows or precludes almost all system quality attributes –Need to evaluate impact of architectural design on quality attributes –Should be possible… Architecture needs to be designed early –E.g., prerequisite for work assignment Architecture embodies fundamental design decisions –Hard/costly to change these –The later the change, the costlier it is Cheap techniques for architectural evaluation exist

5 (c) DAIMI - OOSSHenrik Bærbak Christensen5 When? Architecture designed, design not implemented –I.e., early in a cycle in an iteration –E.g., “We have designed this architecture will it possibly fit the quality requirements?” Early –Evaluate design decisions Either design decisions made or considered Architecture description may not be available –Completeness and fidelity product of state of architectural description –E.g., “Will any architecture meet these quality requirements?” “You can have any combination of features the Air Ministry desires, so long as you do not also require that the resulting airplane fly” (Messerschmitt) Late –Evaluate architecture for existing system Possible redesign, evolution, … –E.g., “We need to integrate with this product, will it be suitable in our architecture?”

6 (c) DAIMI - OOSSHenrik Bærbak Christensen6 Who? Roles –Evaluation team Preferably not process staff –Objectivity reasons… –Project stakeholders Articulate requirements “Garden variety” and decision makers Producers –Software architect, developer, maintainer, integrator, standards expert, performance engineer, security expert, project manager, reuse czar Consumers –Customer, end user, application builder (for product line) Servicers –System administrator, network administrator, service representatives Needs depend on the actual type of evaluation

7 (c) DAIMI - OOSSHenrik Bærbak Christensen7 What? Probe of suitability of architecture(s) –Will system meet quality goals? –Will it still be buildable? Context-specific –E.g., modifiability depends on modification scenarios Need articulated goals –Major part of evaluation process –Particularly in large, organizationally complex projects Most often not scalar or results –“00, 03, …, 13” –Often focus on finding risks Which decisions affect which qualities?

8 (c) DAIMI - OOSSHenrik Bærbak Christensen8 Questioning Techniques Analytical –Any project area, typically review –Any state of completeness Questionnaires Checklists Scenario-based methods

9 (c) DAIMI - OOSSHenrik Bærbak Christensen9 Overview Check-lists / Questionnaires

10 (c) DAIMI - OOSSHenrik Bærbak Christensen10 Questionnaires List of relatively open questions –Apply to all architectures Process and product questions –Cf. Capability Maturity Model (CMM) for software E.g., Kruchten

11 (c) DAIMI - OOSSHenrik Bærbak Christensen11 Checklists Detailed questions –Based on experience –Domain-specific E.g., AT&T

12 (c) DAIMI - OOSSHenrik Bærbak Christensen12 Overview Scenario-based methods

13 (c) DAIMI - OOSSHenrik Bærbak Christensen13 Scenario-Based Methods Scenarios –Description of interaction with system from the point of with of a stakeholder System-specific – developed as part of project Cf. quality-attribute scenarios as in [Bass et al., 2003] Types –Quality Attribute Workshops (QAW) Structured way of involving stakeholders in scenario generation and prioritization –Architecture Tradeoff Analysis Method (ATAM) Creates utility trees to represent quality attribute requirements Analysis of architectural decisions to identify sensitivity points, tradeoffs, and risks –Software Architecture Analysis Method (SAAM) Brainstorming of modifiability and functionality scenarios Scenario walkthrough to verify functionality support and estimate change costs –Active Reviews for Intermediary Designs (ARID) Active Design Review of software architecture –E.g., “Is the performance of each component adequately specified” vs. “For each component, write down its maximum execution time and list the shared resources that it may consume” –Survivable Network Analysis method (SNA) Focus on survivability as quality attribute Process –Determine essential components based on essential services and assets –Map intrusion scenarios onto architecture to find “soft-spot” components (essential, but vulnerable) –Analyze these wrt »Resistance »Recognition »Recovery of/from attacks

14 (c) DAIMI - OOSSHenrik Bærbak Christensen14 Quality Attribute Workshops (QAWs) Facilitated method that engages stakeholders in discovering and prioritizing quality attribute requirements –[Barbacci, 2003] –Typically started before architecture is designed/finished Steps 1.QAW Presentation and Introductions 2.Business/Mission Presentation 3.Architectural Plan Presentation 4.Identification of Architectural Drivers 5.Scenario Brainstorming 6.Scenario Consolidation 7.Scenario Prioritization 8.Scenario Refinement

15 (c) DAIMI - OOSSHenrik Bærbak Christensen15 QAW Steps (2) 1. Presentation and Introductions 2. Business/Mission Presentation –Stakeholders present drivers for system E.g., profitability, time-to-market, better quality of service, … –High-level functional requirements, constraints, quality attribute requirements 3. Architectural Plan Presentation –How will business/mission drivers be satisfied? –Key technical requirements E.g., mandated technical decisions –Existing, preliminary architectural descriptions 4. Identification of Architectural Drivers –Keys to realizing quality attribute goals of system E.g., key requirements, business drivers, quality attributes –Distilled version of steps 2. and 3.

16 (c) DAIMI - OOSSHenrik Bærbak Christensen16 QAW Steps (3) 5. Scenario Brainstorming –Goal Come up with as many well-formed quality attribute scenarios as possible Stimulus, environment, response –Participants Come up with quality attribute scenarios No critique as such, only clarification questions –Facilitator Write scenarios on whiteboard Ensure that scenarios are usable –“The system shall be modifiable” vs. “The user interface of … is changed to different look & fell in two person days” Make sure architectural drivers are covered –Either fixed time period or whenever participants run out of good ideas Usually easy to create 30-40 scenarios

17 (c) DAIMI - OOSSHenrik Bærbak Christensen17 QAW Steps (4) – Scenario Types Use case scenarios –Expected use of system Growth scenarios –Anticipated changes to system Exploratory scenarios –Unanticipated stresses to system

18 (c) DAIMI - OOSSHenrik Bærbak Christensen18 QAW Steps (5) 6. Scenario Consolidation –Merge similar scenarios –Requires stakeholder agreement – majority consensus if need be 7. Scenario Prioritization –Each stakeholder has N = ceil(#scenarios * 0,3) votes on consolidated scenarios –Round-robin voting Two passes Each pass: allocate half of votes –Resulting count = prioritization Typically high, medium, low priority 8. Scenario Refinement –Develop high priority scenarios according to scheme of [Bass et al., 2003] Stimulus, source of stimulus, environment, artifact stimulated, response measure Describe relevant quality attributes Elicitate questions and issues

19 (c) DAIMI - OOSSHenrik Bærbak Christensen19 QAW Results Outputs –List of architectural drivers –Raw scenario list –Prioritized scenario list –Refined scenarios Uses –Update architectural vision, refine requirements –Guide design and development –Facilitate further analysis/evaluation E.g., ATAM

20 (c) DAIMI - OOSSHenrik Bærbak Christensen20 Architecture Tradeoff Analysis Method (ATAM) Context –“Large US public projects” Government, military, space DoD, NASA, … Goals –Reveal how well architecture satisfies quality goals –Insight into tradeoffs among quality attributes –Identify risks Structured method –Repeatable analysis Should yield same results when applied twice –Guides users to look for conflicts and resolutions Parts –Presentation –Investigation –Testing –Reporting

21 (c) DAIMI - OOSSHenrik Bærbak Christensen21 ATAM Steps (1) – Investigation and Analysis Presentation –1. Present the ATAM Evaluation leader describes method… –2. Present Business Drivers Most important functions Relevant constraints –Technical, managerial, economic Business goals Major stakeholders Architectural drivers –Major quality attribute goals, which shape architecture –3. Present the Architecture In terms of views… Focus on how drivers are met

22 (c) DAIMI - OOSSHenrik Bærbak Christensen22 ATAM Steps (2) – Investigation and Analysis 4. Identify Architectural Approaches Used –Architect names approaches used Approach = here, set of architectural decisions, e.g., tactics that are used –Goal Eventually match desired qualities and decisions

23 (c) DAIMI - OOSSHenrik Bærbak Christensen23 ATAM Steps (3) – Investigation and Analysis 5. Generate the Quality Attribute Utility Tree –Decision managers refine most important quality attribute goals Qualities: Performance –Refinement: Data latency »Scenario: Deliver video in real-time –Specify prioritization Importance with respect to system success –High, Medium, Low Difficulty in achieving –High, Medium, Low (H,H), (H,M), (M,H) most interesting

24 (c) DAIMI - OOSSHenrik Bærbak Christensen24 ATAM Steps (4) – Investigation and Analysis 6. Analyze Architectural Approaches –Match quality requirements in utility tree with architectural approaches How is each high priority scenario realized? Quality- and approach-specific questions asked – e.g., well-known weaknesses of approach –Decision makers identify Sensitivity points –Property of component(s) critical in achieving particular quality attribute response –E.g., security: level of confidentiality vs. number of bits in encryption key Tradeoff points –Property that is sensitivity point for more than one attribute –E.g., encryption level vs. security and performance, in particular if hard real-time guarantees are required Risks –Potentially unacceptable values of responses

25 (c) DAIMI - OOSSHenrik Bærbak Christensen25

26 (c) DAIMI - OOSSHenrik Bærbak Christensen26 ATAM Steps (6) – Testing 7. Brainstorm and Prioritize Scenarios –Akin to QAWs Larger groups of stakeholders Place results in utility tree 8. Analyze Architectural Approaches 9. Present Results –Outputs Documented architectural approaches Set of scenarios and prioritizations Utility tree (Non)risks Sensitivity and tradeoff points

27 (c) DAIMI - OOSSHenrik Bærbak Christensen27 ATAM and QAW – Discussion How to ensure completeness? –And thus eventually suitability of architecture evaluated Does process quality => product quality? How do the approaches fare in an iterative setting? –Need to rework scenarios etc. iteratively…

28 (c) DAIMI - OOSSHenrik Bærbak Christensen28 Overview Measuring techniques

29 (c) DAIMI - OOSSHenrik Bærbak Christensen29 Measure Techniques Prerequisite –Artifacts to do measurements on… May answer specific quality attribute scenarios –Cf. experimental vs. exploratory prototyping Types –Metrics –Simulation, prototypes, experiments –Rate-Monotonic Analysis –ADL-based

30 (c) DAIMI - OOSSHenrik Bærbak Christensen30 Metrics Quantitative interpretation of observable measurement of software architecture –Cf. [ISO, 2001] E.g., measuring complexity to predict modifiability –Real-time object-oriented telecommunication systems Number of events reacted to Number of {asynchronous, synchronous} calls made Number of component clusters –Object decompositions units, e.g., car as wheels, transmission, steering, … Depth of inheritance tree … E.g., flow metrics to predict reliability –Call-graph-metrics Number of modules used by module Total calls to others modules Unique calls to other modules –Control-flow metrics Number of if-then conditional arcs Number of loop arcs given control-flow graph

31 (c) DAIMI - OOSSHenrik Bærbak Christensen31 Simulations, Prototypes, Experiments Architectural prototyping: [Bardram et al., 2004] –Exploratory: finding solutions –Experimental: evaluating solutions

32 (c) DAIMI - OOSSHenrik Bærbak Christensen32 Rate-Monotonic Analysis Static, quantitative analysis –Ensuring dependability in hard real-time systems Preemptive multitasking, execute task with highest priority Basic idea –Assign priority to each process according to its execution time The shorter the execution time, the higher the priority RMA ensures schedulability of processes –No process misses execution deadline –Worst-case schedule bound: W(n) = n * (2^(1/n) - 1) ln 2, n -> oo Guarantees schedulability if concurrency model is observed in implementation

33 (c) DAIMI - OOSSHenrik Bærbak Christensen33 Automated Tools UML-based –Simulation –Semantic checks –Code generation to ensure conformance –…

34 (c) DAIMI - OOSSHenrik Bærbak Christensen34 Summary Wide range of evaluation approaches exist –Questioning techniques –Measuring techniques Establishing suitability of architecture as main goal –Does architecture fulfill quality goals? –Is architecture buildable within project constraints? Utility of approaches dependent on context –Project state –Expertise –Tools used –Domain –…


Download ppt "(c) DAIMI - OOSSHenrik Bærbak Christensen1 Architectural Evaluation Overview, Questioning Techniques."

Similar presentations


Ads by Google