Presentation is loading. Please wait.

Presentation is loading. Please wait.

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 121 Software Design I Lecture 12.

Similar presentations


Presentation on theme: "Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 121 Software Design I Lecture 12."— Presentation transcript:

1 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 121 Software Design I Lecture 12 Duplication of course material for any commercial purpose without the explicit written permission of the professor is prohibited.

2 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 2 Discussion There will be no discussion this Friday

3 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 3 Today’s lecture Design methods (interaction design) Design methods (architecture design)

4 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 4 Intermezzo Experts curtail digressions Experts retain their orientation Experts think about what they are not designing Experts re-assess the landscape Experts invest now to save later

5 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 5 Application design Interaction design Architecture design Implementation design Analysis competitive testing contextual inquiry feature comparison stakeholder analysis task analysis critical incident technique interaction logging personas scenarios framework assessment model-driven engineering quality-function- deployment reverse engineering world modeling release planning summarization test-driven design visualization Synthesis affinity diagramming concept mapping mind mapping morphological chart design/making participatory design prototyping storyboarding architectural styles generative programming component reuse decomposition pair programming refactoring search software patterns Evaluation requirements review role playing wizard of oz cognitive walkthrough evaluative research heuristic evaluation think-aloud protocol formal verification simulation weighted objectives correctness proofs inspections/reviews parallel deployment testing Software design methods

6 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 6 Evaluative research Evaluative research is the process of testing the designed interactions with real prospective users

7 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 7 Procedure Choose demographics Define a set of representative tasks Choose deployment strategy Choose metrics Perform the tasks Analyze results

8 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 8 Example: choose demographics Random sample among all potential users Targeted selection of potential spokespeople Anyone willing to try Task experts

9 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 9 Example: define a set of representative tasks Find the book ‘Software Designers in Action’ and order it Find five books that are similar to this book, and choose the book that is most similar Post a review of this book Order a prescription for patient X consisting of 20 pills of penicillin, at a dosage of 250mg, 2 pills per day Change the prescription for patient X to 10 pills at a dosage of 500mg, 1 pill a day

10 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 10 Example: choose deployment strategy Usability lab Native environment Crowdsourcing

11 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 11 Example: choose metrics Time spent in completing tasks Number of clicks per completed task Number of tasks completed Number of mistakes made Number of requests for help Eye focus where expected …

12 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 12 Example: perform the tasks

13 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 13 Example: perform the tasks

14 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 14 Example: analyze results Posting a review is the most mistake-prone activity, and also involves the greatest amount of clicks Help was only asked when a prescription needed to be changed Eye focus was relatively constant across all tasks, suggesting no fundamental issues

15 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 15 Typical notation: table Participant 1Participant 2Participant 3 Time spent 22 minutes30 minutes28 minutes Number of clicks 978 Completed tasks 232921 Mistakes 8186 Help requests 372 Eye focus 22%17%31% … ………

16 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 16 Criteria for successful use The design solution must be sufficiently complete to enable real use The tasks must be representative of the future use by the eventual audience Must have a clear sense of the what the evaluative research is to accomplish

17 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 17 Strengths and weaknesses Strengths Feedback from actual users Can reach a broad audience Builds a benchmark for future improvements Feedback might be broader than just the goal of the study Weaknesses Can be difficult to conduct Requires appropriate resources Feedback may be confined to what the users already know – existing way of working

18 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 18 Application design Interaction design Architecture design Implementation design Analysis competitive testing contextual inquiry feature comparison stakeholder analysis task analysis critical incident technique interaction logging personas scenarios framework assessment model-driven engineering quality-function- deployment reverse engineering world modeling release planning summarization test-driven design visualization Synthesis affinity diagramming concept mapping mind mapping morphological chart design/making participatory design prototyping storyboarding architectural styles generative programming component reuse decomposition pair programming refactoring search software patterns Evaluation requirements review role playing wizard of oz cognitive walkthrough evaluative research heuristic evaluation think-aloud protocol formal verification simulation weighted objectives correctness proofs inspections/reviews parallel deployment testing Software design methods

19 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 19 Heuristic evaluation Heuristic evaluation is the process of assessing a user interface against a standard set of criteria

20 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 20 Procedure Choose heuristics Choose evaluators Identify goal of the interface Define a set of representative tasks Perform tasks Tabulate results

21 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 21 Example: choose heuristics Nielsen’s heuristics of user interface design Gerhardt-Powals’ cognitive engineering principles Standards – accessibility – responsive design – color guides Internal, company guidelines

22 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 22 Example: choose evaluators Choose the number of evaluators – typically three to five Choose required background – ideally, user interface experts – ideally, complementary skills

23 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 23 Example: identify goal of the interface The goal of the interface is to both make it possible for individuals to quickly find and order the book they need and for individuals to explore the richness of the collection that is available The goal of the interface is to allow doctors to easily order the right prescription for their patients while at the same time minimizing the number of wrong prescriptions being ordered

24 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 24 Example: define a set of representative tasks Find the book ‘Software Designers in Action’ and order it Find five books that are similar to this book, and choose the book that is most similar Post a review of this book Order a prescription for patient X consisting of 20 pills of penicillin, at a dosage of 250mg, 2 pills per day Change the prescription for patient X to 10 pills at a dosage of 500mg, 1 pill a day

25 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 25 Example: perform the tasks Each evaluator performs the set of tasks on their own and, per tasks, and ranks how the interface supports the task in terms of the heuristics

26 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 26 Example: tabulate results

27 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 27 Typical notation: table

28 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 28 Criteria for successful use The evaluators must be well-versed in the heuristics Application of multiple evaluators Requires attention to detail

29 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 29 Strengths and weaknesses Strengths Is based on best practices Useful to iron out problems before testing with real users Can be performed relatively quickly Can be used on early design artifacts Result sharing improves design practice Weaknesses Focuses on problems, does not identify opportunities Will not uncover all problems, because of a focus on a given set of heuristics

30 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 30 Variants Cognitive walkthrough

31 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 31 Application design Interaction design Architecture design Implementation design Analysis competitive testing contextual inquiry feature comparison stakeholder analysis task analysis critical incident technique interaction logging personas scenarios framework assessment model-driven engineering quality-function- deployment reverse engineering world modeling release planning summarization test-driven design visualization Synthesis affinity diagramming concept mapping mind mapping morphological chart design/making participatory design prototyping storyboarding architectural styles generative programming component reuse decomposition pair programming refactoring search software patterns Evaluation requirements review role playing wizard of oz cognitive walkthrough evaluative research heuristic evaluation think-aloud protocol formal verification simulation weighted objectives correctness proofs inspections/reviews parallel deployment testing Software design methods

32 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 32 Application design Interaction design Architecture design Implementation design Analysis competitive testing contextual inquiry feature comparison stakeholder analysis task analysis critical incident technique interaction logging personas scenarios framework assessment model-driven engineering quality-function- deployment reverse engineering world modeling release planning summarization test-driven design visualization Synthesis affinity diagramming concept mapping mind mapping morphological chart design/making participatory design prototyping storyboarding architectural styles architectural tactics component reuse decomposition pair programming refactoring search software patterns Evaluation requirements review role playing wizard of oz cognitive walkthrough evaluative research heuristic evaluation think-aloud protocol formal verification simulation weighted objectives correctness proofs inspections/reviews parallel deployment testing Software design methods

33 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 33 Application design Interaction design Architecture design Implementation design Analysis competitive testing contextual inquiry feature comparison stakeholder analysis task analysis critical incident technique interaction logging personas scenarios framework assessment model-driven engineering quality-function- deployment reverse engineering world modeling release planning summarization test-driven design visualization Synthesis affinity diagramming concept mapping mind mapping morphological chart design/making participatory design prototyping storyboarding architectural styles architectural tactics component reuse decomposition pair programming refactoring search software patterns Evaluation requirements review role playing wizard of oz cognitive walkthrough evaluative research heuristic evaluation think-aloud protocol formal verification simulation weighted objectives correctness proofs inspections/reviews parallel deployment testing Software design methods

34 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 34 World modeling World modeling is the process of precisely defining the inputs and outputs of the system to be designed

35 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 35 Procedure For each task to be supported by the system, identify precisely what information flows how, why, and when – manual tasks – automated tasks Detail each information flow in a data dictionary Draw the world model

36 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 36 Example: for each task…, …what information… Weather sensors – what: temperature, wind speed, wind direction, moisture level – when: every 30 seconds – how: RPC request over bus Maintenance display – what: fuel levels, battery levels – when: every 30 seconds – how: TCP/IP request Engineer – what: launch command – when: manual – how: TCP/IP request

37 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 37 Example: detail each information flow in a…

38 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 38 Example: draw the world model

39 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 39 Example: draw the world model

40 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 40 Typical notation: ER diagram

41 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 41 Typical notation: use case diagram

42 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 42 Typical notation: system context diagram

43 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 43 Criteria for successful use Must have a clear understanding of the audience, and overall goals of the system Should verify for completeness, correctness, consistency Must exhibit great attention to detail

44 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 44 Strengths and weaknesses Strengths Provides a complete and precise record of the boundaries of the software to be designed Documents all tasks performed with the system Provides clear guidance for the remainder of the design project Weaknesses On its own, without prior design methods, very difficult to perform Commits to a significant amount of detail and therefore introduces inertia with respect to change Impact of a piecemeal approach is unclear and can be problematic

45 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 45 Application design Interaction design Architecture design Implementation design Analysis competitive testing contextual inquiry feature comparison stakeholder analysis task analysis critical incident technique interaction logging personas scenarios framework assessment model-driven engineering quality-function- deployment reverse engineering world modeling release planning summarization test-driven design visualization Synthesis affinity diagramming concept mapping mind mapping morphological chart design/making participatory design prototyping storyboarding architectural styles architectural tactics component reuse decomposition pair programming refactoring search software patterns Evaluation requirements review role playing wizard of oz cognitive walkthrough evaluative research heuristic evaluation think-aloud protocol formal verification simulation weighted objectives correctness proofs inspections/reviews parallel deployment testing Software design methods

46 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 46 Framework assessment Framework assessment is the process of finding, comparing, and selecting an infrastructure upon which to build the system

47 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 47 Procedure Identify desired characteristics Identify potential frameworks Assess potential frameworks

48 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 48 Example: identify desired characteristics General – scalability – security – easy of use – performance Programming model – model-view-controller – push-based / pull-based – three-tier Application specific – programming language – data replication & synchronization – transaction support – form-based input External factors – programmers’ expertise – license – cost

49 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 49 Example: identify potential frameworks Struts Django Ruby on Rails Symfony Spring Jboss Node.js Angular.js … Oracle ADF Swing JMS TIBCO Processing …

50 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 50 Example: assess potential frameworks

51 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 51 Typical notation: table

52 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 52 Typical notation: graph

53 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 53 Criteria for successful use Must have a clear understanding of the architectural goals of the system Must be able to precisely assess the criteria Must select the right comparison criteria

54 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 54 Strengths and weaknesses Strengths Forces the detailed articulation of architectural goals Might identify new, unexpected opportunities Can lead to significant savings in cost and effort Weaknesses Is often difficult to precisely understand the long-term impact of choosing a given framework Might well require getting hands- on experience in using some frameworks for some time Does not work well for legacy code

55 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 55 Application design Interaction design Architecture design Implementation design Analysis competitive testing contextual inquiry feature comparison stakeholder analysis task analysis critical incident technique interaction logging personas scenarios framework assessment model-driven engineering quality-function- deployment reverse engineering world modeling release planning summarization test-driven design visualization Synthesis affinity diagramming concept mapping mind mapping morphological chart design/making participatory design prototyping storyboarding architectural styles architectural tactics component reuse decomposition pair programming refactoring search software patterns Evaluation requirements review role playing wizard of oz cognitive walkthrough evaluative research heuristic evaluation think-aloud protocol formal verification simulation weighted objectives correctness proofs inspections/reviews parallel deployment testing Software design methods

56 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 56 Reverse engineering Reverse engineering is the process of extracting existing design knowledge from a code base

57 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 57 Procedure Determine what design knowledge you need to obtain, and why Decide upon strategy Execute Verify

58 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 58 Example: determine what design knowledge… What: architecture of HADOOP Why: to consider possible refactoring What: data anonymization in the Google AdSense pipeline Why: make adjustments because of changes in privacy laws

59 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 59 Example: decide upon strategy Look for existing documentation Read code and make diagrams Talk to other developers Use tools

60 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 60 Example: execute

61 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 61 Example: verify Bash “ground truth” ACDC ZBRBunch

62 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 62 Typical notation: diagram

63 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 63 Criteria for successful use Should have access to as much information as possible Should look for focused, localized knowledge Must iterate

64 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 64 Strengths and weaknesses Strengths Reveals the actual (often messy) implementation structures of the current system Can help re-orient the team if the new knowledge is shared Weaknesses Just like the original design documentation, the new knowledge can be lost Typically can only recover the results from making design decisions, not necessarily the decisions or deliberations behind them (the rationale) Can be very labor intensive Results may be inaccurate


Download ppt "Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 121 Software Design I Lecture 12."

Similar presentations


Ads by Google