Presentation is loading. Please wait.

Presentation is loading. Please wait.

Web Content Development Dr. Komlodi Class 25: Evaluative testing.

Similar presentations


Presentation on theme: "Web Content Development Dr. Komlodi Class 25: Evaluative testing."— Presentation transcript:

1 Web Content Development Dr. Komlodi Class 25: Evaluative testing

2 Web Design and Evaluation Information organization (Site map) User, content, context research (Site scope) Labeling and navigation design (Wireframes) User-system interaction design (Application flow) Graphics design and branding Content creation (Content inventory) Evaluate

3 The aims Introduction to the goals and methods of user interface evaluation Practice methods Focus on: –Usability evaluation –Expert reviews: Heuristic evaluation

4 The need for evaluation Usable and useful user interfaces and information architectures need evaluation Evaluation should not be carried out by designers Two main types of evaluation Formative evaluation is done at different stages of development to check that the product meets users’ needs. Summative evaluation assesses the quality of a finished product. Our focus is on formative evaluation

5 Bruce Tognazzini tells you why you need to evaluate “Iterative design, with its repeating cycle of design and testing, is the only validated methodology in existence that will consistently produce successful results. If you don’t have user-testing as an integral part of your design process you are going to throw buckets of money down the drain.” See AskTog.com for topical discussion about design and evaluation.

6 Steve Krug tells you why you need to evaluate http://www.peachpit.com/podcasts/episode.aspx?e=F6578A16-F53E-489E-93CF- 2C839EA840CFhttp://www.peachpit.com/podcasts/episode.aspx?e=F6578A16-F53E-489E-93CF- 2C839EA840CF

7 Iterative design & evaluation is a continuous process that examines: Early ideas for conceptual model Early prototypes of the new system Later, more complete prototypes Designers need to check that they understand users’ requirements. What to evaluate

8 What To Evaluate - Examples Evaluate a paper prototype: –www.youtube.comwww.youtube.com –Paper prototype usability test Evaluate a mockup: –www.youtube.comwww.youtube.com –Balsamiq Mockups Intro

9 When to evaluate Throughout design From the first descriptions, sketches etc. of users needs through to the final product Design proceeds through iterative cycles of ‘design-test-redesign’ Evaluation is a key ingredient for a successful design.

10 Four evaluation paradigms ‘quick and dirty’ usability testing field studies expert reviews

11 Quick and dirty ‘quick & dirty’ evaluation describes the common practice in which designers informally get feedback from users or consultants to confirm that their ideas are in- line with users’ needs and are liked. Quick & dirty evaluations are done any time. The emphasis is on fast input to the design process rather than carefully documented findings.

12 Usability testing Usability testing involves recording typical users’ performance on typical tasks in controlled settings. Field observations may also be used. As the users perform these tasks they are watched & recorded on video & their key presses are logged. This data is used to calculate performance times, identify errors & help explain why the users did what they did. User satisfaction questionnaires & interviews are used to elicit users’ opinions.

13 Usability Testing Video: –www.youtube.comwww.youtube.com –With adults: UIQ, Usability test –With kids: Math Flash Game Usability Test

14 Evaluation Observation methods Define typical user tasks Collect background information: –Demographic questionnaire –Skills questionnaire Define success metrics Collect performance and satisfaction data Do not interfere with user Think aloud Prompt: What are you thinking? What are you doing? But ask follow-up questions on problems Analyze data Suggest improvements

15 Usability Testing Exercise Teams of three: –Participant –Test administrator –Note-taker Test the following sites: –USMAI catalog (http://catalog.umd.edu/)http://catalog.umd.edu/ –Research Port (http://researchport.umd.edu)http://researchport.umd.edu

16 Usability Testing Exercise Procedure Whole group: Familiarize yourself with the site, try to figure out the goals and intended user group – the note-taker should take notes The test administrator and note-taker should read and modify the usability evaluation script, including devising two tasks Conduct the study Post your notes and lessons learned about the site and the usability evaluation process

17

18

19

20

21 Visit the Usability Lab Next Class

22 Field studies Field studies are done in natural settings The aim is to understand what users do naturally and how technology impacts them. In product design field studies can be used to: - identify opportunities for new technology - determine design requirements - decide how best to introduce new technology - evaluate technology in use.

23 Expert reviews Experts apply their knowledge of typical users, often guided by heuristics, to predict usability problems. A key feature of predictive evaluation is that users need not be present Relatively quick & inexpensive Expert reviews entail one-half day to one week effort, although a lengthy training period may sometimes be required to explain the task domain or operational procedures There are a variety of expert review methods to chose from: –Heuristic evaluation –Guidelines review –Consistency inspection –Cognitive walkthrough –Formal usability inspection

24 Expert reviews (cont.) Expert reviews can be scheduled at several points in the development process when experts are available and when the design team is ready for feedback. Different experts tend to find different problems in an interface, so 3-5 expert reviewers can be highly productive, as can complementary usability testing. The dangers with expert reviews are that the experts may not have an adequate understanding of the task domain or user communities. Even experienced expert reviewers have great difficulty knowing how typical users, especially first-time users will really behave.

25 Heuristic Evaluation Example Information visualization tool for intrusion detection Project sponsored by Department of Defense Review created by Enrique Stanziola and Azfar Karimullah

26 Heuristics We developed certain heuristics that were utilized to effectively evaluate the system. We looked at the following criteria: Match user task with the transitions provided on the interface. Object grouping based according to their relatedness. Color Usage – Accessibility evaluation. Interface provides just enough Information Speak User’s language. User’s conceptual Model evaluation User Memory load (design issues) Consistency Evaluation User Feedback Clearly marked exits Shortcuts Constructing error messages Error Handling Help and documentation

27 Findings The rest of this document focuses on the individual findings of each expert user. We report the comments of each user as he completed all the tasks. Expert Reviewer A: c.1.In the File Menu, user’s language is not used. There is no term like “New” or “ New Session” that would indicate the initial step the user must take to start a session. c.2.No help is provided. c.3.Labels in the graph window are too small on the color bar. Font size is not consistent with the font size used in the 3D graph display. c.4.User Language: ‘Binding’ term used in Menu is hard to understand. Also the window title: ‘dGUI’ could be made more meaningful. c.5.No keyboard navigation functions available to the user in the data configuration window. c.6.No clue as to how to select a variable (double clicking) and how to deselect the selected variable. Dragging function not evident to the user. Balloon help could be useful. Buttons next to Visualization attribute list have no label.

28 Heuristic Evaluation Exercise Louis Rosenfeld’s IA Heuristics (2004) Select an area of heuristics: –Main page –Search interface –Search results –Site-wide & Contextual navigation Evaluate the UMBC library site in light of these Report your results to the class

29 Choose the evaluation paradigm & techniques Goals Budgets Participants Time limits Context

30 Evaluating the 1984 OMS Early tests of printed scenarios & user guides  Early simulations of telephone keypad  An Olympian joined team to provide feedback  Interviews & demos with Olympians outside US  Overseas interface tests with friends and family.  Free coffee and donut tests  Usability tests with 100 participants.  A ‘try to destroy it’ test  Pre-Olympic field-test at an international event  Reliability of the system with heavy traffic

31 Design Example Video Allison Druin et al.: Designing with and for children http://www.umiacs.umd.edu/~allisond/ Videos: –Juan Pablo Hourcade, Allison Druin, Lisa Sherman, Benjamin B. Bederson, Glenda Revelle, Dana Campbell, Stacey Ochs & Beth Weinstein (2002) SearchKids: a Digital Library Interface for Young Children. ACM SIGCHI 2002 Conference Questions: –Who: who are the designers, evaluators, and other participants? –What & how: what evaluation methods are they applying and how are they using these?


Download ppt "Web Content Development Dr. Komlodi Class 25: Evaluative testing."

Similar presentations


Ads by Google