Download presentation
Presentation is loading. Please wait.
1
Trends in UCD HCDE 518 Winter 2011
Hand outs: syllabus, A1, R1, Group form With credit to Jake Wobbrock, Dave Hendry, Andy Ko, Jennifer Turns, & Mark Zachry
2
Agenda Announcements, Hand in assignments Sketching Critiques
Lecture – Analytical Evaluation Class Exercise: Heuristic Evaluation Break – 10 mins Discussion of UCD readings Break – 10 mins P3 Demos Class Evaluations Group Project Work Time
3
Announcements R7 returned today A3 returned today
4
Sketching Critiques – Friends & Family
Break into groups of about 4 people Take turns showing off and explaining your 3 sketches with each other Each critic should offer advice and feedback about the idea Strengths, Weaknesses, Originality, Feasibility Sketcher: take notes about what feedback was offered Critic: be critical, but constructive and courteous! Each critic should sign the page after the sketches and date it with today’s date
5
lecture – Analytical Evaluation
6
Analytical Evaluation
Heuristic Evaluation Have usability experts go through your prototype to uncover common usability problems Cognitive Walkthrough Have experts analyze your prototype in a detailed way to understand how uses will understand it Best for understanding novel use, not expert use
7
Heuristic Evaluation Developed by Jakob Nielsen
Helps find usability problems in a UI design Small set (3-5) of evaluators examine UI independently check for compliance with usability principles (“heuristics”) different evaluators will find different problems evaluators only communicate afterwards findings are then aggregated Can perform on working UI or on sketches
8
Heuristic Evaluation Process
Evaluators go through UI several times inspect various dialogue elements compare with list of usability principles consider other principles/results that come to mind Usability principles Nielsen’s “heuristics” supplementary list of category-specific heuristics competitive analysis & user testing of existing products Use violations to redesign/fix problems
9
Heuristics (Nielsen, 1994) Visibility of system status
Match between system and the real world User control and freedom Consistency and standards Error prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design Help users recognize, diagnose, and recover from errors Help and documentation
10
Phases of Heuristic Evaluation
1) Pre-evaluation training give evaluators needed domain knowledge & information on the scenario 2) Evaluation individuals evaluates UI & makes list of problems 3) Severity rating determine how severe each problem is 4) Aggregation group meets & aggregates problems (w/ ratings) 5) Debriefing discuss the outcome with design team
11
How to Perform Evaluation
At least two passes for each evaluator (3-5 people) first to get feel for flow and scope of system second to focus on specific elements If system is walk-up-and-use or evaluators are domain experts, no assistance needed otherwise might supply evaluators with scenarios Each evaluator produces list of problems explain why with reference to heuristic or other information be specific & list each problem separately
12
Example Errors from Evaluators
Can’t copy info from one window to another violates “Minimize the users’ memory load” (H3) fix: allow copying Typography uses different fonts in 3 dialog boxes violates “Consistency and standards” (H4) slows users down probably wouldn’t be found by user testing fix: pick a single format for entire interface
13
Severity Rating Used to allocate resources to fix problems
Estimates of need for more usability efforts Combination of frequency impact persistence (one time or repeating) Should be calculated after all evals. are in Should be done independently by all judges
14
Severity Ratings (cont.)
0 - don’t agree that this is a usability problem 1 - cosmetic problem 2 - minor usability problem 3 - major usability problem; important to fix 4 - usability catastrophe; imperative to fix
15
Debriefing Conduct with evaluators, observers, and development team members Discuss general characteristics of UI Suggest potential improvements to address major usability problems Dev. team rates how hard things are to fix Make it a brainstorming session little criticism until end of session
16
Severity Ratings Example
1. [H4 Consistency] [Severity 3] The interface used the string "Save" on the first screen for saving the user's file, but used the string "Write file" on the second screen. Users may be confused by this different terminology for the same function.
17
HE vs. User Testing HE is much faster
1-2 hours each evaluator vs. days-weeks HE doesn’t require interpreting user’s actions User testing is far more accurate (by def.) takes into account actual users and tasks HE may miss problems & find “false positives” Good to alternate between HE & user testing find different problems don’t waste participants
18
Class Activity: Heuristic Evaluation
Electronic voting machine Download prototype: Download form: Use form and Nielsen’s 1994 heuristics to evaluate the voting interface
19
Break – 10 minutes
20
Trends in UCD discussion
21
Readings Spinuzzi, C. (2005). The methodology of participatory design. Technical Communication, 52(2), 163–74. Sears, A. and Jacko, J. (2008) Future trends in human-computer interaction. The Human-Computer Interaction Handbook, A. Sears, J.A. Jacko (eds). Mahwah, NJ: Lawrence Erlbaum, pp Vredenburg, K. Mao, J.Y., Smith, P.W., and Carey, T. (2002). A survey of user-centered design practice. CHI '02. pp Mao, J.Y., Vredenburg, K. Smith, P.W. Carey, T. (2005). The state of user-centered design practice. Commun. ACM 48, 3 (March 2005), Norman, D.A Human-centered design considered harmful.interactions 12, 4 (July 2005), OPTIONAL: Hendry, D.G. (2008). Public participation in proprietary software development through user roles and discourse. Int. J.Hum.-Comput. Stud. 66, (7),
22
Participatory Design What is it? Why should you do it?
What advantages? What disadvantages?
23
Activity Centered Design vs. HCD
Define Activity Centered Design Example? Thoughts?
24
Trends in UCD How does this relate to your own experiences?
Is it still up to date?
25
Sears & Jacko Six questions to 5 members of the HCI community
What are HCI’s 3 grand challenges? What are the three most important relevant results from the last 10 years? What are the exciting emerging domains? Most innovative changes in next 5 years? What do educators need to change? What is the future?
26
Grand Challenges Carroll Ogawa Rau Salvendy Stephanidis Kientz
Organizational issues, Ubicomp, End user programming, Collaboration Ogawa Integration of telecom & broadcast, HCI for mobile appliances, communication tools (“cyberspace”) Rau Make HCI profitable, new methodologies, impact user experience (e.g., “killer apps” Salvendy Science base for HCI, comprehensive education program, push the needed technology Stephanidis Universal access, HCI theories and methodologies, digitization of HCI practices Kientz Scaling novel computing technologies, personalizing technologies in meaningful ways, supporting activities and long-term goals
27
Important Results Carroll Ogawa Rau Salvendy Stephanidis Kientz
Interactive information visualization, collaboration via the web, powerful information retrieval tools Ogawa Universal designs, portable devices, dispatching individual information (e.g., blogs and homepages) Rau Website usability, UIs for handheld devices, cellphones & mp3 players Salvendy Concepts, metaphors, and tools; visualization, adaptive interfaces Stephanidis User-centered approach to design, computer accessibility, user interface personalization Kientz Usable mobile devices and always-on internet (e.g., iPhone), sensing activities of human behavior, shift to engaging user experiences rather than goal-oriented tasks
28
Exciting Emerging Domains
Carroll Security and privacy, universal accessibility, applications (e.g., healthcare), affect Ogawa Portable devices for elderly, search functions Rau Emotional design, computer games, smart environments, cross-cultural designs, fun Salvendy Nanotechnology, different cultures, system science Stephanidis Services, multimodal interaction, cooperation, access to information, robots Kientz Healthcare (especially preventive health and public health), games with a purpose, ubiquitous computing
29
Innovative Changes of next 5 years
Carroll Cell phones, agents Ogawa Agents/robots Rau Wearable & ubiquitous computing Salvendy Disappearing computer, miniaturized computing systems, intelligent interfaces Stephanidis Mobile interaction, home environment, biometrics Kientz Personalization of computing, activity-based computing
30
Visions of the Future Where will human-computer interaction be in
10 years? 25 years? 50 years?
31
Apple’s Knowledge Navigator
32
Microsoft Labs’ Visions of the Future
Productivity: Manufacturing: Health Care: Retail: Banking: Home:
33
Minority Report Vision
34
Class Activity: Envisioning the future
In small groups, come up with YOUR answers to three of the questions posed by Sears & Jacko What are HCD’s grand challenges? What are exciting emerging domains? What are the innovative changes of next 5 years? Spend 10 minutes, then we’ll share
35
Next Class Tuesday, March 1st Due Next Week
Final Project Presentations Overview of Class (in prep for final take-home exam) Due Next Week P4 Final Presentations Sketching Reflection
36
P3 Demos
37
Order (random!) Healthy Eating Teleworkers Health Bridge Daily Errands
Urban Gardeners
38
Course Evaluations & Group Project Meet Time
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.