Cognitive Walkthrough More evaluating with experts.

Slides:



Advertisements
Similar presentations
Chapter 15: Analytical evaluation
Advertisements

Acknowledgements: Most of this course is based on the excellent course offered by Prof. Kellogg Booth at the British Columbia University, Vancouver, Canada.
Copyright 1999 all rights reserved The HCI Design Process n User Interfaces are not just built by sitting down and drawing up designs for them n Just like.
Cognitive Walkthrough Example Dr. Philip Craiger Human - Computer Interaction.
Cognitive Walkthrough More evaluation without users.
Evaluation 1 Introduction & Usability Inspection.
Ch 11 Cognitive Walkthroughs and Heuristic Evaluation Yonglei Tao School of Computing and Info Systems GVSU.
©N. Hari Narayanan Computer Science & Software Engineering Auburn University 1 COMP 7620 Evaluation Chapter 9.
Chapter 15: Analytical evaluation. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San.
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2009.
Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough.
Useability.
Discount Evaluation Evaluating with experts. Agenda Part 4 preview Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough Perform.
Usability Inspection n Usability inspection is a generic name for a set of methods based on having evaluators inspect or examine usability-related issues.
Semester wrap-up …my final slides.. More on HCI Class on Ubiquitous Computing next spring Courses in visualization, virtual reality, gaming, etc. where.
Testing your design Without users: With users: Cognitive walkthrough
Semester wrap-up …the final slides.. The Final  December 13, 3:30-4:45 pm  Closed book, one page of notes  Cumulative  Similar format and length to.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluating with experts
Intro to Evaluation See how (un)usable your software really is…
Discount Evaluation Evaluating with experts. Agenda Project was due today – You will demo your prototype next class Heuristic Evaluation Cognitive Walkthrough.
Evaluation techniques Part 1
Evaluation: Inspections, Analytics & Models
John Kelleher (IT Sligo) Cognitive Walkthrough. 1 Background Authors – Lewis & Polson (1991) Based on theory L&P CE+ Theory of Exploratory Learning Assesses.
Semester wrap-up …the final slides.. The Final December 15, 3:30-6:30 pm Close book, one page of notes Cumulative Similar format to midterm (probably.
Analytical Evaluations 2. Field Studies
Cognitive walkthrough: description and example Based on Task-Centered Design by Clayon Lewis & John Rieman, CU Boulder.
Usability 2009 J T Burns1 Usability & Usability Engineering.
Usability Methods: Cognitive Walkthrough & Heuristic Evaluation Dr. Dania Bilal IS 588 Spring 2008 Dr. D. Bilal.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Evaluating Your Prototype without Users Class 29.
“Come on! Give me ten!” What users really want and really do on library web sites Darlene Fichter OLA Super Conference 2003.
Human Computer Interaction & Usability Prototyping Design & Prototyping HCI Prototyping.
Predictive Evaluation
Principles of User Centred Design Howell Istance.
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
HCI Prototyping Chapter 6 Prototyping. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “prototyping” –Explain the.
Evaluation Techniques Material from Authors of Human Computer Interaction Alan Dix, et al.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Cognitive Walkthrough Judy Kay CHAI: Computer human adapted interaction research group School of Information Technologies.
Identifying needs and establishing requirements
Extended Cognitive Walkthrough Judy Kay CHAI: Computer human adapted interaction research group School of Information Technologies.
Evaluation of User Interface Design 4. Predictive Evaluation continued Different kinds of predictive evaluation: 1.Inspection methods 2.Usage simulations.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Evaluating & Maintaining a Site Domain 6. Conduct Technical Tests Dreamweaver provides many tools to assist in finalizing and testing your website for.
User Interface Evaluation Cognitive Walkthrough Lecture #16.
Brugergrænseflader til apparater BRGA Presentation 3: Cognitive Psychology & usable methods.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
COMP5047 Pervasive Computing: 2012 Extended Cognitive Walkthrough Judy Kay CHAI: Computer human adapted interaction research group School of Information.
Introduction to Evaluation without Users. Where are you at with readings? Should have read –TCUID, Chapter 4 For Next Week –Two Papers on Heuristics from.
Administrivia  Feedback from the mid-term evaluation  Insights from project proposal.
1 Cognitive Walkthrough C. Wharton, J. Rieman, C. Lewis and P. Polson, The Cognitive Walkthrough Method: A Practitioner’s Guide, in J. Nielsen and R. Mack.
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
Asking Users and Experts Li Zhang (Jacey) Yuewei Zhou (Joanna)
Human Computer Interaction Lecture 15 Usability Evaluation
Imran Hussain University of Management and Technology (UMT)
CS3205: HCI in SW Development Evaluation (Return to…)
Evaluation without Users, Part 2
Cognitive Walkthrough
Cognitive Walkthrough
Cognitive Walkthrough
Evaluation.
HCI Evaluation Techniques
Cognitive Walkthrough
Evaluation: Inspections, Analytics & Models
Evaluation: Inspections, Analytics, and Models
Presentation transcript:

Cognitive Walkthrough More evaluating with experts

Review Discount evaluation techniques – Use experts to predict usability problems – Likely to be cheaper and faster with fewer ethical issues Heuristic evaluation – Experts identify potential problems in a design or prototype based on set of guiding heuristics

Cognitive Walkthrough Assess learnability and usability through simulation of way users explore and become familiar with interactive system A usability “thought experiment” Like code walkthrough (s/w engineering) From Polson, Lewis, et al at UC Boulder

Cognitive Walkthrough Qualitative Predictive With experts to examine learnability and novice behavior

CW: Process Construct carefully designed tasks from system spec or screen mock-up Walk through (cognitive & operational) activities required to go from one screen to another Review actions needed for task, attempt to predict how users would behave and what problems they’ll encounter

CW: Assumptions User has rough plan User explores system, looking for actions to contribute to performance of action User selects action seems best for desired goal User interprets response and assesses whether progress has been made toward completing task

CW: Requirements Description of users and their backgrounds Description of task user is to perform Complete list of the actions required to complete task Prototype or description of system

CW: Methodology Step through action sequence – Action 1 – Response A, B,.. – Action 2 – Response A –... For each one, ask four questions and try to construct a believability story

CW: Questions 1. Will users be trying to produce whatever effect action has? 2. Will users be able to notice that the correct action is available? (is it visible) 3. Once found, will they know it’s the right one for desired effect? (is it correct) 4. Will users understand feedback after action?

CW: Answering the Questions 1. Will user be trying to produce effect? – Typical supporting evidence It is part of their original task They have experience using the system The system tells them to do it – No evidence? Construct a failure scenario Explain, back up opinion

CW: Next Question 2.Will user notice action is available? – Typical supporting evidence Experience Visible device, such as a button Perceivable representation of an action such as a menu item

CW: Next Question 3.Will user know it’s the right one for the effect? – Typical supporting evidence Experience Interface provides a visual item (such as prompt) to connect action to result effect All other actions look wrong

CW: Next Question 4.Will user understand the feedback? – Typical supporting evidence Experience Recognize a connection between a system response and what user was trying to do

Let’s practice: My Internet Radio

User characteristics Technology savy users Familiar with computers Understand Internet radio concept Just joined and downloaded this radio

Task: add a station to presets Click genre Scroll list and choose genre Assuming station is on first page, add station to presets -- right-click on station, choose add to presets from popup menu. Click OK on Presets

Task: Click – Pick a genre 1. Will users be trying to produce whatever effect action has? 2. Will users be able to notice that the correct action is available? 3. Once found, will they know it’s the right one for desired effect? 4. Will users understand feedback after action?

Scroll list and choose genre 1. Will users be trying to produce whatever effect action has? 2. Will users be able to notice that the correct action is available? 3. Once found, will they know it’s the right one for desired effect? 4. Will users understand feedback after action?

Action: Right click on station and choose “Add to Presets” 1. Will users be trying to produce whatever effect action has? 2. Will users be able to notice that the correct action is available? 3. Once found, will they know it’s the right one for desired effect? 4. Will users understand feedback after action?

Action: Click OK 1. Will users be trying to produce whatever effect action has? 2. Will users be able to notice that the correct action is available? 3. Once found, will they know it’s the right one for desired effect? 4. Will users understand feedback after action?

Problems Did I pick the right task? Or list out the right sequence of actions?

CW Summary Advantages Explores important characteristic of learnability Novice perspective Detailed, careful examination Working prototype not necessary Disadvantages Can be time consuming May find problems that aren’t really problems Narrow focus, may not evaluate entire interface

Your turn Library - finding book What are our tasks? What are the actions?

CW: Questions 1. Will users be trying to produce whatever effect action has? 2. Will users be able to notice that the correct action is available? (is it visible) 3. Once found, will they know it’s the right one for desired effect? (is it correct) 4. Will users understand feedback after action?

CW: responsibilities Design team creates prototype, user characteristics Design team chooses tasks, lists out every action and response Experts answer 4 questions for every action/response Design team gathers responses and feedback Design team determines how to modify the design