Imran Hussain University of Management and Technology (UMT)

Slides:



Advertisements
Similar presentations
Chapter 15: Analytical evaluation
Advertisements

Chapter 7 Data Gathering 1.
Data gathering. Overview Four key issues of data gathering Data recording Interviews Questionnaires Observation Choosing and combining techniques.
Asking Users and Experts
CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs.
Data gathering.
Asking users & experts.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
Asking users & experts. The aims Discuss the role of interviews & questionnaires in evaluation. Teach basic questionnaire design. Describe how do interviews,
Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
Evaluation Through Expert Analysis U U U
Evaluating with experts
An evaluation framework
Heuristic Evaluation.
Asking Users and Experts Bobby Kotzev Adrian Sugandhi.
Evaluation: Inspections, Analytics & Models
Chapter 7 GATHERING DATA.
FOCUS GROUPS & INTERVIEWS
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
Review an existing website Usability in Design. to begin with.. Meeting Organization’s objectives and your Usability goals Meeting User’s Needs Complying.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Human Computer Interface
Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.
1 Asking users & experts and Testing & modeling users Ref: Ch
Data gathering. Overview Four key issues of data gathering Data recording Interviews Questionnaires Observation Choosing and combining techniques.
Chapter 7 Data Gathering 1.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Usability Evaluation June 8, Why do we need to do usability evaluation?
SEG3120 User Interfaces Design and Implementation
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Questionnaires Questions can be closed or open Closed questions are easier to analyze, and may be done by computer Can be administered to large populations.
AVI/Psych 358/IE 340: Human Factors Data Gathering October 6, 2008.
AVI/Psych 358/IE 340: Human Factors Data Gathering October 3, 2008.
Asking users & experts. The aims Discuss the role of interviews & questionnaires in evaluation. Teach basic questionnaire design. Describe how do interviews,
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Administrivia  Feedback from the mid-term evaluation  Insights from project proposal.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Data gathering (Chapter 7 Interaction Design Text)
Ten Usability Heuristics with Example.. Page 2 Heuristic Evaluation Heuristic evaluation is the most popular of the usability inspection methods. Heuristic.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
User centered design IS336 with Dr. Basit Qureshi Fall 2015.
Discount Evaluation User Interface Design. Startup Weekend Wellington CALLING ALL DESIGNERS, DEVELOPERS AND IDEAS FOLK: Startup Weekend returns to Wellington.
Asking Users and Experts Li Zhang (Jacey) Yuewei Zhou (Joanna)
Heuristic Evaluation May 4, 2016
Lecture 4 Supplement – Data Gathering Sampath Jayarathna Cal Poly Pomona Based on slides created by Ian Sommerville & Gary Kimura 1.
Human Computer Interaction Lecture 21 User Support
Chapter 7 GATHERING DATA.
SIE 515 Design Evaluation Lecture 7.
Asking Users and Experts
Human Computer Interaction Lecture 15 Usability Evaluation
Lecture3 Data Gathering 1.
CS3205: HCI in SW Development Evaluation (Return to…)
Heuristic Evaluation August 5, 2016
Evaluation Techniques 1
Chapter 7 Data Gathering 1.
Human-Computer Interaction: User Study Examples
Chapter 7 GATHERING DATA.
GATHERING DATA.
Unit 14 Website Design HND in Computing and Systems Development
Chapter 7 GATHERING DATA.
PostPC Computing Heuristic Evaluation Prof. Scott Kirkpatrick, HUJI
Chapter 26 Inspections of the user interface
Evaluation.
COMP444 Human Computer Interaction Evaluation
Evaluation: Inspections, Analytics & Models
Evaluation: Inspections, Analytics, and Models
Presentation transcript:

Imran Hussain University of Management and Technology (UMT) Virtual University Human-Computer Interaction Lecture 41 Asking Users and Experts Imran Hussain University of Management and Technology (UMT) 1

In Last Lecture … How to observe users Field studies Usability testing How to collect data while observing user

In Today’s Lecture … Users Experts Interviews Questionnaires Inspections Walkthroughs

Types of Interviews Unstructured interviews Structured interviews Semi-structured interviews

Types of Interviews Interviews may be conducted to ask user about certain aspects of an application Unstructured - are not directed by a script. Rich but not replicable. Structured - are tightly scripted, often like a questionnaire. Replicable but may lack richness. Semi-structured - guided by a script but interesting issues can be explored in more depth. Can provide a good balance between richness and replicability.

Things to avoid when preparing interview questions Long questions Avoid compound sentences by splitting them in two Jargon & language that the interviewee may not understand Leading questions that make assumptions e.g., why do you like …? Unconscious biases e.g., gender stereotypes

The interview process Dress in a similar way to interviewees if possible, if in doubt dress neatly and avoid standing out. Prepare a consent form and ask the interview to sign it If you are recording the interview which is advisable make sure equipment works in advance and you know how to use it. Record answers exactly and do not made any cosmetic adjustment , correct or change answers any way

Preparing for unstructured interview You have an interview agenda that supports the study goals and questions Be prepared to follow new lines of inquiry that contribute to your agenda Pay attention to ethical issues Work on gaining acceptance and putting the interviewees at ease Respond with sympathy if appropriate but be careful not to put ideas into the head of respondents Always indicate to interviewee the beginning and the ending of the interview session. Start to order and analyze your data as soon as possible after the interview.

Probing Start with some preplanned question and then probes the interviewee to say more Example Which web sites did you visit more frequently? Why do you like this web site? Tell me more about web site x? Any thing else?

You can also make use of Probes and prompts

Group interviews Also known as ‘focus groups’

Questionnaires Make the questions clear and specific When possible ask closed questions and offer a range of answers. Consider including no opinion option for the questions that seek opinion. Think about the ordering of questions Avoid complex multiple questions When scales are used make sure that range is appropriate. Make sure that the ordering of scale is intuitive and consistent. Avoid jargon and consider when you need different versions of questionnaires for different population. Provide clear instructions on how to complete the questionnaires.

Questionnaire style Questionnaires can have various styles and can have various formats Questionnaire format can include: - checkboxes - ranges - Likert rating scales

Encouraging a good response Ensure questionnaire is well designed Provide short overview section Include a stamped self-addressed envelop for its return Explain why you need the questionnaire to be completed assure anonymity Contact respondent through a follow-up letters ,phone calls, or emails Offer incentive such as payments

Advantages of online questionnaires Responses are usually received quickly copying and postage costs are lower than paper surveys Data can be collected in database for analysis Time required for data analysis is reduced Errors in questionnaire design can be corrected easily

Nielsen’s heuristics Visibility of system status Match between system and real world User control and freedom Consistency and standards Help users recognize, diagnose, recover from errors Error prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design Help and documentation

Web heuristics Following heuristics are more useful for evaluating commercial websites: Does the web site have high quality content Is the web site often updated Does the website offer minimal download time Does the web site ensure ease of use Is the web site relevant to the user need

Discount evaluation Heuristic evaluation is referred to as discount evaluation when 5 evaluators are used. Empirical evidence suggests that on average 5 evaluators identify 75-80% of usability problems.

Cognitive walkthroughs Involve simulating a user’s problem solving process and each step in the human computer dialog checking to see if the users goals and memory for action can be assumed to the next correct action

Steps of Cognitive walkthroughs Characteristics of typical user are identified and documented and sample tasks are developed that focus on the aspects of the design to be evaluated Designer and one or more expert evaluators than come together to do analysis Evaluators walk through the action sequences for each task placing it within the context of typical scenario As the walkthrough is being done, record of critical information is compiled Assumption about what would cause problems? Why are they recorded? Notes about site issues and design changes are made Summary of the result is compiled Design is revised to fix the problems presented

Why they do this?

The 3 questions Will the correct action be sufficiently evident to the user? Will the user notice that the correct action is available? Will the user associate and interpret the response from the action correctly?

Pluralistic walkthrough Can be conducted by following sequence of steps Scenarios are developed in the form of series part of the screen representing a single path to the interface Scenarios are presented to the panel of evaluators and panel is asked to write down sequence of actions they would take to move from one screen to another When evaluator has written down their actions the panelist discuss the actions they have suggested for that ground of the review Usually the representative user go first so that they are not influenced by other panel members and are not uttered from speaking Usability experts present their findings and finally designers offer their comments Panel moves on the next ground of the screen. This process continues until all the scenarios have been evaluated