Presentation is loading. Please wait.

Presentation is loading. Please wait.

Alan Woolrych My Background Currently – Research & Liaison Officer (DMN) From 1 st January 2003 Usability Researcher with.

Similar presentations


Presentation on theme: "Alan Woolrych My Background Currently – Research & Liaison Officer (DMN) From 1 st January 2003 Usability Researcher with."— Presentation transcript:

1 Alan Woolrych alan.woolrych@sunderland.ac.uk

2 My Background Currently – Research & Liaison Officer (DMN) From 1 st January 2003 Usability Researcher with From 1 st January 2003 Usability Researcher with NITRO - North East IT Reach Out Project MPhil on Assessment of Usability Inspection Methods (Currently part – time PhD) co-author, International Handbook on HCI, UIM Chapter co-author, International Handbook on HCI, UIM Chapter Usability Consultancy and collaboration with a variety of local and international companies Usability Consultancy and collaboration with a variety of local and international companies Leighton Internet, domainnames, IBM publications in international conferences and journals publications in international conferences and journals HEFCE capital project management usability lab (x2) usability lab (x2) Multimedia Lab (x2) Multimedia Lab (x2)

3 This Week Evaluation Methods (…or Inspection Methods) My Research Evaluation Exercise

4 Poor Usability What is ‘bad usability?’ Something ‘happens’ that you don’t understand Something ‘happens’ that you don’t understand Something happens outside of your control Something happens outside of your control Why does it happen? Lots of reasons! Lots of reasons! Misunderstanding the user, out of context… Leads to: Frustration, anger, confusion etc. Frustration, anger, confusion etc. When it happens what do you do?

5 Strange But True… Cannot delete tmp150_3.tmp: There is not enough free disk space. Delete one or more files to free disk space, and then try again. Error: Keyboard not found. Press F1 to continue. Error 0000: No errors found, restarting computer. Windows has found an unknown device and is installing a driver for it.

6 Approaches to Evaluation Analytical deduction, inference, constructing arguments based on inspection of web-sites deduction, inference, constructing arguments based on inspection of web-sitesEmpirical factual, evidence gathered from real usage by real people factual, evidence gathered from real usage by real people

7 Inspection Methods Heuristic Evaluation Heuristic Walkthrough Expert Inspection Cognitive Walkthrough Novice users, learning the site for the first time Novice users, learning the site for the first time 4 questions 4 questions And many more…

8 Cognitive Walkthrough Novice users, learning the site for the first time 4 questions 4 questions 1.Will the user be trying to achieve the right effect? 2.Will the user notice that the correct action is available? 3.Will the user associate the correct action with the desired effect? 4.If the correct action is performed, will the user see that progress is being made?

9 Heuristic Evaluation 1 Visibility of system status (Nielsen) The system should always keep users informed about what is going on, through appropriate feedback within reasonable time.

10

11 Heuristic Evaluation 2 Match between system and the real world The system should speak the users' language, with words, phrases and concepts familiar to the user, rather than system-oriented terms. Follow real- world conventions, making information appear in a natural and logical order.

12 Heuristic Evaluation 3 User control and freedom Users often choose system functions by mistake and need a clearly marked "emergency exit" to leave unwanted states without having to go through an extended dialogue. Support undo and redo.

13 Heuristic Evaluation 4 Consistency and standards Users must not wonder whether different words, situations, or actions mean the same thing. Follow platform conventions.

14 Heuristic Evaluation 5 Error prevention Even better than good error messages is a careful design which prevents a problem from occurring in the first place.

15 Heuristic Evaluation 6 Recognition rather than recall Make objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate.

16 Heuristic Evaluation 7 Flexibility and efficiency of use Accelerators - unseen by novices - may speed up interaction for experts so that systems can cater to both inexperienced and experienced users. Let users tailor frequent actions.

17 Heuristic Evaluation 8 Aesthetic and minimalist design Dialogues should not contain information that is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with relevant units of information and diminishes relative visibility.

18 Heuristic Evaluation 9 Help users recognise, diagnose, and recover from errors Express error messages in plain language (no codes), precisely indicate the problem, and constructively suggest a solution.

19 Heuristic Evaluation 10 Help and documentation Even though systems are best used without documentation, it may be necessary to provide help. This should not be too large, be easy to search, focused on user tasks, listing concrete steps to be carried out.

20 Empirical Testing Paper prototype testing Lab Testing Field Testing Remote observation and instrumentation Site Feedback

21

22

23 My Research…and yours! To Accurately Assess UIMs in particular: Heuristic Evaluation Why? Inspection methods are unreliable! (but would be great if they weren't) Previous research quantative approach HE can find 75% of usability problems in an interface - Nielsen and others…) Fundamentally flawed! How can we know what is 100%? What if the most serious usability problems are in the undiscovered 25%

24 My Research Heuristic Evaluations Predicted ProblemsTask Sets Assess Quality Actual Problems User Testing Compare

25 My Research

26 Your Work To be usability experts Inspect a web site Produce a usability problem Report In groups or individual

27 Evaluation Exercise You will be supplied with: Lecture… Lecture… HE training Manual (available on module website) HE training Manual (available on module website) Problem report template (available on module website) Problem report template (available on module website) Problem report template guide (available on module website) Problem report template guide (available on module website)

28 Recommendations Read the training manual thoroughly You are welcome to consult other material if you wish You are welcome to consult other material if you wish Read and understand the requirements of the problem report format Perform heuristic evaluation

29 Problem Reporting 1 Section 1 - Problem Description Brief Description Brief Description Specific Likely/Actual Difficulties Specific Likely/Actual Difficulties Specific Context Specific Context Assumed Causes Assumed Causes

30 Problem Reporting 2 Section 2 - Discovery Method Individual/Group Testing Individual/Group Testing Adopted Method Adopted MethodScanningSearching Goal Playing Method Following Explain

31 Problem Reporting 3 Section 3 - Heuristic Application Heuristic Breached Heuristic Breached Evidence of non-conformance Evidence of non-conformanceImportant! Confirmation rationale

32 Problem Reporting 4 Section 4 - Exclusion Rationale Elimination Discussion Elimination Discussion

33 URL For Exercise http://www.tyneandwearmetro.co.uk/


Download ppt "Alan Woolrych My Background Currently – Research & Liaison Officer (DMN) From 1 st January 2003 Usability Researcher with."

Similar presentations


Ads by Google