Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Understanding and Evaluating the User Experience with Information Spaces Andrew Dillon HCI Lab Indiana University

Similar presentations

Presentation on theme: "1 Understanding and Evaluating the User Experience with Information Spaces Andrew Dillon HCI Lab Indiana University"— Presentation transcript:

1 1 Understanding and Evaluating the User Experience with Information Spaces Andrew Dillon HCI Lab Indiana University

2 2 Why does user experience matter? “ The improvements in performance gained through usable interface design are 3 or 4 times larger than those gained through designing better search algorithms” Sue Dumais, Microsoft - invited presentation to IU’s Computer Science Horizon Day, March 2000.

3 3 Why do we need to test users? n Bailey (1993) asked 81 designers to assess 4 interfaces for users like themselves InterfaceRatingPerformance A 4 1 B 3 2 C 1 3 D 2 4 NB: 95% of designers selected an interface other than the one they performed best on.

4 4 So what to test? Interaction Basics User TaskTool Context

5 5 Basic user tendencies: n Users don’t estimate own performance well n Users change over time n Are impatient n See things in their own way n Seek to minimize cognitive effort

6 6 Traditional approach: usability engineering n Usability defined: n Semantically n Featurally n Operationally

7 7 So what is usability? n Semantic definitions ‘user-friendliness’? ‘ease-of-use’? ‘ease-of-learning’? ‘transparency’ n These tend to circularity, and provide little value to design practice n However, the term captures something that people recognize as important

8 8 Usability as a collection of features n Interface is usable if: n Links, search engine, nav bar, back button? n Graphical user interfaces (GUI) n Based on style guide recommendations? n Meets Nielsen’s or Shneiderman’s principles of design?

9 9 Attribution Fallacy n The attribution fallacy suggests usability is a quality of an interface that is determined by the presence or absence of specific interface features. n This attribution leads to an over-reliance on guidelines and prescriptive rules for design

10 10 Experience requires more than features n Users’ experience is contextually determined by their needs, their tasks, their history and their location. n Understanding this and knowing how to evaluate experience, is the primary purpose of this talk

11 11 Operational definition Usability (of an application) refers to the effectiveness, efficiency, and satisfaction with which specified users can achieve specified goals in particular environments ISO Ergonomics requirements, ISO 9241 part 11: Guidance on usability specification and measures. Useful but overlooked, and still not the full story….

12 12 Effectiveness The extent to which users can achieve their task goals. Effectiveness measures the degree of accuracy and/or completion e.g.,if desired task goal is to locate information on a web site then: Effectiveness= success of user in locating the correct data

13 13 Effectiveness can be a scale or an absolute value n If the outcome is ALL or NOTHING then effectiveness is an absolute value -User either locates info or does not... n If outcome can be graded, (user can be partially right) then effectiveness should be measured as a scale -As a %, or a score from 1 (poor) to 5 (complete) n Scale should be determined by evaluator in conjunction with developers and users

14 14 Quality? n Some tasks do not have a definitive correct answer: n creative production (writing, design) n information retrieval n data analysis n management n Making a purchase….. n Effectiveness alone misses something...

15 15 Efficiency n Measures resources used to perform task n i.e., time, effort, cost, n In case of Web site use, efficiency might equal time taken to complete a task or the navigation path followed etc.

16 16 Efficiency of using a redesigned web site n Time taken to complete task n Compared across tasks, across users or against a benchmark score n Number of steps taken n Number of deviations from ideal path Such variables are frequently highly positively correlated - but they needn’t be.

17 17 Efficiency in path analysis Ideal path: 3 steps

18 18 Efficiency in path analysis Actual to ideal user navigation - 7:3 steps

19 19 But is it efficiency that users want? n The push to efficiency is symptomatic of an engineering-oriented approach n Who determines efficiency? n Are path deviations always inefficient? n Is time equally weighted by user, designer or owner? n Suggests a need for negotiation beyond typical usability tests

20 20 Satisfaction n Measures the affective reaction (likes, dislikes, attitudinal response) of users to the application n Assumed to be influenced but not the same as effectiveness or efficiency e.g., n 2 applications with equal effectiveness, and efficiency, may not be equally satisfying to use n or What users like might not be what they need!

21 21 Basis for satisfaction? n Positively influenced by effectiveness and efficiency n Also n Personal experience with other technologies? n Working style? n Manner of introduction? n Personality of user? n Aesthetics of product?

22 22 Satisfaction is important n Good usability studies recognize this But satisfaction is not enough…. n People often like what they don’t use well n What about empowerment, challenge etc?

23 23 Beyond usability: P-O-A n User experience can be thought of at three levels: n Process n Outcome n Affect n Full evaluation needs to cover these bases

24 24 Experiencing IT at 3 levels: n What user does n What user attains n How user feels

25 25 Process: what the user does n Navigation paths taken n Use of back button or links n Use of menus, help, etc. n Focus of attention The emphasis is on tracking the user’s moves and attention through the information space

26 26 Outcome: what the user attains n What constitutes the end of the interaction? n Purchase made? n Details submitted? n Information located? The emphasis is on observing what it means for a user to feel accomplishment or closure

27 27 Affect: how the user feels n Beyond satisfaction, we need to know if user feels: n Empowered? n Annoyed, frustrated? n Enriched? n Unsure or wary? n Confident? n Willing to come back? The emphasis is on identifying what the interaction means for the user

28 28 User experience = behavior +result +emotion Behavior ResultEmotion

29 29 Interesting ‘new’ measures of UE n Aesthetics, n Perceived usability n Cognitive effort, n Perception of information shapes n Acceptance level n Self-efficacy UE proposes a range of measures not normally associated with usability testing

30 30 Aesthetics and user performance - Dillon and Black (2000) n Took 7 interface designs with known user performance data n Asked 15 similar users to rate “aesthetics” and “likely usability” of each alternative design n Compared ratings with performance data

31 31 Rankings of 7 interfaces R=.85R=.83 Correlation between aesthetics and performance = 0

32 32 Follow up study: n 30 users n Rated the aesthetics, likely usability and then used 4 web search interfaces n Rated aesthetics and usability again again n No correlation with performance!

33 33 So what? n Users respond to interface beauty n Users do not predict their own performance (process and outcome) accurately n Designers cannot usefully predict user response through introspection, theory or asking their colleagues!

34 34 Time matters...Error Scores for Regular Users of Software Trial days So design stops being important?

35 35 NO…it remains important….

36 36 So what? n User experience is dynamic - n Most evaluations miss this n User data is the best indicator of interaction quality….REPEAT THIS TO SELF DAILY!!!!! n To be valid and reliable, the user data must reflect all aspects of the user experience: n P-O-A n The targets are moving….user experience is growing daily in web environments

37 37 Genres in information space n Users have expectations of information spaces n Documents have genres n E-business is “shopping” n A website is a website is website…. n Expectations activate mental models which drive what users see and interpret

38 38 What does a home page look like? Dillon and Gushrowski (2000) n We analyzed a sample of 100 home pages for features n Then tested 8 candidate pages manipulating the most common or uncommon features of existing pages n New users were asked to rate the pages they thought were ‘best’ n Significant positive correlation resulted

39 39

40 40 Correlation between features and user ranking r=0.95, d.f.=6, p<.01 Page # Reflects features Number reflects user ranking

41 41 Implications n Expectations for digital information spaces are forming quickly n Violation of expectancy impacts initial user ratings n Full report online at: n

42 42 Maximizing your evaluations: n Measure the 3 aspects of UE n Process, Outcome and Response n Design user tests that capture multiple sources of data n Protocols, screen capture, attitude, speed, free- form answers n Don’t rely on gurus or guidelines! n A little data goes a long way!

43 43 Example web site protocol 1.32: “ What do I choose here?....looks like there is no direct link....and I don’t like the colors here, too (SELECTS TEACHING) : ‘teaching and courses’ sounds right (SCREEN CHANGES) : “oh this is all about missions and stuff...hang on.... (HITS BACK BUTTON) 1.48 :“well.....that looks the best of these, you know.” (User guesses) (Negative comments) ( Navigation strategy)

44 44 Biggest user complaints in our lab n Poor content n Slow loading n Poor aesthetics n Unclear menu options n Menus with example sub-items much preferred and lead to more efficient use n Too much clicking and “forced” navigation n No site map n Poor search facilities

Download ppt "1 Understanding and Evaluating the User Experience with Information Spaces Andrew Dillon HCI Lab Indiana University"

Similar presentations

Ads by Google