Presentation is loading. Please wait.

Presentation is loading. Please wait.

Toolkit Support for Usability Evaluation 05-830 Spring 2013 – Karin Tsai 1.

Similar presentations


Presentation on theme: "Toolkit Support for Usability Evaluation 05-830 Spring 2013 – Karin Tsai 1."— Presentation transcript:

1 Toolkit Support for Usability Evaluation 05-830 Spring 2013 – Karin Tsai 1

2 Overview Motivation Definitions Background from Literature Examples of Modern Tools 2

3 Motivation To improve or validate usability Comparison between products, AB tests, etc. Measuring progress Verify adherence to guidelines or standards Discover features of human cognition 3

4 Usability Attributes Learnability – easy to learn Efficiency – efficient to use Memorability – easy to remember how to use Errors – low error rate; easy to recover Satisfaction – pleasant to use and likable 4

5 Evaluation Categories Predictive –psychological modeling techniques –design reviews Observational –observations of users interacting with the system Participative –questionnaires –interviews –“think aloud” user-testing 5

6 Challenges and Tradeoffs Quality vs. Quantity –“Quality” defined as abstraction, interpretability, etc. –User testing – high quality; low quantity –Counting mouse clicks – low quality; high quantity Observing Context Abstraction –Event reporting in applications places burden on developers –Complicates software evolution 6

7 Evaluation Type: Predictive Description: Uses a predictive human performance model (“cognitive crash dummy”) to evaluate designs. 7 CogTool

8 8 ProsCons FreeLimited in “realisticness” Good for getting a baseline evaluation of prototypes Quite confusing at first (extremely high learning curve) Instantly accessible (not limited by participant availability or completion of the system’s functionality) Documentation is “daunting” Neat concept and insight into human cognition Limited usefulness Overall Score: 6.5/10

9 Evaluation Type: Observational Description: Aggregates developer-defined event data in useful ways. 9 Mixpanel

10 10 Mixpanel ProsCons Very powerful built-in analysis toolsHigh learning curve Good API for automated scriptingExpensive Scalable Application events = developer burden/maintainability issues Flexible to fit needs of developersRate-limited (one request at a time) Overall Score: 9.5/10

11 Evaluation Type: Observational Description: Real-time data visualization. 11 Chartbeat

12 12 Chartbeat ProsCons Data is real-time Does not scale well (financially) with huge sites Captures data that is hard to obtain via events (reading, writing, idling, active time, referrals, social integration, etc.) Limited in the data it captures (have to “hack” it if you want event-like data) Great for site monitoringOnly records “page-level” interactions Really awesome visualizationLimited historical data access Easy to useNot built for usability evaluation Overall Score: 7/10

13 Evaluation Type: Participative Description: Watch a user complete a task on your system while thinking aloud. 13 User Testing

14 14 User Testing ProsCons Probably best method for catching usability issues Small sample size (hit or miss) Most thorough recording of user interaction with the system Not easily scalable (expensive) “Think aloud” allows data insights not otherwise attainable from just user interactions Limited user availability Can observe certain demographics without requesting personal information in the system itself Sometimes, it’s painful to watch… Overall Score: 8.5/10

15 15 Questions?


Download ppt "Toolkit Support for Usability Evaluation 05-830 Spring 2013 – Karin Tsai 1."

Similar presentations


Ads by Google