An Introduction to Treejack

Slides:



Advertisements
Similar presentations
Project 1 ACSM PowerPoint.
Advertisements

Slide 1 FastFacts Feature Presentation November 13, 2008 We are using audio during this session, so please dial in to our conference line… Phone number:
Cultural Heritage in REGional NETworks REGNET Project Meeting Content Group Part 1: Usability Testing.
Accessing electronic journals from off- campus This causes lots of headaches, but dont despair, heres how to do it! (Please note – this presentation is.
Microsoft® Word 2010 Training
The essentials managers need to know about Excel
Microsoft ® Office Word 2007 Training Header and footer basics Sweetwater ISD presents:
Maintaining data quality: fundamental steps
Microsoft® Access® 2010 Training
HelpDesk OSP Presentation HelpDesk OSP converts s into SharePoint list items HelpDesk OSP creates SharePoint list items from Outlook HelpDesk OSP.
HelpDesk OSP Presentation HelpDesk OSP converts s into SharePoint list items HelpDesk OSP works with Office SharePoint V2 and higher
Active Learning with Feedback on Both Features and Instances H. Raghavan, O. Madani and R. Jones Journal of Machine Learning Research 7 (2006) Presented.
Microsoft Office Illustrated Fundamentals Unit C: Getting Started with Unit C: Getting Started with Microsoft Office 2010 Microsoft Office 2010.
Microsoft® Access® 2010 Training
CREATING A PAYMENT REQUEST FOR A NEW VENDOR
Microsoft ® Office Access ® 2007 Training Build a database VI: Create reports for a new Access database ICT Staff Development presents:
Themis IA testing: Lessons learnt from performing study Scott Rippon, User Experience Consultant ITS > Web Services > User Experience Design team.
With TimeCard SharePoint events are tagged with information that converts them into time sheets. This way users can report time and expenses from their.
KompoZer. This is what KompoZer will look like with a blank document open. As you can see, there are a lot of icons for beginning users. But don't be.
Microsoft ® Word 2010 Training Create your first Word document I.
Congratulations! You have just installed the Presentation Game Add-In.
Technovation User-centered Design Week 3. Check-in: survey Now you have survey results from ~20 people Stand up as a team and share interesting results.
Windfall Web Throughout this slide show there will be hyperlinks (highlighted in blue). Follow the hyperlinks to navigate to the specified Topic or Figure.
(case sensitive) And there is this.
Clearing your Cookies Google Chrome A short guide to help you navigate our website faster Brought to you by:
6 th Annual Focus Users’ Conference 6 th Annual Focus Users’ Conference Scheduling Requests and Request Reports Presented by: Sara Sayasane Presented by:
WELCOME TO THE ANALYSIS PLATFORM V4.1. HOME The updated tool has been simplified and developed to be more intuitive and quicker to use: 3 modes for all.
XX/XX/XX Presenter names Position Title Accessibility “How to”
Creating Accessible Word Documents by Debbie Lyn Jones, IT Manager I, NSU Webmaster FRIDAY, JANUARY 23, 2015.
Welcome to Turnitin.com’s Peer Review! This tour will take you through the basics of Turnitin.com’s Peer Review. The goal of this tour is to give you.
Petrophase 2008 Poster Presentation Title
1 of 6 Parts of Your Notebook Below is a graphic overview of the different parts of a OneNote 2007 notebook. Microsoft ® OneNote ® 2007 notebooks are digital.
Microsoft ® Office Word 2007 Training Reuse text and other document parts: Introducing building blocks [Your company name] presents:
Power Point Presentation - Advanced Julia J. Stahl Distributed System Specialist.
Working with SharePoint Document Libraries. What are document libraries? Document libraries are collections of files that you can share with team members.
EASY TEAM MANAGER By Dave Abineri EASYWARE: PO Box 231, Milford, OHIO (Cincinnati) Phone: (513) Use UP arrow to move to the NEXT slide Use.
ADVANCED MICROSOFT POWERPOINT Lesson 6 – Creating Tables and Charts
Lawson System Foundation 9.0
Website Designing Using Ms FrontPage FrontPage 2003 Create a Web site with FrontPage.
Getting started using crystal 00 Month 2006 Insert security classification here.
Scoring Program Updates & XML upload to the NSRCA web site July 2013.
CTS130 Spreadsheet Lesson 3 Using Editing and Formatting Tools.
Introduction With TimeCard users can tag SharePoint events with information that converts them into time sheets. This way they can report.
Designing Interface Components. Components Navigation components - the user uses these components to give instructions. Input – Components that are used.
Microsoft ® Office OneNote ® 2003 Training Get to know OneNote CGI presents:
System for Administration, Training, and Educational Resources for NASA SATERN Overview for Users December 2009.
Moodle (Course Management Systems). Surveys and Choices.
Microsoft ® Office Excel 2003 Training Using XML in Excel SynAppSys Educational Services presents:
User Interface Design & Usability for the Web Card Sorting You should now have a basic idea as to content requirements, functional requirements and user.
Microsoft ® Word 2010 Training Create your first Word document I.
Animal Shelter Activity 2.
Shaelynn Long-KishShaelynn Long-Kish, Instructional Designer Mid Michigan Community College Building Beautiful Courses: Layouts, Lessons, Blocks, & Books.
Introduction to Blackboard Rabie A. Ramadan Session 3.
NIMAC for Accessible Media Producers: February 2013 NIMAC 2.0 for AMPs.
SIMPLE TEMPLATE ---- developed for kids ---- SIMPLE TEMPLATE ---- developed for kids ---- James Sager, Morgan Sager September 2016 James Sager, Morgan.
Access to Electronic Journals and Articles in ARL Libraries By Dana M. Caudle Cecilia M. Schmitz.
SIMPLE TEMPLATE ---- developed for kids ---- SIMPLE TEMPLATE ---- developed for kids ---- James Sager, Morgan Sager October 2016 James Sager, Morgan Sager.
Project Management: Messages
Setting Defaults in Microsoft Word for Accessibility
Welcome to your first Online Class Session
How to Start This PowerPoint® Tutorial
Microsoft® Word 2010 Training
Year 7 E-Me Web design.
How to Use Members Area of The Ninety-Nines Website
OneSupport Help Center (OSHC) Training
Welcome To The Project Website
An AS Lesson Using the LDS to teach content on Data Collection and Processing.
Comparing Apples to Oranges? Doing UX Work Across Time and Space
User Research Goals Memorability/Desirability Usability
Presentation transcript:

An Introduction to Treejack Out on a limb with your IA Dave O’Brien Optimal Usability

Welcome Dave O’Brien Optimal Usability 22 Jan 2010 36 attendees Wellington, New Zealand 22 Jan 2010 36 attendees USA, CA, UK, NZ, AU, BR, CO

Agenda Quickie Treejack tour What is tree testing? Planning a tree test Setting up Treejack Running a test High-level results Detailed results Lessons learned (Q&A throughout)

Poll Have you used Treejack yet? No, haven’t tried it yet = 20% Yes, but only a practice test = 60% Yes, have run a "real" test = 20%

Tree testing - the 5-minute tour Creating a medium or large website Does your top-down structure make sense?

Does your structure work? Can users find particular items in the tree? Can they find them directly, without having to backtrack? Could they choose between topics quickly, without having to think too much? Which parts of your tree work well? Which fall down? The Krug Test – “Don’t make me think!”

Create a site tree

Write some tasks

Put this into Treejack

Invite participants

Participants do the test Browsing down Backing up Skipping

You see the results

Live demo for participants* https://wiki.optimalworkshop.com/treejack/survey/foodshopping1 Q: Can you control font size and color? Q: it would be great if all the options had an 'i'd pick this ' button - when it's on just one, feels leading Q: when you get to the deepest level, if all things are even, have the 'i pick this' button on every row

What is tree testing, really? Testing a site structure for Findability Labeling

What’s it good for? Improving organisation of your site Improving top-down navigation Improving your structure’s terminology (labels) Comparing structures (before/after, or A vs. B) Isolating the structure itself Getting user data early (before site is built) Making it cheap & quick to try out ideas Quick to get a test up and running Quick/easy for users: 10 min for 10 tasks Analysis is straightforward Can test baselines and revisions quickly By isolating the site structure - by removing other variables at this early stage of design - we can more clearly see how the tree itself performs, and revise until we have a solid structure. We can then move on in the design process with confidence. It’s like unit-testing a site’s organisation and labeling. Or as my colleague Sam Ng says, “Think of it as analytics for a website you haven’t built yet.”

What it’s NOT NOT testing other navigation routes NOT testing page layout NOT testing visual design NOT a substitute for full user testing NOT a replacement for card sorting

Origin Paper tree testing “card-based classification” – Donna Spencer Show lists of topics on index cards In person, score manually, analyse in Excel http://www.boxesandarrows.com/view/card_based_classification_evaluation

Make it faster & easier Create a web tool for remote testing Quick for a designer to learn and use Simple for participants to do the test Able to handle a large sample of users Able to present clear results Quick turnaround for iterating

But I already do card sorting! Open card sorting is generative Suggests how your users mentally group content Helps you create new structures Closed card sorting – almost not quite Tree testing is evaluative Tests a given site structure Shows you where the structure is strong & weak Lets you compare alternative structures While closed card sorting mimics how users may file a particular item of content (e.g. where they might store a new document in a document-management system), it doesn’t necessarily model how users find information in a site. They don’t start with a document – they start with a task, just as they do in a usability test. What we wanted was a technique that more closely simulates how users browse sites when looking for something specific. Yes, closed card sorting was better than nothing, but it just didn’t feel like the right approach.

A useful IA approach Run a baseline tree test (existing structure) What works? What doesn’t? Run an open card sort on the content How do your users classify things? Come up with some new structures Run tree tests on them (same tasks) Compare to each other Compare to the baseline results

Planning a tree test Stakeholder interview Find out who, what, when, etc. fill in "planning questions" template Get the tree(s) in digital format use Excel tree-import template, etc. Show “planning questions” template – a canned set of questions specific to tree testing

Getting the tree Import a digital format Or enter in Treejack Excel Text file Word Or enter in Treejack Show Excel template – multiple trees on worksheets

Poll How big are your trees? Small (less than 50 items) = 25% Medium (50 - 150 items) = 39% Large (150 - 250 items) = 22% Huge (more than 250 items) = 14%

Tree tips Recommend <1000 items Bigger? Cut it down by: Using top N levels (e.g. 3 or 4) Testing subtrees separately* Pruning branches that are unlikely to be visited Remove “helper” topics e.g. Search, Site Map, Help, Contact Us Watch for implicit topics! which assumes that users can navigate the top level easily – a big assumption Helper topics - If we leave them in, it makes it too easy for users to choose them as alternatives to browsing the tree, and we don’t learn as much. Q: We are testing some IA's which need supporting text for the links - not ideal, but are there any ways to factor this sort of support into a Treejack evaluation (if not that's fine)

Implicit topics Create your tree based on the content, not just the page structure. Home Products Support Contact Us South America Europe Contact Us North America Lorem ipsum dolor sit amet, consectetur adipisicing elit sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. South America Home Products Support Contact Us North America South America Europe Beware content-management systems that spit out page trees. They need careful review! Europe

User groups and tasks Identify your user groups Draft representative tasks for each group Tasks must be “real” for those users! ~10 tasks per participant Beware the learning effect Small tree ~8, large tree ~12 More tasks? Limit per participant Randomise the task order Example – Shimano website with sections for bike parts and fishing equipment. Q: What’s the upper limit on # of tasks? Q: Since you can only have 1 user test 10 or so paths, can you combine data sets so you can get a bigger sampling? So by running 100 people through test A, 100 through test B, etc..... I can combine results from A-Z tests to see a picture of something with 100's of paths

Drafting tasks What parts of the tree do you want to test? Coverage should reflect importance Each task must: Be specific Be clearly worded Use the customer’s language Be concise Beware “give-away” words! Review now, preview before the real test Show Excel template – tasks and coverage Customer’s language – “How would you upgrade a fixed asset, such as your laptop?” Q: can task instructions include images (e.g. Where would you find this and show a picture of a sofa). (In ecommerce, it's much better to show an image since the text description for a product can be misleading, ambiguous, or biasing.) A If you add images to this, please keep in mind that some may administering tests in a couple of languages, so images should not contain text (or should be available in a couple of languages) --- sorry, us Canadian government folk have to do things in Engliish and French! Q: With the instruction " I smell garlic bread" this influenced my search because I thought it was an instore bakery item, was this what you are intended? Does this point to th eimportance of the instruction phrasing.

Setting up a Treejack project Creating a Treejack project Entering your tree Entering the tasks and answers Less on mechanics, more on tips

Creating a project New vs. Duplicate Survey name vs. address Identification The “Other” option Passing an argument in the URL https://demo.optimalworkshop.com/treejack/survey/test1?i=12345

Entering your tree Paste from Excel, Word, text file, etc. “Top” – how to replace Randomising Not the same as randomising tasks Changing the tree after entering answers Lesson learned: Edit/review/finalise the tree elsewhere before putting it into Treejack

Entering tasks and answers Preview is surprisingly useful Multiple correct answers The “main” answer is usually not enough Check the entire tree yourself Must choose bottom-level topics Workaround: Mark all subtopics correct Workaround: Remove the subtopics Choose answers LAST

Task options Randomising tasks – almost always Limiting the # of tasks 20-30 tasks = 10 per participant Increase the # of participants to get enough results per task Skip limit Eliminate users who didn’t really try Defaults to 50% Q: does it 'randomize' or 'rotate' the order of the tasks (i.e. will the tasks be evenly distributed across the sample of participants)?

Testing the test Not previewing/piloting is just plain dumb Spot mistakes before launch Preview the entire test yourself Pilot it with stakeholders and sample users Launch it, get feedback, duplicate, revise Look for: Task wording (unclear, ambiguous, typos) Unexpected “correct” answers Misc. problems (e.g. instructions)

Poll How many participants do you get per test? 1 – 20 = 44% 21 – 40 = 20% 41 – 100 = 24% Over 100 = 12%

Running the tree test Invite participants Website-page invitations email invitations Recommend >30 users per user group/test Monitor early results for problems low # of surveys started Email invitation not clear? Subject = spam? Not engaging? low completion rate email didn’t set expectations? Test too long? Too hard? Generally less taxing than card sorting Q: Is there any benefit in testing 100 people rather than 5-7? Traditional user testing doesn't need statistical significance in sample size, so what benefit does this add?

Skimming high-level results 10/100/1000 level of detail Middling overall score Often many highs with a few lows Inspect tasks with low scores (low total or low sub-scores) Inspect the pie charts Show sample results Q: Is there any way to export these high-level views into Excel as well? They can be useful in presentations when reporting back results.

Success % who chose a correct answer (directly or indirectly) low Success score check the spreadsheet to see where they went wrong Destinations tab Path tab

Directness % of successful users who did not backtrack Coming soon: making this independent of success low Directness score check the spreadsheet for patterns in their wandering Paths tab

Speed % who completed this task at about the same speed as their other tasks % who completed task within 2 standard deviations of their average task time for all tasks 70% Speed score 7/10 users went their “normal” speed 3/10 users took substantially longer than normal for them Low Speed score indicates that user hesitated when making choices e.g. choices are not clear or not mutually distinguishable Wish: add the raw times to the spreadsheet, so you can do your own crunching as needed. Overall score uses a grid to combine these scores in a semi-intelligent fashion

Detailed results – destinations Where did people end up? # who chose a given topic as the answer Wrong answers High totals - problem with that topic (perhaps in relation to its siblings) Clusters of totals – problem with the parent level Ignore outliers For >30 sessions, ignore topics that get <3 clicks. Show sample results from food shopping – Destinations tab

Detailed results – destinations Look for high “indirect success” rates (>20%) Check paths for patterns of wandering Look for high “failure” rates (>25%) Check the wrong answers above Look for high skip rates (> 10%) Check paths for where they bailed out. Look for "evil attractors" Topics that get clicks across several seemingly unrelated tasks. Usually a vague term that needs tightening up Q: Need to split out the values: How many skipped before even clicking a level; how many quit after truly despairing? Evil attractors - We saw this happen to a consumer-review site that had a “Personal” category; they meant personal-care products like electric shavers, but participants also went there for “personal” items like cell phones, watches, etc.

Detailed results – first clicks Where they went on their first click Important for task success Which sections they visited overall Did they visit the right section but back out? Show sample results from food shopping – First Click tab The first click is the one most highly correlated with eventual success – if we can get users to the right section of a site, they’re much more likely to find the right topic. Perhaps the participants completely ignored the correct section, or maybe they visited it, but backed out and went somewhere else instead.

Detailed results – paths Click-by-click paths that they took through the tree Useful when asking: How the heck did they get way over there? Did a lot of them take the same detour? No web UI for removing participants. Email Support and we’ll fix you up. Show sample results from food shopping – Paths tab

Some lessons learned Test new against old Revise and test again – quick cycles Test a few alternatives at the same time Cover the sections according to their importance Analysis is easier than for card sorting Use in-person testing to get the “why” Paper is still effective (and free!) for this Tree testing is only part of your IA work New vs. old - we promised a client we would deliver a better IA, not just a different one? Tree testing proved to be a great way to demonstrate this. In our baseline test, the original structure notched a 31% success rate. Using the same tasks, the new structure scored 67% - a solid quantitative improvement. Q: Can you do A/B testing in the tool? Q: We've used it in face-to-face sessions as we get other data too. We've never run it as remote-only as we're nervous of losing the rich insights. Any experiences or thoughts? Tree testing not enough - A good tree still needs to be married up to an effective navigation system, content design, and visual treatment before we can say we’ve built a good information architecture.

What’s coming Better scoring for Directness, Speed Improved results (10/100/1000) General enhancements across Treejack, OptimalSort, and Chalkmark Whatever you yell loudest for…  GetSatisfaction lets you “vote” for issues

Tree testing – more resources Boxes & Arrows article on tree testing http://www.boxesandarrows.com/view/tree-testing Donna Spencer’s article on paper tree testing http://www.boxesandarrows.com/view/card_based_classification_evaluation Treejack website Webinars, slides, articles, user forum http://www.optimalworkshop.com

Getting your input Specific issues/questions Feature requests support@optimalworkshop.com Feature requests Check the support forum (GetSatisfaction) “Feedback” button Show GetSatisfaction

Thanks!