We think you have liked this presentation. If you wish to download it, please recommend it to your friends in any social system. Share buttons are a little bit lower. Thank you!
Presentation is loading. Please wait.
Published byGustavo Lunsford
Modified over 2 years ago
© 2006 AT&T Knowledge Ventures. All rights reserved. AT&T is a registered trademark of AT&T Knowledge Ventures. Wireless B2B eCommerce Manager (Nano) Usability Test Plan Presented by Bo Lora May 29, 2007
© 2006 AT&T Knowledge Ventures. All rights reserved. AT&T is a registered trademark of AT&T Knowledge Ventures. Page 2 Introduction This document describes the usability evaluation plan for the Nano project. The purpose of a usability evaluation is to predict the expected performance of the actual customer using the current product and materials, as well as detect any serious problems prior to the release of the product. This plan includes the following sections: Purpose Of The Usability Evaluation Target Audience Design Of The Usability Evaluation Data Collection Methodology Deliverables Resources Schedule
© 2006 AT&T Knowledge Ventures. All rights reserved. AT&T is a registered trademark of AT&T Knowledge Ventures. Page 3 Introduction What is Nano? Nano is a proposed solution for managing one aspect of AT&T B2B Wireless success: the need for improved website building and data management within the ATG / Premier data schema. Due to the increased demand for Wireless B2B websites, the Premier Profile team has recently experienced more website requests than they can create within their SLA (Service Level Agreement). Design Usability Goals The evaluation will focus on determining if the needs of the user are met in a easy to understand, useful, and productive manner. Specific measurable goals for the usability evaluation are outlined in the Usability Evaluation Goals section of this document.
© 2006 AT&T Knowledge Ventures. All rights reserved. AT&T is a registered trademark of AT&T Knowledge Ventures. Purpose of the Usability Evaluation The usability evaluation of the Wireless B2B eCommerce Manager application will evaluate the potential for errors and difficulties involved in using the application for human resource activities. The areas that will be tested through the usability evaluation process were identified by the project team and profile team management.
© 2006 AT&T Knowledge Ventures. All rights reserved. AT&T is a registered trademark of AT&T Knowledge Ventures. Page 5 Concerns This section outlines the specific concerns that the Profile team, development team, and Web Solutions team may have. Can users successfully navigate through the application? Is the information logically organized and grouped for the novice Profile team member? Can the application be used with only the on-line help, or is a paper-based user guide required? How will users feel about using the on-line help? Is context- sensitive help a requirement for our users? Are there tasks that users will want to perform that are not currently supported by the Wireless B2B eCommerce Manager Application?
© 2006 AT&T Knowledge Ventures. All rights reserved. AT&T is a registered trademark of AT&T Knowledge Ventures. Page 6 Usability Evaluation Goals Specific usability goals were determined from the previous concerns. These goals allow for the creation of evaluation scenarios and tasks that will let us know if our concerns are valid and what measures can help us determine if in fact the participants are having trouble completing the tasks. This evaluation will be based on the participants ability to: Begin using the application with no documentation. Locate a specific B2B site in less than a minute. Create a B2B site in 5 minutes or less, with no assistance from a Senior Profile Team member. Move from one section to another with no expressed, or visible difficulty. Find related information with no expressed or visible frustration. Have no more than two false attempts in finding specific information.
© 2006 AT&T Knowledge Ventures. All rights reserved. AT&T is a registered trademark of AT&T Knowledge Ventures. Page 7 Usability Evaluation Goals We will also use a survey to determine subjective reactions: Users feel that the pictures used on the icons are recognizable and do facilitate system use/understanding? Users feel comfortable using only the on-line help, or if context-sensitive help is a requirement. Users feel the on-line help provides them with all of the information necessary to use the system. Users feel that the on-line reference answers all of their B2B site creation/maintenance questions.
© 2006 AT&T Knowledge Ventures. All rights reserved. AT&T is a registered trademark of AT&T Knowledge Ventures. Page 8 Proposed Scenarios The following specific scenarios will be used to evaluate the proposed application: Find a site by FAN Find a site by Keyword Create a B2B Site Add user to B2B Site Restrict Motorola phone in buy flow of a B2B Site Add Promotion to B2B site Change default shipping Disable B2B site Create a Blackberry only version of a B2B Site Change welcome message Change B2B site contact information Change B2B site to browse only Add a new rate plan
© 2006 AT&T Knowledge Ventures. All rights reserved. AT&T is a registered trademark of AT&T Knowledge Ventures. Target Audience The selection of participants whose background and abilities are representative of the products intended end user is a crucial element of the evaluation process. For this reason, valid results will be obtained only from active profile team members who have daily tasks of creating and maintaining B2B sites.
© 2006 AT&T Knowledge Ventures. All rights reserved. AT&T is a registered trademark of AT&T Knowledge Ventures. Page 10 Participant Selection Criteria The following list shows the key characteristics of the end users that are considered as critical differentiators for successful adoption, and use, of the Wireless B2B eCommerce Manager. These characteristics are the basis for participant selection for the usability evaluation. The participants will be selected to reflect the range of characteristics shown below. Senior Profile Analyst: Someone who has an extensive knowledge of ACC and is involved in troubleshooting high priority outages. Profile Analyst: Someone who is involved in the daily task of creating and maintaining Premier profiles. Junior Profile Analyst: Someone who is fairly new to the Profile team.
© 2006 AT&T Knowledge Ventures. All rights reserved. AT&T is a registered trademark of AT&T Knowledge Ventures. Design of the Usability Evaluation A single usability evaluation will be run in at least 10 individual participant sessions. Each individual session will consist of a set of tasks and an interview/questionnaire for the participants to complete.
© 2006 AT&T Knowledge Ventures. All rights reserved. AT&T is a registered trademark of AT&T Knowledge Ventures. Page 12 The Evaluation Process Participant greeting and Orientation Each participant will be personally greeted by the evaluator and made to feel comfortable and relaxed. The participants will receive a short, verbal scripted introduction and orientation to the evaluation. This introduction will explain the purpose and objective of the evaluation, the need for product anonymity until after the evaluation, and additional information about what is expected of them. They will be assured that the product is the center of the evaluation and not themselves, and that they should perform in whatever manner is typical and comfortable for them. The participants will be informed that they are being observed and videotaped.
© 2006 AT&T Knowledge Ventures. All rights reserved. AT&T is a registered trademark of AT&T Knowledge Ventures. Page 13 The Evaluation Process Performance evaluation The performance evaluation consists of a series of tasks that are evaluated separately and sequentially. The individual participants complete the tasks while being recorded on video and observed by the usability specialists. The scenario is as follows: – After the orientation, the participants will be asked to sit down at the computer. – The evaluator will give the participants defined tasks and instruct them go forward through each task verbalizing their thoughts out loud. – The participants will be encouraged to work without guidance and in case they become stuck or hopelessly confused the evaluator will note these occurrences and may ask questions to probe and help to pinpoint the cause of the problem.
© 2006 AT&T Knowledge Ventures. All rights reserved. AT&T is a registered trademark of AT&T Knowledge Ventures. Page 14 The Evaluation Process Participant debriefing After all tasks are complete or the time expires, each participant will be debriefed by the evaluator. The debriefing will be taped and will include the following: – Completion of a brief post evaluation questionnaire in which the participants share their opinions on the products usability, appearance of application screens, and general impressions of the application. – Participants overall comments about his or her experience – Participants responses to probes from the evaluator about specific errors or problems encountered during the evaluation The debriefing session serves several functions: – It allows the participants to say whatever they like, which is important if tasks are frustrating. – It provides important information about each participants rationale for performing specific actions. – It allows the collection of subjective preference data about the application and its supporting documentation.
© 2006 AT&T Knowledge Ventures. All rights reserved. AT&T is a registered trademark of AT&T Knowledge Ventures. Page 15 Logistics Location: We will simulate a typical office environment during the usability evaluation. We will utilize an office (Bothell 6 – 3023A) that will be large enough to comfortably accommodate a desk for the participant to sit at while completing the evaluation. The prototype application will be run on a docked laptop with external keyboard and mouse. Prototype: The prototype employed for this evaluation will be a stand alone html prototype that requires no network connection. This prototype will be representative of the look and feel of the final product and will adhere to current AT&T Intranet Guidelines. Observation: Each usability evaluation will be set up in eMeeting. A meeting room with a projector and speakers. The session will be recorded via screen capture on the laptop with an external microphone. A video camera will be set up to record the participants facial actions and comments. If possible, a video signal will be fed to the meeting room to provide observation of participant.
© 2006 AT&T Knowledge Ventures. All rights reserved. AT&T is a registered trademark of AT&T Knowledge Ventures. Page 16 Data Collection Methodology Data will be collected through the use of a thinking aloud protocol. Measures to be collected include the following: The average amount of time to complete each task The percentage of participants who finished each task successfully The number of cases in which the participants were not able to complete a task due to an error from which they could not recover The number of times the participant used the help line or on-line documentation for each task The number of positive or critical statements about the on-line help documentation Number of and types of findings, including: Good. Events when the participant found the application to be intuitive and correct in its approach. Bad. Events when the participant makes a mistake but is able to recover during the task in the allotted time. Catastrophic. Events when the participant makes a mistake and is unable to recover and complete the task on time. The participant may or may not realize a mistake has been made. The number of indications of frustration or joy from the participant The number of subjective opinions of the usability and aesthetics of the product expressed by the participants
© 2006 AT&T Knowledge Ventures. All rights reserved. AT&T is a registered trademark of AT&T Knowledge Ventures. Deliverables At the completion of the usability evaluation, a formal analysis will be performed. A final evaluation report and a highlight tape will be provided, which will detail the significant problems and observations detected during the usability evaluation, and recommendations to address the findings, will be delivered to the project team.
© 2006 AT&T Knowledge Ventures. All rights reserved. AT&T is a registered trademark of AT&T Knowledge Ventures. Page 18 Schedule The usability evaluation will be conducted in June, This date is dependent on approval of this evaluation plan by the Profile Team and identification and selection of qualified participants. TimeTue June 12thWed June 13thThus June 14th 9:00 a.m.Pilot SessionShannon HarringtonTerry Reid 1:00 p.m.Victor DiazEugene YonDavid Thai 3:00 p.m.Molly UtsickWilliam RynearsonGladys Woo
Day 8 Usability testing. Objectives Examine usability testing in more depth.
Web E’s goal is for you to understand how to create an initial interaction design and how to evaluate that design by studying a sample. Web F’s goal is.
Conducting Usability Tests 4 Step Process. Step 1 – Plan and Prep Step 2 – Find Participants Step 3 – Conduct the Session Step 4 – Analyze Data and Make.
Part 1-Intro; Part 2- Req; Part 3- Design Chapter 20 Why evaluate the usability of user interface designs? Chapter 21 Deciding on what you need to.
By Godwin Alemoh. What is usability testing Usability testing: is the process of carrying out experiments to find out specific information about a design.
1 ISE 412 Usability Testing Purpose of usability testing: evaluate users’ experience with the interface identify specific problems in the interface.
Usability testing: A reality check for Plain Talk A brief overview October 7, 2008 Dana Howard Botka Manager, Customer Communications, L&I Plain Talk Coordinator,
Each individual person is working on a GUI subset. The goal is for you to create screens for three specific tasks your user will do from your GUI Project.
UCI Library Website Chris Lee Archana Vaidyanathan Duncan Tsai Karen Quan.
Usability Testing Chapter 6. Reliability Can you repeat the test?
Directions for this Template Use the Slide Master to make universal changes to the presentation, including inserting your organization’s logo –“View”
Feedback from Usability Evaluation to User Interface Design: Are Usability Reports Any Good? Christian M. Nielsen 1 Michael Overgaard 2 Michael B. Pedersen.
What is Usability? Usability Is a measure of how easy it is to use something: –How easy will the use of the software be for a typical user to understand,
The Usability Test Process: Steps, tips, and more! Dr. Jennifer L. Bowie For Digital Rhetoric.
What is Usability Testing? Usability testing is a technique for ensuring that the intended users of a system can carry out the intended tasks efficiently,
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
BSBIMN501A QUEENSLAND INTERNATIONAL BUSINESS ACADEMY.
Slide 1 Requirements Determination Chapter 5. Slide 2 Objectives ■ Understand how to create a requirements definition. ■ Become familiar with requirements.
3461P Crash Course Lesson on Usability Testing The extreme, extreme basics...
Inspection Methods. Inspection methods Heuristic evaluation Guidelines review Consistency inspections Standards inspections Features inspection Cognitive.
Maura Bidinost User Experience Designer Omnyx LLC Usability: A Critical Factor in the Successful Adoption of Digital Pathology for Routine Sign-out.
Cultural Heritage in REGional NETworks REGNET Project Meeting Content Group Part 1: Usability Testing.
What is Primary Research and How do I get Started?
Designing & Testing Information Systems Notes Information Systems Design & Development: Purpose, features functionality, users & Testing.
Project 3 Cookie Cutters Kevin Huynh Sean Tsusaki Jordaniel Wolk.
Characteristics of on-line formation courses. Criteria for their pedagogical evaluation Catalina Martínez Mediano, Department of Research Methods and Diagnosis.
Project Tracking. Questions... Why should we track a project that is underway? What aspects of a project need tracking?
Glenn Research Center at Lewis Field Software Assurance of Web-based Applications SAWbA Tim Kurtz SAIC/GRC Software Assurance Symposium 2004.
Welcome to the Usability Center Tour Since 1995, the Usability Center has been a learning environment that supports and educates in the process of usability.
Damian Gordon. Summary and Relevance of topic paper Definition of Usability Testing ◦ Formal vs. Informal methods of testing Testing Basics ◦ Five.
What is the ADDIE Model? By: Edith Leticia Cerda.
COMP6703 : eScience Project III ArtServe on Rubens Emy Elyanee binti Mustapha Supervisor: Peter Stradzins Client: Professor Michael.
1 PULASKI TECHNICAL COLLEGE Classified Staff Performance Evaluation To insert your company logo on this slide From the Insert Menu Select “Picture” Locate.
Business Development Suit Presented by Thomas Mathews.
OHT 4.1 Galin, SQA from theory to implementation © Pearson Education Limited 2004 Software Quality assurance (SQA) SWE 333 Dr Khalid Alnafjan
Sandra Martinez, Ph.D. Apply Texas Team Student Information Systems University of Texas at Austin.
IAEA International Atomic Energy Agency. IAEA Outline LEARNING OBJECTIVES REVIEW TEAM AMD COUNTERPARTS Team Composition Qualification PREPARATORY PHASE.
Chapter 11 Designing the User Interface Chapter 14 Systems Analysis and Design in a Changing World, 3 rd Edition.
IAEA International Atomic Energy Agency. IAEA Outline LEARNING OBJECTIVES REVIEW TEAM AND COUNTERPARTS Team Composition Qualification PREPARATORY PHASE.
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
Presentation: Usability Testing Steve Laumaillet November 22, 2004 Comp 585 V&V, Fall 2004.
On-Line BankCard Center Presentation Cardholder Role During the Presentation click the mouse on this button to move back a slide During the Presentation.
Chapter 6 Determining System Requirements. Objectives: Describe interviewing options and develop interview plan. Explain advantages and pitfalls of.
Final Presentation Red Team. Introduction The Project We are building an application that can potentially assist Service Writers at the Gene Harvey Chevrolet.
SCWDC Policy Training Delivery Design: Individual Recommended: Read the policy prior to taking this training. It is helpful to have a copy of.
Heuristic evaluation Functionality: Visual Design: Efficiency: Learnability: Navigation Feedback Accessibility Consistency Simplicity Flexibility Robustness.
APPLICATION DEVELOPMENT BY SYED ADNAN ALI.
1 Chapter 12 User Interface Design Software Engineering: A Practitioner’s Approach, 6th edition by Roger S. Pressman.
Week 9 Usability Test Plan and Quality Assurance Plan.
©Ian Sommerville 2007Change Management Slide 1 Software change management.
© 2016 SlidePlayer.com Inc. All rights reserved.