Presentation is loading. Please wait.

Presentation is loading. Please wait.

Each individual person is working on a GUI subset. The goal is for you to create screens for three specific tasks your user will do from your GUI Project.

Similar presentations


Presentation on theme: "Each individual person is working on a GUI subset. The goal is for you to create screens for three specific tasks your user will do from your GUI Project."— Presentation transcript:

1 Each individual person is working on a GUI subset. The goal is for you to create screens for three specific tasks your user will do from your GUI Project. You also need to prepare the material for a usability evaluation done by the ULAB team of your initial design.

2 1. Identify the intended user GROUP that your screens are designed for. 2. Identify the User GOALs that your screens will satisfy. 3. Inventory the User Tasks they could possibly do in your screens. Make sure that your screen tasks are prototyped completely (meaning they can be done). Only your #2 need to be prototyped completely. 4. Performing a User Analysis for your specific tasks - Give the storyboard of interaction for your tasks 5. Refining of the usability goals and concerns for the evaluation that the ULAB team will do for you 6. Establishing the parts of the evaluation (give dates of when you gave/sent to ULAB team and when you expect results back) 7. 1and 7 Create a user profile. This is the same as #1. 8. Developing two screening questionnaire for the user=PRE ? and POST ? (make sure the Post obtains subjective data from the user). 9. Creating at least 3 task scenarios (make sure you have a setting to your tasks). 10. Develop Usability Baseline Criteria Determining the quantitative and qualitative measures. We will use a the following steps to establish benchmark criteria, events to record, script of the actual evaluation, recommendation format, etc. The benchmark on the standard measures, e.g., time-on- task, errors, user satisfaction, etc. 11. This is done by the ULAB team ==>Give the roles of the ULAB team mates when doing the evaluation 12. This is done by the ULAB team ==>Establishing the events to be recorded and the method of data analysis 13. This is done by the ULAB Team ==>Conduct the ULAB Eval; recording all raw data of the evaluation 14. This is done by the ULAB Team ==>Give all the data, results and analysis of the ULAB Eval; Give usability recommendations justified by the data and analysis of the evaluation. 15. This is done by the ULAB Team ==>send the ULAB report to the GUI team (cc to

3  Evaluate a prototype of an early design (the system is at the beginning of the design = conceptual).  The scenarios for the tasks are not fully developer nor fully functional – so this is a challenge for the user who wants to “play” with every thing on the screen.  Get the user’s cooperation

4 Part X =the plan and preparation materials for an evaluation  Set an usability objective and goals.  Deciding on the user tasks.  Create the Pre-Questionnaire 1.Assurance that you have representative users. 2.Obtaining the specific User Profile = characteristics.  Decide on recorded Events & Baseline Criteria 1.What events will you log from the users experience. 2.What formula (baseline) will decide that you redo the interface.  Create the Post-Questionnaire - Subjective Data - Perceptions Part Y= Evaluation with Users - observation  Preparing the Ulab & performing the evaluations Part Z =Evaluating the results of the orientation  Use the results and give recommendations

5 Before you do any testing, you should take time to figure out what you're testing and what you're not. In other words, determine an objective for your test that focuses on a specific aspect of the product. By limiting the scope of the test, you're more likely to get information that helps you solve a specific problem. Give the target user group and the usability objective for this evaluation and the goal for the prototype. Remember slide 4 of the sample.

6  Your test participant will work through specific tasks. These tasks should be real tasks that you expect most users will do when they use your product. The entire user observation should not run over 20 minutes, so you should design tasks that focus on the part of the product you're studying. For example, if you want to know whether your menus are useful, you could design a task that requires the participant to access the menus frequently. After you determine which tasks to use, write them out as short, simple instructions.  The tasks the user will attempt to do need to be printed on separate pages.  Important: Your instructions must be clear and complete, but they should not explain how to do things you're trying to test. For example, if you want to find out whether users can navigate through your program easily, don't give them instructions for navigation. Or, if you want to know whether your interface is self-explanatory, don't describe how it works. This concept is extremely important to remember. If you teach your participants about something you're trying to test, your data will not be useful.

7 Create a questionnaire that will verify that you have a true user. Also include small profile questions of your user. The Briefer/Debriefer will do this at the start of the evaluation next week.

8  The ULAB team obtains data during the tasks of evaluation. While the user is actively doing the tasks they will record events (metrics).  Immediately after – there is an opportunity to gather impressions and qualitative data from the user.  Create a post questionnaire that obtains all this type of data from the user.

9 a. Decide on the events (metrics) that you will log. List the events to be recorded during the evaluation. Give which are quantified and which are qualitative. b. Decide on the Baseline criteria for the evaluation. – Decide on a formula that contains the events (metrics). Give this formula on the Report for Web F. A sample of this is in this presentation.

10 Prepare the computer to the “Opening scene” for each user by having the same opening scene Test the recording video and audio to insure you will have hard data that you can review later. Practice your conversation with the user to obtain their initial attitude and their pre-questionnaire data. Explain and let user rehearse the “thinking out loud”. Give the task handouts to the user and let them do the tasks alone.

11  List the data logged and compile it into lists of pros & cons.  State whether the Baseline criteria was met.  Give recommendations- which are the areas of concerns – not the solutions!.

12 1. Identify the intended user GROUP that your screens are designed for. 2. Identify the User GOALs that your screens will satisfy. 3. Inventory the User Tasks they could possibly do in your screens. Make sure that your screen tasks are prototyped completely (meaning they can be done). Only your #2 need to be prototyped completely. 4. Performing a User Analysis for your specific tasks - Give the storyboard of interaction for your tasks 5. Refining of the usability goals and concerns for the evaluation that the ULAB team will do for you 6. Establishing the parts of the evaluation (give dates of when you gave/sent to ULAB team and when you expect results back) 7. 1and 7 Create a user profile. This is the same as #1. 8. Developing two screening questionnaire for the user=PRE ? and POST ? (make sure the Post obtains subjective data from the user). 9. Creating at least 3 task scenarios (make sure you have a setting to your tasks). 10. Develop Usability Baseline Criteria Determining the quantitative and qualitative measures. We will use a the following steps to establish benchmark criteria, events to record, script of the actual evaluation, recommendation format, etc. The benchmark on the standard measures, e.g., time-on- task, errors, user satisfaction, etc. 11. This is done by the ULAB team ==>Give the roles of the ULAB team mates when doing the evaluation 12. This is done by the ULAB team ==>Establishing the events to be recorded and the method of data analysis 13. This is done by the ULAB Team ==>Conduct the ULAB Eval; recording all raw data of the evaluation 14. This is done by the ULAB Team ==>Give all the data, results and analysis of the ULAB Eval; Give usability recommendations justified by the data and analysis of the evaluation. 15. This is done by the ULAB Team ==>send the ULAB report to the GUI team (cc to


Download ppt "Each individual person is working on a GUI subset. The goal is for you to create screens for three specific tasks your user will do from your GUI Project."

Similar presentations


Ads by Google