Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Testing—tuning and monitoring © 2013 by Larson Technical Services.

Similar presentations


Presentation on theme: "1 Testing—tuning and monitoring © 2013 by Larson Technical Services."— Presentation transcript:

1 1 Testing—tuning and monitoring © 2013 by Larson Technical Services

2 2 Reasons for Testing During development, developers focus on the system, not on the person who will use the system. Developers believe they are typical users and fail to understand why other users have trouble using the speech application. The design of a usable system is difficult and unpredictable, yet managers believe that usability is just “common sense.” ©2002 Larson Technical Services © 2013 by Larson Technical Services

3 3 Questions Answered Only by Testing Is the current application usable? Is the current application ready to deploy or does it need more work? ©2002 Larson Technical Services © 2013 by Larson Technical Services

4 4 Consequences of Not Testing User problems –People will not use the application if it is difficult to use or fails. The help line may be flooded with calls for help. Revenues will drop –No matter what the business model, revenues will drop. Bad reputation –The company developing the speech application will gain a bad reputation, which may linger for years after the problem itself disappears. ©2002 Larson Technical Services © 2013 by Larson Technical Services

5 5 When to Test? Test Early Test Often ©2002 Larson Technical Services © 2013 by Larson Technical Services

6 In class exercise Sketch 5 usage scenarios for your application Usage scenario –Detailed example of what the user does when using your application –Includes example dialogs between user and application Hint: consider using state transition systems to describe dialogs with state containing output to user and transitions describing user input 6© 2013 by Larson Technical Services

7 7 Testing Is the application useful?Is the application enjoyable? PerformancePreference Measure what users actually accomplished Validate that users achieved success Measure users’ likes and dislikes Validate that users enjoyed the application and will use it again Section 11.2©2002 Larson Technical Services © 2013 by Larson Technical Services

8 8 Performance—General Approach Ask users to perform specific scenarios Measure users’ successes/failures Example tasks and measurements –User speaks a command—Word error rate –User hears a prompt—User performs an appropriate action –User requests a bank transfer—Time/turns to complete the transaction successfully Performance Preference ©2002 Larson Technical Services © 2013 by Larson Technical Services

9 9 Preference—General Approach Ask users specific questions about their likes and dislikes Ask users open-ended questions Examples –On a scale from –5 to +5, rate the help facility –Do you prefer listening to the male or female voice? –What would you change about the application? –What do you like best about the application? Performance Preference ©2002 Larson Technical Services © 2013 by Larson Technical Services

10 In class exercise Write 10 preference questions 10© 2013 by Larson Technical Services

11 11 When to Test? Identify Application Conduct ethnography studies Identify candidate applications Conduct focus groups Select the application Deploy and Monitor Application Monitor application Develop Application Specify the persona Specify dialog structure Specify dialog script Validate initial design Validate app functions Choose Technology Test components Integrate components Specify Application Construct content model Construct scenarios Specify performance & preference requirements Develop Business Model Test Application Usability test Qualify test Stress test Field test Investigation Stage Design Stage Development Stage Testing Stage Sustaining Stage ©2002 Larson Technical Services © 2013 by Larson Technical Services

12 12 Some Types of Tests Conceptual Design QualificationStressField Continual Monitoring Component Test at every stage of development Section 9.4©2002 Larson Technical Services © 2013 by Larson Technical Services

13 Test Plan 1.Purpose 2.Problem statement/test objectives 3.Subject profile 4.Test design 5.Monitor 6.Evaluation measures (data to be collected) 7.Report contents and presentation 13© 2013 by Larson Technical Services

14 1. Purpose Which type of test? –Comparison test—Which is “better?” –Pilot test—Test the test procedure –Field test—Can real users really benefit by using the system –Beta test—Users help debug the system –Acceptance test—Validate that the system satisfy the requirements? What do we hope to learn? 14© 2013 by Larson Technical Services

15 2. Test objectives Avoid unfocused and vague problem statements Examples of poor objectives –Is the current product usable? –Is the product ready for release or does it need more work? Examples of good objectives –Do the screens reflect the end content model? –Is help easier to access via a “hot key” or via a mouse selection? 15© 2013 by Larson Technical Services

16 3. Subject profile General computer experience –Range: none to two years Education: –Range: 10% high school, 60 % college, 20% masters 10% Phd Age –Range: 85% ages 20-50, 15% other Gender Education Major: 0% CS, 100% other 16© 2013 by Larson Technical Services

17 4. Test design Detailed plan for conducting the test –Groups of subjects, e.g., Group A: Mary, Fred, Sam, Jose Group B: Sue, Ron, Bob, Sally –Tests Group A does test 1 Group B does test 2 –Subjects read from printed instruction script* –Conduct a pilot test 17© 2013 by Larson Technical Services

18 Instruction script Orientation –Introduce yourself –Offer refreshments –Explain why they are here –Describe the equipment –Explain what is expected of the participant –Assure the participant that they are not being tested –Ask for questions 18© 2013 by Larson Technical Services

19 Instruction Script (continued) Nondisclosure form scenarios which the user is to perform –Describe the scenario, but not how to do it. For example “Turn on the bedroom lights” but not “Click or speak the desired widget, then click or speak the desired command” Written debriefing questionnaire –Preference questions –Open-ended questions 19© 2013 by Larson Technical Services

20 In class exercise Write Test Instruction Script Note that tester may NOT speak with user during the test. 20© 2013 by Larson Technical Services

21 5. Monitor Try to be objective Enable rather than lead the subject Acting too knowledgeable Don’t jump to conclusions Let the subject struggle Inform the subject we are testing application, not the user 21© 2013 by Larson Technical Services

22 6. Evaluation measures Performance –Data collected during the test Preference –Opinions collected during debreafing 22© 2013 by Larson Technical Services

23 7. Report contents and presentation Collect data during test Collect data during debriefing Summarize data Analyze data –Focus on tasks that did not meet criterion –Analyze source of error –Prioritize problems by criticality Develop recommendations –Focus on solutions that will have the widest impact –Ignore “political considerations” –Provide short-term and long-term recommendations –Identify areas of further research Prepare final report –Executive summary section –Method section –Results section –Findings and recommendations discussion 23© 2013 by Larson Technical Services


Download ppt "1 Testing—tuning and monitoring © 2013 by Larson Technical Services."

Similar presentations


Ads by Google