Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS160 Discussion Section Final review David Sun May 8, 2007.

Similar presentations


Presentation on theme: "CS160 Discussion Section Final review David Sun May 8, 2007."— Presentation transcript:

1 CS160 Discussion Section Final review David Sun May 8, 2007

2 Design Patterns zPattern Style (presented in class) 1.Pattern Title 2.Context 3.Forces 4.Problem Statement 5.Solution Solution Sketch 6.Other Patterns to Consider zTips: 1.Know the pattern format 2.We are not fussy on terminology but make sure the description covers the major conceptual components.

3 Exercise: Design Pattern for… Pick an object and come up with a design pattern in 15 minutes, eg. –Bike –Coffee mug –Desk lamp

4 Object Action Model An interaction/cognitive model for how users interact a system. Elements: –Task: the universe of objects the user works with and the actions they apply to those objects. –Interface: metaphoric representations of objects and actions.

5 OAI Example: the Calculator Reals Addition opFirst numberSecond number OperationsAdd two numbers Pick 2 numbersPerform addition operation Actions (intention) Objects (universe) TASK

6 Object Action Model ButtonsDisplay CalculatorOperate the calculator Press 1Press + Actions (plan) Objects (metaphor) Interface Write out an equation Press 2Press =

7 Infovis Information tasks –Specific fact finding –Extended fact finding –Open-ended browsing –Exploration of availability Info search 4-phase pattern 1.Formulation 2.Action 3.Review of results 4.Refinement

8 Infovis Tasks for a visualization system 1.Overview: Get an overview of the collection 2.Zoom: Zoom in on items of interest 3.Filter: Remove uninteresting items 4.Details on demand: Select items and get details 5.Relate: View relationships between items 6.History: Keep a history of actions for undo, replay, refinement 7.Extract: Make subcollections

9 Infovis Some key concepts –Query building: visual builders and QBE –Multidimensional scaling –Focus + context Distortion Fish-eye lenses Overview + details –Network visualization –Animation –3D

10 User Testing

11 Evaluation Methodologies Expert analysis –Cognitive Walkthrough –Heuristic evaluation –Model-based evaluation (GOMS) User participation –Lab studies –Field studies

12 Ethical Considerations Sometimes tests can be distressing –users have left in tear (embarrassed by mistakes) You have a responsibility to alleviate –make voluntary with informed consent –avoid pressure to participate –let them know they can stop at any time [Gomoll] –stress that you are testing the system, not them –make collected data as anonymous as possible Often must get human subjects approval

13 Measuring User Preference How much users like or dislike the system –can ask them to rate on a scale of 1 to 10 –or have them choose among statements “best UI I’ve ever…”, “better than average”… –hard to be sure what data will mean novelty of UI, feelings, not realistic setting, etc. If many give you low ratings -> trouble Can get some useful data by asking –what they liked, disliked, where they had trouble, best part, worst part, etc. (redundant questions)

14 B A Comparing Two Alternatives Between groups experiment –two groups of test users –each group uses only 1 of the systems Within groups experiment –one group of test users each person uses both systems can’t use the same tasks or order (learning) –best for low-level interaction techniques Between groups will require many more participants than a within groups experiment See if differences are statistically significant –assumes normal distribution & same std. dev.

15 Experimental Details Order of tasks –choose one simple order (simple -> complex) unless doing within groups experiment Training –depends on how real system will be used What if someone doesn’t finish –assign very large time & large # of errors Pilot study –helps you fix problems with the study –do 2, first with colleagues, then with real users

16 Errors and Help

17 Types of errors Mistakes –User intended to do what they did, and it led to an error. User would probably do the same thing again. Slips –User did not mean to do what they did. They can recover by doing it differently again. –Slips are not just for beginners. Experts often make them because they devote less conscious attention to the task.

18 Minimizing Error User errors: –Use Intuitive (from the users domain of knowledge) command names. –Include short explanations as “tool tips”. –Put longer explanations in help system. Recognition over recall –Easier to select a file icon from a folder than to remember and type in the filename. –Auto-completion can help fix this. Use appropriate representations –E.g. graphical file selector good for choosing individual files –Textual file names support automation, richer organization (using command line options).

19 Types of errors Mistakes –User intended to do what they did, and it led to an error. User would probably do the same thing again. Slips –User did not mean to do what they did. They can recover by doing it differently again. –Slips are not just for beginners. Experts often make them because they devote less conscious attention to the task.

20 Description errors Description error: –The action is insufficiently specified by the user. –User may not know all the command line switches, or all the installation options for a program. Solution: –Warn the user that the command is ambiguous, or “unusual”. Provide help about options in several standard ways.

21 Capture error Capture error: (aka the tongue twister error) –Command sequences overlap, and one is more common. –User reflexively does the common one when trying to do the unusual one. –E.g. try typing “soliton” very fast. Solution –be aware of and test for this error. Try different command names.

22 Mode errors Mode errors: –User forgets what mode they’re in, and does the command appropriate for another mode. –Digital watches, VCRs etc. Several attributes: –There aren’t enough command keys for all the operations – so the mode determines what each button does. –There isn’t enough display space to provide strong feedback about the mode.

23 Mode errors Solutions: –Strive for consistent behavior of buttons across modes. –Provide display feedback about behavior of keys in the current mode. –Provide an option for scrolling help tips if possible. –Allow the device to be programmed externally (e.g. from a PC with Bluetooth). –If you don’t have a tiny screen, then make the context clear! i.e. use color, tabs, navigation graphics etc. to make clear to the user “where” they are in the interface.

24 Detecting Errors The earlier the better: –Check for consistency whenever possible (“asserts” for user input). –If there’s a high risk of error, check for unusual input, or for common slips (spelling correction). E.g. google’s “did you mean XX?” response

25 Help Types of help : –Task specific –Quick reference –Full explanation –Tutorial Key concepts: –Sandboxing –Context-sensitive help –Adaptive help


Download ppt "CS160 Discussion Section Final review David Sun May 8, 2007."

Similar presentations


Ads by Google