Presentation is loading. Please wait.

Presentation is loading. Please wait.

Week 5 LBSC 690 Information Technology Human-Computer Interaction.

Similar presentations


Presentation on theme: "Week 5 LBSC 690 Information Technology Human-Computer Interaction."— Presentation transcript:

1 Week 5 LBSC 690 Information Technology Human-Computer Interaction

2 Agenda Questions HCI overview Input and output devices Interface design Interaction design Evaluation

3 What are Humans Good At? Sense low level stimuli Recognize patterns Reason inductively Communicate with multiple channels Apply multiple strategies Adapt to changes or unexpected events

4 What are Computers Good At? Sense stimuli outside human’s range Calculate quickly and accurately Store large quantities and recall accurately Respond rapidly and consistently Perform repetitive actions reliably Work under heavy load for an extended period

5 Synergy Humans do what they are good at Computers do what they are good at Strengths of one cover weakness of the other

6 The Discipline of HCI From ACM SIGCHI Curricula for Human-Computer Interaction

7 Types of Applications Life critical –Low error rate first and foremost –Justifies an enormous design and testing effort Custom Commercial –Speed and error rate Office and Home –Easy learning, high user satisfaction, low cost Creative –User needs assessment is very challenging

8 User Characteristics Physical –Anthropomorphic (height, left handed, etc.) –Age (mobility, dexterity, etc.) Cognitive Perceptual –Sight, hearing, etc. Personality –Including cultural factors

9 Modeling Interaction Task System Mental ModelsSight Sound Hands Voice TaskUser Software ModelsKeyboard Mouse Display Speaker Human Computer

10 Discussion Point: Mental Models As a user, what do you need to know about a machine in order to interact with it effectively?

11 Mental Models How the user thinks the machine works –What actions can be taken? –What results are expected from an action? –How should system output be interpreted? Mental models exist at many levels –Hardware, operating system, and network –Application programs –Information resources

12 The GOMS Perspective Goals –What the user is trying to achieve Operators –What capabilities the system provides Methods –How those capabilities can be used Selection strategies –Which method to choose in a specific case

13 Input Devices Text –Keyboard, optical character recognition –Speech recognition, handwriting recognition Direct manipulation –2-D: mouse, trackball, touch pad, touch panel –3-D: wand, data glove Remote sensing –Camera, speaker ID, head tracker, eye tracker

14 Keyboard Produces character codes –ASCII: American English –Latin-1: European languages –UNICODE: (nearly) Any language Pictographic languages need “entry methods” Keyboard shortcuts help with data entry –Different conventions for standard tasks abound VT-100 standard” functions are common –Differing layouts can inhibit usability

15 Design Example: QWERTY Keyboard From http://home.earthlink.net/~dcrehr/whyqwert.html

16 Dvorak Keyboard From http://www.mwbrooks.com/dvorak/

17 2-D Direct Manipulation Match control actions with on-screen behavior –Use a cursor for visual feedback if needed Rotary devices –Mouse, trackball Linear devices –Touch pad, touch screen, iPod shuttle, joystick Rate devices –Laptop eraserhead

18 Modeling Interaction Task System Mental ModelsSight Sound Hands Voice TaskUser Software ModelsKeyboard Mouse Display Speaker Human Computer

19 Human Senses Visual –Position/motion, color/contrast, symbols Auditory –Position/motion, tones/volume, speech Haptic –Mechanical, thermal, electrical, kinesthethic Olfactory –Smell, taste Vestibular

20 Computer Output Image display –Fixed view, movable view, projection Acoustic display –Headphones, speakers, within-ear monitors Tactile display –vibrotactile, pneumatic, piezoelectric Force feedback –dexterous handmaster, joystick, pen

21 Computer Output Inertial Display –Motion-based simulators Olfactory Display –Chemical (requires resupply) Locomotive display –Stationary bicycle, treadmill,... (trip hazards) Temperature Display

22 Interaction Styles Graphical User Interfaces (GUI) –Direct manipulation (2D, 3D) –Menus Language-based interfaces –Command line interfaces –Interactive voice response systems Virtual Reality (VR) –Direct manipulation Ubiquitous computing

23 WIMP Interfaces Windows –Spatial context Icons –Direct manipulation Menus –Hierarchy Pointing devices –Spatial interaction

24 GUI Components Windows (and panels) –Resize, drag, iconify, scroll, destroy Selectors –Menu bars, pulldown lists Buttons –Labeled buttons, radio buttons, checkboxes Icons (images) –Select, open, drag, group

25 Direct Manipulation Select a metaphor –Desktop, CD player, map, … Use icons to represent conceptual objects –Watch out for cultural differences Manipulate those objects with feedback –Select (left/right/double click), move (drag/drop)

26 Spreadsheets: Direct Manipulation

27 Menus Conserve screen space by hiding functions –Menu bar, pop-up Can hierarchically structured –By application’s logic –By convention (e.g., where is the print function?) Tradeoff between breadth and depth –Too deep  can become hard to find things –Too broad  becomes direct manipulation

28 Dynamic Queries What to do when menus become too deep –Merges keyboard and direct manipulation Select menu items by typing part of a word –After each letter, update the menu –Once the word is displayed, user can click on it Example: Windows help index

29 Language-Based Interfaces Command Entry –Compact and flexible –Powerful in the hands of expert users –Difficult for novices to learn Natural Language –Intuitive and expressive –Ambiguity makes reliable interpretation difficult

30 Ben’s “Seamless Interfaces” Informative feedback Easy reversal User in control –Anticipatable outcomes –Explainable results –Browsable content Limited working memory load –Query context –Path suspension Alternatives for novices and experts –Scaffolding

31 Doug’s “Synergistic Interaction” Interdependence with process –Co-design with search strategy, Speed System initiative –Guided process, exposing the structure of knowledge Support for reasoning –Representation of uncertainty –Meaningful dimensions Synergy with features used for search –Weakness of similarity, Strength of language Easily learned –Familiar metaphors (timelines, ranked lists, maps)

32 Demo: Lighthouse Language for searching Language for results (1-D list) 3-D direct manipulation similarity search Table display for timeline (2-D)

33 Design Critique www.philipglass.com http://www.michaelkamen.com/

34 Aural Perception We respond to sounds without prior focus –Lack of focus limits simultaneous stimuli Absolute amplitude & pitch hard to interpret –But changes stand out clearly Stereo effect provides a sense of direction –Relative amplitude, phase difference

35 Speech Output Replay of digitized speech clips –High fidelity, but limited vocabulary Speech Synthesis –Generate spoken output from unrestricted input Based on pronunciation rules and lists of exceptions –Sounds unnatural due to misplaced emphasis Prosody-guided speech synthesis –Use pronunciation of similar words as a guide

36 Auditory Display Nonspeech audio output for user interfaces Same objectives as graphical output: –Alert the user to exceptional conditions –Provide ubiquitous feedback –Present information But different characteristics –Effective even without focus –Fairly low resolution

37 Auditory Display Design Need a metaphor –Clock ticking, alarm bells, keyboard clicks, etc. Channel is easily overloaded –Focus helps manage cognitive load Changes are more useful than values –Pitch, amplitude, position, harmonics, etc.

38 An Auditory Image Display Display 2-D images using only sound –Sweep from left to right every second Audible pause and click between sweeps –Top pixels are high frequency, bottom are low Blind users can detect objects and motion –Time indicates horizontal position –Pitch indicates vertical position –Sweep-to-sweep differences indicate motion http://www.visualprosthesis.com/javoice.htm

39 Interactive Voice Response Systems Operate without graphical interfaces –Hands-free operation (e.g., driving) –Telephone access Built on three technologies –Speech recognition (input) –Text-to-speech (output) –Dialog management (control) Example: TellMe (1-800-555-TELL)

40 Dialogue Management User initiative System initiative –Allows a smaller vocabulary Mixed initiative (e.g., barge in)

41 Interaction Design Where are you departing from? Where do you want to go? What day do you want to travel? VerificationGoodbye Wrong Confirmed Baltimore National Dulles San Francisco Oakland San Jose Anywhere else Day when there are flights Not a day Sorry Another day Anywhere else

42 Evaluation Measures Time to learn Speed of performance Error rate Retention over time Subjective satisfaction

43 Evaluation Approaches Extrinsic vs. intrinsic Formative vs. summative Human subjects vs. simulated users Deductive vs. abductive

44 Summary HCI design starts with user needs + abilities –Users have a wide range of both Users must understand their tools –And these tools can learn about their user! Many techniques are available –Direct manipulation, languages, menus, etc. –Choosing the right technique is important LBSC 795 in Spring 2006 has this focus


Download ppt "Week 5 LBSC 690 Information Technology Human-Computer Interaction."

Similar presentations


Ads by Google