Presentation is loading. Please wait.

Presentation is loading. Please wait.

Jörg Müller, Michael Nischt Deutsche Telekom Laboratories, TU Berlin Input and Output Mobile and Physical Interaction, WS.

Similar presentations


Presentation on theme: "Jörg Müller, Michael Nischt Deutsche Telekom Laboratories, TU Berlin Input and Output Mobile and Physical Interaction, WS."— Presentation transcript:

1 Jörg Müller, Michael Nischt Hans-joerg.mueller@telekom.de Deutsche Telekom Laboratories, TU Berlin Input and Output Mobile and Physical Interaction, WS 2010/2011 The slides for this lecture are based on the lecture „Mobile and Physical Interaction“ by Michael Rohs and Georg Essl

2 Mobile and Physical Interaction2WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Organization History of HCI Some Research Topics Exponential developments Review

3 Mobile and Physical Interaction3WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Preview Input and output modalities for mobile and physical interaction Design space of input devices Text input for mobile devices Display output and input technologies Haptics and audio output Exercise Session –Simple Computer Vision with Processing and JMyron

4 The Design Space of Input Devices

5 Mobile and Physical Interaction5WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Input Devices and Interaction Techniques “An input device is a transducer from the physical properties of the world into logical parameters of an application” (Card et al.) Interaction techniques combine input with feedback –Control processes generally need feedback loop

6 Mobile and Physical Interaction6WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Input Devices Design of human-machine dialogue = design of artificial languages Input devices enable human-machine dialogues –Communicative intention  movements  application Manipulations of input devices as sentences in a formal language –Primitive moves –Composition operators to generate sentences of moves

7 Mobile and Physical Interaction7WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Properties of Input Devices Property sensed (position, motion, force, rotation, etc.) –Absolute vs. relative sensing (position vs. movement) –Absolute sensing issue: nulling problem (physical position not in agreement with value set in software) –Relative sensing issue: clutching Number of dimensions –1D, 2D, 3D, 6D Indirect vs. direct –Indirect: input space and output space are separate –Direct: input space = output space Device acquisition time Control-to-display (C:D) ratio

8 Mobile and Physical Interaction8WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Language of Input Device Interaction Modeling the language of input device interaction –Primitive movement vocabulary –Set of composition operators Input device (M, In, S, R, Out, W) –M: manipulation operator –In: input domain –S: current state –R: resolution function (input-to-output mapping) –Out: output domain set –W: work properties (e.g. lag)

9 Mobile and Physical Interaction9WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Language of Input Device Interaction Modeling the language of input device interaction –Primitive movement vocabulary –Set of composition operators Input device (M, In, S, R, Out, W) –M: manipulation operator –In: input domain –S: current state –R: resolution function (input-to-output mapping) –Out: output domain set –W: work properties (e.g. lag)

10 Mobile and Physical Interaction10WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Motivation for Design Space Need for systematization –Bring order to knowledge about input devices “Design Space of Input Devices” (Card, et al.) –Framework for describing and analyzing input devices –Group input devices into families –Find gaps in the design space Morphological design space analysis –Morphé: shape or form (Greek) –Morphology: the study of the form and structure of things –Input device designs as points in a design space

11 Mobile and Physical Interaction11WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Generating the Design Space Primitive movement vocabulary Composition operators –Merge composition: cross product –Layout composition: collocation –Connect composition: output  input Design space of input devices –Possible combinations of composition operators with the primitive vocabulary

12 Mobile and Physical Interaction12WS 2010/2011Jörg Müller, Michael Nischt, T-Labs The Design Space of Input Devices Set of possible combinations of composition operators with the primitive vocabulary Merge Layout Connect Touch screen? Keyboard? Trackball? Isometric joystick?

13 Mobile and Physical Interaction13WS 2010/2011Jörg Müller, Michael Nischt, T-Labs The Design Space of Input Devices More input devices –absolute position heavily populated –relative force and relative torque are empty Touch screen –absolution position (X, Y) Keyboard –absolute position (Z) Trackball –relative rotation (rX, rY) Isometric joystick –absolute force (F)

14 Mobile and Physical Interaction14WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Testing Points in the Design Space Use the space to evaluate devices Expressiveness –“The input conveys exactly and only the intended meaning” –Problematic if Out  In do not match Out  In: can input illegal values Out  In: cannot input all legal values –Cf. 3D position with a mouse Effectiveness –“The input conveys the intended meaning with felicity” –Pointing speed: device might be slower than unaided hand –Pointing precision: convenient selection of small targets –Errors –Time to learn –Time to grasp the device

15 Mobile and Physical Interaction15WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Bandwidth Speed of use depends on –Human: bandwidth of muscle group to which input device attaches –Application: precision requirements of the task –Device: effective bandwidth of input device

16 Mobile Text Entry

17 Mobile and Physical Interaction17WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Text Entry on Mobile Devices Mobile text entry –SMS (>1 billion SMS messages sent each day) –Email, calendars, etc. Small devices require alternative input methods –Smaller keyboards, stylus input, finger input, gestures Many text entry methods exist –Companies are ambitiously searching for improvements Key-basedFinger-basedStylus-basedTilt-based

18 Mobile and Physical Interaction18WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Text Entry Speed on Mobile Devices Goal: High-speed entry at low error rates –Movement minimization –Low attention demand –Low cognitive demand Entry speeds depend on task type and practice Typical text entry speeds –Desktop touch typing: 60+ words per minute (wpm) –Soft keyboards: 40+ wpm after lots of practice, typically 18-28 wpm for qwerty, 5-7 wpm for unfamiliar layout –Handwriting speeds: 13-22 wpm

19 Mobile and Physical Interaction19WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Keyboard Layouts for Mobile Devices Qwerty variations –Designed to be slow –Prevent typing machines from jamming alternate between sides of the keyboard

20 Mobile and Physical Interaction20WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Dvorak Keyboard Speed typing by –Maximizing home row (where fingers rest) –Alternate hand typing Most frequent letters and digraphs easiest to type Home row

21 Mobile and Physical Interaction21WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Fitaly and Opti Keyboards Designed for stylus input on soft (on-screen) keyboards Minimizing stylus movement during text entry Stylus movement for entering the ten most and least frequent digrams:

22 Mobile and Physical Interaction22WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Half-Qwerty and ABC Keyboards Half-qwerty –One-handed operation –30 wpm ABC keyboards –Familiar arrangement –Non-qwerty shape

23 Mobile and Physical Interaction23WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Very Small Devices 5 keys (e.g., pager) 3 keys (e.g., watch)

24 Mobile and Physical Interaction24WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Keyboards and Ambiguity Keyboard miniaturization: Smaller keys, Less keys Unambiguous keyboards –One key, one character Ambiguous keyboards –One key, many characters –Disambiguation methods (manually driven, semiautomatic) 3512>26keys ambiguity continuum 1?

25 Mobile and Physical Interaction25WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Ambiguity Ambiguity occurs if fewer keys than symbols in the language Disambiguation needed to select intended letter from possibilities Typical example: Phone keypad ? RUNNER SUMMER STONES

26 Mobile and Physical Interaction26WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Unambiguous Keyboards One key, one character FasTap keyboard –Keys in space between keys –9.3 wpm FastTap

27 Mobile and Physical Interaction27WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Ambiguous Keyboards One key, many characters Standard 12-button phone keyboard, larger variants Blackberry 7100 Nokia N73 Twiddler, chord keyboard

28 Mobile and Physical Interaction28WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Manual Disambiguation Consecutive disambiguation –Press key, then disambiguate (multitap) Concurrent disambiguation –Disambiguate while pressing key (via tilting or chord) Multitap –More keystrokes –Disambiguating presses on same key (timeout or timeout kill) Tilting –Tilt in a certain direction while pressing Chord-keyboard on rear of device –Not picked up by device manufacturers

29 Mobile and Physical Interaction29WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Disambiguation by Multitap

30 Mobile and Physical Interaction30WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Partridge: TiltType Text input method for watches or pagers Press and hold button while tilting device 9 tilting directions (corners + edges) Buttons select to character set Kurt Partridge et al.: TiltType: Acceler- ometer-Supported Text Entry for Very Small Devices. UIST 2002 technote portolano.cs.washington.edu/projects/tilttype

31 Mobile and Physical Interaction31WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Dictionary-Based Disambiguation (T9)

32 Mobile and Physical Interaction32WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Simplified Handwriting: Unistroke Single-stroke handwriting recognition –Each letter is a single stroke, simple recognition –Users have to learn the strokes –“Graffiti” intuitive unistroke alphabet (5 min practice: 97% accuracy) Slow (15 wpm) Users have to attend to and respond to recognition process Recognition constrains variability of writing styles Requires considerable processing power

33 Mobile and Physical Interaction33WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Speeding up stylus-based text entry –Visual feedback required for handwriting –Eyes-free entry possible for unistroke Look at suggestions during eyes-free unistrokes Language-based acceleration techniques –Word completion list based on corpus (word, frequency) Tap candidate –Suffix completion based on suffix list (“ing”, “ness”, “ly”, etc.) Top-left to bottom-right stroke, tap suffix –Frequent word prompting (“for“, "the", "you", "and", etc.) Tap frequent word Unipad: Language-Based Acceleration for Unistroke MacKenzie, Chen, Oniszczak: Unipad: Single-stroke text entry with language-based acceleration. NordiCHI 2006.

34 Mobile and Physical Interaction34WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Word completion example –User is entering “hours” –State after two strokes (“ho”) Experimental interface –First line shows text to enter –Second line shows text already entered –Pad below Entering strokes Word completion list Unipad: Acceleration by Word Completion MacKenzie, Chen, Oniszczak: Unipad: Single-stroke text entry with language-based acceleration. NordiCHI 2006. http://www.yorku.ca/mack/nordichi2006.html

35 Mobile and Physical Interaction35WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Frequent word example –User is about to enter “of” Pad shows frequent word list –User taps “of” Unipad: Acceleration by Frequent Word MacKenzie, Chen, Oniszczak: Unipad: Single-stroke text entry with language-based acceleration. NordiCHI 2006. http://www.yorku.ca/mack/nordichi2006.html

36 Mobile and Physical Interaction36WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Suffix completion example –User is entering “parking” –State after 3 strokes (“par”) Pad shows word completion list (incl. “park”) –User enters top-left to bottom-right stroke on “park” Pad shows suffix list –User taps “ing” Unipad: Acceleration by Suffix Completion MacKenzie, Chen, Oniszczak: Unipad: Single-stroke text entry with language-based acceleration. NordiCHI 2006. http://www.yorku.ca/mack/nordichi2006.html

37 Mobile and Physical Interaction37WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Entry speed >40 wpm possible –KSPC ≈ 0.5 (key strokes per character) Expert performance simulated on sentence “the quick brown fox jumps over the lazy dog” (43 chars) (27 strokes) Unipad: Performance MacKenzie, Chen, Oniszczak: Unipad: Single-stroke text entry with language-based acceleration. NordiCHI 2006. http://www.yorku.ca/mack/nordichi2006.html

38 Mobile and Physical Interaction38WS 2010/2011Jörg Müller, Michael Nischt, T-Labs EdgeWrite Provide physical constraints Moving stylus along edges and diagonals of square input area People with motor impairments Input = Sequence of visited corners Example: Digits Wobbrock, Myers, Kembel: EdgeWrite: a stylus-based text entry method designed for high accuracy and stability of motion. UIST'03. http://depts.washington.edu/ewrite/http://depts.washington.edu/ewrite/

39 Mobile and Physical Interaction39WS 2010/2011Jörg Müller, Michael Nischt, T-Labs QuickWriting: Gesture-Based Input Combine visual keyboards with stylus movements Motor memory for paths Following a path through letters of the word to enter Reduced stress and fatigue compared to continuous tapping Ken Perlin: Quikwriting: Continuous Stylus-based Text Entry. UIST’98. Quickwriting, http://mrl.nyu.edu/~perlin/demos/quikwriting.html

40 Display and Touch Screen Technologies

41 Mobile and Physical Interaction41WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Light Emitting Diodes (LED) Cheap Fairly high power consumption when on Use outside of visual range –Infrared –Ultraviolet http://www.adobe.com/de/designcenter/thinktank/livingskins/images/Living_Skins_fig04.jpg

42 Mobile and Physical Interaction42WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Liquid Crystal Display (LCD) An LCD cell is a voltage-controlled “light valve” Twisted nematic effect –Liquid crystal molecules form helix structure between electrodes –Electric field aligns molecules (controls amount of twisting) –Orientation of molecules controls orientation of polarized light LCD pattern Off stateOn state

43 Mobile and Physical Interaction43WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Plasma Display Like a flourescent lamp, emit light UV light hits phosphor, creates visible light Very large implementations High power consumption (depends on image)

44 Mobile and Physical Interaction44WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Organic Light Emitting Diode (OLED) Emit light (like LED) Very thin, bendable, can be printed on any foil High contrast Aging effect

45 Mobile and Physical Interaction45WS 2010/2011Jörg Müller, Michael Nischt, T-Labs E-Paper Reflective display (the more sun shining on it, the better) Only consumes power while changing content No large displays available yet Slow update rate

46 Mobile and Physical Interaction46WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Projection Back- or Frontprojection Robust and cheap for large displays Difficult with ambient lighting LCD, DLP, LED (uses DLP), LCoS (like LCD),

47 Mobile and Physical Interaction47WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Laser Projection Nice for large distances High contrast Expensive May be dangerous

48 Mobile and Physical Interaction48WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Touch Screens Resistive Capacitive –iPhone Surface Acoustic Wave (SAW) Light Array Frustrated Total Internal Reflection –Jeff Han’s multitouch table Diffuse Illumination

49 Mobile and Physical Interaction49WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Resistive Touch Screens 1.Polyester Film 2.Upper Resistive Circuit Layer 3.Conductive ITO (Transparent Metal Coating) 4.Lower Resistive Circuit Layer 5.Insulating Dots 6.Glass/Acrylic Substrate 7.Touching the overlay surface causes the (2) Upper Resistive Circuit Layer to contact the (4) Lower Resistive Circuit Layer, producing a circuit switch from the activated area. 8.The touchscreen controller gets the alternating voltages between the (7) two circuit layers and converts them into the digital X and Y coordinates of the activated area. (www.fastpoint.com)

50 Mobile and Physical Interaction50WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Capacitive Touch Screens (www.unwiredview.com) Senses capacitive changes –Only works with finger, not with stylus iPhone –Uses additional grid for better multitouch disambiguation

51 Mobile and Physical Interaction51WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Surface Acoustic Waves Screens (www.fastpoint.com) Sense diffracted surface waves from interaction point –Works on glass –Robust: Used for public space displays

52 Mobile and Physical Interaction52WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Light Array Screens Light array to sensing disruption in intensity –Translucent –Costly –Low resolution (grid bound) (www.unwiredview.com) (www.actkern.info)

53 Mobile and Physical Interaction53WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Digital Vision Touch (DViT, Smart tech) Cameras in the corners –IR LEDs in the frame –Up to 2 touch points (www.smart-technologies.com)

54 Mobile and Physical Interaction54WS 2010/2011Jörg Müller, Michael Nischt, T-Labs FTIR = Frustrated Total Internal Reflection Infrared light is totally reflected inside acrylic glass Light scatters out when finger touches glass Camera captures light Can track multiple fingers No identity for multiple users No hovering http://multi-touchscreen.com/

55 Mobile and Physical Interaction55WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Rear Diffuse Illumination Illumination (IR) and camera from behind. –Can detect hover (wiki.nuigroup.com)

56 Mobile and Physical Interaction56WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Front Diffuse Illumination Illumination (IR) from front, camera from behind. –Can detect hover (wiki.nuigroup.com)

57 Mobile and Physical Interaction57WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Computer Vision Frontal Cameras Ceiling Mounted Cameras Stereo Cameras Time-of-Flight Cameras

58 Mobile and Physical Interaction58WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Frontal Cameras Usually installed above or below display Can be used for face detection or movement detection, background subtraction or color tracking also possible Problems with field of view and occlusions Difficult to find the position of people

59 Mobile and Physical Interaction59WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Ceiling-Mounted Cameras Installed on the ceiling above the display Often used for tracking the position of people Less problems with occlusion, but we see no face Often combined with frontal cameras

60 Mobile and Physical Interaction60WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Stereo Cameras Two calibrated cameras to compute depth field Only used when good texture is available Computationally intensive, relatively difficult to do Problems with reflections

61 Mobile and Physical Interaction61WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Time-of-Flight Cameras Illuminate the scene with modulated infrared light Use phase-shift of reflected light to calculate distance of reflecting object Calculates distance for each pixel Generates 2 ½ D point cloud Currently strong noise, occlusion problems

62 Audio and Haptics

63 Mobile and Physical Interaction63WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Multimodality Involve different senses through different modalities –Audio, tactile –Suit different users, tasks, and contexts Problems of visual modality –Screen space small –Eyes heavily used when mobile Interaction problems –Particular modality may not always be available all of the time –User involved in other tasks  Attention may be occupied –Contexts may vary –Sole use of one modality not effective

64 Mobile and Physical Interaction64WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Multimodal Interaction Allow people to do everyday tasks while using mobile technology –“Eyes-free” or “hands-free” Interaction techniques that suit real environments –Non-speech sounds + tactile feedback –Sensors for gestural input –Speech input

65 Mobile and Physical Interaction65WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Audio: Loudspeakers Electromagnetic induction –Core moved by changing currents which in turn moves a membrane Piezoelectric speakers

66 Mobile and Physical Interaction66WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Non-Speech Audio Earcons (Blattner) –Musically structured sounds (abstract) Auditory Icons (Gaver) –Natural, everyday sounds (representational) Sonification –Visualization using sound –Mapping data parameters to audio parameters (abstract)

67 Mobile and Physical Interaction67WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Earcons Structured audio messages based on abstract sounds –Created by manipulation of sound properties: timbre, rhythm, pitch, tempo, spatial location (stereo, 3D sound), etc. Composed of motives Can be compound –Sub-units combined to make messages Or hierarchical –Sounds manipulated to make complex structures OpenCloseUndo Examples from: http://www.dcs.gla.ac.uk/~stephen/earconexperiment1/earcon_expts_1.shtmlhttp://www.dcs.gla.ac.uk/~stephen/earconexperiment1/earcon_expts_1.shtml

68 Mobile and Physical Interaction68WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Auditory Icons Sounds mapped to interface events by analogy to everyday sound-producing events –E.g., selecting — tapping; copying — filling –Iconic v. symbolic mapping Auditory icons can be parameterized –E.g. material for type, loudness for size –Multiple layers of information in single sounds –Reduces repetition and annoyance The SonicFinder –Selecting, copying, dragging Gaver, W. W., (1986). Auditory icons: Using sound in computer interfaces. Human-Computer Interaction, 2, 167-177.

69 Mobile and Physical Interaction69WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Sonification Mapping of data values to auditory parameters Most commonly x-axis to time, y-axis to pitch

70 Mobile and Physical Interaction70WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Sonification of Luminance Histograms in Digital Cameras Difficult to focus visual attention on subject and technical parameters –Exposure, aperture, battery life, image mode, etc. Idea: Sonified luminance histogram Sonification of remaining memory space Brewster and Johnston: Multimodal Interfaces for Camera Phones. MobileHCI 2008.

71 Mobile and Physical Interaction71WS 2010/2011Jörg Müller, Michael Nischt, T-Labs 3D Audio Interaction Increase the audio display space 3D audio –“Cocktail party effect” –Provides larger display area –Monitor more sound sources “Audio Windows” (Cohen) –Each application gets its own part of the audio space Pie Menus (Brewster, CHI’03, Marentakis, CHI’06) –Audio items placed around the head Cohen. Integrating graphics and audio windows. Presence: Teleoper. Virtual Environ. 1, 4 (Oct. 1992), 468-481. Brewster, Lumsden, Bell, Hall, Tasker. Multimodal 'eyes-free' interaction techniques for wearable devices. CHI '03. Marentakis, Brewster. Effects of feedback, mobility and index of difficulty on deictic spatial audio target acquisition in the horizontal plane. CHI '06.

72 Mobile and Physical Interaction72WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Haptics Definition: Sense and/or motor activity based in the skin, muscles, joints, and tendons Two parts –Kinaesthesis: Sense and motor activity based in the muscles, joints, and tendons –Touch: Sense based on receptors in the skin Tactile: mechanical simulation of the skin

73 Mobile and Physical Interaction73WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Why Haptic Interaction? Has benefits over visual display –Eyes-free Has benefits over audio display –Personal not public –Only the receiver knows there has been a message People have a tactile display with them all the time –Mobile phone

74 Mobile and Physical Interaction74WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Tactile Technologies PMD 310 vibration motor Vibration motor with asymmetric weight –Eccentricity induces vibrations –Speed controls vibration frequency –Precision limited (several ms startup time)

75 Mobile and Physical Interaction75WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Design of Tactons Tactons = tactile icons –Structured, abstract messages that can be used to communicate non-visually (Brown, 2005) –Tactile equivalent to Earcons Encode information using parameters of cutaneous perception –Body location –Rhythm –Duration –Waveform –Intensity

76 Mobile and Physical Interaction76WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Tacton Parameters Spatial location (on forearm, waist, hand) very effective –Good performance with up to 4 locations –Wrist and ankle less effective, especially mobile

77 Mobile and Physical Interaction77WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Tacton Parameters Rhythm very effective –Easily identified with three levels Waveform –Carefully designed sine, square, and sawtooth wave forms very effective (tuned to capabilities of actuator) Intensity –Two levels –Hard to use and may need to be controlled by user

78 Mobile and Physical Interaction78WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Example: Tactile Button Feedback Touchscreen phones have no tactile feedback for buttons –More errors typing text and numbers Performance comparison of physical buttons, touchscreen, and touchscreen+tactile –In lab and on subway Touchscreen+tactile as good as physical buttons –Touchscreen alone was poor Brewster, Chohan, Brown: Tactile feedback for mobile interactions. CHI '07.

79 Mobile and Physical Interaction79WS 2010/2011Jörg Müller, Michael Nischt, T-Labs Example: Tactile Navigation Non-visual interface for GPS + compass Belt of 4 actuators –Placed north, south, east, west Vibrations gave direction and distance Users could follow paths accurately without a screen

80 Exercise


Download ppt "Jörg Müller, Michael Nischt Deutsche Telekom Laboratories, TU Berlin Input and Output Mobile and Physical Interaction, WS."

Similar presentations


Ads by Google