Presentation is loading. Please wait.

Presentation is loading. Please wait.

Designing easily learnable eyes-free interaction Kevin Li University of California, San Diego.

Similar presentations


Presentation on theme: "Designing easily learnable eyes-free interaction Kevin Li University of California, San Diego."— Presentation transcript:

1 designing easily learnable eyes-free interaction Kevin Li University of California, San Diego

2 why eyes free?

3 PCs… PC screens have the users’ undivided attention  design for the visual channel

4 environment

5 screen-less device

6 visual impairment

7 socially unacceptable

8 can’t see screen

9 I propose the way to enable easily learnable eyes-free interaction is to leverage pre-learned associations

10 touch current eyes-free interaction humanobject hearing musicspeechsounds Tapping and Rubbing Simulating buttons People Tones People Tones Mapping language to tactile stimulus class sense example project People Tones People Tones lessons for application designers

11 auditory feedback

12 calendar preview “Monday 9am” “tic, tic, sssssh” “How about Monday morning?” “Yeah, looks like I’m free after 10” blindSight [Li et al. 2008]

13 Pros – Easily learnable – Can convey lots of information Doesn’t always work – Requires direct interaction with device – Not usable in all scenarios (concerts, meetings, driving) tradeoffs

14 what about touch?

15 binary is easy

16 multidimensional tactons [Brown et al. 2006]

17 Vibratese Language - Communicate via vibrations Map alphabet and digits to 5 vibrators user with 65 hours training -> 90% accuracy, 38wpm System capable of delivering at 67wpm anything can be learned [Geldard 1956]

18 vibratese language

19 human-to-human interaction

20 Tapping and Rubbing

21 Can we create tactile stimuli that feels like tapping or rubbing? Will people associate it with what they already associate with tapping and rubbing from human-human communication? research questions

22 Tapping Rubbing [Li et al. in submission]

23 soundTouch

24 what feels like a finger?

25

26 applications

27 messaging

28 in-car navigation

29 gaming

30 questionnaire How would you describe the tactile sensations you just experienced to someone who has not experienced them? Which aspects of the experience felt natural and which aspects did not? If your phone could generate these types of sensations, what would you like to use them for?

31 Tapping has human quality –13 of 16 used the word ‘tap” in their description –12 of 16 volunteered it had a human quality Fast and slow are perceptually different –12 participants mentioned harder taps don’t feel natural –5 said fast ones don’t feel natural results

32 Taps have a number of characteristics that make them good for alerts – Quiet – Strong Rubbing is more subtle – Useful for in-the-hand scenarios Number of taps and rubs is key element – Sometimes, this has pre-learned meaning – Limits number of viable distinct icons lessons

33 mapping music to vibrations

34 PeopleTones: Buddy Proximity Notification

35 only two states, nearby and far away when a buddy is near, play their song if phone is in vibrate mode, play a matching vibrotactile sequence PeopleTones [Li et al. 2008]

36 measuring vibrations

37 zzz z z z

38 capturing the essence of music

39 high level approach just using beat doesn’t always work mapping lyrics doesn’t work well

40 remove noise isolate 6.6kHz to 17.6kHz components using 8 th order Butterworth Filter use amplitude threshold, to keep only components greater than the average

41 take running sum take running sum of absolute value, generate 1 value every 20ms this keeps length consistent

42 exaggerate features compose output from previous step with power function: Ax n,x is sample, A and n are constants, 10<=A<15, 1<=n<=2

43 field study

44 participants 3 groups of friends, 2 weeks

45 did you act on the cue?

46 could you tell who it was?

47 lessons cues in the wild should be music higher comprehension rate when users select their own cues obtrusiveness of music cues was not a concern -> longer cues semantic association is key

48 mapping physical objects to tactile feedback

49 problem tactile feedback is always the same, but visual and motor has directional aspect information is lost in the conversion

50 Add state information to tactile feedback – Hover state – Moved to the left/right/up/down Where do we put it? – Under the button – Make the sides move – Tapping? – solenoid solution

51 mapping speech to touch

52 research questions what are relevant characteristics of speech when mapping to tactile? how do users naturally perceive these to be mapped?

53 pilot study 5 common phrases from text messaging literature 4 vibration sequences per phrase 20 sequences Which phrase does this vibration feel like?

54 5 phrases Hello. Goodbye. Where are you? Are you busy? I miss you.

55 participants agree…

56 lessons intonation is important syllables should match number of pulses duration should match (roughly)

57 potential applications learning to sign augmented sms messages messaging backchannel

58 thesis summary

59 touch current eyes-free interaction humanobject hearing musicspeechsounds Tapping and Rubbing Simulating buttons People Tones People Tones Mapping language to tactile stimulus class sense example project People Tones People Tones lessons for application designers

60 timeline

61 Jan Apr Jun Sept Dec 2008 Ubicomp Doctoral colloquium Paper Deadlines Projects UIST Doctoral symposium CHI Doctoral consortium CHI Doctoral Symposiums Mapping language to tactile Simulating buttons

62 2009 Paper Deadlines Projects Jan Apr Jun Sept Dec Mapping language to tactile Part 2: Field Deployment Mapping language to tactile Part 2: Field Deployment CHIUIST Tapping and Rubbing Part 2 Find a job

63 2010 Paper Deadlines Projects Jan Apr Jun Sept Dec Dissertation Find a job Formalize music mapping

64 designing easily learnable eyes-free interaction Kevin Li University of California, San Diego

65

66 EXTRA SLIDES

67 Please listen carefully as our options have changed…

68

69

70 blindSight evaluation

71 interfaces Smartphone 2003 (sighted)BlindSight (eyes-free) vs.

72 task while “driving”idle (1) schedule appointments and (2) add contacts

73 012345678 Was not missing information Knew position in the menu Knew what day/time I was at Felt in control of the conversation Better for setting meeting times Prefer if driving and talking Prefer Overall blindSightSmartphone Overall preference results

74 1. brevity is good, but use in moderation clarification of navigation overrides brevity 2. predictable/modeless user interface is key 3. auditory feedback goes a long way even during phone call (disclaimer: need to study how it interferes with activities… driving) lessons

75 generating vibrations

76 Task: Identify Given a tap rate its strength on a scale of 1-7. Given a tap rate its speed on a scale of 1- 7.

77

78


Download ppt "Designing easily learnable eyes-free interaction Kevin Li University of California, San Diego."

Similar presentations


Ads by Google