Presentation on theme: "BlindSight: Eyes-free mobile phone interaction Kevin Li, University of California, San Diego Patrick Baudisch, Microsoft Research Ken Hinckley, Microsoft."— Presentation transcript:
blindSight: Eyes-free mobile phone interaction Kevin Li, University of California, San Diego Patrick Baudisch, Microsoft Research Ken Hinckley, Microsoft Research
calendar preview “Monday 9am” “tic, tic, sssssh” “How about Monday morning?” “Yeah, looks like I’m free after 10” blindSight
is an application running on Microsoft Windows Smartphone is launched when user places or receive a call. It then replaces the in-call menu unlike the in-call menu, blindSight uses auditory feedback
User’s should be able to “dial ahead” [Perugini et al.,CHI 2007] Zap and Zoom allows users to jump to locations using shortcuts [Hornstein, UBILAB Rep 1994] Use visual channel to inform users about options [Yin and Zhai, CHI 2006] interactive voice response
Time compress audio [Dietz and Yerazunis, UIST 2001] Integrate speech commands into the conversation [Lyons et al., CHI 2004] phone interaction mid-conversation
audio is heard only by the user, not by the person at the other end
rationale people can recover from audio interruptions as long as interruption is short human-human conversation contains redundancy can we use this redundancy to inject auditory feedback from the device?
how do we make sure device feedback fits into these time windows of low information content?
rules 1. feedback only on-demand hear voice note mute speaker phone hear task list add contact record voice find contact cal endar hear emails hear text message home
rules 2. brevity 2 13 8 delete 4 9 7 6 5 abc def tuv ghi wxyz pqrs mno jkl next play find contact type 6 “200 hits” type 2 “12 hits” type 7 “Marion”
rules 3. non-speech previews of composites week day 3 hours ½ hour block ½h preview day + – – – + _ whereAmI go today preview 3 hours ++ calendar (what if the content is a long list, such as appointments for a day?)
interfaces Smartphone 2003 (sighted)BlindSight (eyes-free) vs.
task while “driving”idle (1) schedule appointments and (2) add contacts
012345678 Was not missing information Knew position in the menu Knew what day/time I was at Felt in control of the conversation Better for setting meeting times Prefer if driving and talking Prefer Overall blindSightSmartphone Overall preference results
1. brevity is good, but use in moderation clarification of navigation overrides brevity 2. predictable/modeless user interface is key 3. auditory feedback goes a long way even during phone call (disclaimer: need to study how it interferes with activities… driving) lessons
environment visual impairment can’t see screen screen-less device next:
calendar week day 3 hours ½ hour block ½h preview day + – – – + _ whereAmI go today preview 3 hours ++ + +
hear voice note mute speaker phone hear task list add contact record voice find contact cal endar hear emails hear text message 2 1 3 8 save 4 9 7 6 0 5 delete week day 3 hours ½ hour block ½h preview day + – – – + _ whereAmI go today preview 3 hours ++ 2 13 8 delete 4 9 7 6 5 abc def tuv ghi wxyz pqrs mno jkl next play add contact find contact calendar home type folder n items item play + – – – + _ + preview + email, tasks, voice, SMS home help hold bottom left for hold bottom right for menu
…is a phenomenon in which people who are perceptually blind in a certain area of their visual field demonstrate some visual awareness, without any qualitative experience blindSight... [wikipedia]
don’t mode me in blind sight 10 design rules to allow eyes-free use and flow tactile features
…are in in a mobile situation If they requires visual attention, users will fail at their current activity interference with social activities drive off the road… phones…