Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Informal Tools for Multimodal, Context-based User Interface Design James A. Landay July 7, 1999 HCC Retreat

Similar presentations


Presentation on theme: "1 Informal Tools for Multimodal, Context-based User Interface Design James A. Landay July 7, 1999 HCC Retreat"— Presentation transcript:

1 1 Informal Tools for Multimodal, Context-based User Interface Design James A. Landay July 7, 1999 HCC Retreat http://guir.berkeley.edu

2 2 Natural Tides of Innovation Time Integration Innovation Log R Mainframe Minicomputer Personal Computer Workstation Server 7/99 ?? Universal Computing

3 3 Universal 1 : including or covering all or a whole collectively or distributively without limit or exception 2 a : present or occurring everywhere b : existent or operative everywhere or under all conditions 3 a : embracing a major part or the greatest portion (as of mankind) b : comprehensively broad and versatile 4 a : affirming or denying something of all members of a class or of all values of a variable b : denoting every member of a class 5 : adapted or adjustable to meet varied requirements (as of use, shape, or size)

4 4  Powerful, personal capabilities from specialized devices *small, highly mobile or embedded in environment  “Intelligence” + immense storage and processing in the infrastructure  Everything connected Laptops, Desktops Specialized Devices Away From the “Average Device”

5 5 HCI Challenges  Universal computing devices will not have the same UI as “dad’s PC” *a wide range of devices -often w/ small or no screens & alternative I/O +e.g., pens, speech, vision -special purpose to particular applications +“information appliances” *lots of devices per user -all working in concert

6 6 HCI Challenges (cont.)  Design of good appliances will be hard *single device design is easy *hard to design the same “application” in a consistent manner across many devices -e.g., calendar app.: one speech based & one GUI based *hard to make different devices work together -which device is used when? -multiple UIs & modes/device, which to “display”? -building awareness of context of use into design is the key to some of these issues *multimodal input is assumed, but little design support for creating multimodal interfaces

7 7 Our Approach  Build *novel applications on existing appliances -e.g., NotePals on the Palm PDA & CrossPad *new information appliances  Evaluate appliances in realistic settings  Iterate *use the resulting experience to build -more interesting appliances -better design tools & analysis techniques

8 8 Outline  HCI Challenges for Universal Computing  Multimodal interaction  Why is building MM UIs hard?  Best practices for designing GUIs  Proposed approach to building MM UIs  Using context to our advantage

9 9 Multimodal Interaction  When communicating with people we use more than one mode at a time! *gesture & speak *sketch & speak *etc.  Computers would be easier to use and more useful if they also worked like this

10 10 Benefits of Multimodal (MM) Interaction on Heterogeneous Devices  Computers could be used in *more situations -when hands are full or vision used for something else *more places -e.g., walking, car, factory floor, etc.  Interfaces would be easier to use *use innate perceptual, motor, & cognitive skills  More people would be able to use computers *including users w/ vision or motor impairments  MM UIs likely to become predominant

11 11 Why is building MM UIs hard?  Often require “recognition” technology *speech, handwriting, sketches, gesture, etc.  Recognition technology is immature *finally “just good enough” *single mode toolkits just appearing now *no prototyping tools  Hard to combine recognition technologies *still requires experts to build systems *few toolkits or prototyping tools!  This was the state of GUIs in 1980

12 12 Best Practices for Designing GUIs  Iterative design  Prototyping tools are key to this success Design Prototype Evaluate

13 13 Early Stage UI Design  Brainstorming *put designs in a tangible form *consider different ideas rapidly  Incomplete designs *do not need to cover all cases *illustrate important examples  Present several designs to client or design team  No need at this stage for “coding”

14 14 Prototyping Tools for Multimodal UI Design Should Support  Iterative design methodology  Informal techniques designers currently use in the early stage of UI design *sketching

15 15 Prototyping Tools for Multimodal UI Design Should Support  Iterative design methodology  Informal techniques designers currently use in the early stage of UI design *sketching *storyboarding

16 16 Prototyping Tools for Multimodal UI Design Should Support  Iterative design methodology  Informal techniques designers currently use in the early stage of UI design *sketching *storyboarding *“Wizard of Oz” Landay Dictation Machine |Um…. I’ll see you in the morning.

17 17 Our Approach: Sketches, Models & Context-aware Design Tools  Infer models from design “sketches”or other informal representations *model is an abstraction of apps UI design *model for representing contexts & UI implications  Use models to *semi-automatically generate UIs on diverse platforms *dynamically adapt a particular appliance UI to changing context of use

18 18 How to Specify Events  We have a good idea how to for visual UIs *visually!  But how about speech or gestures?

19 19 Specifying Non-Visual Events  How do designers do this now? *speech -scripts or grammars (advanced designers only) -flowcharts on the whiteboard -“Wizard of Oz” -> fake it! *gestures -give an example & then tell programmer what it does  We can do the same by demonstration (PBD) *demonstrate example of act (e.g., speech) *demonstrate result *system infers program -just a prototype, so doesn’t have to be too general

20 20 Specifying Non-Visual Events result of demonstrating a pen gesture for delete

21 21 Combining the Visual & Non-Visual  How do you see what the system inferred? *necessary for editing *generate a visual representation -flowchart seems like a start (common in speech UIs) +appropriate? what should it look like?

22 22 Specifying Non-Visual Events Type: System Response Mode: Speech “Please name the departure city” Type: User Stimulus Mode: Speech DEP_CITY Type: System Response Mode: Speech “DEP_CITY departures arriving to which city?” Type: User Stimulus Mode: Speech ARR_CITY Type: Computation Lookup ARRIVAL_TIME in “flight times” using DEP_CITY, ARR_CITY Type: System Response Mode: Speech “The flight from DEP_CITY to ARR_CITY will arrive at ARRIVAL_TIME” A flowchart representing inferences made from the demonstration of a flight arrival time application

23 23 Combining the Visual & Non-Visual  How do you see what the system inferred? *necessary for editing *generate a visual representation -flowchart seems like a start (common in speech UIs)flowchart +appropriate? what should it look like?  Combining visual & non-visual events *e.g., end-user dragging truck while saying “fast” *use a visual language that combines visual storyboard of GUI w/ flowchart for non-visual -VL better be simple...

24 24 Supporting Heterogeneous Devices  Consider sketches as an abstraction  Infer a “model” from the sketches  Use methods from model-based UI tools to *generate UIs for multiple devices *generate alternative modes for a single spec. on one device  Hard problems *how to abstract? *how do you generate a “good” UI? -keep the designer in the loop

25 25 Take Advantage of Context: Monitor Environment & Actions to Improve Interaction  Which devices are present & available? *there is a wall display -> use it for my wearable -device discovery

26 26 Take Advantage of Context: Monitor Environment & Actions to Improve Interaction  What is the state of the user? *hands using tools -> use speech I/O & visual Out -tangible tools or vision processing  Solution: UI design tools that understand context as well as multiple devices & modalities  Which devices are present & available? *there is a wall display -> use it for my wearable -device discovery  What is occurring in the environment? *people are talking -> don’t rely on speech I/O -speech sensing

27 27 Design Goals  Let designer’s rapidly produce “rough cuts” *doesn’t need to handle all cases  Allow end user testing & fast modification  Generate code that can help start UI for multiple devices *designer adds more detail & improves interaction *programmers add necessary code

28 28 What We’ve Accomplished So Far  Informal tools for UI design *sketch-based tools for GUI / Web design -built & tested 1st generation, building next gen. now

29 29 What We’ve Accomplished So Far  Informal tools for UI design *sketch-based tools for GUI / Web design -built & tested 1st generation, building next gen. now *informal tool for speech UI design -designed & implementation in progress

30 30 What We’ve Accomplished So Far  Informal tools for UI design *sketch-based tools for GUI / Web design -built & tested 1st generation, building next gen. now *informal tool for speech UI design -designed & implementation in progress  Automatic generation of simple control UIs  First cut designs for multimodal *UI design tool & appliance (SpeechCorder w/ ICSI)  Experience w/ appliances & simple context *NotePals

31 31 Take Home Ideas  Universal Computing is about supporting people  Success will require the design & evaluation of new appliances (device + app + UI) that *take advantage of natural modes of input -especially multimodal input! *take advantage of context *are used in realistic settings  Experience, new architectures, and new tools will make this design problem easier

32 32 Informal Tools for Multimodal, Context-based User Interface Design James A. Landay July 7, 1999 HCC Retreat http://guir.berkeley.edu

33 33 Research Plan  Finish implementation of informal tools *study usage (especially of speech UI design) *use results to design multimodal design tool  Develop algorithms for extracting app model  Build context-aware applications w/o tools *two testbeds to create & study -wirelessly networked PDAs in classroom/learning -extraction of tacit context using social networking *build taxonomy of contexts -how they should effect UI?

34 34 Research Plan (Cont.)  Implement tool for multimodal UI design *extract model & generates UI for 2 diverse platforms *uses simple context ques  Develop algorithms for capturing context  Evaluate usage (apps & tools) in target settings  Extend multimodal UI design tool *generate multi-platform UIs that dynamically adapt -allow context to be fully integrated in decisions

35 35 HCI Goals of Universal Computing  Some of the roots of Universal Computing are in the ideas of Mark Weiser *Ubiquitous/Pervasive/Invisible/Calm Computing -“... does not live on a personal device of any sort [PDA or dynabook], but is in the woodwork everywhere.” -“you don’t want personal technology, you want personal relationships”

36 36 HCI Goals of Universal Computing  Some of the roots of Universal Computing are in the ideas of Mark Weiser *Ubiquitous/Pervasive/Invisible/Calm Computing -“... does not live on a personal device of any sort [PDA or dynabook], but is in the woodwork everywhere.” -“you don’t want personal technology, you want personal relationships”  Universal Computing is about *supporting people’s tasks -most often includes working with other humans *making people’s lives easier -just creating ubiquitous technology does not solve this

37 37 Computers Support Human-Human Communication (HHC) Design ideas Presentations E-mail Reports

38 38  Force translations to formal representations *sometimes we want this (e.g., conference slides) *sometimes we don’t (e.g., creative tasks) Traditional Software Interfaces

39 39 Traditional Representations  Rigid and unambiguous *hard to mix (e.g., few tools support rough sketches) *warp perceptions of the viewer and user  Increase time *encourage precision  Inhibit creativity *“tunnel vision” “Put me in a room with a pad & a pencil and set me up against a hundred people with a hundred computers -- I’ll outcreate every goddamn sonofabitch in the room.” -- Ray Bradbury, Wired 6.10

40 40 Informal Communications Styles  Speaking  Writing  Gesturing  Sketching Informal UIs do not immediately translate natural input, allowing users to work more naturally


Download ppt "1 Informal Tools for Multimodal, Context-based User Interface Design James A. Landay July 7, 1999 HCC Retreat"

Similar presentations


Ads by Google