Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 V11: User Interaction in AmI Dr.-Ing. Reiner Wichert Fraunhofer-Institut für Graphische Datenverarbeitung IGD Holger Graf Fraunhofer-Institut für Graphische.

Similar presentations

Presentation on theme: "1 V11: User Interaction in AmI Dr.-Ing. Reiner Wichert Fraunhofer-Institut für Graphische Datenverarbeitung IGD Holger Graf Fraunhofer-Institut für Graphische."— Presentation transcript:

1 1 V11: User Interaction in AmI Dr.-Ing. Reiner Wichert Fraunhofer-Institut für Graphische Datenverarbeitung IGD Holger Graf Fraunhofer-Institut für Graphische Datenverarbeitung IGD Mohammad-Reza (Saied) Tazari Fraunhofer-Institut für Graphische Datenverarbeitung IGD Ambient Intelligence WS 10/11

2 2 Gliederung Abgrenzung des Themas Eine Analyse UI Beschreibungssprachen Beispielmodelle & -ansätze Das UI Framework von PERSONA

3 3 WIEDERHOLUNG AUS VORLESUNG #1 + ABGRENZUNG DES THEMAS User Interaction in Smart Environments

4 4 4 Methodischer Ansatz Änderung der Interaktion 1. Benutzerziel 2. Strategie 3. Ausführung 1. Benutzerziel 2. Strategie 3. Ausführung

5 5 5 (2)Software-Architektur zur Koordination von Geräten und Applikationen. (1)Natürlichsprache Interaktion wird möglich. Multimodale Interaktion wird möglich. (3)Interaktionsmodelle, Mentale Modelle, Inferenz-Mechanismen, Induktives Schließen. Ambient Intelligence bündelt alle Bereiche der Informatik (4)Geräte / Sensoren

6 6 Heller! Vision Quelle: EMBASSI

7 7 Implizite versus Explizite Interaktion (I) Implizit: Beobachtung der Aktionen des Benutzers, auch wenn nicht im Sinne direkter Interaktion mit der Umgebung Erfassung der Handlungen und Teilung im System als kontextuelle Ereignisse Analyse der sich ergebenden Situationen und Ableitung möglicher Benutzerziele Ableitung möglicher Reaktionen durch die Umgebung Teilaspekt von Vorlesungen #4 & #7 Nicht Gegenstand dieser Vorlesung

8 8 Implizite versus Explizite Interaktion (II) Explizit: Benutzer interagiert bewusst mit der Umgebung Entweder im Sinne direkter Anweisung, z.B. Benutzer: Fenster im Schlafzimmer schließen! Oder auf Rückfragen der Umgebung reagieren, z.B. System: zu viel Rauch in der Küche; soll das Fenster geöffnet werden? Benutzer: Ja! Gegenstand dieser Vorlesung

9 9 Explicit UI over I/O channels long enough in the shadow of implicit interaction over sensing channels in AmI Progresses that help explicit UI become more important proliferation of (multi-)touch sensing, HD displays, & displays embedded in all possible devices new interaction forms supported by special devices (e.g. WiiMote) qualitative progresses in speech recognition natural language processing gesture recognition (e.g., Kinect) socio-political pressure on accessibility for all The Importance of Explicit User Interaction

10 10 EINE ANALYSE User Interaction in Smart Environments

11 11 The Pipe-Lines Model von Nigay & Coutaz (1997) Quelle:

12 12 Intelligente Umgebungen mit vernetzten Geräten

13 13 Multiplizität in Intelligenten Umgebungen Quelle: Dissertation von Marco Blumendorf

14 14 living room TV sleeping room TV a display in the entrance a display integrated in the fridge door mirrors capable of becoming displays microphone arrays installed in all rooms loudspeakers installed in all rooms phones providing displays, microphones, (loud)speakers hi-fi providing loudspeakers. An infrastructure of available I/O channels I/O Devices in emerging Smart Homes

15 15 Smart Environments as Open Distributed Systems

16 16 The Consequence I/O Infrastructure I/O Infrastructure Open Distributed System Open Distributed System

17 17 Begriffe 1.Nach Blumendorf Quelle: _marco.pdf 2.Nach Tazari Quelle: RSONA_Architektur_Manual.pdf

18 18 Begriffe nach Blumendorf 1.Interaction Resource (IR) Atomic (one-way, single-modality) I/O channel exploitable by a user for executing a task. E.g., keyboards, mice, screens, speakers, microphones, or cameras 2.Interaction Device (ID) Computing systems that handle the input of or send output to individual IRs connected to it. Hence, an ID is a collection of IRs together with the computing unit. It comprises the hardware used for the interaction (e.g. screen, keyboard, touch-pad) as well as a software platform for communication and presentation tasks.

19 19 Begriffe nach Tazari Channel: Smart environments need to bridge between the physical world and the virtual realm with the help of certain devices. Channel denotes the bridging passage provided by such devices between the physical world and the virtual realm. Depending on the kind of channel opened, a channel might be called a sensing channel (provided by sensors), an acting channel (provided by actuators), an input channel (provided by microphones, keyboards, etc.), or an output channel (provided by displays, loudspeakers, etc.). The latter two types of channels might be referred to as I/O Channels. I/O Device: An abbreviation for input and / or output device. A device that provides an input and / or output channel for facilitating explicit interaction between a smart environment and its human users. Input devices, such as a microphone, a keyboard, or a mouse, can capture an instruction or response that is provided by a human user and represent it in terms of data in the virtual realm. Upon receive of data within the virtual realm that is intended to be presented to human users, output devices, such as displays and loudspeakers, can make it perceivable to the addressed humans.

20 20 Recall Concept Maps from V3

21 21 Anforderungen 1.Nach Blumendorf Quelle: _marco.pdf 2.Nach Tazari Quelle:

22 22 Anforderungen nach Blumendorf Shapeability to address different layouts for users, device capabilities and usage contexts, distribution across multiple interaction devices, multimodality to support various input and output modalities, shareability between multiple users, mergability and interoperability of different applications. Im Grunde alles verschiedene Aspekte der Adaptibilität

23 23 Adaptibilitätsanforderungen nach Blumendorf: Shapeability Layout change depending on user distance to the screen

24 24 Adaptibilitätsanforderungen nach Blumendorf: Distribution user interface can be distributed across multiple interaction devices and kept continuously synchronized

25 25 Adaptibilitätsanforderungen nach Blumendorf: Multimodality user is able to utilize multiple interaction resources and modalities including voice, touch and gesture simultaneously

26 26 Adaptibilitätsanforderungen nach Blumendorf: Shareability Two users sharing applications

27 27 Adaptibilitätsanforderungen nach Blumendorf: Mergability UI of a cooking assistant embedded in a meta-UI controlling interaction parameters

28 28 Anforderungen nach Tazari Separating I/O channel management from applications Modality- / layout-neutral dialogs Brokerage mechanisms Support for adaptive dialogs Task division between layers Availability of user context, capabilities, and preferences to all layers Handling input & output Modality fusion & fission Context-free input

29 29 UI BESCHREIBUNGS- SPRACHEN User Interaction in Smart Environments

30 30 Need for Declarative Languages A direct consequence of separating application layer from the presentation layer e.g., Firefox e.g., language = HTML protocol = HTTP

31 31 The problem with HTML Not really modality-neutral Sometimes posing certain layout More abstract and neutral languages investigated since more than 10 years: UIML TERESA XML UsiXML SMIL EMMA XISL XForms

32 32 XForms - Separation of Values from Controls There are two parts to the essence of XForms. The first is to separate what is being returned from how the values are filled in: The model specifies the values being collected (the instance), and their related logic Types, restrictions Initial values, Relations between values The body of the document then binds forms controls to values in the instance Quelle:

33 33 XForms – Intent-based Controls Quelle:

34 34 BEISPIEL-MODELLE & -ANSÄTZE User Interaction in Smart Environments

35 35 The W3C Multimodal interaction Framework - Overview Quelle (auch für die nächsten 2 Folien):

36 36 The W3C Multimodal interaction Framework – Input Side

37 37 The W3C Multimodal interaction Framework – Output Side

38 38 Ansatz von Sottet et al. Quelle:

39 39 Eine Laufzeitarchitektur nach Clerckx et al

40 40 Die MASP Architektur nach Blumendorf

41 41 DAS UI FRAMEWORK VON PERSONA User Interaction in Smart Environments

42 42 Dialog Descriptions Goal: Modality- & Layout-neutral PERSONA solution inspired by XForms Apparently the most advanced form-based solution Separating the form UI description from the form data Define a dialog package based on XForms UI controls Use own RDF-based data model instead of adding a new complexity

43 43 The Dialog Package

44 44 Cornerstone: I/O Buses Capabilities of the I/O handlers appropriateness for certain access impairments supported languages and modalities locations where output can be presented modality-specific tuning capabilities Dialog ID The Brokerage / Adaptation

45 45 Parameters provided by the app Content language & privacy level Addressed user Parameters added by the UI Framework the presentation location and modality access impairments to be considered modality-specific recommendations Supporting the Output Bus in Adaptation

46 46 Coherent representation of the whole system Management of Dialogs Per user & priority-based management of dialog queues Suspending dialogs and continuing later Providing the system main menu Handling context-free input More on the Dialog Manager

47 47 Application 2 Input Input Bus Input Handler Application 1 Speech Recognition Gesture Recognition Input Handler Subscribe Publish

48 48 Input Fusion Switch On ! Input Bus Application 1 Input Handler TV setSwitch On Switch On TV set Speech Recognition Gesture Recognition Fusion

49 49 Application 2 Output Output Bus Output Handler Application 1 Text 2 Speech Output Handler Subscribe Publish

50 50 Context Awareness Output Bus Application 1 Take your Prozac TM ! Text 2 Speech Privacy Awareness

51 51 Context Awareness: Dynamic Output Bus Application 1

52 52 Input Bus

53 53 Input Bus Members

54 54 Functional Model of Input Events

55 55 Output Bus

56 56 Output Bus Members

57 57 Output Event Properties & their Providers addressedUser (app) contentPrivacyLevel (app) channelPrivacyLevel (DM) dialogForm (app) dialogPriority (app) hasAccessImpairment (DM) outputLanguage (app) outputModality (DM) altOutputModality (DM) presentationLocation (DM) Beispiele von modalitätsspezifische Parameter (DM) screenResolutionMaxX screenResolutionMaxY screenResolutionMinX screenResolutionMinY voiceGender voiceLevel

58 58 Modelling of Access Impairments

59 59 Danke für die Aufmerksamkeit & bis zur nächsten Vorlesung

Download ppt "1 V11: User Interaction in AmI Dr.-Ing. Reiner Wichert Fraunhofer-Institut für Graphische Datenverarbeitung IGD Holger Graf Fraunhofer-Institut für Graphische."

Similar presentations

Ads by Google