Multimodal SIG © 2007 IBM Corporation Position Paper on W3C Workshop on Multimodal Architecture and Interfaces - Application control based on device modality.

Slides:



Advertisements
Similar presentations
Android Application Development A Tutorial Driven Course.
Advertisements

Speech-to-Text Technology on Mac OS X Computer Access for Individuals with Disabilities.
® Copyright 2008 Adobe Systems Incorporated. All rights reserved. ADOBE® ACCESSIBILITY AT Access to Flash and PDF Matt May 25 Mar 2010 Featuring.
XISL language XISL= eXtensible Interaction Sheet Language or XISL=eXtensible Interaction Scenario Language.
A Study of Dialogue Management Principles Corresponding to the Driver's Workload Mind-free concept (reducing mind distraction) is important in voice telematics.
The State of the Art in VoiceXML Chetan Sharma, MS Graduate Student School of CSIS, Pace University.
Quality of Service in IN-home digital networks Alina Albu 23 October 2003.
Analysis Concepts and Principles
Multimodal Architecture for Integrating Voice and Ink XML Formats Under the guidance of Dr. Charles Tappert By Darshan Desai, Shobhana Misra, Yani Mulyani,
What is adaptive web technology?  There is an increasingly large demand for software systems which are able to operate effectively in dynamic environments.
Introduction to eValid Presentation Outline What is eValid? About eValid, Inc. eValid Features System Architecture eValid Functional Design Script Log.
Session-01. What is a Servlet? Servlet can be described in many ways, depending on the context: 1.Servlet is a technology i.e. used to create web application.
June 12, 2001 Jeong-Su Han An Autonomous Vehicle for People with Motor Disabilities by G. Bourhis, O.Horn, O.Habert and A. Pruski Paper Review.
Finding Nearby Wireless Hotspots CSE 403 LCA Presentation Team Members: Chris Scoville Tessa MacDuff Matt Mohebbi Aiman Erbad Khalil El Haitami.
1 Introduction to Web Development. Web Basics The Web consists of computers on the Internet connected to each other in a specific way Used in all levels.
An Intelligent Broker Architecture for Context-Aware Systems A PhD. Dissertation Proposal in Computer Science at the University of Maryland Baltimore County.
xx IEEE MEDIA INDEPENDENT HANDOVER DCN: Title: Proposed Presentation for 3GPP Date Submitted: July,
IT 210 The Internet & World Wide Web introduction.
1 Skip Cave Chief Scientist, Intervoice Inc. Multimodal Framework Proposal.
DHTML. What is DHTML?  DHTML is the combination of several built-in browser features in fourth generation browsers that enable a web page to be more.
Web Applications Harry R. Erwin, PhD University of Sunderland CIT304/CSE301.
MVC pattern and implementation in java
1.1 1 Introduction Foundations of Computer Science  Cengage Learning.
1 Forms A form is the usual way that information is gotten from a browser to a server –HTML has tags to create a collection of objects that implement this.
® IBM Software Group © 2006 IBM Corporation Rational Software France Object-Oriented Analysis and Design with UML2 and Rational Software Modeler 06. Requirements.
Conversational Applications Workshop Introduction Jim Larson.
XForms: The next generation of Web Forms Tyler St. John.
ANSTO E-Science workshop Romain Quilici University of Sydney CIMA CIMA Instrument Remote Control Instrument Remote Control Integration with GridSphere.
Computers and Disability Case Study IB Computer Science II Paul Bui.
Presentation. Recap A multi layer architecture powered by Spring Framework, ExtJS, Spring Security and Hibernate. Taken advantage of Spring’s multi layer.
Page 1 WWRF Briefing WG2-br2 · Kellerer/Arbanowski · · 03/2005 · WWRF13, Korea Stefan Arbanowski, Olaf Droegehorn, Wolfgang.
Assessing the influence on processes when evolving the software architecture By Larsson S, Wall A, Wallin P Parul Patel.
Building Rich Web Applications with Ajax Linda Dailey Paulson IEEE – Computer, October 05 (Vol.38, No.10) Presented by Jingming Zhang.
® IBM Software Group © 2006 IBM Corporation Writing Good Use Cases Module 1: Introduction to Use-Case Modeling.
March 20, 2006 © 2005 IBM Corporation Distributed Multimodal Synchronization Protocol (DMSP) Chris Cross IETF 65 March 20, 2006 With Contribution from.
.  A multi layer architecture powered by Spring Framework, ExtJS, Spring Security and Hibernate.  Taken advantage of Spring’s multi layer injection.
PICTURE your design. Purpose : Functions & Content Functions the facilities that make the content of the ICT useful for relevant users and other ICT’s.
Technology Vocabulary By: Rakeysha Patterson. Search Engine  A computer program that searches documents, especially on the World Wide Web, for a specified.
Multi-Modal Dialogue in Personal Navigation Systems Arthur Chan.
Stefan Arbanowski, FOKUS Wolfgang Kellerer, DoCoMo Euro-Labs WWRF13, Jeju, Korea, Feb.
W3C Multimodal Interaction Activities Deborah A. Dahl August 9, 2006.
IEEE MEDIA INDEPENDENT HANDOVER DCN: Title: Proposed Presentation for 3GPP Date Submitted: September,
Software Architecture for Multimodal Interactive Systems : Voice-enabled Graphical Notebook.
Software Group 7-December-2005 | Cross © 2005 IBM Corporation Distributed Multimodal Synchronization Protocol (DMSP) Chris Cross, Multimodal Browser Architect.
1 A Look at the Application Authorized users can access Communicator! NXT from any Internet-capable computer via the Web.
1 Seminar: Pervasive Computing 2004 Automatic mobile device configuration: Status & open challenges Stefan Hoferer Supervisor: Andreas Fasbender.
Forschungszentrum Telekommunikation Wien An initiative of the K plus Programme MONA Mobile Multimodal Next Generation Applications Rudolf Pailer
IEEE MEDIA INDEPENDENT HANDOVER DCN: Title: Proposed Presentation for 3GPP Date Submitted: August,
Service Control Using SIP in 3GPP’s IP Multimedia Subsystem (IMS) Xin Chen Fujitsu Laboratories of Europe LTD
DHTML.
Jens Ziegler, Markus Graube, Johannes Pfeffer, Leon Urbas
Software Design Refinement Using Design Patterns
Common object request broker
Prepared for Md. Zakir Hossain Lecturer, CSE, DUET Prepared by Miton Chandra Datta
Oracle Sales Cloud Sales campaign
Application Development A Tutorial Driven Course
Distributed Systems Bina Ramamurthy 12/2/2018 B.Ramamurthy.
Specialist hardware devices for physically disabled users
Design and Implementation
Architectural Requirements for the Effective Support of Adaptive Mobile Applications Lawrence Li ICS 243F.
Architectural Requirements for the Effective Support of Adaptive Mobile Applications Lawrence Li ICS 243F.
IEEE MEDIA INDEPENDENT HANDOVER DCN:
Personalization and User Profile Management for Public Internet Access Points (PIAPs) ETSI TC HF, STF PX on Extending e-Inclusion for Public Internet.
IEEE MEDIA INDEPENDENT HANDOVER DCN:
IEEE MEDIA INDEPENDENT HANDOVER DCN:
Introduction to Web Services and SOA
IEEE MEDIA INDEPENDENT HANDOVER DCN:
IEEE MEDIA INDEPENDENT HANDOVER DCN:
IEEE MEDIA INDEPENDENT HANDOVER DCN:
Presentation transcript:

Multimodal SIG © 2007 IBM Corporation Position Paper on W3C Workshop on Multimodal Architecture and Interfaces - Application control based on device modality - Daisuke Tomoda, Seiji Abe, Teruo Nameki, Mitsuru Shioya

Multimodal SIG © 2007 IBM Corporation Contents  Introduction  Discussion point  Use-case  Requirements  Proposal (4 points)  Further discussion

Multimodal SIG © 2007 IBM Corporation Introduction  Software engineer in IBM Japan Ltd.  Line of my biz is –Developing pervasive computing software Car navigation in conjunction with speech recognition  My academy –Project management academy association

Multimodal SIG © 2007 IBM Corporation Discussion Point - our interest -  Input/Output method depends on device, and may change by the environment, user preference, and other situation.  Multimodal system designer needs to consider application control based on the client modality.  We should verify that such an application can be developed based upon the current Multimodal Interface Architecture.

Multimodal SIG © 2007 IBM Corporation Discussion Point (continue)  Static and Dynamic modalities of Device –Mobile Cell Phone/PDA –Voice, Pointing device (static) –Voice can not be used depending on each of environments (dynamic) In Car –Voice, Pointing device / Touch device (static) –Voice only while driving (dynamic) –Desktop Many types of input/output (static/dynamic) User’s preference (static/dynamic) –Telephone Voice, DTMF (mostly static, some dynamic)

Multimodal SIG © 2007 IBM Corporation Use-case  Navigation application on Mobile Devices –Cell Phone/PDA & In Car –Speech Reco/TTS & Pointing Device I want to go here. Next, turn left I want to go to xxx.

Multimodal SIG © 2007 IBM Corporation Device Environment Use-case (continue) Data

Multimodal SIG © 2007 IBM Corporation Device Environment Use-case (continue) Data 1.Data is not sent to Voice Server 2. Data from Voice Server is not expected X Text Only

Multimodal SIG © 2007 IBM Corporation Device Environment Use-case (continue) Data X X Voice Only

Multimodal SIG © 2007 IBM Corporation Requirements 1. Static and dynamic device capabilities are to be described. 2. Applications can know the current device status: –2.1 Client devices are notified modality changes by the user or events of themselves. –2.2 Interaction Manager are to be notified modality changes from client devices. 3. Devices’ properties of modalities can be queried, modified, and transferred between server and client. 4. Application behavior can be changed dynamically according to the device modality status.

Multimodal SIG © 2007 IBM Corporation Proposal (1) 1. Static and dynamic device capabilities are to be described →Define description of device capabilities –Static capabilities –Dynamic capabilities changes (Changeable capabilities) Perhaps they can be described as XML document

Multimodal SIG © 2007 IBM Corporation Static modality information  Capability of client device –Speech input/output –Keyboard –Pointing device –Hand Writing –Image recognition –etc.  Default settings –User preferences –Priority –etc.

Multimodal SIG © 2007 IBM Corporation Dynamic modality change  Re-Initialization of client device –Default setting/User preference is loaded  Modality change triggers –User enabled or disabled the modality –Application or system changed modality Based on environment change (automatic/manual) Based on application scenario  Priority of modality –Can be changed based on the situation

Multimodal SIG © 2007 IBM Corporation Proposal (2) 2. Applications can know the current device status: –2.1 Client devices are notified modality changes by the user or events of themselves. –2.2 Interaction Manager are to be notified modality changes of client devices. →2.1. Define interface between the device (driver) and client Modality Component(if it’s not already defined.) →2.2. will be covered by discussion in W3C Device Independence Working Group, that defines interface between Interaction Manager and Delivery Context Component.

Multimodal SIG © 2007 IBM Corporation Device Environment State Change

Multimodal SIG © 2007 IBM Corporation Proposal (3) 3. Devices’ properties of modalities can be queried, modified, and transferred between server and client. →Apply the MMI Life-Cycle Event API

Multimodal SIG © 2007 IBM Corporation Device Environment State Change Initial Device state is included in New Context Request Event or Prepare Event. Device state change is included in Data Event. Device query is done thru Status Request Event.

Multimodal SIG © 2007 IBM Corporation Proposal (4) 4. Application behavior can be changed dynamically according to the device modality status. →(a) Document device state driven control in SCXML. →(b) Select device specific SCXML.

Multimodal SIG © 2007 IBM Corporation Device Environment (a) Step1. IM queries how the status of each of devices is by way of APIs of Delivery Context Component Step2. IM might vary its behavior depending on the result of step1.

Multimodal SIG © 2007 IBM Corporation Device Environment SCXML New Context Request (b) The Point IM issues "New Context Request" depending on states of devices using the SCXML that corresponds to devices. The catch is this; How to choose the pertinent SCXML that fits each of devices.

Multimodal SIG © 2007 IBM Corporation Further Discussion  Device depend application control could be developed on the current MMI architecture, but need more detail specifications especially interface for device state change.  Need more discussion for: –If it is possible to replace or switch SCXML document dynamically based on the client state, or not –Timing of application behavior change When modality state should reflected –How to apply this proposal to session type application

Multimodal SIG © 2007 IBM Corporation

Multimodal SIG © 2007 IBM Corporation