Aiming Computing Technology at Enhancing the Quality of Life of People with ALS Some Sketches on Directions in Minimal Signaling Communication Communication.

Slides:



Advertisements
Similar presentations
Cognitive Systems, ICANN panel, Q1 What is machine intelligence, as beyond pattern matching, classification and prediction. What is machine intelligence,
Advertisements

Creative Movement in the Foundation Phase
Farewell to ICT or evolving ICT into Computing? Phil Bagge code-it.co.uk.
Rationale To encourage all students to take a full part in the life of our school, college, workplace or wider community. To provide opportunities to enable.
Towards Adaptive Web-Based Learning Systems Katerina Georgouli, MSc, PhD Associate Professor T.E.I. of Athens Dept. of Informatics Tempus.
2. What is Multimedia? Multimedia can have a many definitions these include: Multimedia means that computer information can be represented through audio,
Information complied by Andrea Bilello, M.Ed..  AAC includes equipment and services that enhance face-to-face communication and telecommunication. Writing.
Richard Yu.  Present view of the world that is: Enhanced by computers Mix real and virtual sensory input  Most common AR is visual Mixed reality virtual.
 INTRODUCTION  STEPS OF GESTURE RECOGNITION  TRACKING TECHNOLOGIES  SPEECH WITH GESTURE  APPLICATIONS.
Audiovisual Emotional Speech of Game Playing Children: Effects of Age and Culture By Shahid, Krahmer, & Swerts Presented by Alex Park
Topics Dr. Damian Schofield Director of Human Computer Interaction.
1 Affective Learning with an EEG Approach Xiaowei Li School of Information Science and Engineering, Lanzhou University, Lanzhou, China
A Social and Emotional Approach to Games and Human Computer Interaction (HCI) Katherine Isbister Associate Professor, Digital Media and CSE.
Teaching Multimedia. Multimedia is media that uses multiple forms of information content and information processing (e.g. text, audio, graphics, animation,
Designing and Developing Interactive Multimedia EDCI 663 Educational Technology Purdue University.
Projects in the Intelligent User Interfaces Group Frank Shipman Associate Director, Center for the Study of Digital Libraries.
Session 2 The Planning Process for Literacy. Aims of the session: To consider how to develop the phases of the planning process for a literacy unit of.
 A robot is a machine or a computer program which seems to have a life of its own.  A social robot is a robot that interacts and communicates with humans.
User Centered Design Lecture # 5 Gabriel Spitz.
Elementary Balanced Literacy: Read Alouds. Read Aloud minutes Research has found: The single most important activity for building knowledge for.
Language Learning Styles and Strategies. Objectives by the end of this lecture you will be able to: Distinguish between learning styles and strategies.
SD Splash Page Splash page brings your brand and message to the screen with a digital flyer and guide to your materials. Feature: Colorful User Interface.
index.php Palmyra Area High School 1.
8/26/20151 Kick-off meeting 15-16/12/2008 Objectives and outcome Two year project C. Kourkoumelis, UoA.
 A set of objectives or student learning outcomes for a course or a set of courses.  Specifies the set of concepts and skills that the student must.
Chapter 7 Requirement Modeling : Flow, Behaviour, Patterns And WebApps.
Multimedia. Definition What is Multimedia? Multimedia can have a many definitions these include: Multimedia means that computer information can be represented.
Chapter 11-Multimedia Authoring Tools. Overview Introduction to multimedia authoring tools. Types of authoring tools. Cross-platform authoring notes.
Literacy – Listening & Talking
Teaching and Learning with Technology  Allyn and Bacon 2002 Academic Software Chapter 6 Teaching and Learning with Technology.
Enabling enactive interaction in virtualized experiences Stefano Tubaro and Augusto Sarti DEI – Politecnico di Milano, Italy.
Teaching with Multimedia and Hypermedia
Teaching is not just a process of imparting knowledge to an attentive child. Learning needs to be engaging, social and interactive. It is also vital that.
Today: ID Chapter 4 – Designing for Collaboration and Communication.
4/12/2007dhartman, CS A Survey of Socially Interactive Robots Terrance Fong, Illah Nourbakhsh, Kerstin Dautenhahn Presentation by Dan Hartmann.
Introduction to Interactive Media The Interactive Media Development Process.
Chapter 7. BEAT: the Behavior Expression Animation Toolkit
Break-out Group # D Research Issues in Multimodal Interaction.
Autism with additional Learning Difficulties : special school strategies Autism with additional Learning Difficulties : special school strategies Dr. Rita.
MULTIMEDIA DEFINITION OF MULTIMEDIA
Oracy O 6.1 Understand the main points and simple opinions in a spoken story, song or passage listen attentively, re-tell and discuss the main ideas agree.
ECE 8443 – Pattern Recognition EE 3512 – Signals: Continuous and Discrete Objectives: Spectrograms Revisited Feature Extraction Filter Bank Analysis EEG.
Sample Video Game & Sound. The Plan 1.Game Theme 2.Game Structure 3.Sprites 4.Trackers 5.Collisions 6.Score 7.Levels 8.Splash Screens 9.Design 10.Implementation.
Communication. © 2012 Pearson Australia ISBN: Communication Communication: How we interact with others.
Communication with Handler Approach Overview Alice 2.0 source code was modified to release event information to a robot handler component using sockets.
ENTERFACE 08 Project 1 “MultiParty Communication with a Tour Guide ECA” Mid-term presentation August 19th, 2008.
1 User Interface Design Components Chapter Key Definitions The navigation mechanism provides the way for users to tell the system what to do The.
AI on the Battlefield: an Experimental Exploration Alexander Kott BBN Technologies Robert Rasch US Army Battle Command Battle Lab Views expressed in this.
Similarities to my current programme of work Teaching of relevant strategies to be used whenever pupils listen and talk with others (e.g. one person speaking.
1 Workshop « Multimodal Corpora » Jean-Claude MARTIN Patrizia PAGGIO Peter KÜEHNLEIN Rainer STIEFELHAGEN Fabio PIANESI.
Workshop on direct brain/computer interface & control Febo Cincotti Fondazione Santa Lucia IRCCS Brussels, August 2, 2006.
Yonglei Tao School of Computing & Info Systems GVSU Ch 7 Design Guidelines.
ICT Presentation. Why teach ICT? ICT is essential in enhancing your child’s education It will play an important role in your child’s teenage and adult.
Key Competencies.
Computer-based Media Language Elements Understanding how we communicate through media Stewart.C. (2007). Media: New Ways and Means. John Wiley & Sons:
Task Analysis Lecture # 8 Gabriel Spitz 1. Key Points  Task Analysis is a critical element of UI Design  It specifies what functions the user will need.
Chapter 6 Academic Software Teaching and Learning With Technology, 3e.
Linear Growing Patterns and Relations: A Sneak Preview Grade Wendy Telford Grade Wendy Telford.
Information Design Goal: identify methods for representing and arranging the objects and actions possible in a system in a way that facilitates perception.
Visual Impairment. Fatigue, time of day, and medications can cause fluctuating vision. A child may be able to do a task at one time of day, but not at.
Project Information Abstract Project Objectives The objective of this project is to: Create a visual designer that will allow inexperienced end- users.
Multimedia. A medium (plural media) is something that a presenter can use for presentation of information Two basic ways to present information are: –Unimedium.
FlowArm PLTW Programming
MULTIMODAL AND NATURAL COMPUTER INTERACTION Domas Jonaitis.
CALO VISUAL INTERFACE RESEARCH PROGRESS
Ubiquitous Computing and Augmented Realities
Howell Istance Ambient Assisted Living Group
COMMUNICATIVE LANGUAGE TEACHING
Interactive media.
Chapter 9 System Control
Presentation transcript:

Aiming Computing Technology at Enhancing the Quality of Life of People with ALS Some Sketches on Directions in Minimal Signaling Communication Communication Working Group Microsoft Research & ALS Society of British Columbia

Directions Tools and method for planning and creating long- term corpora and strategies for communication Creating effective input strategies and systems for different phases of the progression of ALS Exploration of novel signaling strategies, such as eye gaze, as an effective input modality Coupling contextual information as source of evidence in reasoning about intentions Potential for integrated robotic systems for motion Multiple directions

Core Challeges & Opportunities with Gaze-Centric Systems Move beyond potentially frustrating point and dwell approaches Toward new UI designs, metaphors & control methodologies that can ease user effort and enhance accuracy of selection, control. Adaptive techniques, use of context can ease user effort and error. Promise of creating multiple rewarding and valuable applications, for communication, creative expression, information access, enjoyable experiences, control of real- world sensors and effectors.

Sample Applications

Example Gaze-centric interaction with (previously authored) databases of images and audiovisual snippets for enhanced communication

Memories library from video and image photolibrary Social discourse library of videos and stills for sharing emotions, moods, gestures.

Example Gaze-centric control of confirmation, negation, selection of alphabetic controls for writing, editing.

Yes No Big Time! No way. [ask] [tell] [have] Q W E R T Y U I O P A S D F G H J K L Z X C V B N M. ? [ I want to … Predictive language model with correction

Example Gaze-centric control of integrated, supportive robotic systems – Head motion – Head gestures (nods, yes; shakes no) – Arms (and hand?) motion …etc.

Turn head left or right (e.g., in increments)

Conversation Confirmation Nod “yes”; shake head “no” (Beyond confirmation: nod as a natural cue for understanding in stream with conversation)

Toward more general application of “integrated robotics,” e.g., arms controls, etc., for atomic and patterned moves (e.g., wave hello).

Example Gaze-centric control of navigation through cached real-world, or 3D virtual-reality animation sequences, potentially with other appropriate sensory stimulation.

Example Gaze-centric communication and linking for single and multiplayer games, including chess, card playing, virtual worlds (Second Life, etc.)

Example Gaze-centric control of telepresence applications for real-time viewing from different viewpoints.

Example Gaze-centric control of painting, CAD, sculpting machinery, machine lathes, other artistry and crafts.

Longer-Term: Addressing Totally Locked in State (TLS) Challenge of use of EEG for preference assessment, acknowledgment, confirmation, selection E.g., Rotating selection rings, with “other” option; navigation among hierarchy of rings, use of simple input for halting, selecting, confirmation.

Multiple Directions and Possibilities Comments?