“EyeMouse”: An interaction device for severely motor-disabled people

Slides:



Advertisements
Similar presentations
An infrastructure language for Open Nets Michele Loreti Joint work with: Lorenzo Bettini and Rosario Pugliese Dipartimento di Sistemi e Informatica Università.
Advertisements

Multimedia Specification Design and Production 2012 / Semester 1 / week 6 Lecturer: Dr. Nikos Gazepidis
Shweta Jain 1. Motivation ProMOTE Introduction Architectural Choices Pro-MOTE system Architecture Experimentation Conclusion and Future Work Acknowledgement.
Page 1 SIXTH SENSE TECHNOLOGY Presented by: KIRTI AGGARWAL 2K7-MRCE-CS-035.
Active Capture and Folk Computing Ana Ramírez and Marc Davis ICME 2004 – Taipei, Taiwan 29 June 2004 UC Berkeley - Garage Cinema Research - Group for User.
Wheelesley : A Robotic Wheelchair System: Indoor Navigation and User Interface Holly A. Yanco Woo Hyun Soo DESC Lab.
MUltimo3-D: a Testbed for Multimodel 3-D PC Presenter: Yi Shi & Saul Rodriguez March 14, 2008.
Practical Gaze Tracking Peter Elliott CS 498 Spring 2009.
Head Tracking and Virtual Reality by Benjamin Nielsen.
Image Mosaicing from Uncalibrated Views of a Surface of Revolution
CS335 Principles of Multimedia Systems Multimedia and Human Computer Interfaces Hao Jiang Computer Science Department Boston College Nov. 20, 2007.
Stockman MSU Fall Computing Motion from Images Chapter 9 of S&S plus otherwork.
Another Look at Camera Control
Lesson Objectives To understand that users with disabilities require different input and output devices To be able to identify these devices and explain.
An Integral System for Assisted Mobility Manuel Mazo & the Research group of the SIAMO Project Yuchi Ming, IC LAB.
Harshita Karamchandani Placement, Masters Project and Travels…..
MACHINE VISION GROUP Multimodal sensing-based camera applications Miguel Bordallo 1, Jari Hannuksela 1, Olli Silvén 1 and Markku Vehviläinen 2 1 University.
Eye tracking: principles and applications 廖文宏 Wen-Hung Liao 12/10/2009.
 At the end of this class, students are able to  Describe definition of input devices clearly  List out the examples of input devices  Describe.
The Eye-Tracking Butterfly: Morphing the SMI REDpt Eye-Tracking Camera into an Interactive Device. James Cunningham & James D. Miles California State University,
RAGEEVGANDHI MEMORIAL COLLEGE OF ENGINEERING AND TECHNOLOGY
“S ixth Sense is a wearable gestural interface device that augments the physical world with digital information and lets people use natural hand gestures.
Examples of Available Assistive Technology
The Camera Mouse: Visual Tracking of Body Features to Provide Computer Access for People With Severe Disabilities.
DEVELOPMENT OF AN EYE TRACKING SYSTEM FOR A TABLET Harshita Karamchandani Supervisor: David Hobbs Co-supervisor: Dr. Tom Chau (Toronto)
Virtual Image Processing System for Intelligent Reconstruction of 3D Environments IST
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
Presentation by: K.G.P.Srikanth. CONTENTS  Introduction  Components  Working  Applications.
Input and Output Devices. I/O Devices: Input information data An input device together with appropriate software, transforms information from the user.
Submitted by:- Vinay kr. Gupta Computer Sci. & Engg. 4 th year.
===!"§ Deutsche Telekom Laboratories Target Acquisition with Camera Phones when used as Magic Lenses CHI 2008, Florence, Italy, April 9, 2008 Michael Rohs.
COMPUTER PARTS AND COMPONENTS INPUT DEVICES
Input By Hollee Smalley. What is Input? Input is any data or instructions entered into the memory of a computer.
Human Computer Interaction © 2014 Project Lead The Way, Inc.Computer Science and Software Engineering.
UNIT 7 Describing how an item functions [2] (infinitive with or without ‘to’)
卓越發展延續計畫分項三 User-Centric Interactive Media ~ 主 持 人 : 傅立成 共同主持人 : 李琳山,歐陽明,洪一平, 陳祝嵩 水美溫泉會館研討會
GAZE ESTIMATION CMPE Motivation  User - computer interaction.
Designing for energy-efficient vision-based interactivity on mobile devices Miguel Bordallo Center for Machine Vision Research.
KAMI KITT ASSISTIVE TECHNOLOGY Chapter 7 Human/ Assistive Technology Interface.
CS-498 Computer Vision Week 7, Day 1 3-D Geometry
CAMEO: Meeting Understanding Prof. Manuela M. Veloso, Prof. Takeo Kanade Dr. Paul E. Rybski, Dr. Fernando de la Torre, Dr. Brett Browning, Raju Patil,
Team IRALAR Breanna Heidenburg -- Michael Lenisa -- Daniel Wentzel Advisor: Dr. Malinowski.
Touch Screen, Head Mouse and Eye Gaze. Alternatives to the mouse & keyboard One alternative to the keyboard and mouse is a touch screen monitor Particularly.
Different Types of HCI CLI Menu Driven GUI NLI
Computer Graphics Lecture 02 Fasih ur Rehman. Last Class Introduction to Computer Graphics Areas Application.
1 Partner presentation COGAIN camp 05 September 2007 Prof Alastair Gale, Dr Fangmin Shi Applied Vision Research Centre Loughborough University, UK.
Input and Output Devices.
It Starts with iGaze: Visual Attention Driven Networking with Smart Glasses It Starts with iGaze: Visual Attention Driven Networking with Smart Glasses.
IEEE International Conference on Multimedia and Expo.
Projector-camera system Application of computer vision projector-camera v3a1.
Slicer IGT : Workflow based design Andinet Enquobahrie, PhD Kitware Inc.
EYE-GAZE COMMUNICATION
What is a Laser Mouse? Laser Mouse is a tool that will enable users to control a mouse on a projection screen using a laser pointer. Users will no longer.
TECHNICAL SEMINAR ON. ABSTRACT INTRODUCTION USERS OF THE EYEGAZE SYSTEM SKILL NEEDED BY THE USERS PARTS AND WORKING HOW TO RUN THE EYEGAZE SYSTEM USES.
A novel depth-based head tracking and facial gesture recognition system by Dr. Farzin Deravi– EDA UoK Dr Konstantinos Sirlantzis– EDA UoK Shivanand Guness.
introduction Brain driven car which would be of great help to the physically disabled people. These cars will rely only on what the individual is thinking.
TOUCHLESS TOUCHSCREEN USER INTERFACE
Mobile eye tracker construction and gaze path analysis By Wen-Hung Liao 廖文宏.
EYE-GAZE COMMUNICATION
Generic Gaze Interaction Events for Web Browsers
Multimedia Programming
Presented by Jason Moore
EYE-GAZE COMMUNICATION
GESTURE RECOGNITION TECHNOLOGY
TOUCHLESS TOUCHSCREEN USER INTERFACE
Microsoft Research Faculty Summit 2003
Introduction Brain driven car which would be of great help to the physically disabled people. These cars will rely only on what the individual is thinking.
眼動儀與互動介面設計 廖文宏 6/26/2009.
A Comparative Study of Target Assistance
Computer Vision Readings
Presentation transcript:

“EyeMouse”: An interaction device for severely motor-disabled people Carlo Colombo, Massimiliano Corsini Media Integration and Communication Center University of Florence

EyeMouse Human-machine interaction system replacing the mouse with eye movements Designed to exploit the residual mobility of severely motor-disabled people (e.g. multiple sclerosis)

User eye movements are captured through computer vision… Interaction User eye movements are captured through computer vision… …and then transformed into commands for the on-screen PC interface CV Interpreter Interface commands Live camera images User Screen action feedback

Eye capture External eye and iris are captured by elastic template matching (snakes) Snake template Iris (circle) external eye (ellipse)

The iris position in the image is remapped onto the screen plane Eye remapping Image plane Screen plane The iris position in the image is remapped onto the screen plane The image-to- screen map is calibrated at startup

Mouse functionalities Navigation: eye movements Selection: eye persistence, eye blinks Interface feedback compensates for slight remapping errors

“X” movement

“O” movement

Accommodation of head motion Current work Accommodation of head motion Automatic recovery of tracking errors for prolongated interaction

Contacts Prof. Carlo Colombo Dipartimento Sistemi e Informatica Via S. Marta 3 – Firenze Centro di Eccellenza MIUR per la Comunicazione e Integrazione dei Media Sede RAI, Largo A. de Gasperi 1 - Firenze www.dsi.unifi.it/users/colombo colombo@dsi.unifi.it