ICRA2009 Evaluation of a robot as embodied interface for Brain Computer Interface systems E. Menegatti, L. Tonin Intelligent Autonomous System Laboratory.

Slides:



Advertisements
Similar presentations
Thought Translation Device
Advertisements

Electroencephalogram (EEG) and Event Related Potentials (ERP) Lucy J. Troup 28 th January 2008 CSU Symposium on Imaging.
Study of Change Blindness EEG Synchronization using Wavelet Coherence Analysis Professor: Liu Student: Ruby.
IntroductionMethods Participants  7 adults with severe motor impairment.  9 adults with no motor impairment.  Each participant was asked to utilize.
Brain-computer interfaces: classifying imaginary movements and effects of tDCS Iulia Comşa MRes Computational Neuroscience and Cognitive Robotics Supervisors:
DEVELOPMENT OF A FAST AND EFFICIENT ALGORITHM FOR P300 EVENT RELATED POTENTIAL DETECTION A MASTER’S THESIS PRESENTATION BY ELLIOT FRANZ ADVISOR: IYAD OBEID,
NeuroPhone: Brain-Mobile Phone Interface using a Wireless EEG Headset Source: MobiHeld 2010 Presented By: Corey Campbell.
A Real-time Data Acquisition and Neural Spike Processing Platform for Brain Machine Interface Engineering Experiments M. KOCATURK 1, H. O. GULCUR 1, R.
MEG Experiments Stimulation and Recording Setup Educational Seminar Institute for Biomagnetism and Biosignalanalysis February 8th, 2005.
NEUROPHONE: BRAIN- MOBILE PHONE INTERFACE USING A WIRELESS EEG HEADSET Andrew T. Campbell, Tanzeem Choudhury, Shaohan Hu, Hong Lu, Matthew K. Mukerjee!,
Drexel University Optical Imaging Research Group
May 12, 2009BCI Workshop ICRA 2009, Kobe Evaluation of a robot as embodied interface for Brain Computer Interface systems Evaluation of a robot as embodied.
Electro-Oculography (EOG) Measurement System The goal : To measure eye movement with maximum accuracy using skin electrodes around the eyes that detect.
Online Automation and Control: An Experiment in Distance Engineering Education INTERDISCIPLINARY ROBOTICS AND INTELLIGENT SYSTEMS CONTROL (RISC) LABORATORY.
Virtual Reality. What is virtual reality? a way to visualise, manipulate, and interact with a virtual environment visualise the computer generates visual,
A commonly used feature to discriminate between hand and foot movements is the variance of the EEG signal at certain electrodes. To this end, one calculates.
Online Automation and Control: An Experiment in Distance Engineering Education Tarek Sobh Sarosh Patel INTERDISCIPLINARY ROBOTICS AND INTELLIGENT SYSTEMS.
UPLINK: ULTRASONIC POSITION LOCATOR FOR INDOOR ENVIRONMENTS Aunim Mashrur Hossain, Giridhar Nandipati Advised By: Dr. Daniel Lee Thursday, April 22nd 1:30pm.
Introduction ‘Have you ever played video games before? Look at the joystick movement. When you move the joystick to the left, the plane on the TV screen.
2.03B Common Types and Interface Devices and Systems of Virtual Reality 2.03 Explore virtual reality.
 Before a BCI can be used for control purposes, several training sessions are necessary ◦ Operant conditioning  Feed back, real-time changes to the.
Background   Who does this project addresses to?   Handicapped.   Amputated limbs.   Paralyzed.   Motivation Statistics.
Real-time systems Systems Refers to: (computing, communication, and information) (c) Rlamsal DWIT.
Brain-Computer Interface for VR control Christoph Guger.
Change blindness and time to consciousness Professor: Liu Student: Ruby.
Matt Waldersen T.J. Strzelecki Rick Schuman Krishna Jharjaria.
EWatch: A Wearable Sensor and Notification Platform Paper By: Uwe Maurer, Anthony Rowe, Asim Smailagic, Daniel P. Siewiorek Presenter: Ke Gao.
Brain-Computer-Interface for Stress Detection Hassan Farooq, Ilona Wong Supervisor: Steve Mann Administrator: Cristiana Amza Section 8 Collecting Brainwave.
Interim Presentation EEG Signal Acquisition and Analysis for Mental Workload Monitoring.
IntroductionMethods Participants  7 adults with severe motor impairment performed EEG recording sessions in their own homes.  9 adults with no motor.
A Camera-Projector System for Real-Time 3D Video Marcelo Bernardes, Luiz Velho, Asla Sá, Paulo Carvalho IMPA - VISGRAF Laboratory Procams 2005.
Graz-Brain-Computer Interface: State of Research By Hyun Sang Suh.
Virtual Reality Lecture2. Some VR Systems & Applications 고려대학교 그래픽스 연구실.
Virtual Reality in Brain- Computer Interface Research F. Lee 1, R. Scherer 2, H. Bischof 1, G. Pfurtscheller 2 1) Institute for Computer Graphics and Vision.
1 The Low-Cost Implement of a Phase Coding SSVEP-Based BCI System Kuo-Kai Shyu, Po-Lei Lee, Ming-Huan Lee and Yun-Jen Chiu Department of Electrical Engineering.
The Berlin Brain-Computer Interface: Machine Learning-Based Detection of User Specific Brain States Umar Farooq Berlin Brain Computer Interface.
Video Eyewear for Augmented Reality Presenter: Manjul Sharma Supervisor: Paul Calder.
EEG-Based Communication and Control: Short-Term Role Feedback Present by: Yu Yuan-Chu Dennis J. Mcfarland, Lynn M. McCane, and J. R. Wolpaw.
作者:Ali Bulent Usakli and Serkan Gurkan
Analysis of Movement Related EEG Signal by Time Dependent Fractal Dimension and Neural Network for Brain Computer Interface NI NI SOE (D3) Fractal and.
2.03 Explore virtual reality design and use.
1 Research Question  Can a vision-based mobile robot  with limited computation and memory,  and rapidly varying camera positions,  operate autonomously.
The Main Injector Beam Position Monitor Front-End Software Luciano Piccoli, Stephen Foulkes, Margaret Votava and Charles Briegel Fermi National Accelerator.
Enhancement in Online Laboratory Learning Principal Investigators: Drs. Richard Chiou and Yongjin Kwon Research Assistants: Sweety Das Agarwal and Yueh-Ting.
HCI 입문 Graphics Korea University HCI System 2005 년 2 학기 김 창 헌.
Introduction Can you read the following paragraph? Can we derive meaning from words even if they are distorted by intermixing words with numbers? Perea,
Frequency-response-based Wavelet Decomposition for Extracting Children’s Mismatch Negativity Elicited by Uninterrupted Sound Department of Mathematical.
GazeEEGLabImporter Date: 12/12/2013 Version: 1.0 Produced by: Anton Andreev, Gipsa-lab/CNRS Contact:
Parts of a Computer. Two Basic Components of a Computer System Hardware Parts of the Computer System you can physically touch Software Computer Instructions.
Principal components analysis (PCA) as a tool for identifying EEG frequency bands: I. Methodological considerations and preliminary findings Jürgen Kayser,
Intelligent Systems Research Centre University of Ulster, Magee Campus BCI Research at the ISRC, University of Ulster N. Ireland, UK By Dr. Girijesh Prasad.
IPSIHAND AN EEG BASED BRAIN COMPUTER INTERFACE FOR MOTOR REHABILITATION.
BCI2000: 2D Control. Getting Started Follow the Passive Stimulus Presentation Data Collection Tutorial on the wiki – However, when the tutorial tells.
  Computer vision is a field that includes methods for acquiring,prcessing, analyzing, and understanding images and, in general, high-dimensional data.
An ERP investigation of response inhibition in adults with DCD Elisabeth Hill Duncan Brown José van Velzen.
GridEEG – User training
introduction Brain driven car which would be of great help to the physically disabled people. These cars will rely only on what the individual is thinking.
Bongjae Choi, Sungho Jo Presented by: Yanrong Wo
Brain operated wheelchair
[Ran Manor and Amir B.Geva] Yehu Sapir Outlines Review
Fabien LOTTE, Cuntai GUAN Brain-Computer Interfaces laboratory
Major Project Presentation Phase - I
Mobile Robot Kinematics
Introduction Brain driven car which would be of great help to the physically disabled people. These cars will rely only on what the individual is thinking.
Facilitating Effects of Transcranial Direct Current Stimulation on Motor Imagery Brain- Computer Interface With Robotic Feedback for Stroke Rehabilitation 
Warm Up- What is a robot? Describe in one sentence what you understand by the term ‘robot’ 2. What are the main parts of a robot? What do people do to.
Machine Learning for Visual Scene Classification with EEG Data
Eye Movement Tracking Device Senior Design Project: P09004
Brain-Computer Interfaces in Medicine
BCI Research at the ISRC, University of Ulster N. Ireland, UK
Presentation transcript:

ICRA2009 Evaluation of a robot as embodied interface for Brain Computer Interface systems E. Menegatti, L. Tonin Intelligent Autonomous System Laboratory (IAS-Lab) Department of Information Engineering University of Padua, Italy F. Piccione, S. Silvoni I.R.C.C.S. San Camillo Venice, Italy K. Priftis Department of General Psychology University of Padua, Italy ICRA IEEE International Conference On Robotics and Automation Kobe, Japan, May 12-17, 2009 Goals: Evaluate the advantages of a BCI system when the actions triggered by the subject brain activity are performed by a physical device in the real world (i.e. a mobile robot instead of a GUI). Motivations and purposes: Robot and on-board camera feedback can lead to higher engagement of the subjects? can lead to a better BCI- performance ? Telepresence can improve patients’ quality of life? Figure 1: BCI with a robot as physical device providing feedback from real world. Figure 2: P300 related peaks in EEG graphic sample. The holonomous robot: First experiment, the robot just replicates the motion of the virtual cursor cursor on the screen in the real world. We used an holonomous robot with an omnidirectional camera because: it can move to any position in the plane without needing to rotate. the omnidirectional video, can be used both for streaming images to the user and can generate feedback to the user. Figure 3: BENDER, the holonomous robot with omnidirectional camera. BCI data acquisition Registration electrodes were placed according to the international system at Fz, Cz, Pz and Oz; the Electrooculogram (EOG) was recorded from a pair of electrodes below and laterally to the right eye; all electrodes were referenced to the left earlobe. Figure 4: Tuning–up the electrodes configuration. The five channels were amplified, band-pass filtered between 0.15 Hz and 30 Hz, and digitized (with a 16-bit resolution) at 200 Hz sampling rate. Every ERP epoch, synchronized with the stimulus, began 500 ms before the stimulus onset, up to 1000 ms after stimulus trigger signal (tot ms). Thus, after each stimulus (trial) presentation the system recorded a matrix of 300 samples per 5 channels, available for on-line and off-line data processing. Experiments: We performed two experiments: Experiment 1: task performed using only graphical interface as user feedback. The environment is the monitor showing graphical interface. User command a virtual object moving it for reach one of the four virtual goal-icons displayed. Experiment 2: task performed using robot as actuator and robot camera view as user feedback. The robot is positioned in the middle of a square room. Physical goal- objects are positioned in the room according to the goal-icon used in the virtual interface during Experiment 1. The feedback for the subject is the change in the image (grabbed by the robot) displayed on the screen (the image of goal- object grows if commands move the robot closer to a physical goal-object). Figure 5: representation of a trial during Experiment 1 and 2.The central position of the virtual cursor (1a) and of the robot (1b), with goal-objects and four arrows with flashed arrow; the movement of the cursor (2a) and of the robot (2b) after P300 recognition. (1a) (1b) (2a) (2b) Results by using robot: Performances are comparable to those of the old GUI Performances achieved encourage further evaluations of the BCI with the robot embodied interface. Figure 6: classification accuracy (%) of the 5 healthy subjects who performed Experiment 1, and classification accuracy (%) of one subject who performed Experiment 2. Figure 7: Screenshot from the new museum Graphic user interface, and Rovio, the house holonomous robot. Future goals: Telepresence for museum visits: autonomous navigation, new graphical user interface, with masterpieces closest to current location. Telepresence for household rehabilitation: consumer robot Rovio holonomous robot with frontal camera for remote perception.