Presentation is loading. Please wait.

Presentation is loading. Please wait.

Activity 3: Multimodality HMI for Hands-free control of an intelligent wheelchair L. Wei, T. Theodovidis, H. Hu, D. Gu University of Essex 27 January 2012.

Similar presentations


Presentation on theme: "Activity 3: Multimodality HMI for Hands-free control of an intelligent wheelchair L. Wei, T. Theodovidis, H. Hu, D. Gu University of Essex 27 January 2012."— Presentation transcript:

1 Activity 3: Multimodality HMI for Hands-free control of an intelligent wheelchair L. Wei, T. Theodovidis, H. Hu, D. Gu University of Essex 27 January 2012 Ecole Centrale de Lille 1 Part-financed by the European Regional Development Fund

2 Part-financed by the European Regional Development Fund 1. Outline of the task within the context of the project 1)To develop novel multimodal human-machine interfaces by integration of voice, gesture, brain and muscle. 2)To understand:  the user who interacts with it  the system (the computer technology and its usability)  the interaction between the user and the system 3)A proper balance between  Functionality - defined by the set of actions or services that it provides to its users, based on the system usability.  Usability - the range and degree that the system can be used efficiently & adequately by certain users. 2

3 Part-financed by the European Regional Development Fund System Integration 3 Voice, gesture, EEG,,.. Electrodes GPS Gyro, Laser... Activity 1 Navigation Multimodal HMI Activity 2 Communication 1. Outline of the task within the context of the project

4 Part-financed by the European Regional Development Fund 4 System Software Structure II.Main results – Gesture based HMI

5 Part-financed by the European Regional Development Fund 5 System GUI

6 Part-financed by the European Regional Development Fund 6 5600 4100 1300 Docking Area B Wood box barrier Pitch boundary Planed routes Docking Area A Experiment 1

7 Part-financed by the European Regional Development Fund 7 Experimental 1 Results: (upper) multi-modality control; (lower) joystick control

8 Part-financed by the European Regional Development Fund 8 Fig.10 Planned task 2 map for indoor experiment Experiment 2

9 Part-financed by the European Regional Development Fund 9 Experimental 2 Results: (Left) multi-modality control; (Right) joystick control

10 Part-financed by the European Regional Development Fund 10 II.Main results – Voice based HMI Task: To use voice recognition for controlling a wheelchair Purpose: To aid people with limited physical capability Software: The Microsoft Speech SDK Hardware: The Essex robotic wheelchair Experimentation: The Essex robotic arena

11 Part-financed by the European Regional Development Fund 11 Speech Recognition Structure Driving components:  · Start: Capture voice command  · Sampling: Sample voice signal in real-time  · Calculate energy: Validate signal’s presence  · Calculate zero-crossing rate: Validate signal’s changes  · Calculate entropy: Validate signal’s utterance  · Speech recognition by parser: Microsoft Speech SDK  · Driving: Forward, Back, Left, Right, Stop ) Start Sampling real-time signals Calculate energy Calculate zero- crossing rate Calculate entropy Speech recognition by parser Driving

12 Part-financed by the European Regional Development Fund 12 Microsoft Speech SDK Features:  Developed by Microsoft’s Speech Technologies Group  Aims to recognize audio speech and perform text-to-speech synthesizing  This API can be used on common programming languages including C++ FFTW Core:  FFTW is a ready-made library for computing discrete Fourier transform (DFT)  Developed using the C++ language by MIT  Can be used for increasing the running speed Recognition Accuracy:  Four commands are employed for control  Exceptional recognition accuracy  Adequate real-time control CommandAccuracy Forward90% Back93% Right92% Left86% Stop90%

13 Part-financed by the European Regional Development Fund 13 Testing Results Environment 1 A simple corridor with no obstacles Task: Reach destination at the same horizontal coordinate as the origin Environment 2 An open area with two obstacles Task: Avoid obstacles in a zigzag fashion and return back to the origin Test Time (sec) Environment 1 Time (sec) Environment 2 1135.1220.1 2134.9223.5 3134.5218.3 4135.3214.4 5126.4209.1 6123.1208.8 7114.8204.6 8116.1197.9 9112.9205.3 10115.3206.4 Average ≈ 2min≈ 3.5min

14 Part-financed by the European Regional Development Fund III.Future challenges and the work to be done 1)A novel multi-modal HMI will be developed by integration of voice control, gesture control, brain and muscle actuated control in order to meet the needs of different users. 2) The novel navigation and control algorithms developed in Activity 1 will be integrated to the wheelchair, including map- building, path planning, obstacle-avoidance, self-localization, trajectory generation, etc. 3) An integrated communications system allowing confidential data developed in Activity 2 to be made available at the intelligent wheelchair 14


Download ppt "Activity 3: Multimodality HMI for Hands-free control of an intelligent wheelchair L. Wei, T. Theodovidis, H. Hu, D. Gu University of Essex 27 January 2012."

Similar presentations


Ads by Google