Presentation is loading. Please wait.

Presentation is loading. Please wait.

Background Implementation of Gesture Recognition in the NIST Immersive Visualization Environment Luis D. Catacora Under the guidance of Judith Terrill.

Similar presentations


Presentation on theme: "Background Implementation of Gesture Recognition in the NIST Immersive Visualization Environment Luis D. Catacora Under the guidance of Judith Terrill."— Presentation transcript:

1 Background Implementation of Gesture Recognition in the NIST Immersive Visualization Environment Luis D. Catacora Under the guidance of Judith Terrill and Terence Griffin National Institute of Standards and Technology The purpose of this research project was to study and implement gesture recognition for an immersive environment found at NIST. The Immersive Visualization Environment (IVE) is a virtual room that allows a user to manipulate and visualize 3D objects with a wand controller and a headset respectively. Although the software found at NIST already included flexible and intuitive navigation methods, their use of pointer gestures has been limited. In order to establish a better interaction between the immersive environment and its user, game technology was researched to develop efficient gesture recognition for the RAVE. The procedure followed consisted of a basic four-step plan for each gesture to be implemented in the RAVE. The results gathered dealt with the position, velocity, and acceleration of the RAVE wand while the user performed a specific gesture. The implemented gesture actions were written in specific command scripts that could be easily replaced with any executable method, making the actions completely user definable. Background Results Methods Conclusions Future Research Abstract The IVE (Immersive Visualization Environment) is an environment that enables scientific visualizations of large amounts of data for further analysis and measurement. Data analyzed in the RAVE (Reconfigurable Automatic Virtual Environment) could originate from large parallel computations such as biological, and chemical data sets of interest. The RAVE has various commands and capabilities for the user, such as onscreen menus and displays. But the RAVE lacks the user-friendly connection that the more modern technologies possess in terms of their capabilities in gesture recognition. Thus this virtual room requires the enactment of gesture recognition in order to increase efficiency and compatibility between the user and the technology. As visualizations become increasingly complex, better methods of human interaction are needed; consequently, detailed and accurate gestures should be investigated. Game technology was chosen for research because they contain the most up-to-date expertise in gesture recognition. Predefined gaming gestures are surprisingly very useful in the field that the RAVE uses because of their uses in data manipulation and visualization. This project focuses on the development of gesture recognition (i.e. flicks, stabs, lines, etc.) and the process that was followed to accomplish it. Future works will also include curve fitting the points of the wand for complex gestures leading to quicker interactions between the users and the IVE. This upgrade in gesture recognition could lead to user ability in not only being able to change the action of the gesture, but in terms of how they are recognized in terms of the given variables Advancements in gesture recognition will improve current analysis of data, measurements of data, and interactions between the user and the data within the IVE. In the end, gesture recognition will improve current analysis, measurement, and the interaction between the user and complex data within the IVE. This program will help set the basic blocks for more complex gestures in the future. In the end, this completed program was able to successfully identify a total of 15 gestures. The only data collected was used aid the gesture recognizing algorithms in determining gesture patterns, and to verify if the correct gesture was recognized upon completion. Current work in gesture recognition for the IVE was researched and compared to the other established game technology. The main game console of interest was the Wii because of the similar dependency of a wand controller in 3D space. Steps taken for each gesture: Define the gesture: Specify what happens when the gesture is performed. And define what the gesture will basically look for. Collect data: Determine how each variable changes during the gesture. Analyze data to determine thresholds for calibration. Program implementation of gesture: Use algorithms and data analysis to write recognition code in C++. Write the gesture number to a shared memory file. Test Gesture: Compile and execute code for the gesture, and see if the correct gesture output is performed. Test on both the Desktop version, and the RAVE version. Figure 1: A picture demonstrating the desktop version of the IVE. The resulting program [Figure 2] was able to successfully recognize basic gestures from human movement. Raw data was stored into txt files [Figure 3] and reviewed for analysis and correction of gesture recognition. The gestures performed were identical to gestures recognized by the Wii console [Figure 4].The various algorithms looked for changes in the data similar to those patterns shown [Figures 5.1-5.3]. The progress of displaying gesture detection progressed from onscreen terminal messages, to pop-up images to 3Darrows. Images displayed below [Figure 6.1-6.2]. Figures 5.1-5.3: Patterns in the position, velocity, and acceleration (in order) of a gesture that were looked for in the coded algorithms. Examples shown represent the components for a line gesture. Figure 3: Sample raw data stored into txt file labeled “zout.txt” shows the main variables used. The wand’s position in the X, Y, and Z environment are displayed, along with the three Euler angles of the wand: yaw, pitch, and roll. Figure 2: Sample code from the main program that uses algorithms to recognize gestures. Used specific variables to determine changes in position for lines, and acceleration for flicks. Position over TimeSpeed over TimeAcceleration over Time Figure 4: A picture demonstrating a sample of the various gestures that the Wii console recognizes. Figure 6.1 – 6.2: Pictures demonstrating the desktop version of the IVE. Unfortunately, the images [Figures 6.1-6.2] were helpful but slightly distracting, obscuring the item that the user was seeing in the IVE, so a constantly changing arrow was created in 3D space that reacted with the user’s gesture [Figures 7.1-7.3]. Figure 7.1-7.3: Sample images of 3d geometry arrows Figure 8: Demonstration of Desktop Version of IVE with final work: pictures, data, and gesture recognition calibration options


Download ppt "Background Implementation of Gesture Recognition in the NIST Immersive Visualization Environment Luis D. Catacora Under the guidance of Judith Terrill."

Similar presentations


Ads by Google