Knowledge Systems Lab JN 1/15/2016 Facilitating User Interaction with Complex Systems via Hand Gesture Recognition MCIS Department Knowledge Systems Laboratory.

Slides:



Advertisements
Similar presentations
Hand Gesture for Taking Self Portrait Shaowei Chu and Jiro Tanaka University of Tsukuba Japan 12th July 15 minutes talk.
Advertisements

Martin Wagner and Gudrun Klinker Augmented Reality Group Institut für Informatik Technische Universität München December 19, 2003.
Department of Electrical and Computer Engineering He Zhou Hui Zheng William Mai Xiang Guo Advisor: Professor Patrick Kelly ASLLENGE Midway Design review.
Shweta Jain 1. Motivation ProMOTE Introduction Architectural Choices Pro-MOTE system Architecture Experimentation Conclusion and Future Work Acknowledgement.
Page 1 SIXTH SENSE TECHNOLOGY Presented by: KIRTI AGGARWAL 2K7-MRCE-CS-035.
SIXTH SENSE TECHNOLOGY
Project Proposal [5HC99]: Nao Robot playing Checkers Natalia Irigoyen Wouter Kuijpers Alejandro Betancourt.
Video Matting from Depth Maps Jonathan Finger Oliver Wang University of California, Santa Cruz {jfinger,
Department of Electrical and Computer Engineering He Zhou Hui Zheng William Mai Xiang Guo Advisor: Professor Patrick Kelly ASLLENGE.
MUltimo3-D: a Testbed for Multimodel 3-D PC Presenter: Yi Shi & Saul Rodriguez March 14, 2008.
Tracking Migratory Birds Around Large Structures Presented by: Arik Brooks and Nicholas Patrick Advisors: Dr. Huggins, Dr. Schertz, and Dr. Stewart Senior.
Virtual Reality. What is virtual reality? a way to visualise, manipulate, and interact with a virtual environment visualise the computer generates visual,
“Let’s Take This Outside” Boxing David A. Blau Uzoma A. Orji Reesa B. Phillips.
Background S.A.U.V.I.M. Semi - Autonomous Underwater Vehicle for
Vision Guided Navigation Andrey Kozitsky Seth Kramer.
Cindy Song Sharena Paripatyadar. Use vision for HCI Determine steps necessary to incorporate vision in HCI applications Examine concerns & implications.
Gyration GyroMouse. Digitizers 3D Digitization Data Gloves (Haptic Devices)
Hand Movement Recognition By: Tokman Niv Levenbroun Guy Instructor: Todtfeld Ari.
Department of Electrical & Computer Engineering Team Hollot By: Cory Brett Jonathan Katon Thomas Pavlu Haiyan Xu NavFocus Midway Design Review.
Real-Time Face Detection and Tracking Using Multiple Cameras RIT Computer Engineering Senior Design Project John RuppertJustin HnatowJared Holsopple This.
 Introduction  Devices  Technology – Hardware & Software  Architecture  Applications.
Augmented Reality and 3D modelling Done by Stafford Joemat Supervised by Mr James Connan and Mr Mehrdad Ghaziasgar.
Electronic Visualization Laboratory University of Illinois at Chicago Interaction between Real and Virtual Humans: Playing Checkers R. Torre, S. Balcisoy.
Knowledge Systems Lab JN 8/24/2015 A Method for Temporal Hand Gesture Recognition Joshua R. New Knowledge Systems Laboratory Jacksonville State University.
   Input Devices Main Memory Backing Storage PROCESSOR
R.O.M.P Robot Orientation Mapping Project Team Evolution Peri Subrahmanya: Lead Designer Michael Lazar: Project Manager Sean Hogan: Lead Designer Joe Hackstadt:
Marcelo de Paiva Guimarães Bruno Barberi Gnecco Marcelo Knorich Zuffo
Knowledge Systems Lab JN 9/13/2015 An Advanced User Interface for Pattern Recognition in Medical Imagery: Interactive Learning, Contextual Zooming, and.
Progress Presentation IRALAR Breanna Heidenburg -- Michael Lenisa -- Daniel Wentzel Advisor: Dr. Malinowski.
Knowledge Systems Lab JN 9/15/2015 Heterogeneous Collection of Learning Systems for Confident Pattern Recognition Joshua R. New Knowledge Systems Laboratory.
DEVSView: A DEVS Visualization Tool Wilson Venhola.
3D Fingertip and Palm Tracking in Depth Image Sequences
Knowledge Systems Lab JN 9/10/2002 Computer Vision: Gesture Recognition from Images Joshua R. New Knowledge Systems Laboratory Jacksonville State University.
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
PortableVision-based HCI A Hand Mouse System on Portable Devices 連矩鋒 (Burt C.F. Lien) Computer Science and Information Engineering Department National.
Presentation by: K.G.P.Srikanth. CONTENTS  Introduction  Components  Working  Applications.
A Method for Hand Gesture Recognition Jaya Shukla Department of Computer Science Shiv Nadar University Gautam Budh Nagar, India Ashutosh Dwivedi.
Submitted by:- Vinay kr. Gupta Computer Sci. & Engg. 4 th year.
Hand Tracking for Virtual Object Manipulation
TELEKINESYS Group Members: Mir Murtaza SM Rasikh Mukarram Shiraz Sohail.
COMPUTER GRAPHICS Hochiminh city University of Technology Faculty of Computer Science and Engineering CHAPTER 01: Graphics System.
Real-Time Cyber Physical Systems Application on MobilityFirst Winlab Summer Internship 2015 Karthikeyan Ganesan, Wuyang Zhang, Zihong Zheng.
GENESIS OF VIRTUAL REALITY  The term ‘Virtual reality’ (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989)..
出處: Signal Processing and Communications Applications, 2006 IEEE 作者: Asanterabi Malima, Erol Ozgur, and Miijdat Cetin 2015/10/251 指導教授:張財榮 學生:陳建宏 學號: M97G0209.
Students: Anurag Anjaria, Charles Hansen, Jin Bai, Mai Kanchanabal Professors: Dr. Edward J. Delp, Dr. Yung-Hsiang Lu CAM 2 Continuous Analysis of Many.
May 16-18, Skeletons and Asynchronous RPC for Embedded Data- and Task Parallel Image Processing IAPR Conference on Machine Vision Applications Wouter.
Self-assembling Agent System Presentation 1 Donald Lee.
Human pose recognition from depth image MS Research Cambridge.
1 Research Question  Can a vision-based mobile robot  with limited computation and memory,  and rapidly varying camera positions,  operate autonomously.
Knowledge Systems Lab JN 12/2/2015 Pliable Display Technology: Contextual Zoom as a Learning System Interface Joshua R. New Special Topics Jacksonville.
Anaglyph Videos Student:Jihaad Pienaar Supervisor: Mr Mehrdad Ghaziasgar Co-Supervisor:Mr James Connan Mentors: Mr Roland Foster & Mr Kenzo Abrahams.
Team IRALAR Breanna Heidenburg -- Michael Lenisa -- Daniel Wentzel Advisor: Dr. Malinowski.
Hand Gesture Recognition Using Haar-Like Features and a Stochastic Context-Free Grammar IEEE 高裕凱 陳思安.
Thresholding and Segmenting Objects The overall objective of image processing operations is to extract the objects of interest and to distinguish them.
Knowledge Systems Lab JN 8/10/2002 Fusion of Multi-Modality Volumetric Medical Imagery Mario Aguilar and Joshua R. New Knowledge Systems Laboratory MCIS.
Face Detection – EE368 Group 10 May 30, Face Detection EE 368 Group 10 Waqar Mohsin Noman Ahmed Chung-Tse Mar.
Over the recent years, computer vision has started to play a significant role in the Human Computer Interaction (HCI). With efficient object tracking.
SPACE MOUSE. INTRODUCTION  It is a human computer interaction technology  Helps in movement of manipulator in 6 degree of freedom * 3 translation degree.
Hand Gestures Based Applications
SIXTH SENSE TECHNOLOGY
FISH IDENTIFICATION SYSTEM
Senior Capstone Project Gaze Tracking System
Salevich Alex & Frenkel Eduard Wizard Hunting
Mixed Reality Server under Robot Operating System
Higher School of Economics , Moscow, 2016
The Implementation of a Glove-Based User Interface
Motivation It can effectively mine multi-modal knowledge with structured textural and visual relationships from web automatically. We propose BC-DNN method.
Higher School of Economics , Moscow, 2016
FISH IDENTIFICATION SYSTEM
Higher School of Economics , Moscow, 2016
Presentation transcript:

Knowledge Systems Lab JN 1/15/2016 Facilitating User Interaction with Complex Systems via Hand Gesture Recognition MCIS Department Knowledge Systems Laboratory Jacksonville State University Joshua R. New, Erion Hasanbelliu, and Mario Aguilar

Knowledge Systems Lab JN 1/15/2016 Outline Motivation System Architecture Implementation Overview Proposed Approach Demonstration Future Directions

Knowledge Systems Lab JN 1/15/2016 Motivation Gesturing is a natural form of communication Interaction problems with the mouse –Have to locate cursor –Hard for some to control (Parkinsons or people on a train) –Limited forms of input from the mouse

Knowledge Systems Lab JN 1/15/2016 Motivation (2) Interaction Problems with the Virtual Reality Glove –Reliability –Always connected –Encumbrance

Knowledge Systems Lab JN 1/15/2016 System Architecture Standard Web Camera Rendering User Interface Display Hand Movement User Gesture Recognition System Image Capture Update Object Image Input

Knowledge Systems Lab JN 1/15/2016 Implementation Overview System: 1.6 Ghz AMD Athlon OpenCV and IPL libraries (from Intel) Input: 640x480 video image Hand calibration measure Output: Rough estimate of centroid Refined estimate of centroid Number of fingers being held up Manipulation of 3D skull in QT interface in response to gesturing

Knowledge Systems Lab JN 1/15/2016 Implementation Overview (2) Hand Calibration Measure: Max hand size in x and y orientations in # of pixels

Knowledge Systems Lab JN 1/15/2016 Implementation Overview (3) Saturation Channel Extraction (HSL space): Original Image Hue Lightness Saturation

Knowledge Systems Lab JN 1/15/2016 Proposed Approach

Knowledge Systems Lab JN 1/15/2016 Proposed Approach (2)

Knowledge Systems Lab JN 1/15/2016 Proposed Approach (3) The finger-finding function sweeps out a circle around the rCoM, counting the number of white and black pixels as it progresses A finger is defined to be any 10+ white pixels separated by 17+ black pixels (salt/pepper tolerance) Total fingers is number of fingers minus 1 for the hand itself

Knowledge Systems Lab JN 1/15/2016 Proposed Approach (4) System Runtime: Current time – 41 ms for one image from camera Processing Capability on 1.6 Ghz Athlon: 24 fps Process Steps Time (ms) Athlon XP 1900 (1.6 Ghz) 1) Extract Sat Channel9 2) Threshold3 3) Connected Contour Fill 14 4) Centroid2 5) Segment Hand From Arm 9 6) Refined Centroid4 7) Count Number of Fingers 0 Total Time41

Knowledge Systems Lab JN 1/15/2016 Demonstration System Configuration System GUI Layout

Knowledge Systems Lab JN 1/15/2016 Demonstration (2) Gesture to Interaction Mapping Number of Fingers: 2 – Roll Left 3 – Roll Right 4 – Zoom In 5 – Zoom Out

Knowledge Systems Lab JN 1/15/2016 Demonstration (3)

Knowledge Systems Lab JN 1/15/2016 Demonstration (4)

Knowledge Systems Lab JN 1/15/2016 Future Directions Optimization Calibration Phase Defining Hand Orientation Learning System Interface Extensions For additional information, please visit