Presentation is loading. Please wait.

Presentation is loading. Please wait.

EEC-492/592 Kinect Application Development Lecture 15 Wenbing Zhao

Similar presentations


Presentation on theme: "EEC-492/592 Kinect Application Development Lecture 15 Wenbing Zhao"— Presentation transcript:

1 EEC-492/592 Kinect Application Development Lecture 15 Wenbing Zhao wenbing@ieee.org

2 Outline Approaches to gesture recognition  Rules based Single pose based (this lecture) Multiple poses based  Machine learning based

3 Gesture Recognition Engine The recognition engine typically performs the following tasks:  It accepts user actions in a form of skeleton data  It matches the data points with predefined logic for a specific gesture  It executes actions if the gesture is recognized  It responds to the users

4 Gesture Recognition Engine

5

6 Recognizing Two Gestures

7 Gesture Recognition Engine GestureType GestureType RecognitionResult RecognitionResult GestureEventArgs GestureEventArgs public enum GestureType { HandsClapping, TwoHandsRaised } public enum RecognitionResult { Unknown, Failed, Success } public class GestureEventArgs : EventArgs { public GestureType gsType { get; internal set; } public RecognitionResult Result { get; internal set; } public GestureEventArgs(GestureType t, RecognitionResult result) { this.Result = result; this.gsType = t; }

8 Gesture Recognition Engine public class GestureRecognitionEngine { public GestureRecognitionEngine() { } public event EventHandler GestureRecognized; public Skeleton Skeleton { get; set; } public GestureType GestureType { get; set; } public void StartRecognize(GestureType t) { this.GestureType = t; switch (t) { case GestureType.HandsClapping: this.MatchHandClappingGesture(this.Skeleton); break; case GestureType.TwoHandsRaised: this.MatchTwoHandsRaisedGesture(this.Skeleton); break; default: break; }

9 Gesture Recognition Engine float previousDistance = 0.0f; private void MatchHandClappingGesture(Skeleton skeleton) { if (skeleton == null) { return; } if (skeleton.Joints[JointType.HandRight].TrackingState == JointTrackingState.Tracked && skeleton.Joints[JointType.HandLeft].TrackingState == JointTrackingState.Tracked) { float currentDistance = GetJointDistance(skeleton.Joints[JointType.HandRight], skeleton.Joints[JointType.HandLeft]); if (currentDistance 0.1f) { if (this.GestureRecognized != null) { this.GestureRecognized(this, new GestureEventArgs(GestureType.HandsClapping, RecognitionResult.Success)); } previousDistance = currentDistance; }

10 Gesture Recognition Engine private void MatchTwoHandsRaisedGesture(Skeleton skeleton) { if (skeleton == null) { return; } float threshold = 0.3f; if (skeleton.Joints[JointType.HandRight].Position.Y > skeleton.Joints[JointType.Head].Position.Y + threshold && skeleton.Joints[JointType.HandLeft].Position.Y > skeleton.Joints[JointType.Head].Position.Y + threshold) { if (this.GestureRecognized != null) { this.GestureRecognized(this, new GestureEventArgs(GestureType.TwoHandsRaised,RecognitionResult.Success)); }

11 Gesture Recognition Engine private float GetJointDistance(Joint firstJoint, Joint secondJoint) { float distanceX = firstJoint.Position.X - secondJoint.Position.X; float distanceY = firstJoint.Position.Y - secondJoint.Position.Y; float distanceZ = firstJoint.Position.Z - secondJoint.Position.Z; return (float)Math.Sqrt(Math.Pow(distanceX, 2) + Math.Pow(distanceY, 2) + Math.Pow(distanceZ, 2)); }

12 Vectors, Dot Product, Angles Vector3 seg; seg.X = Joint1.Position.X – Joint2.Position.X; seg.Y = Joint1.Position.Y – Joint2.Position.Y; seg.Z = Joint1.Position.Z – Joint2.Position.Z; Segments of body can be represented using vectors  You can create a class/struct for Vector3 or use Unity Vector3 type Dot product of two vectors Angle formed by two vectors in degrees Vector3 seg1, seg2; float dotproduct = seg1.X*seg2.X+seg1.Y*seg2.Y+seg1.Z*seg2.Z; Vector3 seg1, seg2; float dotproduct = seg1.X*seg2.X+seg1.Y*seg2.Y+seg1.Z*seg2.Z; float seg1magnitude = Math.Sqt(seg1.X*seg1.X+seg1.Y*seg1.Y+seg1.Z*seg1.Z); float seg2magnitude = Math.Sqt(seg2.X*seg2.X+seg2.Y*seg2.Y+seg2.Z*seg2.Z); float angle = Math.Acos(dotproduct/seg1magnitude/seg2magnitude)*180/Math.PI;

13 Build a Gesture Recognition App The app can recognize two gestures The app can recognize two gestures  Clapping hand  Two hands raised Setting up project Setting up project  Create a new C# WPF project with a name GestureRecognitionBasic  Add Microsoft.Kinect reference, import name space, etc  Add GUI component  Create a new C# file named GestureRecognitionEngine.cs, and copy the code to this class In solution explore, right click, then Add => New Item

14 Build a Gesture Recognition App User interface User interface Canvas TextBox Image

15 Build a Gesture Recognition App Add member variables Add member variables Modify constructor Modify constructor KinectSensor sensor; private WriteableBitmap colorBitmap; private byte[] colorPixels; Skeleton[] totalSkeleton = new Skeleton[6]; Skeleton skeleton; GestureRecognitionEngine recognitionEngine; public MainWindow() { InitializeComponent(); Loaded += new RoutedEventHandler(WindowLoaded); }

16 Build a Gesture Recognition App private void WindowLoaded(object sender, RoutedEventArgs e) { if (KinectSensor.KinectSensors.Count > 0) { this.sensor = KinectSensor.KinectSensors[0]; if (this.sensor != null && !this.sensor.IsRunning) { this.sensor.Start(); this.sensor.ColorStream.Enable(); this.colorPixels = new byte[this.sensor.ColorStream.FramePixelDataLength]; this.colorPixels = new WriteableBitmap(this.sensor.ColorStream.FrameWidth, this.sensor.ColorStream.FrameHeight, 96.0, 96.0, PixelFormats.Bgr32, null); this.image1.Source = this.colorBitmap; this.sensor.ColorFrameReady += this.colorFrameReady; this.sensor.SkeletonStream.Enable(); this.sensor.SkeletonFrameReady += skeletonFrameReady; recognitionEngine = new GestureRecognitionEngine(); recognitionEngine.GestureRecognized += gestureRecognized; }

17 Build a Gesture Recognition App Gesture recognized event handler Gesture recognized event handler colorFrameRead(), DrawSkeleton(), drawBone(), ScalePosition() same as before colorFrameRead(), DrawSkeleton(), drawBone(), ScalePosition() same as before void gestureRecognized(object sender, GestureEventArgs e) { textBox1.Text = e.gsType.ToString(); }

18 Build a Gesture Recognition App Handle skeleton frame ready event Handle skeleton frame ready event void skeletonFrameReady(object sender, SkeletonFrameReadyEventArgs e) { canvas1.Children.Clear(); using (SkeletonFrame skeletonFrame = e.OpenSkeletonFrame()) { if (skeletonFrame == null) { return; } skeletonFrame.CopySkeletonDataTo(totalSkeleton); skeleton = (from trackskeleton in totalSkeleton where trackskeleton.TrackingState == SkeletonTrackingState.Tracked select trackskeleton).FirstOrDefault(); if (skeleton == null) return; DrawSkeleton(skeleton); recognitionEngine.Skeleton = skeleton; recognitionEngine.StartRecognize(GestureType.HandsClapping); recognitionEngine.StartRecognize(GestureType.TwoHandsRaised); }

19 Challenge Tasks Add recognition of two more gestures Add recognition of two more gestures  Right hand is raised  Left hand is raised Add the GestureRecognitionEngine.cs to a Unity+Kinect app, and add visual feedback on the gestures recognized Add the GestureRecognitionEngine.cs to a Unity+Kinect app, and add visual feedback on the gestures recognized


Download ppt "EEC-492/592 Kinect Application Development Lecture 15 Wenbing Zhao"

Similar presentations


Ads by Google