Presentation is loading. Please wait.

Presentation is loading. Please wait.

EEC-492/592 Kinect Application Development Lecture 16 Wenbing Zhao

Similar presentations


Presentation on theme: "EEC-492/592 Kinect Application Development Lecture 16 Wenbing Zhao"— Presentation transcript:

1 EEC-492/592 Kinect Application Development Lecture 16 Wenbing Zhao wenbing@ieee.org

2 Outline Algorithmic gesture recognition overview Swipe to left gesture Object-oriented new design for recognition engine Implementation details Build an app to test

3 Algorithmic Gesture Recognition Uses a set of predefined conditions and parameters to detect and validate a gesture against each of them Validate a gesture as it is being performed, by ensuring the start points, constraints, parameters, and the end points are always valid The algorithmic approach not only recognizes the gestures, but it also tracks if the gesture is performed correctly or not Examples gestures which can be considered as algorithmic  Hand moving in the same direction  A swipe to the right or the left  Zooming in and out  Waving hands

4 Algorithmic Gesture Recognition Start Start Condition Condition Validation Validation Finish Finish

5 SwipeToLeft Example Start:  The left hand joint should be below the left elbow and the spine joint  The right hand joint should be below the right shoulder joint and above the right elbow joint Condition & validation  The user should move the hand from right to left while maintaining the right hand and left hand joint positions Finish:  After a specific number of frames when the gesture reaches to validate the last condition, it will check if the distance between the right hand joint and the left shoulder has reduced from the starting point

6 Implementation of Algorithmic Gesture Recognition GestureType: Event argument public enum GestureType { SwipeToRight, SwipeToLeft, ZoomIn, ZoomOut } public class GestureEventArgs : EventArgs { public RecognitionResult Result { get ; internal set; } public GestureType GestureType { get; internal set; } public GestureEventArgs(RecognitionResult result, GestureType type) { this.Result = result; this.GestureType = type; }

7 Implementation of Algorithmic Gesture Recognition

8

9 Add a new C# file called GestureBase.cs Add a new C# file called GestureBase.cs  Common operation for all types of gestures suitable for algorithmic gesture recognition using Microsoft.Kinect; public abstract class GestureBase { public GestureBase(GestureType type) { this.CurrentFrameCount = 0; this.GestureType = type; } public bool IsRecognitionStarted { get; set; } private int CurrentFrameCount { get; set; } public GestureType GestureType { get; set; } protected virtual int MaximumNumberOfFrameToProcess { get { return 15; } }

10 GestureBase.cs public long GestureTimeStamp { get; set; } protected abstract bool ValidateGestureStartCondition(Skeleton skeleton); protected abstract bool ValidateGestureEndCondition(Skeleton skeleton); protected abstract bool ValidateBaseCondition(Skeleton skeleton); protected abstract bool IsGestureValid(Skeleton skeleton);

11 GestureBase.cs public virtual bool CheckForGesture(Skeleton skeleton) { if (this.IsRecognitionStarted == false) { if (this.ValidateGestureStartCondition(skeleton)) { this.IsRecognitionStarted = true; this.CurrentFrameCount = 0; } } else { if (this.CurrentFrameCount == this.MaximumNumberOfFrameToProcess) { this.IsRecognitionStarted = false; if (ValidateBaseCondition(skeleton) && ValidateGestureEndCondition(skeleton)) { return true; } } this.CurrentFrameCount++; if (!IsGestureValid(skeleton) && ! ValidateBaseCondition(skeleton)) { this.IsRecognitionStarted = false; } return false; }

12 SwipeToLeftGesture.cs using Microsoft.Kinect; public class SwipeToLeftGesture : GestureBase { // intermediate right hand position used for validation of the gesture private SkeletonPoint validatePosition; // starting right hand position when the gesture start condition is met (starting pose) private SkeletonPoint startingPostion; // distance between the hand right and the left shoulder private float shoulderDiff; // constructor public SwipeToLeftGesture() : base(GestureType.SwipeToLeft) { } // check to see if the starting pose is seen // called for every skeleton frame received protected override bool ValidateGestureStartCondition(Skeleton skeleton) { } // … }

13 SwipeToLeftGesture.cs protected override bool ValidateGestureStartCondition(Skeleton skeleton) { var handRightPoisition = skeleton.Joints[JointType.HandRight].Position; var handLeftPosition = skeleton.Joints[JointType.HandLeft].Position; var shoulderRightPosition = skeleton.Joints[JointType.ShoulderRight].Position; var spinePosition = skeleton.Joints[JointType.Spine].Position; // Starting pose: // right hand lower than right should && right hand higher than right elbow // && left hand lower than spine if ((handRightPoisition.Y < shoulderRightPosition.Y) && (handRightPoisition.Y > skeleton.Joints[JointType.ElbowRight].Position.Y) && handLeftPosition.Y < spinePosition.Y) { shoulderDiff = GestureHelper.GetJointDistance(skeleton.Joints[JointType.HandRight], skeleton.Joints[JointType.ShoulderLeft]); validatePosition = skeleton.Joints[JointType.HandRight].Position; startingPostion = skeleton.Joints[JointType.HandRight].Position; return true; } return false; }

14 SwipeToLeftGesture.cs // called for every skeleton frame protected override bool IsGestureValid(Skeleton skeletonData) { // current right hand position var currentHandRightPoisition = skeletonData.Joints[JointType.HandRight].Position; // current right hand should be on the left of the previous right hand position, // i.e., the right hand is moving to the left if (validatePosition.X < currentHandRightPoisition.X) { // if the right hand is moving to the right, stop doing gesture recognition return false; } // update the validatePosition using the current right hand position validatePosition = currentHandRightPoisition; // gesture so far so good return true; }

15 SwipeToLeftGesture.cs // check if the final pose has reached protected override bool ValidateGestureEndCondition(Skeleton skeleton) { // distance between the staring right hand position and // the last right hand position double distance = Math.Abs(startingPostion.X - validatePosition.X); // the distance between the current right hand and the left shoulder float currentshoulderDiff = GestureHelper.GetJointDistance(skeleton.Joints[JointType.HandRight], skeleton.Joints[JointType.ShoulderLeft]); // the right hand has moved for 0.1m since its starting position and // the right hand is getting closer to the left shoulder => we are done! if (distance > 0.1 && currentshoulderDiff < shoulderDiff) return true; // otherwise, the right hand has not moved enough distance yet return false; }

16 SwipeToLeftGesture.cs protected override bool ValidateBaseCondition(Skeleton skeleton) { var handRightPoisition = skeleton.Joints[JointType.HandRight].Position; var handLeftPosition = skeleton.Joints[JointType.HandLeft].Position; var shoulderRightPosition = skeleton.Joints[JointType.ShoulderRight].Position; var spinePosition = skeleton.Joints[JointType.Spine].Position; // right hand is to the left of the right shoulder, and // right hand is higher than right elbow, and // left hand is lower than spine if ((handRightPoisition.Y < shoulderRightPosition.Y) && (handRightPoisition.Y > skeleton.Joints[JointType.ElbowRight].Position.Y) && (handLeftPosition.Y < spinePosition.Y)) { // swipe to the left is ongoing, so far so good return true; } // condition is not met, terminate return false; }

17 GestureRecognitionEngine.cs class GestureRecognitionEngine { int SkipFramesAfterGestureIsDetected = 0; public event EventHandler GestureRecognized; public GestureType GestureType { get; set; } public Skeleton Skeleton { get; set; } public bool IsGestureDetected { get; set; } // list of gestures to be detected private List gestureCollection = null; public GestureRecognitionEngine() { this.InitilizeGesture(); } ….. } Add a new C# file called GestureRecognitionEgine.cs to the project Resembles the previous one, but uses inheritance Add GestureType, RecogntionResult, GestureEventArgs to the new file, or three separate files

18 GestureRecognitionEngine.cs private void InitilizeGesture() { this.gestureCollection = new List (); //this.gestureCollection.Add(new ZoomInGesture()); //this.gestureCollection.Add(new ZoomOutGesture()); //this.gestureCollection.Add(new SwipeToRightGesture()); // add SwipeToLeftGesture recognizer to the list this.gestureCollection.Add(new SwipeToLeftGesture()); } // reset data structures for a new round of gesture recognition private void RestGesture() { this.gestureCollection = null; this.InitilizeGesture(); this.SkipFramesAfterGestureIsDetected = 0; this.IsGestureDetected = false; }

19 GestureRecognitionEngine.cs public void StartRecognize() { if (this.IsGestureDetected) { // create a short break when we are done one round of gesture recognition while (this.SkipFramesAfterGestureIsDetected <= 30) { this.SkipFramesAfterGestureIsDetected++; } // reset our data structures for a new round of gesture recognition this.RestGesture(); return; } // perform gesture recognition for every gesture recognizer in our list foreach (var item in this.gestureCollection) { if (item.CheckForGesture(this.Skeleton)) { if (this.GestureRecognized != null) { // fire a gesture event when a gesture is recognized this.GestureRecognized(this, new GestureEventArgs(RecognitionResult.Success, item.GestureType)); this.IsGestureDetected = true; }

20 GestureHelper.cs public static float GetJointDistance(Joint firstJoint, Joint secondJoint) { float distanceX = firstJoint.Position.X - secondJoint.Position.X; float distanceY = firstJoint.Position.Y - secondJoint.Position.Y; float distanceZ = firstJoint.Position.Z - secondJoint.Position.Z; return (float)Math.Sqrt(Math.Pow(distanceX, 2) + Math.Pow(distanceY, 2) + Math.Pow(distanceZ, 2)); } Added a GestureHelper.cs file to your project Added a GestureHelper.cs file to your project  The class has only the following static method

21 Build a Gesture Recognition App for SwipeToLeft Gesture User interface User interface Canvas TextBox Image

22 Build a Gesture Recognition App Add member variables Add member variables Modify constructor Modify constructor KinectSensor sensor; private WriteableBitmap colorBitmap; private byte[] colorPixels; Skeleton[] totalSkeleton = new Skeleton[6]; Skeleton skeleton; GestureRecognitionEngine recognitionEngine; public MainWindow() { InitializeComponent(); Loaded += new RoutedEventHandler(WindowLoaded); }

23 Build a Gesture Recognition App private void WindowLoaded(object sender, RoutedEventArgs e) { if (KinectSensor.KinectSensors.Count > 0) { this.sensor = KinectSensor.KinectSensors[0]; if (this.sensor != null && !this.sensor.IsRunning) { this.sensor.Start(); this.sensor.ColorStream.Enable(); this.colorPixels = new byte[this.sensor.ColorStream.FramePixelDataLength]; this.colorPixels = new WriteableBitmap(this.sensor.ColorStream.FrameWidth, this.sensor.ColorStream.FrameHeight, 96.0, 96.0, PixelFormats.Bgr32, null); this.image1.Source = this.colorBitmap; this.sensor.ColorFrameReady += this.colorFrameReady; this.sensor.SkeletonStream.Enable(); this.sensor.SkeletonFrameReady += skeletonFrameReady; recognitionEngine = new GestureRecognitionEngine(); recognitionEngine.GestureRecognized += gestureRecognized; }

24 Build a Gesture Recognition App Gesture recognized event handler Gesture recognized event handler colorFrameRead(), DrawSkeleton(), drawBone(), ScalePosition() same as before colorFrameRead(), DrawSkeleton(), drawBone(), ScalePosition() same as before void gestureRecognized(object sender, GestureEventArgs e) { textBox1.Text = e.GestureType.ToString(); }

25 Build a Gesture Recognition App Handle skeleton frame ready event Handle skeleton frame ready event void skeletonFrameReady(object sender, SkeletonFrameReadyEventArgs e) { canvas1.Children.Clear(); using (SkeletonFrame skeletonFrame = e.OpenSkeletonFrame()) { if (skeletonFrame == null) { return; } skeletonFrame.CopySkeletonDataTo(totalSkeleton); skeleton = (from trackskeleton in totalSkeleton where trackskeleton.TrackingState == SkeletonTrackingState.Tracked select trackskeleton).FirstOrDefault(); if (skeleton == null) return; DrawSkeleton(skeleton); recognitionEngine.Skeleton = skeleton; recognitionEngine.StartRecognize(); }

26 Challenge Tasks Add recognition of zoom-in gesture Add recognition of zoom-in gesture Start condition Start condition  Right hand lower than right shoulder, left hand lower than right shoulder  Right hand higher than hip center  Left hand higher than hip center  Distance between two hands smaller than 0.5m. This distance is recorded in a variable for comparison later Check gesture for validity Check gesture for validity  Current distance between two hands must be bigger than initial value End gesture condition End gesture condition  Distance between two hands exceed 1.0m Base condition Base condition  Right hand lower than right shoulder  Left hand lower than right shoulder  Right hand higher than hip center  Left hand higher than hip center


Download ppt "EEC-492/592 Kinect Application Development Lecture 16 Wenbing Zhao"

Similar presentations


Ads by Google