1 References: 1. J.M. Hart, Windows System Programming, 4th Ed., Addison-Wesley, 2010, Ch.12 2.Microsoft Kinect SDK for Developers,

Slides:



Advertisements
Similar presentations
K - News By: Elie Ain Malak Nour Assi Vasken Sarkis Georges Wakim.
Advertisements

What’s New in Kinect for Windows v2 Click to add title
A new Network Concept for transporting and storing digital video…………
Joshua Fabian Tyler Young James C. Peyton Jones Garrett M. Clayton Integrating the Microsoft Kinect With Simulink: Real-Time Object Tracking Example (
BRETT WATT COMPUTER SCIENCE 1631 WINTER.  Originally known by the code name “Project Natal”  Microsoft Kinect is a hands free gaming system built for.
KINECT REHABILITATION
Kinect + TFS aka Kinban Jeremy Novak Farm Credit Services of America.
Wait, what? More than just technology catch-up. Johnny Lee (Carnegie Mellon) * Motion-Tracking/Head-Tracking/Virtual Whiteboard
By : Adham Suwan Mohammed Zaza Ahmed Mafarjeh. Achieving Security through Kinect using Skeleton Analysis (ASKSA)
Department of Electrical and Computer Engineering He Zhou Hui Zheng William Mai Xiang Guo Advisor: Professor Patrick Kelly ASLLENGE.
ALFRED THOMPSON MICROSOFT ACADEMIC TEAM Kinect for FRC 2012.
By Rishabh Maheshwari. Objective of today’s lecture Play Angry Birds in 3D.
SM1205 Interactivity Topic 01: Introduction Spring 2012SCM-CityU1.
Windows Sockets Purpose Windows Sockets 2 (Winsock) enables programmers to create advanced internet, intranet, and other network-capable applications to.
1 2. Controlling Robot Car in Ogre References: 1. LEGO.com MINDSTORMS NXT Home, 2. OGRE 3D, 3. MSDN,
Game Development with Kinect
Socket Addresses. Domains Internet domains –familiar with these Unix domains –for processes communicating on the same hosts –not sure of widespread use.
BRETT GIPSON PRESENTS CHAPTER 5. DESCRIBE INPUT Input devices translate words, sounds, images and actions that people understand into symbols that the.
SM1205 Interactivity Topic 01: Introduction Spring 2011SCM-CityU1.
Adapted from CTAE Resources Network PROFITT Curriculum Basic Computer Skills Module 1 Hardware.
Introduce about sensor using in Robot NAO Department: FTI-FHO-FPT Presenter: Vu Hoang Dung.
CHAPTER 2 Input & Output Prepared by: Mrs.sara salih 1.
   Input Devices Main Memory Backing Storage PROCESSOR
1 Input/Output. 2 Principles of I/O Hardware Some typical device, network, and data base rates.
Socket Programming References: redKlyde ’ s tutorial set Winsock2 for games (gamedev.net)
TCP/IP Protocol Stack IP Device Drivers TCPUDP Application Sockets (Gate to network) TCP: –Establish connection –Maintain connection during the communication.
Kinect Part II Anna Loparev.
Professor : Yih-Ran Sheu Student’s name : Nguyen Van Binh Student ID: MA02B203 Kinect camera 1 Southern Taiwan University Department of Electrical Engineering.
Introduction Kinect for Xbox 360, referred to as Kinect, is developed by Microsoft, used in Xbox 360 video game console and Windows PCs peripheral equipment.
Zhengyou Zhang Microsoft Research Digital Object Identifier: /MMUL Publication Year: 2012, Page(s): Professor: Yih-Ran Sheu Student.
1 Lecture 2 Animation References: [1] Gregory Junker, Pro OGRE 3D Programming, Apress, 2006 [2] Ogre Tutorials – Ogre Wiki
Project Guide: Mr. B.RAVINDER Assistant Professor (CSE) Batch No.: CSB33 Team Members: D. Sai Goud A0569 Satya Swarup Sahoo A0574 G. Shivadeep.
Page 1 | Microsoft Work With Skeleton Data Kinect for Windows Video Courses Jan 2013.
Page 1 | Microsoft Work With Color Data Kinect for Windows Video Courses Jan 2013.
Project By: Brent Elder, Mike Holovka, Hisham Algadaibi.
1 EEC-492/592 Kinect Application Development Lecture 2 Wenbing Zhao
Project By: Brent Elder, Mike Holovka, Hisham Algadaibi.
S ENSORS U SED I N G AMES By Wusqa Waqar. What are sensors and how are they used in games? A sensor is a converter that measures a physical quantity and.
ECE 8443 – Pattern Recognition EE 3512 – Signals: Continuous and Discrete Objectives: Spectrograms Revisited Feature Extraction Filter Bank Analysis EEG.
The X170 protocol as a vehicle for 3D sound control Presented by Shane Haw Supervisor: Professor Richard Foss
Kinect & 3D Scanning Mark Breedveld
Networking Tutorial Special Interest Group for Software Engineering Luke Rajlich.
CS 158A1 1.4 Implementing Network Software Phenomenal success of the Internet: – Computer # connected doubled every year since 1981, now approaching 200.
Professor : Tsung Fu Chien Student’s name : Nguyen Trong Tuyen Student ID: MA02B208 An application Kinect camera controls Vehicles by Gesture 1 Southern.
CPSC 441 TUTORIAL – FEB 13, 2012 TA: RUITNG ZHOU UDP REVIEW.
KAMI KITT ASSISTIVE TECHNOLOGY Chapter 7 Human/ Assistive Technology Interface.
Socket Programming Tutorial Department of Computer Science Southern Illinois University Edwardsville Fall, 2015 Dr. Hiroshi Fujinoki
What is gaming console & platform? A game console is a device which outputs video signal into TV screen to display the video game. A platform is in which.
Wiimote/Kinect Lab Midterm Update Senior Design December 2011, Group 16 Adviser: Dr. Tom Daniels Brenton Hankins Rick Hanton Harsh Goel Jeff Kramer.
CS 6401 Introduction to Computer Networks 09/21/2010 Outline - UNIX sockets - A simple client-server program - Project 1 - LAN bridges and learning.
CONTENT 1. Introduction to Kinect 2. Some Libraries for Kinect 3. Implement 4. Conclusion & Future works 1.
Introduction to Kinect For Windows SDK
James Crosetto BS (Computer Science and Computer Engineering) Jeremy Ellison BS (Computer Science and Computer Engineering) Seth Schwiethale BS (Computer.
Product: Microsoft Kinect Team I Alex Styborski Brandon Sayre Brandon Rouhier Section 2B.
Socket Programming in C CS587x Lecture 3 Department of Computer Science Iowa State University.
Kinect for Windows By: Craig Delzangle COSC 380. What I’m going to cover: History How Kinect works Kinect and Windows Uses Conclusion Questions.
Introduction to Input Devices. Input Devices Units that gather information and transform that information into a series of electronic signals for the.
KINECT AMERICAN SIGN TRANSLATOR (KAST)
Southern Taiwan University Department of Electrical Engineering
VIRTUAL INTELLIGENCE PROJECT NATAL (Kinect & Xbox 360)
Xbox Kinect: A Brand New Console
Methods of Computer Input and Output
Jim Fawcett CSE 681 – Software Modeling & Analysis Fall 2002
Introduction to Microsoft Kinect Sensor Programming
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
Programming HCI Yingcai Xiao Yingcai Xiao.
EEC-693/793 Applied Computer Vision with Depth Cameras
Chapter 3 Socket API © Bobby Hoggard, Department of Computer Science, East Carolina University These slides may not be used or duplicated without permission.
Jim Fawcett CSE 681 – Software Modeling & Analysis Summer 2003
Presentation transcript:

1 References: 1. J.M. Hart, Windows System Programming, 4th Ed., Addison-Wesley, 2010, Ch.12 2.Microsoft Kinect SDK for Developers, 3.Kinect, Wikipedia Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun Lecture 6 Ogre and Kinect EIE360 Integrated Project

Architecture of the Interactive Virtual Aquarium System 2 USB port 3D Graphics System 3D Graphics Your program Your program Network Computer A Computer B Kinect Sensor Device Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun

3 Kinect for Xbox 360 Sensor Kinect, originally known by the code name Project Natal, is a motion sensing input device by Microsoft for the Xbox 360 video game console Enable users to control and interact with the Xbox 360 without the need to touch a game controller through a natural user interface (NUI) using gestures and spoken commands Launched in North America on November 4, 2010 After selling a total of 8 million units in its first 60 days, the Kinect holds the Guinness World Record of being the "fastest selling consumer electronics device“! Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun

Kinect for Xbox 360 Sensor 4 Depth image sensor To measure the distance of the object from the sensor Hence allow motion capture Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun IR Illuminator RGB Camera Allow facial recognition Multi-array Mic Allow sound source tracking Facilitate voice recognition Motorized Tilt

5 Kinect for Xbox 360 Sensor Some technical data of Kinect sensor: RGB video: 8-bit VGA resolution (640x480) at 30FPS Depth sensing video stream: resolution 640x480, 320x240, 80x60, 11 bits dynamic range Hence provides 2048 levels of sensitivity when measuring depth Distance limit when using with the Xbox software: 4 – 11ft The sensor has an angular field of view of 57° horizontally and 43° vertically Since using optical sensor, can have line-of-sight problem (e.g. occluded body parts cannot be measured) Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun

6 Kinect for Windows SDK A non-commercial Kinect software development kit (SDK) for Windows was released for Windows 7 on June 2011 The SDK includes Windows 7 compatible PC drivers for Kinect device Provides Kinect capabilities to developers to build applications with C++, C#, or Visual Basic by using Microsoft Visual Studio 2010 Includes following features: Raw sensor streams, skeletal tracking, advanced audio capabilities, and sample code and documentation Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun

7 Natural User Interface (NUI) API The NUI API is the core of the Kinect for Windows API Support fundamental image and device management features, including the following: Access to the Kinect sensors connected to the computer Access to image and depth data streams from the Kinect image sensors Delivery of a processed version of image and depth data to support skeletal tracking Image Stream Audio Stream Depth Stream

8 NUI Skeleton Tracking The NUI Skeleton API provides information about the location of up to two players standing in front of the Kinect sensor array, with detailed position and orientation information The data is provided to application code as a set of points, called skeleton positions, that compose a skeleton Twenty skeleton positions have been identified to indicate the major joints of human body

Skeleton Positions 1.NUI_SKELETON_POSITION_HIP_CENTER 2.NUI_SKELETON_POSITION_SPINE 3.NUI_SKELETON_POSITION_SHOULDER_CENTER 4.NUI_SKELETON_POSITION_HEAD 5.NUI_SKELETON_POSITION_SHOULDER_LEFT 6.NUI_SKELETON_POSITION_ELBOW_LEFT 7.NUI_SKELETON_POSITION_WRIST_LEFT 8.NUI_SKELETON_POSITION_HAND_LEFT 9.NUI_SKELETON_POSITION_SHOULDER_RIGHT 10.NUI_SKELETON_POSITION_ELBOW_RIGHT 11.NUI_SKELETON_POSITION_WRIST_RIGHT 12.NUI_SKELETON_POSITION_HAND_RIGHT 13.NUI_SKELETON_POSITION_HIP_LEFT 14.NUI_SKELETON_POSITION_KNEE_LEFT 15.NUI_SKELETON_POSITION_ANKLE_LEFT 16.NUI_SKELETON_POSITION_FOOT_LEFT 17.NUI_SKELETON_POSITION_HIP_RIGHT 18.NUI_SKELETON_POSITION_KNEE_RIGHT 19.NUI_SKELETON_POSITION_ANKLE_RIGHT 20.NUI_SKELETON_POSITION_FOOT_RIGHT 9 Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun

10 Skeletal Viewer Skeletal Viewer is one of the sample applications of the Kinect for Windows SDK Provide both video outputs of the Kinect sensor: RGB and Depth plus the skeleton of the player constructed from the detected skeleton positions Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun

The Motion Tracking Server of our project is built based on Skeletal Viewer The application is run in the server with skeleton positions extracted to send to the client program via Winsock 11 Motion Tracking Server and Skeleton Viewer Motion Tracking Server Network backbone SkeletalViewer Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun Server program Winsock

Motion Tracking Server Software Architecture of the Motion Tracking Server 12 Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun CSkeletalViewerApp Winsock mSkeletalViewerApp SkeletalViewer bool mSkeletonExist[2]; char mMessage[2][160]; char mMessage3D[2][320]; ServerSocket Server Program ListenOnPort() AcceptConnection() ProcessClient() CloseConnection()

13 Useful Parameters in CSkeletalViewerApp bool mSkeletonExist[2]; Kinect sensor can detect at most 2 players at the same time If the skeleton of a player is detected, the corresponding mSkeletonExist will be equal to true char mMessage[2][160]; If the skeleton of a player is detected, this array keeps the information of the 20 skeleton positions Each element is a POINT, which is a structure composed by 2 LONG integers (4 bytes each) struct POINT {LONG x; LONG y; }; struct POINT {LONG x; LONG y; }; Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun

Useful Parameters in CSkeletalViewerApp (cont) Each POINT in fact refers to the (x, y) coordinates of the skeleton positions in the display window of the Skeleton Viewer 14 Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun (0,0) (639,479)

char mMessage[2][160]; Useful Parameters in CSkeletalViewerApp (cont) 15 Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun xy … For the 1 st player mMessage[0] For the 2 nd player mMessage[1] 16 xyxyxyxy xy xyxyxyxy 1. NUI_SKELETON_POSITION_HIP_CENTER 20. NUI_SKELETON_POSITION_FOOT_RIGHT …

Useful Parameters in CSkeletalViewerApp (cont) 16 char mMessage3D[2][320]; If the skeleton of a player is detected, this array keeps the 3D coordinates of the 20 skeleton positions of the player in the 3D space with reference to the sensor Each element of the array is a Vector4 composed by 4 float numbers (4 bytes each) Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun struct Vector4 { float x; float y; float z; float w; }; struct Vector4 { float x; float y; float z; float w; }; Using 4-d vector rather than 3-d vector facilitates efficient matrix operations to translate a vector (e.g. by a matrix multiplication)

Useful Parameters in CSkeletalViewerApp (cont) Among the 4 elements, w can be ignored in this project 17 (0,0,0,1) x y z Sensor Direction Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun

Useful Parameters in CSkeletalViewerApp (cont) 18 Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun char mMessage3D[2][320]; xy … For the 1 st player mMessage3D[0] For the 2 nd player mMessage3D[1] 16 zwzwxyzw xy 048 zwxyzw 1. NUI_SKELETON_POSITION_HIP_CENTER 20. NUI_SKELETON_POSITION_FOOT_RIGHT … xy zwxy

Sending the Kinect Data to Client Assume that a connection has been made with a client. We can send the Kinect data to the client using send() Before that, the array data need to be converted into a character string for the function send(). A simple way to convert between types is to use the function memcopy() 19 Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun … … … … mMessage3D[0] mMessage[0] mSkeletonExist[0] mMessage3D[1] mMessage[1] mSkeletonExist[1]

Sending the Kinect Data to Client (cont) 20 char bufferSkel[1024]; char *bufferPtr = bufferSkel; for (int i = 0; i < 2; i++) { memcpy(bufferPtr, &(mSkeletalViewerApp->mSkeletonExist[i]), sizeof(bool)); bufferPtr += sizeof(bool); memcpy(bufferPtr, &(mSkeletalViewerApp->mMessage[i]), 160); bufferPtr += 160; memcpy(bufferPtr, &(mSkeletalViewerApp->mMessage3D[i]), 320); bufferPtr += 320; } rVal = send(mClientSocket, bufferSkel, 962, 0); char bufferSkel[1024]; char *bufferPtr = bufferSkel; for (int i = 0; i < 2; i++) { memcpy(bufferPtr, &(mSkeletalViewerApp->mSkeletonExist[i]), sizeof(bool)); bufferPtr += sizeof(bool); memcpy(bufferPtr, &(mSkeletalViewerApp->mMessage[i]), 160); bufferPtr += 160; memcpy(bufferPtr, &(mSkeletalViewerApp->mMessage3D[i]), 320); bufferPtr += 320; } rVal = send(mClientSocket, bufferSkel, 962, 0); Low level byte-by-byte memory copy without considering the type Socket created by accept()

Architecture of the Interactive Virtual Aquarium System 21 USB port 3D Graphics System 3D Graphics Your program Your program Network Computer A Computer B Kinect Sensor Device Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun

Setting up a Winsock Client In Lecture 5, the procedure for setting up a Winsock server is discussed To communicate with the server, a Winsock client should be setting up also Procedure for setting up a Winsock client is much simpler 22 Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun WSAStartup(... ); //1. Initialize socket (... ); //2. Create a client socket connect(... ); //3. Connect to the server send (... ); / recv (... ); //4. Send or receive data : closesocket (... ); //5. Close the socket after using it WSACleanup (... ); //6. Free resource allocated WSAStartup(... ); //1. Initialize socket (... ); //2. Create a client socket connect(... ); //3. Connect to the server send (... ); / recv (... ); //4. Send or receive data : closesocket (... ); //5. Close the socket after using it WSACleanup (... ); //6. Free resource allocated

Connect() and Its Parameters 23 Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun mSocket = socket(AF_INET, SOCK_STREAM, IPPROTO_TCP); //If mSocket = INVALID_SOCKET, error struct sockaddr_in clientService; clientService.sin_family = AF_INET; clientService.sin_addr.s_addr = inet_addr(“ ”); clientService.sin_port = htons(8888); iResult = connect( mSocket, (SOCKADDR*) &clientService, sizeof(clientService) ); //If iResult = SOCKET_ERROR, error mSocket = socket(AF_INET, SOCK_STREAM, IPPROTO_TCP); //If mSocket = INVALID_SOCKET, error struct sockaddr_in clientService; clientService.sin_family = AF_INET; clientService.sin_addr.s_addr = inet_addr(“ ”); clientService.sin_port = htons(8888); iResult = connect( mSocket, (SOCKADDR*) &clientService, sizeof(clientService) ); //If iResult = SOCKET_ERROR, error Port no. of the server to be connected to IP addr. of the server

Architecture of the Interactive Virtual Aquarium System 24 USB port 3D Graphics System 3D Graphics Your program Your program Network Computer A Computer B Kinect Sensor Device Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun

Using the Received Kinect Data Assume that the Kinect data are successfully received from the network. We should convert them back to arrays to ease further analysis 25 Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun … … … … mSkeletonPoints[ ]mSkeletonExist[ ]mSkeletonPositions[ ] … … … …

Using the Received Kinect Data (cont) 26 Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun char *bufferPtr = buffer; for (int i = 0; i < 2; i++) { memcpy(&mSkeletonExist[i], bufferPtr, sizeof(bool)); bufferPtr += sizeof(bool); memcpy(&mSkeletonPoints[i], bufferPtr, 160); bufferPtr += 160; memcpy(mSkeletonPositions[i], bufferPtr, 320); bufferPtr += 320; } char *bufferPtr = buffer; for (int i = 0; i < 2; i++) { memcpy(&mSkeletonExist[i], bufferPtr, sizeof(bool)); bufferPtr += sizeof(bool); memcpy(&mSkeletonPoints[i], bufferPtr, 160); bufferPtr += 160; memcpy(mSkeletonPositions[i], bufferPtr, 320); bufferPtr += 320; }

Using the Received Kinect Data (cont) We can compare the magnitude of different data to understand the motion of player E.g. the following codes allow us to detect if the left hand of the player is raised above his head 27 Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun if (mSkeletonPoints[0][8].y > mSkeletonPoints[0][4].y) {... } if (mSkeletonPoints[0][8].y > mSkeletonPoints[0][4].y) {... } 8. NUI_SKELETON_POSITION_HAND_LEFT 4. NUI_SKELETON_POSITION_HEAD 1 st player Y coordinate stands for height

Using the Received Kinect Data (cont) We may also estimate the velocity of motion E.g. to estimate the velocity of the motion of left hand in y direction: 28 Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun Initialization Update screen Finish update screen processCalculation() { Ogre::Real eTime = evt.timeSinceLastFrame; float currentY = mSkeletonPoints[0][8].y; Ogre::Real velocity = (currentY – mPreviousY)/eTime; mPreviousY = currentY; } May not be accurate as it is only the instantaneous result between 2 frames May need to average the results across a few frames