Verification of specifications and aptitude for short-range applications of the Kinect v2 depth sensor Cecilia Chen, Cornell University Lewis’ Educational.

Slides:



Advertisements
Similar presentations
Miroslav Hlaváč Martin Kozák Fish position determination in 3D space by stereo vision.
Advertisements

1 Photometric Stereo Reconstruction Dr. Maria E. Angelopoulou.
Fusion of Time-of-Flight Depth and Stereo for High Accuracy Depth Maps Reporter :鄒嘉恆 Date : 2009/11/17.
QR Code Recognition Based On Image Processing
G. Lucas and Xin Zhao University of Huddersfield, UK
Exploration of bump, parallax, relief and displacement mapping
Joshua Fabian Tyler Young James C. Peyton Jones Garrett M. Clayton Integrating the Microsoft Kinect With Simulink: Real-Time Object Tracking Example (
4/30/2015 GEM Lecture 13 Content Orientation –Inner –Relative –Absolute.
KINECT REHABILITATION
Educated Spray A Geometry Thomas Furlong Prof. Caroline Genzale August 2012.
20 10 School of Electrical Engineering &Telecommunications UNSW UNSW 10 Nicholas Webb (Author), David Taubman (Supervisor) 15 October 2010.
MAT 105 SP09 Functions and Graphs
By : Adham Suwan Mohammed Zaza Ahmed Mafarjeh. Achieving Security through Kinect using Skeleton Analysis (ASKSA)
Lynne Grewe, Steven Magaña-Zook CSUEB, A cyber-physical system for senior collapse detection.
Definition of common terms Errors on photo
A Multicamera Setup for Generating Stereo Panoramic Video Tzavidas, S., Katsaggelos, A.K. Multimedia, IEEE Transactions on Volume: 7, Issue:5 Publication.
SDP 12 Project “PRASER” Senior Design Project 2012 Team Mosaic Advisor: Professor Lixin Gao Members: Allen Chew, Charles Essien, Brian Giang, Simon Ma.
3D Mapping Robots Intelligent Robotics School of Computer Science Jeremy Wyatt James Walker.
Tracking Migratory Birds Around Large Structures Presented by: Arik Brooks and Nicholas Patrick Advisors: Dr. Huggins, Dr. Schertz, and Dr. Stewart Senior.
ENGINEERING GRAPHICS 1E9
Passive Object Tracking from Stereo Vision Michael H. Rosenthal May 1, 2000.
ECEn 670 Mini-Conference29-Nov.-2011Everett Bryan, Bryce Pincock Velocity Estimation using the Kinect Sensor Everett Bryan Bryce Pincock 29-Nov
Basic Rendering Techniques V Recognizing basic rendering techniques.
Page 1 | Microsoft Introduction to audio stream Kinect for Windows Video Courses.
Page 1 | Microsoft Streams sync and coordinate mapping Kinect for Windows Video Courses.
PhaseSpace Optical Navigation Development & Multi-UAV Closed- Loop Control Demonstration Texas A&M University Telecon with Boeing December 7, 2009 PhaseSpace.
Professor : Yih-Ran Sheu Student’s name : Nguyen Van Binh Student ID: MA02B203 Kinect camera 1 Southern Taiwan University Department of Electrical Engineering.
Introduction Kinect for Xbox 360, referred to as Kinect, is developed by Microsoft, used in Xbox 360 video game console and Windows PCs peripheral equipment.
INTERACTING WITH SIMULATION ENVIRONMENTS THROUGH THE KINECT Fayez Alazmi Supervisor: Dr. Brett Wilkinson Flinders University Image 1Image 2 Source : 1.
ARSF Data Processing Consequences of the Airborne Processing Library Mark Warren Plymouth Marine Laboratory, Plymouth, UK RSPSoc 2012 – Greenwich, London.
Landing a UAV on a Runway Using Image Registration Andrew Miller, Don Harper, Mubarak Shah University of Central Florida ICRA 2008.
Page 1 | Microsoft Work With Skeleton Data Kinect for Windows Video Courses Jan 2013.
Page 1 | Microsoft Work With Color Data Kinect for Windows Video Courses Jan 2013.
Author : Ng Thomas ( ) Under the Guidance of: Iwan Njoto Sandjaja, MSCS. Rudy Adipranata, M.Eng.
SS5305 – Motion Capture Initialization 1. Objectives Camera Setup Data Capture using a Single Camera Data Capture using two Cameras Calibration Calibration.
By: Alex Norton Advisor: Dr. Huggins November 15, 2011
1 EEC-492/592 Kinect Application Development Lecture 2 Wenbing Zhao
Addison Wesley is an imprint of © 2010 Pearson Addison-Wesley. All rights reserved. Chapter 2 Graphics Programming with C++ and the Dark GDK Library Starting.
Course 9 Texture. Definition: Texture is repeating patterns of local variations in image intensity, which is too fine to be distinguished. Texture evokes.
Online Kinect Handwritten Digit Recognition Based on Dynamic Time Warping and Support Vector Machine Journal of Information & Computational Science, 2015.
Professor : Tsung Fu Chien Student’s name : Nguyen Trong Tuyen Student ID: MA02B208 An application Kinect camera controls Vehicles by Gesture 1 Southern.
The 18th Meeting on Image Recognition and Understanding 2015/7/29 Depth Image Enhancement Using Local Tangent Plane Approximations Kiyoshi MatsuoYoshimitsu.
EEC 490 GROUP PRESENTATION: KINECT TASK VALIDATION Scott Kruger Nate Dick Pete Hogrefe James Kulon.
The geometry of the system consisting of the hyperbolic mirror and the CCD camera is shown to the right. The points on the mirror surface can be expressed.
CONTENT 1. Introduction to Kinect 2. Some Libraries for Kinect 3. Implement 4. Conclusion & Future works 1.
I-Precision: Refers to how close a series of
1 Imaging Techniques for Flow and Motion Measurement Lecture 20 Lichuan Gui University of Mississippi 2011 Stereo High-speed Motion Tracking.
CLIC Module WG 20/07/2009 H. MAINAUD DURAND, BE-ABP/SU Pre-alignment system and impact on module design.
Introduction to Kinect For Windows SDK
N A S A G O D D A R D S P A C E F L I G H T C E N T E R I n s t r u m e n t S y n t h e s i s a n d A n a l y s i s L a b o r a t o r y APS Formation Sensor.
Stereoscopic Imaging for Slow-Moving Autonomous Vehicle By: Alex Norton Advisor: Dr. Huggins February 28, 2012 Senior Project Progress Report Bradley University.
The Coordinate System and Descriptive Geometry
Orthonormal Basis Cartesian Coordinate System – Unit vectors: i, j, k – Normalized to each other – Unique representation for each position!! – Convenient!
What you need: In order to use these programs you need a program that sends out OSC messages in TUIO format. There are a few options in programs that.
Capstone Design Implementation of Depth Sensor Based on Structured Infrared Patterns June 11, 2013 School of Information and Communication Engineering,
European Geosciences Union General Assembly 2016 Comparison Of High Resolution Terrestrial Laser Scanning And Terrestrial Photogrammetry For Modeling Applications.
Over the recent years, computer vision has started to play a significant role in the Human Computer Interaction (HCI). With efficient object tracking.
range from cameras stereoscopic (3D) camera pairs illumination-based
Creative Coding & the New Kinect
Plotting 3D Coordinates Using Microsoft Kinect
Danfoss Visual Inspection System
CURVES IN ENGINEERING.
CAPTURING OF MOVEMENT DURING MUSIC PERFORMANCE
Senior Capstone Project Gaze Tracking System
Introduction to the coordinate Plane
Introduction to Conic Sections
By: Mohammad Qudeisat Supervisor: Dr. Francis Lilley
Chapter 2 Graphics Programming with C++ and the Dark GDK Library
Orthogonal Base Cartesian Coordinate System Works in general:
INDOOR DENSE DEPTH MAP AT DRONE HOVERING
Presentation transcript:

Verification of specifications and aptitude for short-range applications of the Kinect v2 depth sensor Cecilia Chen, Cornell University Lewis’ Educational and Research Collaborative Internship Project (LERCIP) NASA Glenn Research Center, Graphics & Visualization/Dr. Herb Schilling 7/22/2014

Purpose of this study Validate the published specifications for the Microsoft Kinect v2 depth sensor Resolution and x-y position accuracy Depth-sensing accuracy Near-range sensing limit Determine the sensor’s potential for use in short-range applications Feasibility of repurposing the Kinect for functions requiring an operating distance of approximately 0.5 m from the sensor

Overview of topics discussed Introduction Microsoft Kinect v2 Infrared and Depth streams Time-of-flight Preliminary Calculations Is the depth camera’s error range acceptable? Possibility of errors in position due to low resolution Possibility of errors in angle due to noise

Overview of topics discussed Calibration and Experimental Verification Software preparation Calibration Experimental verification Conclusions Findings Sponsors

Introduction

Microsoft Kinect v2 Primarily used for gaming and natural user interface Color, infrared, and depth streams 512 × 424 depth resolution 0.5–4.5 m depth sensing range 30 frames per second

Infrared and Depth streams

Time-of-flight TOF is a form of LIDAR Software calculates the distance between a point and the sensor based on round-trip time and speed of light (c=3× 10 8 𝑚/𝑠) Emitter sends pulses of infrared light Detector senses returning light

Preliminary Calculations

Is the depth camera’s error range acceptable? Microsoft claims depth measurements are accurate to within 1 mm Guidelines for short-range use of the depth sensor Defined by an operating distance (between the Kinect and object) of approximately 0.5 m Near-range of the Kinect v2 supposedly starts at 0.5 m Given a surface oriented perpendicular to the depth axis: Position – be able to locate a point on the surface to within 2 mm Angle – be able to measure inclinations on the surface to within 10° Both to be confirmed through calculations using an upper error bound

Possibility of errors in position due to low resolution 512 × 424 depth resolution Field of view: 70° horizontal 60° vertical y z x

Possibility of errors in position due to low resolution Assuming a distance of 0.7 m between the surface and Kinect: tan 35° = 𝑥/2 0.7 𝑚 𝑥≈0.980 𝑚 35° x/2 0.7 m Horizontal slice of FOV

Possibility of errors in position due to low resolution Assuming a distance of 0.7 m between the surface and Kinect: tan 30° = 𝑦/2 0.7 𝑚 𝑦≈0.808 𝑚 y/2 0.7 m 30° Vertical slice of FOV

Possibility of errors in position due to low resolution Assuming a distance of 0.7 m between the surface and Kinect: 𝐴𝑟𝑒𝑎 𝑠𝑒𝑒𝑛 𝑏𝑦 𝐾𝑖𝑛𝑒𝑐𝑡=𝑥𝑦 𝐴𝑟𝑒𝑎= 0.980 𝑚 0.808 𝑚 =0.792 𝑚 2 Pixel density: 512×424=217,088 𝑝𝑖𝑥𝑒𝑙𝑠 217,088 𝑝𝑖𝑥𝑒𝑙𝑠 0.792 𝑚 2 ≈274,101 𝑝𝑖𝑥𝑒𝑙𝑠/ 𝑚 2

Possibility of errors in position due to low resolution Given a circular section of the surface 50 mm in diameter: 𝐴=𝜋 𝑟 2 𝐴=𝜋∙( 0.025 2 ) 𝑚 2 𝐴=𝜋∙0.000625 𝑚 2 =0.00196 𝑚 2 𝑃𝑖𝑥𝑒𝑙𝑠 𝑎𝑐𝑟𝑜𝑠𝑠 𝑠𝑢𝑟𝑓𝑎𝑐𝑒=𝑝𝑖𝑥𝑒𝑙 𝑑𝑒𝑛𝑠𝑖𝑡𝑦×𝑎𝑟𝑒𝑎 274,101 𝑝𝑖𝑥𝑒𝑙𝑠 𝑚 2 ×0.00196 𝑚 2 ≈538 𝑝𝑖𝑥𝑒𝑙𝑠 This works out to a ratio of 3.65 square millimeters (3.65≈ 1.9 2 ) per pixel. 50 mm diameter

Possibility of errors in angle due to noise For simplicity, take a plane defined by two points 50 mm apart representing the same circular surface: 50 mm diameter x y z Axis orientation

Possibility of errors in angle due to noise z x y 50 mm Kinect θ Δz

Possibility of errors in angle due to noise How large can Δz get before θ falls outside the allowed angle range? tan 𝜃 = ∆𝑧 50 𝑚𝑚 Let θ = 10°: ∆𝑧≤8.8 𝑚𝑚 Δz 50 mm θ

Calibration and Experimental Verification

Software preparation Use C# to interact with Microsoft Visual Studio and the Kinect for Windows SDK Modify and expand the Depth Basics code included in the SDK (Preview 1404) Decreasing the grayscale gradient range for easier visual distinguishability between depths Making the window recognize a hovering cursor Writing the (x, y, z) coordinates of the cursor to the image window in real-time Averaging depth readings at a given point to smooth out jumpy data Capturing depth arrays of multiple frames Outputting collected data to CSV files for further analysis

Before After

Calibration Step 0: Design and assemble a rig to hold the Kinect steady Step 1: Level the sensor Step 2: Set up a base surface for measurements at a height similar to the intended distance between the Kinect and interaction area

Calibration Step 3: Check for irregularities

Experimental verification Step 4: Measure a sloped calibration block of known height 50 mm

Base, z1 Top of block, z2 Height, Δz Actual (mm) -- 50 Measured (mm) 571 0 (538 minimum) Error

Experimental verification Step 5: Lower the base height Step 6: Measure a sloped calibration block of known height 50 mm

Base, z1 Top of block, z2 Height, Δz Actual (mm) -- 50 Measured (mm) 712 662

Experimental verification Step 7: Repeat with calibration block(s) of different heights 64 mm

Base, z1 Top of block, z2 Height, Δz Actual (mm) -- 64 Measured (mm) 712 647 65

Conclusions

Findings Successful verification of the published specs Kinect v2 appears to be a promising depth sensor for short-range applications Expectation or Guideline Experimental confirmation Acceptable? Position (resolution) ±2 mm (1.9 mm)2 per pixel resolution Yes Angle (noise) ≤10° Requires Δz ≤ 8.8 mm at z = 0.7 m Very Depth accuracy ±1 mm Near-range 0.5 m 0.538 m No