Download presentation

Published byLevi Luttrell Modified over 3 years ago

1
Verification of specifications and aptitude for short-range applications of the Kinect v2 depth sensor Cecilia Chen, Cornell University Lewis’ Educational and Research Collaborative Internship Project (LERCIP) NASA Glenn Research Center, Graphics & Visualization/Dr. Herb Schilling 7/22/2014

2
Purpose of this study Validate the published specifications for the Microsoft Kinect v2 depth sensor Resolution and x-y position accuracy Depth-sensing accuracy Near-range sensing limit Determine the sensor’s potential for use in short-range applications Feasibility of repurposing the Kinect for functions requiring an operating distance of approximately 0.5 m from the sensor

3
**Overview of topics discussed**

Introduction Microsoft Kinect v2 Infrared and Depth streams Time-of-flight Preliminary Calculations Is the depth camera’s error range acceptable? Possibility of errors in position due to low resolution Possibility of errors in angle due to noise

4
**Overview of topics discussed**

Calibration and Experimental Verification Software preparation Calibration Experimental verification Conclusions Findings Sponsors

5
Introduction

6
Microsoft Kinect v2 Primarily used for gaming and natural user interface Color, infrared, and depth streams 512 × 424 depth resolution 0.5–4.5 m depth sensing range 30 frames per second

7
**Infrared and Depth streams**

8
**Time-of-flight TOF is a form of LIDAR**

Software calculates the distance between a point and the sensor based on round-trip time and speed of light (c=3× 𝑚/𝑠) Emitter sends pulses of infrared light Detector senses returning light

9
**Preliminary Calculations**

10
**Is the depth camera’s error range acceptable?**

Microsoft claims depth measurements are accurate to within 1 mm Guidelines for short-range use of the depth sensor Defined by an operating distance (between the Kinect and object) of approximately 0.5 m Near-range of the Kinect v2 supposedly starts at 0.5 m Given a surface oriented perpendicular to the depth axis: Position – be able to locate a point on the surface to within 2 mm Angle – be able to measure inclinations on the surface to within 10° Both to be confirmed through calculations using an upper error bound

11
**Possibility of errors in position due to low resolution**

512 × 424 depth resolution Field of view: 70° horizontal 60° vertical y z x

13
**Possibility of errors in position due to low resolution**

Assuming a distance of 0.7 m between the surface and Kinect: tan 35° = 𝑥/2 0.7 𝑚 𝑥≈0.980 𝑚 35° x/2 0.7 m Horizontal slice of FOV

14
**Possibility of errors in position due to low resolution**

Assuming a distance of 0.7 m between the surface and Kinect: tan 30° = 𝑦/2 0.7 𝑚 𝑦≈0.808 𝑚 y/2 0.7 m 30° Vertical slice of FOV

15
**Possibility of errors in position due to low resolution**

Assuming a distance of 0.7 m between the surface and Kinect: 𝐴𝑟𝑒𝑎 𝑠𝑒𝑒𝑛 𝑏𝑦 𝐾𝑖𝑛𝑒𝑐𝑡=𝑥𝑦 𝐴𝑟𝑒𝑎= 𝑚 𝑚 =0.792 𝑚 2 Pixel density: 512×424=217,088 𝑝𝑖𝑥𝑒𝑙𝑠 217,088 𝑝𝑖𝑥𝑒𝑙𝑠 𝑚 2 ≈274,101 𝑝𝑖𝑥𝑒𝑙𝑠/ 𝑚 2

16
**Possibility of errors in position due to low resolution**

Given a circular section of the surface 50 mm in diameter: 𝐴=𝜋 𝑟 2 𝐴=𝜋∙( ) 𝑚 2 𝐴=𝜋∙ 𝑚 2 = 𝑚 2 𝑃𝑖𝑥𝑒𝑙𝑠 𝑎𝑐𝑟𝑜𝑠𝑠 𝑠𝑢𝑟𝑓𝑎𝑐𝑒=𝑝𝑖𝑥𝑒𝑙 𝑑𝑒𝑛𝑠𝑖𝑡𝑦×𝑎𝑟𝑒𝑎 274,101 𝑝𝑖𝑥𝑒𝑙𝑠 𝑚 2 × 𝑚 2 ≈538 𝑝𝑖𝑥𝑒𝑙𝑠 This works out to a ratio of 3.65 square millimeters (3.65≈ ) per pixel. 50 mm diameter

17
**Possibility of errors in angle due to noise**

For simplicity, take a plane defined by two points 50 mm apart representing the same circular surface: 50 mm diameter x y z Axis orientation

18
**Possibility of errors in angle due to noise**

z x y 50 mm Kinect θ Δz

19
**Possibility of errors in angle due to noise**

How large can Δz get before θ falls outside the allowed angle range? tan 𝜃 = ∆𝑧 50 𝑚𝑚 Let θ = 10°: ∆𝑧≤8.8 𝑚𝑚 Δz 50 mm θ

20
**Calibration and Experimental Verification**

21
Software preparation Use C# to interact with Microsoft Visual Studio and the Kinect for Windows SDK Modify and expand the Depth Basics code included in the SDK (Preview 1404) Decreasing the grayscale gradient range for easier visual distinguishability between depths Making the window recognize a hovering cursor Writing the (x, y, z) coordinates of the cursor to the image window in real-time Averaging depth readings at a given point to smooth out jumpy data Capturing depth arrays of multiple frames Outputting collected data to CSV files for further analysis

22
Before After

23
Calibration Step 0: Design and assemble a rig to hold the Kinect steady Step 1: Level the sensor Step 2: Set up a base surface for measurements at a height similar to the intended distance between the Kinect and interaction area

24
Calibration Step 3: Check for irregularities

25
**Experimental verification**

Step 4: Measure a sloped calibration block of known height 50 mm

26
Base, z1 Top of block, z2 Height, Δz Actual (mm) -- 50 Measured (mm) 571 0 (538 minimum) Error

27
**Experimental verification**

Step 5: Lower the base height Step 6: Measure a sloped calibration block of known height 50 mm

28
Base, z1 Top of block, z2 Height, Δz Actual (mm) -- 50 Measured (mm) 712 662

29
**Experimental verification**

Step 7: Repeat with calibration block(s) of different heights 64 mm

30
Base, z1 Top of block, z2 Height, Δz Actual (mm) -- 64 Measured (mm) 712 647 65

31
Conclusions

32
**Findings Successful verification of the published specs**

Kinect v2 appears to be a promising depth sensor for short-range applications Expectation or Guideline Experimental confirmation Acceptable? Position (resolution) ±2 mm (1.9 mm)2 per pixel resolution Yes Angle (noise) ≤10° Requires Δz ≤ 8.8 mm at z = 0.7 m Very Depth accuracy ±1 mm Near-range 0.5 m 0.538 m No

Similar presentations

Presentation is loading. Please wait....

OK

ENGINEERING GRAPHICS 1E9

ENGINEERING GRAPHICS 1E9

© 2019 SlidePlayer.com Inc.

All rights reserved.

To make this website work, we log user data and share it with processors. To use this website, you must agree to our Privacy Policy, including cookie policy.

Ads by Google