Reprojection of 3D points of Superquadrics Curvature caught by Kinect IR-depth sensor to CCD of RGB camera Mariolino De Cecco, Nicolo Biasi, Ilya Afanasyev.

Slides:



Advertisements
Similar presentations
Miroslav Hlaváč Martin Kozák Fish position determination in 3D space by stereo vision.
Advertisements

DEVELOPMENT OF A COMPUTER PLATFORM FOR OBJECT 3D RECONSTRUCTION USING COMPUTER VISION TECHNIQUES Teresa C. S. Azevedo João Manuel R. S. Tavares Mário A.
Instituto de Sistemas e Robótica TELE-3D Calibrator Camera Calibration Using Intel OpenCV Library.
Assignment 4 Cloning Yourself
CSE473/573 – Stereo and Multiple View Geometry
Inpainting Assigment – Tips and Hints Outline how to design a good test plan selection of dimensions to test along selection of values for each dimension.
Joshua Fabian Tyler Young James C. Peyton Jones Garrett M. Clayton Integrating the Microsoft Kinect With Simulink: Real-Time Object Tracking Example (
Hilal Tayara ADVANCED INTELLIGENT ROBOTICS 1 Depth Camera Based Indoor Mobile Robot Localization and Navigation.
Chapter 6 Feature-based alignment Advanced Computer Vision.
Computer vision. Camera Calibration Camera Calibration ToolBox – Intrinsic parameters Focal length: The focal length in pixels is stored in the.
Lecture 8: Stereo.
Lecture 23: Structure from motion and multi-view stereo
RANSAC-Assisted Display Model Reconstruction for Projective Display Patrick Quirk, Tyler Johnson, Rick Skarbez, Herman Towles, Florian Gyarfas, Henry Fuchs.
Real Time Visual Body Feedback & IR Tracking in HMD Based Virtual Environments Using Microsoft Kinects Speaker: Srivishnu ( Kaushik ) Satyavolu Advisor:
Game Development with Kinect
Lecture 11: Structure from motion CS6670: Computer Vision Noah Snavely.
LYU0503 Document Image Reconstruction on Mobile Using Onboard Camera Supervisor: Professor Michael R.Lyu Group Members: Leung Man Kin, Stephen Ng Ying.
Image Stitching and Panoramas
CSCE 641 Computer Graphics: Image-based Modeling Jinxiang Chai.
Panoramas and Calibration : Rendering and Image Processing Alexei Efros …with a lot of slides stolen from Steve Seitz and Rick Szeliski.
Abstract Overall Algorithm Target Matching Error Checking: By comparing what we transform from Kinect Camera coordinate to robot coordinate with what we.
Lecture 12: Structure from motion CS6670: Computer Vision Noah Snavely.
Today: Calibration What are the camera parameters?
Multispectral Camera Simon Belkin, Audrey Finken, Grant George, & Matthew Walczak Comprehensive Design Review Team Parente.
3-D Scene u u’u’ Study the mathematical relations between corresponding image points. “Corresponding” means originated from the same 3D point. Objective.
Kinect calibration Ilya Afanasyev Facoltà di Ingegneria Trento, /20 25/01/2012.
Automatic Camera Calibration
Object Detection with Superquadrics Presenter: Ilya Afanasyev Facoltà di Ingegneria Trento, /12/2010 1/20.
Simple Calibration of Non-overlapping Cameras with a Mirror
Geometric and Radiometric Camera Calibration Shape From Stereo requires geometric knowledge of: –Cameras’ extrinsic parameters, i.e. the geometric relationship.
Professor : Yih-Ran Sheu Student’s name : Nguyen Van Binh Student ID: MA02B203 Kinect camera 1 Southern Taiwan University Department of Electrical Engineering.
Advanced Computer Vision Feature-based Alignment Lecturer: Lu Yi & Prof. Fuh CSIE NTU.
Stereoscopic Imaging for Slow-Moving Autonomous Vehicle By: Alexander Norton Advisor: Dr. Huggins April 26, 2012 Senior Capstone Project Final Presentation.
INTRODUCTION Generally, after stroke, patient usually has cerebral cortex functional barrier, for example, the impairment in the following capabilities,
A Method for Hand Gesture Recognition Jaya Shukla Department of Computer Science Shiv Nadar University Gautam Budh Nagar, India Ashutosh Dwivedi.
Reconstructing 3D mesh from video image sequences supervisor : Mgr. Martin Samuelčik by Martin Bujňák specifications Master thesis
 Supervised by Prof. LYU Rung Tsong Michael Student: Chan Wai Yeung ( ) Lai Tai Shing ( )
Image stitching Digital Visual Effects Yung-Yu Chuang with slides by Richard Szeliski, Steve Seitz, Matthew Brown and Vaclav Hlavac.
CSCE 643 Computer Vision: Structure from Motion
Multiview Geometry and Stereopsis. Inputs: two images of a scene (taken from 2 viewpoints). Output: Depth map. Inputs: multiple images of a scene. Output:
Ray Divergence-Based Bundle Adjustment Conditioning for Multi-View Stereo Mauricio Hess-Flores 1, Daniel Knoblauch 2, Mark A. Duchaineau 3, Kenneth I.
Computer Vision : CISC 4/689 Going Back a little Cameras.ppt.
Professor : Tsung Fu Chien Student’s name : Nguyen Trong Tuyen Student ID: MA02B208 An application Kinect camera controls Vehicles by Gesture 1 Southern.
Kinect Hank Wei. Top - News 1.5 billion USD.
EECS 274 Computer Vision Geometric Camera Calibration.
1 Comparative Survey on Fundamental Matrix Estimation Computer Vision and Robotics Group Institute of Informatics and Applications University of Girona.
CONTENT 1. Introduction to Kinect 2. Some Libraries for Kinect 3. Implement 4. Conclusion & Future works 1.
Smoothly Varying Affine Stitching [CVPR 2011]
3D reconstruction from uncalibrated images
Making Panoramas. Input: Output: … Input:  A set of images taken from the same optical center.  For this project, the images will also have the same.
COS429 Computer Vision =++ Assignment 4 Cloning Yourself.
Stereoscopic Imaging for Slow-Moving Autonomous Vehicle By: Alex Norton Advisor: Dr. Huggins February 28, 2012 Senior Project Progress Report Bradley University.
Lecture 22: Structure from motion CS6670: Computer Vision Noah Snavely.
Capstone Design Implementation of Depth Sensor Based on Structured Infrared Patterns June 11, 2013 School of Information and Communication Engineering,
Che-An Wu Background substitution. Background Substitution AlphaMa p Trimap Depth Map Extract the foreground object and put into another background Objective.
OpenCV C++ Image Processing
3D HUMAN BODY POSE ESTIMATION BY SUPERQUADRICS 26/02/2012 1/21 Ilya Afanasyev, Massimo Lunardelli, Nicolo' Biasi, Luca Baglivo, Mattia Tavernini, Francesco.
Date of download: 7/9/2016 Copyright © 2016 SPIE. All rights reserved. High level representations of the two data flows. (a) The state-of-the-art data.
Southern Taiwan University Department of Electrical Engineering
Geometric Camera Calibration
Depth from disparity (x´,y´)=(x+D(x,y), y)
RGBD Camera Integration into CamC Computer Integrated Surgery II Spring, 2015 Han Xiao, under the auspices of Professor Nassir Navab, Bernhard Fuerst and.
The research project: FP7-People-Marie Curie-COFUND-Trentino-post-doc Incoming Laser-Camera systems and algorithms to sense the environment for.
A special case of calibration
DEPTH RANGE ACCURACY FOR PLENOPTIC CAMERAS
Multiple View Geometry for Robotics
Noah Snavely.
Object Detection with Superquadrics
Calibration and homographies
Presentation transcript:

Reprojection of 3D points of Superquadrics Curvature caught by Kinect IR-depth sensor to CCD of RGB camera Mariolino De Cecco, Nicolo Biasi, Ilya Afanasyev Facoltà di Ingegneria Trento, /11/2011 1/20

Content 2/20 1. Kinect installation. 2. Kinect calibration. 3. Kinect outputs. 4. MATLAB Preprocessing Kinect data. 5. MATLAB Inputs. 6. SQ Curvature Software. 7. Tests. 8. Results. 9. Links. 21/11/2011

Kinect installation 3/20 1. OpenNI backend: 1. Nicolas Burrus software [1]: Kinect RGBDemo v > supports OpenNI/Nite backends [2] and has the experimental infrared support with OpenNI (still buggy) -> I tried to check this work for OpenNI backend under Windows 64bit. 2. Install Sensor Kinect drivers under Windows 32 bit [3] and OpenNI/Nite modules (Win32) [2] according to the instruction [4] and verified that they work properly. 3. Use RGB-D Capture [1] to grab Kinect RGB and IR images with intensity.raw and depth.raw files. -> the software doesn’t grab IR images!! -> only RGB image, intensity.raw, depth.raw and calibration file (calibration.yml without distortion paraeters)!! 4. “calibrate-openni-intrinsics --pattern-size grab1 calibration.yml” -> gives “openni_calibration.yml” with Intrinsic matrix and distortion coefficients for Kinect RGB camera !! 5. “calibrate-openni-depth.exe --pattern-size grab1” gives partly processed figures with a message about errors. 21/11/ st attempt

The comparison of Kinect calibration results for - Libfreenect backend (under OS Linux / Ubuntu bit) - OpenNI backend (under OS Windows 32bit) - 23/11/2011 The same figures of Intrinsic Matrixes for Kinect RGB and Depth Cameras Libfreenect backendOpenNI backend The Depth Camera Calibration for OpenNI backend is NOT available!!!

The comparison of Kinect calibration results for - Libfreenect backend (under OS Linux / Ubuntu bit) - OpenNI backend (under OS Windows 32bit) - 23/11/2011 There is NO extrinsic mapping between Kinect Depth and RGB cameras for OpenNI backend Libfreenect backendOpenNI backend The reprojection of 3D points from Kinect Depth Camera to RGB image for OpenNI is NOT possible!!

Kinect installation 6/20 1. Libfreenect backend (Windows): 1. Nicolas Burrus software [1]: Kinect RGBDemo v > supports Libfreenect backend [2] and has RGB and infrared support -> I tried to check the work under Windows 64bit. 2. Install Xbox NUI Motor drivers under Windows 32 bit [3] and OpenNI/Nite modules (Win32) [2] according to the instruction [4] and verified that they work properly. 3. Use RGB-D Capture [1] to grab Kinect RGB and IR images with intensity.raw and depth.raw files. -> the software doesn’t grab IR images!! -> only RGB image, intensity.raw, depth.raw and calibration file (calibration.yml without distortion paraeters)!! 4. “calibrate-openni-intrinsics --pattern-size grab1 calibration.yml” -> gives “openni_calibration.yml” with Intrinsic matrix and distortion coefficients for Kinect RGB camera !! 5. “calibrate-openni-depth.exe --pattern-size grab1” gives partly processed figures with a message about errors. 21/11/ nd attempt

Kinect installation 7/20 1. OpenNI backend: 1. Nicolas Burrus software [1]: Kinect RGBDemo v > supports OpenNI/Nite backends [2] and has the experimental infrared support with OpenNI (still buggy) -> I tried to check this work for OpenNI backend under Windows 64bit. 2. Install Sensor Kinect drivers under Windows 32 bit [3] and OpenNI/Nite modules (Win32) [2] according to the instruction [4] and verified that they work properly. 3. Use RGB-D Capture [1] to grab Kinect RGB and IR images with intensity.raw and depth.raw files. -> the software doesn’t grab IR images!! -> only RGB image, intensity.raw, depth.raw and calibration file (calibration.yml without distortion paraeters)!! 4. “calibrate-openni-intrinsics --pattern-size grab1 calibration.yml” -> gives “openni_calibration.yml” with Intrinsic matrix and distortion coefficients for Kinect RGB camera !! 5. “calibrate-openni-depth.exe --pattern-size grab1” gives partly processed figures with a message about errors. 21/11/2011 New Idea

5. MATLAB Inputs. 8/20 Cube_Curvatura_ \ main_reproject.m -> color.png (480x640 pixels) -> calibData.mat: - matrixes of intrinsic parameters for IR camera (K_ir) and RGB camera (K_rgb) - extrinsic mapping between ir (depth) and rgb Kinect camera (R,T) - distortion coefficients (kc_ir, kc_rgb) -> Points.mat (N x 6), where N – number of points with (x,y,z,R,G,B) info 21/11/2011

6. MATLAB Software. 6.1 Elimination of the ground. 9/20 RANSAC search of the ground by using the plane equation. 21/11/2011

Transformation from IR (Depth) to RGB camera reference system: P_rgb = R * P_ir + T, where R,T - extrinsic mapping between ir and rgb Kinect cameras; P_rgb, P_ir – points in RGB and IR camera reference systems. Cube Reprojection without considering distortion (up figures). Cube Reprojection with considering distortion (bottom figures). 10/ D Points Reprojection. ??? Bad calibration? The distance from Kinect to the cube z = 0.5 m 21/11/2011

Cube and Cylinder Reprojection without considering distortion (up figures). Cube and Cylinder Reprojection with considering distortion (bottom figures). 11/ D Points Reprojection. ??? Bad calibration? The distance from Kinect to the objects z = 0.6 m 23/11/2011

12/ RANSAC fitting SQ to 3D data points. With using Levenberg-Marquardt algorithm of distance minimization from SQ to 3D points. Red points – outliers. Blue points – inliers. Green points – SQ model. 21/11/2011

13/ Object structure creation and reprojection on image. Reprojection of the lines between the 3D points of cube vertexes to CCD of Kinect RGB camera (figure a) and to the image with reprojected 3D points (figure b). The information about vertex position was obtained in the previous stage of RANSAC cube pose estimation. Red lines – the cube frameworks Figure a Figure b 21/11/2011

9.References 1. Nicolas Burrus. Kinect. RGBDemo, calibrate and visualize Kinect output. // OpenNI Modules. // 3. SensorKinect drivers. // github social coding How-to: Successfully install Kinect on Windows (OpenNI and NITE). // Vangos Pterneas blog: successfully-install-kinect-windows-openni-nite.aspx 5. Install OpenKinect for Windows 7 and XP. // openkinect-windows-7-and-xp 6. Camera Calibration and 3d Reconstruction. // OpenCV (Open Source Computer Vision) v2.1 documentation: ation_and_3d_reconstruction.html 14/20 21/11/2011