1 Finding depth. 2 Overview Depth from stereo Depth from structured light Depth from focus / defocus Laser rangefinders.

Slides:



Advertisements
Similar presentations
Miroslav Hlaváč Martin Kozák Fish position determination in 3D space by stereo vision.
Advertisements

Chapter 23.
3D Head Mesh Data Stereo Vision Active Stereo 3D Reconstruction 3dMD System 1.
The Camera : Computational Photography Alexei Efros, CMU, Fall 2006.
CS 376b Introduction to Computer Vision 04 / 21 / 2008 Instructor: Michael Eckmann.
Lab 10: Lenses 1.Focal Length 2.Magnification 3.Lens Equation 4.Depth of Field 5.Spherical Aberrations and Curved Focal Plane 6.Chromatic Aberration 7.Astigmatism.
Optics. Spherical Mirrors Spherical mirror – a section of a sphere of radius R and with a center of curvature C R C Mirror.
Optics: Lenses & Mirrors. Thin Lenses Thin Lenses: Any device which concentrates or disperses light. Types of Lenses:  Converging Lens: Parallel rays.
Stereo.
Small f/number, “fast” system, little depth of focus, tight tolerances on placement of components Large f/number, “slow” system, easier tolerances,
Image Formation and Optics
Geometry of Images Pinhole camera, projection A taste of projective geometry Two view geometry:  Homography  Epipolar geometry, the essential matrix.
Announcements. Projection Today’s Readings Nalwa 2.1.
Multiple View Geometry : Computational Photography Alexei Efros, CMU, Fall 2005 © Martin Quinn …with a lot of slides stolen from Steve Seitz and.
MSU CSE 240 Fall 2003 Stockman CV: 3D to 2D mathematics Perspective transformation; camera calibration; stereo computation; and more.
Epipolar geometry. (i)Correspondence geometry: Given an image point x in the first view, how does this constrain the position of the corresponding point.
Announcements Mailing list (you should have received messages) Project 1 additional test sequences online Talk today on “Lightfield photography” by Ren.
Introduction to Computer Vision 3D Vision Topic 9 Stereo Vision (I) CMPSCI 591A/691A CMPSCI 570/670.
Announcements Mailing list Project 1 test the turnin procedure *this week* (make sure it works) vote on best artifacts in next week’s class Project 2 groups.
Lecture 12: Projection CS4670: Computer Vision Noah Snavely “The School of Athens,” Raphael.
Linear View Synthesis Using a Dimensionality Gap Light Field Prior
© 2004 by Davi GeigerComputer Vision March 2004 L1.1 Binocular Stereo Left Image Right Image.
CSE473/573 – Stereo Correspondence
Announcements PS3 Due Thursday PS4 Available today, due 4/17. Quiz 2 4/24.
The Camera : Computational Photography Alexei Efros, CMU, Fall 2008.
Multiple View Geometry : Computational Photography Alexei Efros, CMU, Fall 2006 © Martin Quinn …with a lot of slides stolen from Steve Seitz and.
Stereo Ranging with verging Cameras Based on the paper by E. Krotkov, K.Henriksen and R. Kories.
Spherical Mirrors - Starter
3-D Scene u u’u’ Study the mathematical relations between corresponding image points. “Corresponding” means originated from the same 3D point. Objective.
Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.
Convex Lens A convex lens curves outward; it has a thick center and thinner edges.
Chapter 3: Mirrors and Lenses. Lenses –Refraction –Converging rays –Diverging rays Converging Lens –Ray tracing rules –Image formation Diverging Lens.
Computer Vision Spring ,-685 Instructor: S. Narasimhan WH 5409 T-R 10:30am – 11:50am Lecture #15.
Cameras Course web page: vision.cis.udel.edu/cv March 22, 2003  Lecture 16.
Camera Calibration & Stereo Reconstruction Jinxiang Chai.
Physics 014 Images. Topics  Plane mirrors  Spherical mirrors  Thin lenses.
Fundamental of Optical Engineering Lecture 3.  Aberration happens when the rays do not converge to a point where it should be. We may distinguish the.
Epipolar geometry The fundamental matrix and the tensor
Chapter 18-1 Mirrors. Plane Mirror a flat, smooth surface light is reflected by regular reflection rather than by diffuse reflection Light rays are reflected.
Recap from Wednesday Two strategies for realistic rendering capture real world data synthesize from bottom up Both have existed for 500 years. Both are.
Recap from Monday Image Warping – Coordinate transforms – Linear transforms expressed in matrix form – Inverse transforms useful when synthesizing images.
3D Sensing and Reconstruction Readings: Ch 12: , Ch 13: , Perspective Geometry Camera Model Stereo Triangulation 3D Reconstruction by.
CS654: Digital Image Analysis Lecture 8: Stereo Imaging.
Image Based Rendering an overview. 2 Photographs We have tools that acquire and tools that display photographs at a convincing quality level.
Lesson 25 Lenses Eleanor Roosevelt High School Chin-Sung Lin.
Cmput412 3D vision and sensing 3D modeling from images can be complex 90 horizon 3D measurements from images can be wrong.
Lec 22: Stereo CS4670 / 5670: Computer Vision Kavita Bala.
October 13, IMAGE FORMATION. October 13, CAMERA LENS IMAGE PLANE OPTIC AXIS LENS.
stereo Outline : Remind class of 3d geometry Introduction
Feature Matching. Feature Space Outlier Rejection.
Lenses Convex lenses converge rays of light. Parallel rays converge a fixed distance away from the lens. This is known as the focal length.
Project 2 due today Project 3 out today Announcements TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAA.
Image-Based Rendering Geometry and light interaction may be difficult and expensive to model –Think of how hard radiosity is –Imagine the complexity of.
Basics Reflection Mirrors Plane mirrors Spherical mirrors Concave mirrors Convex mirrors Refraction Lenses Concave lenses Convex lenses.
Mirrors and Lenses How do eyeglasses correct your vision? When you look in a mirror, where is the face you see? In which ways is a cell phone camera similar.
55:148 Digital Image Processing Chapter 11 3D Vision, Geometry
제 5 장 스테레오.
The Camera : Computational Photography
CS4670 / 5670: Computer Vision Kavita Bala Lec 27: Stereo.
Common Classification Tasks
Two-view geometry.
--- Stereoscopic Vision and Range Finders
--- Stereoscopic Vision and Range Finders
Multiple View Geometry for Robotics
Announcements Midterm out today Project 1 demos.
Filtering Things to take away from this lecture An image as a function
Projection Readings Nalwa 2.1.
Course 6 Stereo.
Filtering An image as a function Digital vs. continuous images
Announcements Midterm out today Project 1 demos.
Presentation transcript:

1 Finding depth

2 Overview Depth from stereo Depth from structured light Depth from focus / defocus Laser rangefinders

3 Overview Depth from stereo Depth from structured light Depth from focus / defocus Laser rangefinders

4 Depth from stereo P1P1 P2P2 two cameras with known parameters infer 3D location of point seen in both images sub problem: correspondences for a point seen in the left image, find its projection in the right image

5 Depth from stereo: déjà-vu math P1P1 P2P2 unknowns are w 1 and w 2 overconstrained system the u 2 v 2 coordinates of a point seen at u 1 v 1 are constrained to an epipolar line

6 Epipolar line P1P1 P2P2 C 1, C 2, P 1 define a plane P 2 will be on that plane P 2 is also on the image plane 2 So P 2 will be on the line defined by the two planes’ intersection

7 Search for correspondences on epipolar line P1P1 P2P2 Reduces the dimensionality of the search space Walk on epipolar segment rather than search in entire image

8 Parallel views Preferred stereo configuration epipolar lines are horizontal, easy to search

9 Parallel views Limit search to epipolar segment from u 2 = u 1 (P is infinitely far away) to 0 (P is close)

10 Depth precision analysis 1/z linear with disparity (u 1 – u 2 ) better depth resolution for nearby objects important to determine correspondences with subpixel accuracy

11 Overview Depth from stereo Depth from structured light Depth from focus / defocus Laser rangefinders

12 Overview Depth from stereo Depth from structured light Depth from focus / defocus Laser rangefinders

13 Depth from stereo problem Correspondences are difficult to find Structured light approach –replace one camera with projector –project easily detectable patterns –establishing correspondences becomes a lot easier

14 Depth from structured light P1P1 P2P2 C 1 is a projector Projects a pattern centered at u 1 v 1 Pattern center hits object scene at P Camera C 2 sees pattern at u 2 v 2, easy to find 3D location of P is determined

16

17 Depth from structured light challenges Associated with using projectors –expensive, cannot be used outdoors, not portable Difficult to identify pattern –I found a corner, which corner is it? Invasive, change the color of the scene –one could use invisible light, IR

18 Overview Depth from stereo Depth from structured light Depth from focus / defocus Laser rangefinders

19 Overview Depth from stereo Depth from structured light Depth from focus / defocus Laser rangefinders

20 Depth of field aperture object image CF F’ Thin lenses rays through lens center (C) do not change direction rays parallel to optical axis go through focal point (F’)

21 Depth of field aperture object image plane CF F’ For a given focal length, only objects that are at a certain depth are in focus

22 Out of focus aperture object image plane CF F’ When object at different depth One point projects to several locations in the image Out of focus, blurred image

23 Focusing aperture object image plane CF F’ Move lens to focus for new depth Relationship between focus and depth can be exploited to extract depth

24 Determine z for points in focus aperture object image plane CF F’ hihi h af z

25 Depth from defocus Take images of a scene with various camera parameters Measuring defocus variation, infer range to objects Does not need to find the best focusing planes for the various objects Examples by Shree Nayar, Columbia U

26 Overview Depth from stereo Depth from structured light Depth from focus / defocus Laser rangefinders

27 Overview Depth from stereo Depth from structured light Depth from focus / defocus Laser rangefinders

28 Laser range finders Send a laser beam to measure the distance –like RADAR, measures time of flight

29 DeltaSphere - depth&color acquisition device Lars Nyland et al. courtesy 3 rd Tech Inc.

o x 300 o panorama this is the reflected light

o x 300 o panorama this is the range light

32 courtesy 3 rd Tech Inc. spherical range panoramas planar re-projection

33 courtesy 3 rd Tech Inc. Jeep – one scan

34 courtesy 3 rd Tech Inc. Jeep – one scan

35 courtesy 3 rd Tech Inc. Complete Jeep model

36