Planar Orientation from Blur Gradients in a Single Image Scott McCloskey Honeywell Labs Golden Valley, MN, USA Michael Langer McGill University Montreal,

Slides:



Advertisements
Similar presentations
Range from Focus Based on the work of Eric Krotkov And Jean Paul Martin.
Advertisements

Image Registration  Mapping of Evolution. Registration Goals Assume the correspondences are known Find such f() and g() such that the images are best.
Chapter 23.
Recovery of relative depth from a single observation using an uncalibrated (real-aperture) camera Vinay P. Namboodiri Subhasis Chaudhuri Department of.
CS 376b Introduction to Computer Vision 04 / 21 / 2008 Instructor: Michael Eckmann.
Depth of Field. What the what?? Is Depth of Field.
CASTLEFORD CAMERA CLUB DEPTH OF FIELD. DEPTH OF FIELD (DOF) DOF is the portion of a scene that appears acceptably sharp in the image.
Shape-from-X Class 11 Some slides from Shree Nayar. others.
Small f/number, “fast” system, little depth of focus, tight tolerances on placement of components Large f/number, “slow” system, easier tolerances,
P.1 JAMES S. Bethel Wonjo Jung Geomatics Engineering School of Civil Engineering Purdue University APR Sensor Modeling and Triangulation for an.
Active Calibration of Cameras: Theory and Implementation Anup Basu Sung Huh CPSC 643 Individual Presentation II March 4 th,
8. Thin lenses Thin lenses are those whose thickness is small compared to their radius of curvature. They may be either converging or diverging. Example:
Image Formation and Optics
Image Formation1 Projection Geometry Radiometry (Image Brightness) - to be discussed later in SFS.
Lecture 12: Projection CS4670: Computer Vision Noah Snavely “The School of Athens,” Raphael.
Properties of Lenses Dr. Kenneth Hoffman. Focal Length The distance from the optical center of a lens to the film plane (imaging chip) when the lens is.
Information that lets you recognise a region.
Motion from normal flow. Optical flow difficulties The aperture problemDepth discontinuities.
... M A K E S Y O U R N E T W O R K S M A R T E R Lenses & Filters.
Factors affecting the depth of field for SEM Afshin Jooshesh.
Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.
Multi-Aperture Photography Paul Green – MIT CSAIL Wenyang Sun – MERL Wojciech Matusik – MERL Frédo Durand – MIT CSAIL.
Depth from Diffusion Supported by ONR Changyin ZhouShree NayarOliver Cossairt Columbia University.
Multi-Focus Range Sensor using Coded Aperture Takashi MATSUYAMA Kyoto Univ. Shinsaku HIURA Osaka Univ.
CAP4730: Computational Structures in Computer Graphics 3D Concepts.
Design of photographic lens Shinsaku Hiura Osaka University.
Lenses Are classified by their Focal Length. The distance from the optical center of a lens to the front surface of the imaging device.
1 CS6825: Image Formation How are images created. How are images created.
Introduction to Engineering Camera Lab #3 - 1 Agenda Do parts I and II of the lab Record data Answer questions.
Circular aperture Rectangular aperture Fraunhofer Diffraction.
An Interactive Background Blurring Mechanism and Its Applications NTU CSIE 1 互動式背景模糊.
Shape from Stereo  Disparity between two images  Photogrammetry  Finding Corresponding Points Correlation based methods Feature based methods.
1-1 Measuring image motion velocity field “local” motion detectors only measure component of motion perpendicular to moving edge “aperture problem” 2D.
Graphics Graphics Korea University cgvr.korea.ac.kr 3D Viewing 고려대학교 컴퓨터 그래픽스 연구실.
Reporter: Wade Chang Advisor: Jian-Jiun Ding 1 Depth Estimation and Focus Recovery.
1 Finding depth. 2 Overview Depth from stereo Depth from structured light Depth from focus / defocus Laser rangefinders.
Optical Density - a property of a transparent medium that is an inverse measure of the speed of light through the medium. (how much a medium slows the.
Course 9 Texture. Definition: Texture is repeating patterns of local variations in image intensity, which is too fine to be distinguished. Texture evokes.
How natural scenes might shape neural machinery for computing shape from texture? Qiaochu Li (Blaine) Advisor: Tai Sing Lee.
CSE 185 Introduction to Computer Vision Stereo. Taken at the same time or sequential in time stereo vision structure from motion optical flow Multiple.
October 13, IMAGE FORMATION. October 13, CAMERA LENS IMAGE PLANE OPTIC AXIS LENS.
Film/Sensor Where the light is recorded Lens Bends the light Trajectory of light Subject Source of light Focusing A look at the overall camera system.
by: Taren Haynes, Robert Barnes, Devin Guelfo, Ashley Evans.
 Marc Levoy Using Plane + Parallax to Calibrate Dense Camera Arrays Vaibhav Vaish, Bennett Wilburn, Neel Joshi, Marc Levoy Computer Science Department.
Colour and Texture. Extract 3-D information Using Vision Extract 3-D information for performing certain tasks such as manipulation, navigation, and recognition.
1Ellen L. Walker 3D Vision Why? The world is 3D Not all useful information is readily available in 2D Why so hard? “Inverse problem”: one image = many.
An Interactive Background Blurring Mechanism and Its Applications NTU CSIE Yan Chih-Yu Advisor: Wu Ja-Ling, Ph.D. 1.
Camera LENSES, APERTURE AND DEPTH OF FIELD. Camera Lenses Wide angle lenses distort the image so that extreme wide angle can look like its convex such.
Reflectance Function Estimation and Shape Recovery from Image Sequence of a Rotating object Jiping Lu, Jim Little UBC Computer Science ICCV ’ 95.
Chapter 2: The Lens. Focal Length is the distance between the center of a lens and the film plane when focused at infinity.
By Claudine-Faye H. Agno Visual Elements in Photography.
Lenses Lenses define 2 important things: Angle of view (focal length) Aperture.
Mirrors and Lenses How do eyeglasses correct your vision? When you look in a mirror, where is the face you see? In which ways is a cell phone camera similar.
Motion of specularities on smooth random surfaces
3D Rendering 2016, Fall.
The human eye - an evolutionary look.
Rendering Pipeline Fall, 2015.
Lenses Are classified by their Focal Length.
Aperture and Depth of Field
CS262 – Computer Vision Lect 4 - Image Formation
Imaging and Depth Estimation in an Optimization Framework
Degradation/Restoration Model
Depth of Field Objective: to photograph a subject with a foreground and background, giving a better understanding of aperture and its relation to depth.
Modeling Motion Blur in Computer – Generated Images
Lenses Are classified by their Focal Length.
Aperture & Depth of Field
Depth Of Field (DOF).
Lesson 14 Key Concepts and Notes
COMPOSITION in photography
Unit 57 – Photography Depth of field
Presentation transcript:

Planar Orientation from Blur Gradients in a Single Image Scott McCloskey Honeywell Labs Golden Valley, MN, USA Michael Langer McGill University Montreal, QC, Canada

Outline Introduction Relation to Previous Work Modelling the Blur Gradient Planar Orientation Estimation Algorithm ◦ Estimating Tilt ◦ Estimating Slant Test Data and Experimental Results

Introduction A focus-based method to recover the orientation of a textured planar surface patch from a single image

Relation to Previous Work Depth from Defocus Shape from Texture ◦ Distance effect ◦ Foreshortening effect

Modelling the Blur Gradient(1/3) The goal of planar orientation algorithms is to accurately estimate the slant and tilt of a 3D plane

Modelling the Blur Gradient(2/3) Visible surface is a plane of depth The slant and tilt are the same at all positions in the image patch Focal length : f The distance from the sensor plane to the lens:

Modelling the Blur Gradient(3/3) camera’s aperture :F focal length: f sensor distance: blur radius: image position: (x,y) is a linear function of inverse depth blur radius is a linear function of image position (x, y) the blur gradient

Planar Orientation Estimation Algorithm(1/3) Image blur is best observed in the middle to high spatial frequencies ◦ remove low frequencies by low pass filter Comparing the blur along different lines in an image ◦ Sharpness measure

Planar Orientation Estimation Algorithm(2/3) Estimating Tilt ◦ Equifocal contour  A contour along which the amount of optical blur remains constant ◦ Fnding surface tilt searches for the direction in which the sharpness gradient is maximized

Estimating Slant ◦ Slant is estimated as the angle whose back- projection ◦ Produces the smallest gradient in the sharpness measure in the direction of former depth variation ◦ Uniformly blurred image (“doubly blurred image ”) Perspective- induced size change

Test Data and Experimental Results(1/4) Test set: 1404 camera images ◦ 9 planar textures ◦ 26 carefully-controlled orientations ◦ 6 different apertures (F = 22, 16, 11, 8, 5.6, 4) 26 planar orientations(Table 1.)

Test Data and Experimental Results(2/4) Orientation Estimation Results

Test Data and Experimental Results(3/4) Experiments with Image Size

Test Data and Experimental Results(4/4) Experiments with Natural Images