Extended Depth of Field For Long Distance Biometrics

Slides:



Advertisements
Similar presentations
The f/stop has a dual function complicating its understanding by the beginner. The f/stop is half the exposure changing mechanism along with the shutter.
Advertisements

Lens  The lens is the most important part of the camera  Lens captures light and focuses the light on the part of the camera that receives the image.

Announcements. Projection Today’s Readings Nalwa 2.1.
Diffusion Coding Photography for Extended Depth of Field SIGGRAPH 2010 Ollie Cossairt, Changyin Zhou, Shree Nayar Columbia University.
When Does a Camera See Rain? Department of Computer Science Columbia University Kshitiz Garg Shree K. Nayar ICCV Conference October 2005, Beijing, China.
What are Good Apertures for Defocus Deblurring? Columbia University ICCP 2009, San Francisco Changyin Zhou Shree K. Nayar.
CCU VISION LABORATORY Object Speed Measurements Using Motion Blurred Images 林惠勇 中正大學電機系
Structured Light in Scattering Media Srinivasa Narasimhan Sanjeev Koppal Robotics Institute Carnegie Mellon University Sponsor: ONR Shree Nayar Bo Sun.
Linear View Synthesis Using a Dimensionality Gap Light Field Prior
1 Diffusion Coded Photography for Extended Depth of Field SIGGRAPH 2010 Oliver Cossairt, Changyin Zhou, Shree Nayar Columbia University Supported by ONR.
Lensless Imaging with A Controllable Aperture Assaf Zomet and Shree K. Nayar Columbia University IEEE CVPR Conference June 2006, New York, USA.
Photography Basics. The Process The Exposure - Camera and Lens Developing - Dark Room or PS… Printing – Darkroom or InkJet.
Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.
Northeastern University, Fall 2005 CSG242: Computational Photography Ramesh Raskar Mitsubishi Electric Research Labs Northeastern University September.
How does a camera work? Ksenia Bykova. Types of film cameras  SLR cameras (Single Lens Reflex)  SLR cameras (Single Lens Reflex) – you see the actual.
Depth from Diffusion Supported by ONR Changyin ZhouShree NayarOliver Cossairt Columbia University.
Design of photographic lens Shinsaku Hiura Osaka University.
Introduction to Computational Photography. Computational Photography Digital Camera What is Computational Photography? Second breakthrough by IT First.
Computational photography CS4670: Computer Vision Noah Snavely.
High Throughput Microscopy
Motion Deblurring Using Hybrid Imaging Moshe Ben-Ezra and Shree K. Nayar Columbia University IEEE CVPR Conference June 2003, Madison, USA.
Mitsubishi Electric Research Labs (MERL) Super-Res from Single Motion Blur PhotoAgrawal & Raskar Amit Agrawal and Ramesh Raskar Mitsubishi Electric Research.
0 Assignment 1 (Due: 3/9) The projections of two parallel lines, l 1 and l 2, which lie on the ground plane G, onto the image plane I converge at a point.
EG 2011 | Computational Plenoptic Imaging STAR | VI. High Speed Imaging1 Computational Plenoptic Imaging Gordon Wetzstein 1 Ivo Ihrke 2 Douglas Lanman.
Extracting Depth and Matte using a Color-Filtered Aperture Yosuke Bando TOSHIBA + The University of Tokyo Bing-Yu Chen National Taiwan University Tomoyuki.
Camera LENSES, APERTURE AND DEPTH OF FIELD. Camera Lenses Wide angle lenses distort the image so that extreme wide angle can look like its convex such.
In Photography, there is an Exposure Triangle (Reciprocity Theory) Aperture – size of the iris opening, how much light come into the “window” Shutter Speed.
Strictly confidential Introduction to Lenses 07/28/2015: 2014 Applied Core Technology Inc., All Rights Introduction to lenses.
Residential Security, Access Control, and Surveillance Copyright © 2005 Heathkit Company, Inc. All Rights Reserved Presentation 16 – The Camera Lens.
Introduction Computational Photography Seminar: EECS 395/495
Cinematography Module - CIN303 – Week 02
Mirrors and Lenses.
Rendering Pipeline Fall, 2015.
Lenses Are classified by their Focal Length.
Interchangeable Lens Camera Basics
Aperture and Depth of Field
Astronomical Spectroscopic Techniques
Camera Settings What Do They Do?.
“Whether they ever find life there or not, I think Jupiter should be considered an enemy planet.” Jack Handy HW2 is due on Wednesday. How’s that going?
PHOTOGRAPHY 101 Semester One Review (Aperture, Shutter Speed, ISO)
Imaging and Depth Estimation in an Optimization Framework
A tool for Graphic Design
CSE 185 Introduction to Computer Vision
3D Viewing cgvr.korea.ac.kr.
Recognizing Deformable Shapes
By your full names (2 people per group)
Aperture, Exposure and Depth of Field
You will be given the answer.
Lenses Are classified by their Focal Length.
Lab 10: Lenses Focal Length Magnification Lens Equation Depth of Field
Lecture 13: Cameras and geometry
Announcements Midterm out today Project 1 demos.
Depth Of Field (DOF).
THIN LENSES.
The Optics of the Camera Obscura or
Mirrors, Plane and Spherical Spherical Refracting Surfaces
Projection Readings Nalwa 2.1.
Credit: CS231a, Stanford, Silvio Savarese
Light and Lenses While Mirrors involve the reflection of light and the images we see, Lenses involve another property of light, refraction, or the effects.
The Thin-Lens Equation
A tool for Graphic Design
Unit 57 – Photography Depth of field
Chapter 14.1 Announcements:
Announcements Midterm out today Project 1 demos.
Photographic Image Formation I
Distributed Ray Tracing
Exposure Defined In photography, exposure is the amount of light per unit area (the image plane illuminance times the exposure time) reaching a photographic.
Depth Of Field.
Aperture, Exposure and Depth of Field
Presentation transcript:

Extended Depth of Field For Long Distance Biometrics Shree K. Nayar Columbia University ONR MURI Annual Meeting October 28, 2011 Graduate Students: Oliver Cossairt, Daniel Miau, Sophia Li

Depth Effects: Magnification and Blur Depth Range = 100 m Sensor Lens 200 mm 5 mm FL = 2 m Object Distance = 300 m Captured Image

Extending Depth of Field: Approaches Focal Sweep Cameras [Hausler ‘72] [Nagahara et al. ’08] Sensor Lens Focal Plane Diffusion Coding [Cossairt et al. ‘10] Lens Sensor Diffuser

Approach1: Focal Sweep Lens Scene Sensor Final PSF Instantaneous PSF [Hausler ’72] [Nagahara et al. ’08] Lens Scene Sensor Final PSF Instantaneous PSF t = 1 + + + + + + = One way of extending DOF that was explored here in the lab is to utilize sensor motion during exposure The idea is to move the sensor during exposure so that every object is in focus at least one instant. The result is a PSF that is depth-invariant and preserves high frequencies. t = 2 t = 3 t = 4 t = 5 t = 6 t = 7

Focal Sweep PSF: Depth Invariance PSFs for Different Depths

Focal Sweep Experiments with Telescope Meade LX200 Schmidt-Cassegrain 8’’ Telescope System 75 m 50 m Conventional image 75 m 50 m Focus sweep

Focal Sweep without Moving Parts? With motion x x We can achieve the same effect without motion if we can use an optical device to distribute the ray energy in exactly the same way

Focal Sweep without Moving Parts? Can we achieve the performance of focal sweep without moving parts? With motion x x Lens We can achieve the same effect without motion if we can use an optical device to distribute the ray energy in exactly the same way Optical Device Without motion x x ? Lens

Approach 2: Diffusion Coding Conventional Diffuser w Sensor Diffuser Light ray x Radially Symmetric Diffuser Lens Sensor These optical devices exist – they are called diffusers. Diffusers are typically random surfaces whose statistics are chosen to give the desired scattering profile In our case, we were interested in a design that only scatters light along radial lines, so we designed a radially symmetric diffuser

Diffusion Coding: Performance Deblurring Error vs. Depth Focus Sweep Diffusion Coding Deblurring Error noise Depth Similar performance to focal sweep without moving parts

Diffusion Coding: Implementation Diffuser surface profile Diffuser scatter function r (mm) RPC Photonics [www.rpcphotonics.com] Diffuser height map

Diffusion Coding: Experiments Experimental Setup Fabricated Diffuser Canon 50mm EF lens Canon 450D Sensor Measured PSFs depth Without diffuser With diffuser

Face Detection with Diffusion Coding Conventional Camera (F:2.0) Diffusion Coding Camera (F:2.0)

Towards a Diffusion Coded Telescope Diffusion Coding for Long Range EDOF Imaging ?

Diffusion Coded Telescope: Optical Design Diffuser Annular Aperture Mirror 2 Mirror 1 Sensor 8” dia 80” Focal Length

Work in Progress Telescope Diffuser Diffuser Design Issues 8” dia. Telescope Diffuser Diffuser Design Issues Focal Length = 2000 mm Diffusion Angle < 50 μradians Fabrication Issues 8” Diameter Optical Element Surface Thickness < 30 μm Precision Optical Alignment

Depth of Field and Biometrics Conventional Camera Captured Image Sensor Lens Depth 1 Depth 2 Depth 3 Depth 4 Depth 5 Image Database Verification Classifier Same or Different? Brad Pitt

Verification Through Attributes Attribute-Based Face Verification Verification Through Attributes Images Low-level features Attributes Verification RGB HOG LBP SIFT … + - Dark hair Male Asian Round Jaw Different RGB HOG LBP SIFT … + - [Kumar et al. 2009]

EDOF Face Verification: Experiment Sensor Ground Truth Simulated Image Depth 1 Depth 2 Depth 3 Depth 4 Depth 5 Sensor Lens Conventional Camera Simulated Image Depth 1 Depth 2 Depth 3 Depth 4 Depth 5 Sensor Pinhole Pinhole Camera Simulated Image Depth 1 Depth 2 Depth 3 Depth 4 Depth 5 Lens Focal Sweep Camera Simulated Image Sensor Depth 1 Depth 2 Depth 3 Depth 4 Depth 5

EDOF Face Verification: Experiment For each camera type, simulate 5 different depth locations for all 200 people in pubfig database For each person, generate 20 positive matches and 200 negative matches Test verification accuracy using attribute classifier trained on LFW database [Kumar et al. ‘09]

EDOF Face Verification: Results Ground Truth (800mm EFL F/10, No Blur, Zero Noise) 1 0.75 True Positive Rate Depth = 100 m Depth = 136 m Depth = 163 m Depth = 190 m Depth = 218 m 0.5 0.25 0.25 0.5 0.75 1 False Positive Rate

EDOF Face Verification: Results Conventional Camera (800mm EFL F/10) conventional 1 0.75 True Positive Rate Depth = 100 m Depth = 136 m Depth = 163 m Depth = 190 m Depth = 218 m 0.5 0.25 0.25 0.5 0.75 1 False Positive Rate

EDOF Face Verification: Results Pinhole Camera (800mm EFL F/10) focalsweep 1 0.75 True Positive Rate Depth = 100 m Depth = 136 m Depth = 163 m Depth = 190 m Depth = 218 m 0.5 0.25 0.25 0.5 0.75 1 False Positive Rate

EDOF Face Verification: Results Focal Sweep Camera (800mm EFL F/10) 0.25 0.5 0.75 1 conventional True Positive Rate Depth = 100 m Depth = 136 m Depth = 163 m Depth = 190 m Depth = 218 m False Positive Rate

The Problem of Magnification Lens Focal Sweep Camera Sensor Depth 1 Depth 2 Depth 3 Depth 4 Depth 5

The Problem of Magnification Lens Focal Sweep Camera Sensor depth 1 depth 2 depth 3 depth 4 depth 5 Depth 1 Depth 2 Depth 3 Depth 4 Depth 5 Decreasing Performance

Focal sweep produces a depth-invariant PSF Input Focal sweep produces a depth-invariant PSF

Generalized Focal Sweep Depth 1 Depth 2 Depth 3 Depth 4 Depth 5 PSF Input Generalized focal sweep preserves energy at smaller magnifications

Generalized Focal Sweep: Motion Trajectories Sinusoid Wave Triangle Wave 5 Hz: Position Time Time Sinusoid Wave Triangle Wave 25 Hz: Position Time Time Blue: input trajectory signal; Red: monitored trajectory signal

Generalized Focal Sweep: Telephoto System BEI Kimco Voice Coil Motor Elmo Motion Control Driver Lumenera 1/3’’ CMOS Sensor Power Supply Positioning Stage Canon 800mm EFL Lens

Future Work Diffusion Coded Telescope Fabricate 8” diameter Diffuser Design Experiments: Long Range EDOF Videos Generalized Focal Sweep Develop Generalized Focal Sweep Theory Implement Generalized Focal Sweep Camera System EDOF Face Verification Experiments Optimize Motion Trajectory for Face Verification