Cameras Digital Visual Effects, Spring 2005 Yung-Yu Chuang 2005/3/2 with slides by Brian Curless, Steve Seitz and Alexei Efros.

Slides:



Advertisements
Similar presentations
CS559: Computer Graphics Lecture 2: Image Formation in Eyes and Cameras Li Zhang Spring 2008.
Advertisements

Exposure The balance of the amount of light allowed entering the photographic medium There are 3 elements used to create the desired exposure 1. ISO 2.
The Camera : Computational Photography Alexei Efros, CMU, Fall 2006.
776 Computer Vision Jan-Michael Frahm, Enrique Dunn Spring 2013.
Computational Photography: Color perception, light spectra, contrast Connelly Barnes.
The Generic Sensor Each photosite converts lightwave energy into photo- electrons Pixels in the output image are a measure of the number of photo-electrons.
Recap of Previous Lecture Matting foreground from background Using a single known background (and a constrained foreground) Using two known backgrounds.
Projection Readings Szeliski 2.1. Projection Readings Szeliski 2.1.
Image formation and cameras CSE P 576 Larry Zitnick Many slides courtesy of Steve Seitz.
Modeling Light : Rendering and Image Processing Alexei Efros.
Announcements. Projection Today’s Readings Nalwa 2.1.
High Dynamic Range Images : Computational Photography Alexei Efros, CMU, Fall 2005 …with a lot of slides stolen from Paul Debevec and Yuanzhen Li,
Capturing Light… in man and machine : Computational Photography Alexei Efros, CMU, Fall 2006 Some figures from Steve Seitz, Steve Palmer, Paul Debevec,
High Dynamic Range Images : Computational Photography Alexei Efros, CMU, Fall 2006 …with a lot of slides stolen from Paul Debevec and Yuanzhen Li,
Announcements Mailing list (you should have received messages) Project 1 additional test sequences online Talk today on “Lightfield photography” by Ren.
Announcements Mailing list Project 1 test the turnin procedure *this week* (make sure it works) vote on best artifacts in next week’s class Project 2 groups.
Lecture 4a: Cameras CS6670: Computer Vision Noah Snavely Source: S. Lazebnik.
The Camera : Computational Photography Alexei Efros, CMU, Fall 2008.
Image formation and cameras
Cameras CSE 455, Winter 2010 January 25, 2010.
Chapter 4: Cameras and Photography Depth of Field –Circle of Confusion –Effect of aperture Apertures –F-stop –Area –Depth of field Exposure –Shutter speed.
Aperture and Depth of Field. Review What are the three controls on the camera that control proper exposure?
Building a Real Camera.
1 University of Palestine Faculty of Applied Engineering and Urban Planning Software Engineering Department Introduction to computer vision Chapter 2:
Photography Is the capture of reflective light on light sensitive material. Film-Base Photography used “silver” as the light sensitive material. Digital.
Photography Parts of a Camera. Aperture size (or width or diameter) of the opening of a lens diaphragm inside a photographic lens regulates the amount.
How the Camera Works ( both film and digital )
Northeastern University, Fall 2005 CSG242: Computational Photography Ramesh Raskar Mitsubishi Electric Research Labs Northeastern University September.
Bits & Bytes (are not junk food!). Bit is short for binary digit, the smallest unit of information in the digital world. A single bit can hold only one.
Recovering High Dynamic Range Radiance Maps from Photographs [Debevec, Malik - SIGGRAPH’97] Presented by Sam Hasinoff CSC2522 – Advanced Image Synthesis.
High dynamic range imaging. Camera pipeline 12 bits8 bits.
11/15/11 Image-based Lighting Computational Photography Derek Hoiem, University of Illinois Many slides from Debevec, some from Efros T2.
776 Computer Vision Jan-Michael Frahm Fall Camera.
High Dynamic Range Images cs195g: Computational Photography James Hays, Brown, Spring 2010 slides from Alexei A. Efros and Paul Debevec.
Recap from Wednesday Two strategies for realistic rendering capture real world data synthesize from bottom up Both have existed for 500 years. Both are.
Ray Tracing Sang Il Park SEjong University With lots of slides stolen from Jehee Lee, Doug James, Steve Seitz, Shree Nayar, Alexei Efros, Fredo Durand.
776 Computer Vision Jan-Michael Frahm Fall Last class.
Camera Basics. Three things effect the exposure: 2. The size of the aperture or hole that allows light in. 3. The length of time light is admitted into.
Lenses. Diverging and Converging Lenses Double Convex lenses focus light rays to a point on the opposite side of the lens. Double Concave lenses diverge.
The Exposure Trio Aperture, Shutter Speed, and ISO.
High Dynamic Range Images : Computational Photography Alexei Efros, CMU, Spring 2010 …with a lot of slides stolen from Paul Debevec © Alyosha Efros.
CSE 185 Introduction to Computer Vision Cameras. Camera models –Pinhole Perspective Projection –Affine Projection –Spherical Perspective Projection Camera.
Projection Readings Nalwa 2.1 Szeliski, Ch , 2.1
DIGITAL CAMERAS Prof Oakes. Overview Camera history Digital Cameras/Digital Images Image Capture Image Display Frame Rate Progressive and Interlaced scans.
Understanding Aperture Overview & Refresher. Choosing Exposure Modes Aperture Priority Mode Lets you choose the aperture needed to obtain the depth of.
How digital cameras work The Exposure The big difference between traditional film cameras and digital cameras is how they capture the image. Instead of.
High dynamic range imaging Digital Visual Effects, Spring 2006 Yung-Yu Chuang 2006/3/8 with slides by Fedro Durand, Brian Curless, Steve Seitz and Alexei.
Digital Camera TAVITA SU’A. Overview ◦Digital Camera ◦Image Sensor ◦CMOS ◦CCD ◦Color ◦Aperture ◦Shutter Speed ◦ISO.
Lenses Contain both convex and concave lenses. Lenses Many concave and convex lenses, called elements are grouped together to produce a specific magnification.
CS559: Computer Graphics Lecture 3: Image Sampling and Filtering Li Zhang Spring 2010.
In Photography, there is an Exposure Triangle (Reciprocity Theory) Aperture – size of the iris opening, how much light come into the “window” Shutter Speed.
In Video, the shutter remains set Shutter Speed – amount of time the camera shutter is open during the exposure of one frame. (Standard – 1/50 sec.
Announcements Project 1 grading session this Thursday 2:30-5pm, Sieg 327 –signup ASAP:signup –10 minute slot to demo your project for a TA »have your program.
Introduction to Camera. Aperture The larger the aperture of the lens opening the more light reaches the sensor. Aperture is expressed as an f-stop. Each.
CSE 185 Introduction to Computer Vision
Digital Image -M.V.Ramachandranwww.youtube.com/postmanchandru
PNU Machine Vision Lecture 2a: Cameras Source: S. Lazebnik.
Electronics Lecture 5 By Dr. Mona Elneklawi.
In Video, the shutter remains set Shutter Speed – amount of time the camera shutter is open during the exposure of one frame. (Standard – 1/50.
The Camera : Computational Photography
Announcements Project 1 Project 2
Jan-Michael Frahm Fall 2016
High Dynamic Range Images
The Camera : Computational Photography
Announcements Midterm out today Project 1 demos.
Projection Readings Nalwa 2.1.
Projection Readings Szeliski 2.1.
Announcements Midterm out today Project 1 demos.
Photographic Image Formation I
Presentation transcript:

Cameras Digital Visual Effects, Spring 2005 Yung-Yu Chuang 2005/3/2 with slides by Brian Curless, Steve Seitz and Alexei Efros

Announcements Classroom is changed to Room101 Assignment schedule –Image morphing (3/9-3/30) –Image stitching (3/30-4/20) –Matchmove (4/20-5/11) –Final project (5/11-6/22) Scribe Send to subscribe

Outline Pinhole camera Film camera Digital camera Video camera High dynamic range imaging

Camera trial #1 scene film Put a piece of film in front of an object.

Pinhole camera scene film Add a barrier to block off most of the rays. It reduces blurring The pinhole is known as the aperture The image is inverted barrier pinhole camera

Shrinking the aperture Why not make the aperture as small as possible? Less light gets through Diffraction effect

Shrinking the aperture

Camera Obscura Drawing from “The Great Art of Light and Shadow “ Jesuit Athanasius Kircher, 1646.

High-end commercial pinhole cameras

Adding a lens

scene filmlens “circle of confusion” A lens focuses light onto the film There is a specific distance at which objects are “in focus” other points project to a “circle of confusion” in the image

Lenses Any object point satisfying this equation is in focus Thin lens applet: Thin lens equation:

Exposure = aperture + shutter speed Aperture of diameter D restricts the range of rays (aperture may be on either side of the lens) Shutter speed is the amount of light is allowed to pass through the aperture F

Aperture Aperture is usually specified by f-stop, f/D. When a change in f-stop occurs, the light is either doubled or cut in half. Lower f-stop, more light (larger lens opening) Higher f-stop, less light (smaller lens opening)

Depth of field Changing the aperture size affects depth of field. A smaller aperture increases the range in which the object is approximately in focus See

Distortion Radial distortion of the image –Caused by imperfect lenses –Deviations are most noticeable for rays that pass through the edge of the lens No distortionPin cushionBarrel

Correcting radial distortion from Helmut DerschHelmut Dersch

Film camera scene filmlens & motor aperture & shutter

Digital camera scene sensor array lens & motor aperture & shutter A digital camera replaces film with a sensor array Each cell in the array is a light-sensitive diode that converts photons to electrons

CCD v.s. CMOS CCD is less susceptible to noise (special process, higher fill factor) CMOS is more flexible, less expensive (standard process), less power consumption CCDCMOS

Sensor noise Blooming Diffusion Dark current Photon shot noise Amplifier readout noise

Color So far, we’ve only talked about monochrome sensors. Color imaging has been implemented in a number of ways: Field sequential Multi-chip Color filter array X3 sensor

Field sequential

Prokudin-Gorskii (early 1990’s) Lantern projector

Prokudin-Gorskii (early 1990’s)

Multi-chip wavelength dependent

Embedded color filters Color filters can be manufactured directly onto the photodetectors.

Color filter array Color filter arrays (CFAs)/color filter mosaics Kodak DCS620x

Color filter array Color filter arrays (CFAs)/color filter mosaics Bayer pattern

Bayer’s pattern

Demosaicking CFA’s bilinear interpolation originalinputlinear interpolation

Demosaicking CFA’s Constant hue-based interpolation (Cok) Hue: Interpolate G first

Demosaicking CFA’s Median-based interpolation (Freeman) 1. Linear interpolation 2. Median filter on color differences

Demosaicking CFA’s Median-based interpolation (Freeman) originalinput linear interpolation color differencemedian filter reconstruction

Demosaicking CFA’s Gradient-based interpolation (LaRoche-Prescott) 1. Interpolation on G

Demosaicking CFA’s Gradient-based interpolation (LaRoche-Prescott) 2. Interpolation of color differences

Demosaicking CFA’s bilinearCokFreemanLaRoche

Demosaicking CFA’s Generally, Freeman’s is the best, especially for natural images.

Foveon X3 sensor light penetrates to different depths for different wavelengths multilayer CMOS sensor gets 3 different spectral sensitivities

Foveon X3 sensor X3 sensorBayer CFA

Color processing After color values are recorded, more color processing usually happens: –White balance –Non-linearity to approximate film response or match TV monitor gamma

White Balance automatic white balancewarmer +3

Manual white balance white balance with the white book white balance with the red book

Autofocus Active –Sonar –Infrared Passive

Digital camera review website

Camcorder

Interlacing with interlacingwithout interlacing

deinterlacing blend weave

deinterlacing Discard Progressive scan

Hard cases

High dynamic range imaging

Camera pipeline

High dynamic range image

Short exposure Real world radiance Picture intensity dynamic range Pixel value 0 to 255

Long exposure Real world radiance Picture intensity dynamic range Pixel value 0 to 255

Real-world response functions

Camera calibration GeometricGeometric –How pixel coordinates relate to directions in the world PhotometricPhotometric –How pixel values relate to radiance amounts in the world GeometricGeometric –How pixel coordinates relate to directions in the world PhotometricPhotometric –How pixel values relate to radiance amounts in the world

Camera is not a photometer Limited dynamic rangeLimited dynamic range  Perhaps use multiple exposures? Unknown, nonlinear responseUnknown, nonlinear response  Not possible to convert pixel values to radiance Solution:Solution: –Recover response curve from multiple exposures, then reconstruct the radiance map Limited dynamic rangeLimited dynamic range  Perhaps use multiple exposures? Unknown, nonlinear responseUnknown, nonlinear response  Not possible to convert pixel values to radiance Solution:Solution: –Recover response curve from multiple exposures, then reconstruct the radiance map

Varying exposure Ways to change exposure –Shutter speed –Aperture –Natural density filters

Shutter speed Note: shutter times usually obey a power series – each “stop” is a factor of 2Note: shutter times usually obey a power series – each “stop” is a factor of 2 ¼, 1/8, 1/15, 1/30, 1/60, 1/125, 1/250, 1/500, 1/1000 sec¼, 1/8, 1/15, 1/30, 1/60, 1/125, 1/250, 1/500, 1/1000 sec Usually really is: Usually really is: ¼, 1/8, 1/16, 1/32, 1/64, 1/128, 1/256, 1/512, 1/1024 sec ¼, 1/8, 1/16, 1/32, 1/64, 1/128, 1/256, 1/512, 1/1024 sec Note: shutter times usually obey a power series – each “stop” is a factor of 2Note: shutter times usually obey a power series – each “stop” is a factor of 2 ¼, 1/8, 1/15, 1/30, 1/60, 1/125, 1/250, 1/500, 1/1000 sec¼, 1/8, 1/15, 1/30, 1/60, 1/125, 1/250, 1/500, 1/1000 sec Usually really is: Usually really is: ¼, 1/8, 1/16, 1/32, 1/64, 1/128, 1/256, 1/512, 1/1024 sec ¼, 1/8, 1/16, 1/32, 1/64, 1/128, 1/256, 1/512, 1/1024 sec

Varying shutter speeds

Algorithm  t = 1 sec  t = 1/16 sec  t = 4 sec  t = 1/64 sec  t = 1/4 sec Z=F(exposure) Δt exposure=radiance* Δt log exposure = log radiance + log Δt

Response curve ln exposure Assuming unit radiance for each pixel Assuming unit radiance for each pixel After adjusting radiances to obtain a smooth response curve Pixel value ln exposure Pixel value

Results (color film)

Recovered response function

Reconstructed radiance map

What is this for? Human perception Vision/graphics applications

Easier HDR reconstruction raw image = 12-bit CCD snapshot

Easier HDR reconstruction Z=F(exposure) Δt exposure=radiance* Δt log exposure = log radiance + log Δt exposure Δt

Reference Ramanath, Snyder, Bilbro, and Sander. Demosaicking Methods for Bayer Color Arrays, Journal of Electronic Imaging, 11(3), pp Demosaicking Methods for Bayer Color Arrays Paul E. Debevec, Jitendra Malik, Recovering High Dynamic Range Radiance Maps from Photographs, SIGGRAPH 1997.Recovering High Dynamic Range Radiance Maps from Photographs