Computational Photography lecture 13 – HDR CS 590 Spring 2014 Prof. Alex Berg (Credits to many other folks on individual slides)

Slides:



Advertisements
Similar presentations
High Dynamic Range Imaging Samu Kemppainen VBM02S.
Advertisements

Week 9 - Monday.  What did we talk about last time?  BRDFs  Texture mapping and bump mapping in shaders.
Capturing light Source: A. Efros. Image formation How bright is the image of a scene point?
Computer Vision CS 776 Spring 2014 Photometric Stereo and Illumination Cones Prof. Alex Berg (Slide credits to many folks on individual slides)
Measuring BRDFs. Why bother modeling BRDFs? Why not directly measure BRDFs? True knowledge of surface properties Accurate models for graphics.
Eyes for Relighting Extracting environment maps for use in integrating and relighting scenes (Noshino and Nayar)
Advanced Computer Graphics
Foundations of Computer Graphics (Spring 2012) CS 184, Lecture 21: Radiometry Many slides courtesy Pat Hanrahan.
16421: Vision Sensors Lecture 6: Radiometry and Radiometric Calibration Instructor: S. Narasimhan Wean 5312, T-R 1:30pm – 2:50pm.
Acquiring the Reflectance Field of a Human Face Paul Debevec, Tim Hawkins, Chris Tchou, Haarm-Pieter Duiker, Westley Sarokin, Mark Sagar Haarm-Pieter Duiker,
Advanced Computer Graphics (Spring 2013) CS 283, Lecture 8: Illumination and Reflection Many slides courtesy.
Recap of Previous Lecture Matting foreground from background Using a single known background (and a constrained foreground) Using two known backgrounds.
Week 9 - Wednesday.  What did we talk about last time?  Fresnel reflection  Snell's Law  Microgeometry effects  Implementing BRDFs  Image based.
1Notes  Assignment 1 is out, due October 12  Inverse Kinematics  Evaluating Catmull-Rom splines for motion curves  Wednesday: may be late (will get.
Lecture 23: Photometric Stereo CS4670/5760: Computer Vision Kavita Bala Scott Wehrwein.
10/14/14 Image-based Lighting Computational Photography Derek Hoiem, University of Illinois Many slides from Debevec, some from Efros T2.
Image-Based Lighting : Computational Photography Alexei Efros, CMU, Fall 2008 © Eirik Holmøyvik …with a lot of slides donated by Paul Debevec.
High Dynamic Range Images : Computational Photography Alexei Efros, CMU, Fall 2005 …with a lot of slides stolen from Paul Debevec and Yuanzhen Li,
High Dynamic Range Images : Rendering and Image Processing Alexei Efros.
High Dynamic Range Images : Computational Photography Alexei Efros, CMU, Fall 2006 …with a lot of slides stolen from Paul Debevec and Yuanzhen Li,
Image-Based Lighting : Computational Photography Alexei Efros, CMU, Fall 2005 © Eirik Holmøyvik …with a lot of slides donated by Paul Debevec.
1cs426-winter-2008 Notes  RenderMan tutorial now on web site too  More papers to read when you can: Hanrahan and Lawson, “A language for shading and.
Image-based Rendering of Real Objects with Complex BRDFs.
CSCE 641: Computer Graphics Image-based Rendering Jinxiang Chai.
10/16/14 Image-based Lighting (Part 2) Computational Photography Derek Hoiem, University of Illinois Many slides from Debevec, some from Efros, Kevin Karsch.
Convergence of vision and graphics Jitendra Malik University of California at Berkeley Jitendra Malik University of California at Berkeley.
Image-Based Lighting II : Rendering and Image Processing Alexei Efros …with a lot of slides donated by Paul Debevec.
Measure, measure, measure: BRDF, BTF, Light Fields Lecture #6
Computer Graphics Inf4/MSc Computer Graphics Lecture Notes #16 Image-Based Lighting.
Previous Lecture The 7d plenoptic function, indexing all light. Lightfields: a 4d (not 5d!) data structure which captures all outgoing light from a region.
Light and shading Source: A. Efros.
11/18/10 Image-based Lighting (Part 2) Computational Photography Derek Hoiem, University of Illinois Many slides from Debevec, some from Efros T2.
Tone mapping with slides by Fredo Durand, and Alexei Efros Digital Image Synthesis Yung-Yu Chuang 11/08/2005.
Recovering High Dynamic Range Radiance Maps from Photographs [Debevec, Malik - SIGGRAPH’97] Presented by Sam Hasinoff CSC2522 – Advanced Image Synthesis.
High dynamic range imaging. Camera pipeline 12 bits8 bits.
Shading / Light Thanks to Srinivas Narasimhan, Langer-Zucker, Henrik Wann Jensen, Ravi Ramamoorthi, Hanrahan, Preetham.
11/15/11 Image-based Lighting Computational Photography Derek Hoiem, University of Illinois Many slides from Debevec, some from Efros T2.
Image-Based Rendering from a Single Image Kim Sang Hoon Samuel Boivin – Andre Gagalowicz INRIA.
High Dynamic Range Images cs195g: Computational Photography James Hays, Brown, Spring 2010 slides from Alexei A. Efros and Paul Debevec.
03/10/03© 2003 University of Wisconsin Last Time Tone Reproduction and Perceptual Issues Assignment 2 all done (almost)
16421: Vision Sensors Lecture 7: High Dynamic Range Imaging Instructor: S. Narasimhan Wean 5312, T-R 1:30pm – 3:00pm.
-Global Illumination Techniques
Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted.
16421: Vision Sensors Lecture 7: High Dynamic Range Imaging Instructor: S. Narasimhan Wean 5312, T-R 1:30pm – 3:00pm.
Capturing light Source: A. Efros.
Sources, Shadows, and Shading
Taku KomuraComputer Graphics Local Illumination and Shading Computer Graphics – Lecture 10 Taku Komura Institute for Perception, Action.
Radiometry and Photometric Stereo 1. Estimate the 3D shape from shading information Can you tell the shape of an object from these photos ? 2.
Image-Based Lighting © Eirik Holmøyvik …with a lot of slides donated by Paul Debevec CS194: Image Manipulation & Computational Photography Alexei Efros,
High Dynamic Range Images : Computational Photography Alexei Efros, CMU, Spring 2010 …with a lot of slides stolen from Paul Debevec © Alyosha Efros.
What is light? Electromagnetic radiation (EMR) moving along rays in space R( ) is EMR, measured in units of power (watts) – is wavelength Light: Travels.
Rendering Synthetic Objects into Real Scenes: Bridging Traditional and Image-based Graphics with Global Illumination and High Dynamic Range Photography.
Inverse Global Illumination: Recovering Reflectance Models of Real Scenes from Photographs Computer Science Division University of California at Berkeley.
Panorama artifacts online –send your votes to Li Announcements.
11/16/10 Image-based Lighting Computational Photography Derek Hoiem, University of Illinois Many slides from Debevec, some from Efros T2.
Rendering Synthetic Objects into Real- World Scenes by Paul Debevec SIGGRAPH 98 Conference Presented By Justin N. Rogers for Advanced Comp Graphics Spring.
Interreflections : The Inverse Problem Lecture #12 Thanks to Shree Nayar, Seitz et al, Levoy et al, David Kriegman.
High dynamic range imaging Digital Visual Effects, Spring 2006 Yung-Yu Chuang 2006/3/8 with slides by Fedro Durand, Brian Curless, Steve Seitz and Alexei.
Local Illumination and Shading
1Ellen L. Walker 3D Vision Why? The world is 3D Not all useful information is readily available in 2D Why so hard? “Inverse problem”: one image = many.
CS559: Computer Graphics Lecture 36: Raytracing Li Zhang Spring 2008 Many Slides are from Hua Zhong at CUM, Paul Debevec at USC.
EECS 274 Computer Vision Sources, Shadows, and Shading.
MAN-522 Computer Vision Spring
Image-based Lighting Computational Photography
donated by Paul Debevec
© 2005 University of Wisconsin
Image Based Modeling and Rendering (PI: Malik)
High Dynamic Range Images
CSCI 1290: Comp Photo Fall Brown University James Tompkin
Lecture 28: Photometric Stereo
Presentation transcript:

Computational Photography lecture 13 – HDR CS 590 Spring 2014 Prof. Alex Berg (Credits to many other folks on individual slides)

Assignment 3 Light probes and high dynamic range. Today

Slides from Derek Hoiem and other

How to render an object inserted into an image?

Traditional graphics way Manually model BRDFs of all room surfaces Manually model radiance of lights Do ray tracing to relight object, shadows, etc.

How to render an object inserted into an image? Image-based lighting Capture incoming light with a “light probe” Model local scene Ray trace, but replace distant scene with info from light probe Debevec SIGGRAPH 1998

Key ideas for Image-based Lighting Environment maps: tell what light is entering at each angle within some shell +

Cubic Map Example

Spherical Map Example

Key ideas for Image-based Lighting Light probes: a way of capturing environment maps in real scenes

Mirrored Sphere

Mirror ball -> equirectangular

Mirror ball Equirectangular NormalsReflection vectors Phi/theta of reflection vecs Phi/theta equirectangular domain Mirror ball -> equirectangular

One small snag How do we deal with light sources? Sun, lights, etc? – They are much, much brighter than the rest of the environment Use High Dynamic Range photography! Relative Brightness

Key ideas for Image-based Lighting Capturing HDR images: needed so that light probes capture full range of radiance

Problem: Dynamic Range

Long Exposure Real world Picture 0 to 255 High dynamic range

Short Exposure Real world Picture High dynamic range 0 to 255

LDR->HDR by merging exposures Real world High dynamic range 0 to 255 Exposure 1 … Exposure 2 Exposure n

Ways to vary exposure  Shutter Speed (*)  F/stop (aperture, iris)  Neutral Density (ND) Filters  Shutter Speed (*)  F/stop (aperture, iris)  Neutral Density (ND) Filters

Shutter Speed Ranges: Canon EOS-1D X: 30 to 1/8,000 sec. ProCamera for iOS: ~1/10 to 1/2,000 sec. Pros: Directly varies the exposure Usually accurate and repeatable Issues: Noise in long exposures Ranges: Canon EOS-1D X: 30 to 1/8,000 sec. ProCamera for iOS: ~1/10 to 1/2,000 sec. Pros: Directly varies the exposure Usually accurate and repeatable Issues: Noise in long exposures

Recovering High Dynamic Range Radiance Maps from Photographs Paul Debevec Jitendra Malik Paul Debevec Jitendra Malik August 1997 Computer Science Division University of California at Berkeley Computer Science Division University of California at Berkeley

The Approach Get pixel values Z ij for image with shutter time Δt j ( i th pixel location, j th image) Exposure is radiance integrated over time: – Pixel values are non-linearly mapped E ij ’s : – Rewrite to form a (not so obvious) linear system:

The objective Solve for radiance R and mapping g for each of 256 pixel values to minimize: give pixels near 0 or 255 less weight known shutter time for image j radiance at particular pixel site is the same for each image exposure should smoothly increase as pixel intensity increases exposure, as a function of pixel value

The Math Let g(z) be the discrete inverse response function For each pixel site i in each image j, want: Solve the overdetermined linear system: Let g(z) be the discrete inverse response function For each pixel site i in each image j, want: Solve the overdetermined linear system: fitting termsmoothness term

Matlab Code

function [g,lE]=gsolve(Z,B,l,w) n = 256; A = zeros(size(Z,1)*size(Z,2)+n+1,n+size(Z,1)); b = zeros(size(A,1),1); k = 1; % Include the data-fitting equations for i=1:size(Z,1) for j=1:size(Z,2) wij = w(Z(i,j)+1); A(k,Z(i,j)+1) = wij; A(k,n+i) = -wij; b(k,1) = wij * B(i,j); k=k+1; end A(k,129) = 1; % Fix the curve by setting its middle value to 0 k=k+1; for i=1:n-2 % Include the smoothness equations A(k,i)=l*w(i+1); A(k,i+1)=-2*l*w(i+1); A(k,i+2)=l*w(i+1); k=k+1; end x = A\b; % Solve the system using pseudoinverse g = x(1:n); lE = x(n+1:size(x,1));

 t = 1 sec  t = 1/16 sec  t = 4 sec  t = 1/64 sec Illustration Image series  t = 1/4 sec Exposure = Radiance   t log Exposure = log Radiance  log  t Pixel Value Z = f(Exposure)

Illustration: Response Curve ln(exposure)

Response Curve ln Exposure Assuming unit radiance for each pixel Assuming unit radiance for each pixel After adjusting radiances to obtain a smooth response curve Pixel value ln Exposure Pixel value

Results: Digital Camera Recovered response curve log Exposure Pixel value Kodak DCS460 1/30 to 30 sec

Reconstructed radiance map

Results: Color Film Kodak Gold ASA 100, PhotoCD

Recovered Response Curves RedGreen RGBBlue

How to display HDR? Linearly scaled to display device

Simple Global Operator Compression curve needs to –Bring everything within range –Leave dark areas alone In other words –Asymptote at 255 –Derivative of 1 at 0 Compression curve needs to –Bring everything within range –Leave dark areas alone In other words –Asymptote at 255 –Derivative of 1 at 0

Global Operator (Reinhart et al)

Global Operator Results

Darkest 0.1% scaled to display device Reinhart Operator

Local operator

What do we see? Vs.

Acquiring the Light Probe

Assembling the Light Probe

Real-World HDR Lighting Environments Lighting Environments from the Light Probe Image Gallery: Funston Beach Uffizi Gallery Eucalyptus Grove Grace Cathedral

Not just shiny… We have captured a true radiance map We can treat it as an extended (e.g spherical) light source Can use Global Illumination to simulate light transport in the scene So, all objects (not just shiny) can be lighted What’s the limitation? We have captured a true radiance map We can treat it as an extended (e.g spherical) light source Can use Global Illumination to simulate light transport in the scene So, all objects (not just shiny) can be lighted What’s the limitation?

Illumination Results Rendered with Greg Larson’s RADIANCE synthetic imaging system Rendered with Greg Larson’s RADIANCE synthetic imaging system

Comparison: Radiance map versus single image HDR LDR

Synthetic Objects + Real light! Synthetic Objects + Real light!

CG Objects Illuminated by a Traditional CG Light Source

Illuminating Objects using Measurements of Real Light Object Light Environment assigned “glow” material property in Greg Ward’s RADIANCE system.

Paul Debevec. A Tutorial on Image-Based Lighting. IEEE Computer Graphics and Applications, Jan/Feb 2002.

Rendering with Natural Light SIGGRAPH 98 Electronic Theater

Movie

Illuminating a Small Scene

We can now illuminate synthetic objects with real light. - Environment map - Light probe - HDR - Ray tracing How do we add synthetic objects to a real scene?

Real Scene Example Goal: place synthetic objects on table

real scene Modeling the Scene light-based model

Light Probe / Calibration Grid

The Light-Based Room Model

real scene Modeling the Scene synthetic objects light-based model local scene

Differential Rendering Local scene w/o objects, illuminated by model

The Lighting Computation synthetic objects (known BRDF) synthetic objects (known BRDF) distant scene (light-based, unknown BRDF) local scene (estimated BRDF) local scene (estimated BRDF)

Rendering into the Scene Background Plate

Rendering into the Scene Objects and Local Scene matched to Scene

Differential Rendering Difference in local scene - - = =

Differential Rendering Final Result

How to render an object inserted into an image? Image-based lighting Capture incoming light with a “light probe” Model local scene Ray trace, but replace distant scene with info from light probe Debevec SIGGRAPH 1998

I MAGE -B ASED L IGHTING IN F IAT L UX Paul Debevec, Tim Hawkins, Westley Sarokin, H. P. Duiker, Christine Cheng, Tal Garfinkel, Jenny Huang SIGGRAPH 99 Electronic Theater

Fiat Lux

HDR Image Series 2 sec 1/4 sec 1/30 sec 1/250 sec 1/2000 sec 1/8000 sec

Stp1 Panorama

Assembled Panorama

Light Probe Images

Capturing a Spatially-Varying Lighting Environment

What if we don’t have a light probe? Insert Relit Face Zoom in on eye Environment map from eye -- Nishino Nayar 2004

Environment Map from an Eye

Can Tell What You are Looking At Eye Image: Computed Retinal Image:

Summary Real scenes have complex geometries and materials that are difficult to model We can use an environment map, captured with a light probe, as a replacement for distance lighting We can get an HDR image by combining bracketed shots We can relight objects at that position using the environment map

Take away Read the paper for next class (Tuesday), we will go over the math again siggraph97.pdf siggraph97.pdf Make progress on your assignments