10/14/14 Image-based Lighting Computational Photography Derek Hoiem, University of Illinois Many slides from Debevec, some from Efros T2.

Slides:



Advertisements
Similar presentations
Lecture 8 Transparency, Mirroring
Advertisements

Week 5 - Friday.  What did we talk about last time?  Quaternions  Vertex blending  Morphing  Projections.
Announcements Project 2 due today Project 3 out today –demo session at the end of class.
Lecture 35: Photometric stereo CS4670 / 5670 : Computer Vision Noah Snavely.
Computer Graphics In4/MSc Computer Graphics Lecture Notes #15 Illumination III View Independent Rendering.
Week 9 - Monday.  What did we talk about last time?  BRDFs  Texture mapping and bump mapping in shaders.
Capturing light Source: A. Efros. Image formation How bright is the image of a scene point?
Advanced Computer Graphics
16421: Vision Sensors Lecture 6: Radiometry and Radiometric Calibration Instructor: S. Narasimhan Wean 5312, T-R 1:30pm – 2:50pm.
Basic Principles of Surface Reflectance
Recap of Previous Lecture Matting foreground from background Using a single known background (and a constrained foreground) Using two known backgrounds.
ATEC Procedural Animation Introduction to Procedural Methods in 3D Computer Animation Dr. Midori Kitagawa.
Week 9 - Wednesday.  What did we talk about last time?  Fresnel reflection  Snell's Law  Microgeometry effects  Implementing BRDFs  Image based.
1 Angel: Interactive Computer Graphics 5E © Addison-Wesley 2009 Shading I.
Image-Based Lighting : Computational Photography Alexei Efros, CMU, Fall 2008 © Eirik Holmøyvik …with a lot of slides donated by Paul Debevec.
Capturing light Source: A. Efros. Review Pinhole projection models What are vanishing points and vanishing lines? What is orthographic projection? How.
CSCE 641: Photon Mapping Jinxiang Chai. Outline Rendering equation Photon mapping.
Capturing Light… in man and machine : Computational Photography Alexei Efros, CMU, Fall 2006 Some figures from Steve Seitz, Steve Palmer, Paul Debevec,
High Dynamic Range Images : Computational Photography Alexei Efros, CMU, Fall 2006 …with a lot of slides stolen from Paul Debevec and Yuanzhen Li,
Image-Based Lighting : Computational Photography Alexei Efros, CMU, Fall 2005 © Eirik Holmøyvik …with a lot of slides donated by Paul Debevec.
Introduction to Computer Vision CS / ECE 181B Tues, May 18, 2004 Ack: Matthew Turk (slides)
10/16/14 Image-based Lighting (Part 2) Computational Photography Derek Hoiem, University of Illinois Many slides from Debevec, some from Efros, Kevin Karsch.
Computational Photography lecture 13 – HDR CS 590 Spring 2014 Prof. Alex Berg (Credits to many other folks on individual slides)
CS 480/680 Computer Graphics Shading I Dr. Frederick C Harris, Jr.
Computer Graphics Inf4/MSc Computer Graphics Lecture 11 Texture Mapping.
Computer Graphics Inf4/MSc Computer Graphics Lecture Notes #16 Image-Based Lighting.
Previous Lecture The 7d plenoptic function, indexing all light. Lightfields: a 4d (not 5d!) data structure which captures all outgoing light from a region.
Light and shading Source: A. Efros.
11/18/10 Image-based Lighting (Part 2) Computational Photography Derek Hoiem, University of Illinois Many slides from Debevec, some from Efros T2.
Recovering High Dynamic Range Radiance Maps from Photographs [Debevec, Malik - SIGGRAPH’97] Presented by Sam Hasinoff CSC2522 – Advanced Image Synthesis.
Image-Based Rendering. 3D Scene = Shape + Shading Source: Leonard mcMillan, UNC-CH.
11/15/11 Image-based Lighting Computational Photography Derek Hoiem, University of Illinois Many slides from Debevec, some from Efros T2.
Reflections Specular reflection is the perfect reflection of light from a surface. The law a reflection states that the direction of the incoming ray and.
High Dynamic Range Images cs195g: Computational Photography James Hays, Brown, Spring 2010 slides from Alexei A. Efros and Paul Debevec.
03/10/03© 2003 University of Wisconsin Last Time Tone Reproduction and Perceptual Issues Assignment 2 all done (almost)
-Global Illumination Techniques
09/09/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Event management Lag Group assignment has happened, like it or not.
Ray Tracing Sang Il Park SEjong University With lots of slides stolen from Jehee Lee, Doug James, Steve Seitz, Shree Nayar, Alexei Efros, Fredo Durand.
09/11/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Graphics Pipeline Texturing Overview Cubic Environment Mapping.
CS 638, Fall 2001 Today Project Stage 0.5 Environment mapping Light Mapping.
Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted.
16421: Vision Sensors Lecture 7: High Dynamic Range Imaging Instructor: S. Narasimhan Wean 5312, T-R 1:30pm – 3:00pm.
Capturing light Source: A. Efros.
Week 10 - Wednesday.  What did we talk about last time?  Shadow volumes and shadow mapping  Ambient occlusion.
Computer graphics & visualization Photon Mapping.
Radiometry and Photometric Stereo 1. Estimate the 3D shape from shading information Can you tell the shape of an object from these photos ? 2.
Image-Based Lighting © Eirik Holmøyvik …with a lot of slides donated by Paul Debevec CS194: Image Manipulation & Computational Photography Alexei Efros,
High Dynamic Range Images : Computational Photography Alexei Efros, CMU, Spring 2010 …with a lot of slides stolen from Paul Debevec © Alyosha Efros.
Rendering Synthetic Objects into Real Scenes: Bridging Traditional and Image-based Graphics with Global Illumination and High Dynamic Range Photography.
11/16/10 Image-based Lighting Computational Photography Derek Hoiem, University of Illinois Many slides from Debevec, some from Efros T2.
Announcements Office hours today 2:30-3:30 Graded midterms will be returned at the end of the class.
04/30/02(c) 2002 University of Wisconsin Last Time Subdivision techniques for modeling We are now all done with modeling, the standard hardware pipeline.
Photo-realistic Rendering and Global Illumination in Computer Graphics Spring 2012 Material Representation K. H. Ko School of Mechatronics Gwangju Institute.
Monte-Carlo Ray Tracing and
CSCE 441: Computer Graphics Ray Tracing
In the name of God Computer Graphics. Last Time Some techniques for modeling Today Global illumination and raytracing.
Photo-realistic Rendering and Global Illumination in Computer Graphics Spring 2012 Stochastic Path Tracing Algorithms K. H. Ko School of Mechatronics Gwangju.
CSE 681 Introduction to Ray Tracing. CSE 681 Ray Tracing Shoot a ray through each pixel; Find first object intersected by ray. Image plane Eye Compute.
Distributed Ray Tracing. Can you get this with ray tracing?
Distributed Ray Tracing. Can you get this with ray tracing?
CS552: Computer Graphics Lecture 33: Illumination and Shading.
Image-based Lighting Computational Photography
donated by Paul Debevec
(c) 2002 University of Wisconsin
Capturing light Source: A. Efros.
Point Processing CS194: Image Manipulation & Computational Photography
High Dynamic Range Images
Part One: Acquisition of 3-D Data 2019/1/2 3DVIP-01.
CSCI 1290: Comp Photo Fall Brown University James Tompkin
Lecture 28: Photometric Stereo
Presentation transcript:

10/14/14 Image-based Lighting Computational Photography Derek Hoiem, University of Illinois Many slides from Debevec, some from Efros T2

Next two classes Today Mid-semester feedback Distribute light probes Start on ray tracing, environment maps, and relighting 3D objects (project 4 topics) Thursday More HDR, light probes, etc.

Project 4

How to render an object inserted into an image? What’s wrong with the teapot?

How to render an object inserted into an image? Traditional graphics way Manually model BRDFs of all room surfaces Manually model radiance of lights Do ray tracing to relight object, shadows, etc.

government-photoshop-fails-ever.htm north-korean-photoshop-fail/ Relighting is important!

How to render an object inserted into an image? Traditional graphics way Manually model BRDFs of all room surfaces Manually model radiance of lights Do ray tracing to relight object, shadows, etc.

How to render an object inserted into an image? Image-based lighting Capture incoming light with a “light probe” Model local scene Ray trace, but replace distant scene with info from light probe Debevec SIGGRAPH 1998

Key ideas for Image-based Lighting Environment maps: tell what light is entering at each angle within some shell +

Key ideas for Image-based Lighting Light probes: a way of capturing environment maps in real scenes

Key ideas for Image-based Lighting Capturing HDR images: needed so that light probes capture full range of radiance

Key ideas for Image-based Lighting Ray tracing: synthesize light paths between camera and light sources

Key ideas for Image-based Lighting Ray tracing: determining pixel intensity by shooting out rays from the camera

Key ideas for Image-based Lighting Relighting: environment map acts as light source, substituting for distant scene

Today Ray tracing Capturing environment maps

A photon’s life choices Absorption Diffusion Reflection Transparency Refraction Fluorescence Subsurface scattering Phosphorescence Interreflection λ light source ?

A photon’s life choices Absorption Diffusion Reflection Transparency Refraction Fluorescence Subsurface scattering Phosphorescence Interreflection λ light source

A photon’s life choices Absorption Diffuse Reflection Reflection Transparency Refraction Fluorescence Subsurface scattering Phosphorescence Interreflection λ light source

A photon’s life choices Absorption Diffusion Specular Reflection Transparency Refraction Fluorescence Subsurface scattering Phosphorescence Interreflection λ light source

A photon’s life choices Absorption Diffusion Reflection Transparency Refraction Fluorescence Subsurface scattering Phosphorescence Interreflection λ light source

A photon’s life choices Absorption Diffusion Reflection Transparency Refraction Fluorescence Subsurface scattering Phosphorescence Interreflection λ light source

A photon’s life choices Absorption Diffusion Reflection Transparency Refraction Fluorescence Subsurface scattering Phosphorescence Interreflection λ1λ1 light source λ2λ2

A photon’s life choices Absorption Diffusion Reflection Transparency Refraction Fluorescence Subsurface scattering Phosphorescence Interreflection λ light source

A photon’s life choices Absorption Diffusion Reflection Transparency Refraction Fluorescence Subsurface scattering Phosphorescence Interreflection t=1 light source t=n

A photon’s life choices Absorption Diffusion Reflection Transparency Refraction Fluorescence Subsurface scattering Phosphorescence Interreflection λ light source (Specular Interreflection)

A photon’s life choices Absorption Diffusion Reflection Transparency Fluorescence Interreflection Subsurface scattering Phosphoresence camera center image plane λ light source ?

Where are the light sources are in this room?

Rendering Equation Incoming LightBRDFIncident angle Generated light Total reflected light Outgoing light

Rendering a scene with ray tracing

Ray tracing: basics camera center image plane λ light source

Diffuse shading

Reflections

Shadows

Ray casting Store colors of surfaces, cast out rays, see what colors each ray hits Wolfenstein 3D (1992)

Ray tracing: fast approximation Upon hitting a surface Cast reflection/refraction ray to determine reflected or refracted surface Cast shadow ray: go towards light and see if an object is in the way

Ray tracing: interreflections Reflect light N times before heading to light source N=16 N=2

Ray tracing Conceptually simple but hard to do fast Full solution requires tracing millions of rays for many inter-reflections Design choices – Ray paths: Light to camera vs. camera to light? – How many samples per pixel (avoid aliasing)? – How to sample diffuse reflections? – How many inter-reflections to allow? – Deal with subsurface scattering, etc?

Environment Maps The environment map may take various forms: – Cubic mapping – Spherical mapping – other Describes the shape of the surface on which the map “resides” Determines how the map is generated and how it is indexed

Cubic Map Example

Cubic Mapping The map resides on the surfaces of a cube around the object – Typically, align the faces of the cube with the coordinate axes To generate the map: – For each face of the cube, render the world from the center of the object with the cube face as the image plane Rendering can be arbitrarily complex (it’s off-line) To use the map: – Index the R ray into the correct cube face – Compute texture coordinates

Spherical Map Example

Sphere mapping Map lives on a sphere To generate the map: – Render a spherical panorama from the designed center point Rendering with environment map: – Use the orientation of the R ray to index directly into the sphere

What approximations are made? The map should contain a view of the world with the point of interest on the object as the Center of Projection – We can’t store a separate map for each point, so one map is used with the COP at the center of the object – Introduces distortions in the reflection, but we usually don’t notice – Distortions are minimized for a small object in a large room The object will not reflect itself!

Rendering with environment maps and local models camera center image plane

Storing spherical environment maps

Equirectangular (latitude-longitude) projection

What about real scenes? From Flight of the Navigator

What about real scenes? from Terminator 2

Real environment maps We can use photographs to capture environment maps – The first use of panoramic mosaics – Fisheye lens – Mirrored balls (light probes)

Mirrored Sphere

Mirror balls for image-based lighting

Sources of Mirrored Balls  2-inch chrome balls ~ $20 ea.  McMaster-Carr Supply Company  6-12 inch large gazing balls  Baker’s Lawn Ornaments  Hollow Spheres, 2in – 4in  Dube Juggling Equipment  FAQ on  2-inch chrome balls ~ $20 ea.  McMaster-Carr Supply Company  6-12 inch large gazing balls  Baker’s Lawn Ornaments  Hollow Spheres, 2in – 4in  Dube Juggling Equipment  FAQ on

=> 59% Reflective Calibrating Mirrored Sphere Reflectivity

Spherical map domain transformations Many rendering programs only accept one format (mirror ball, equirectangular, cube map, etc) – E.g. Blender only accepts equirectangular maps How to convert mirror ball to equirectangular?

Mirror ball -> equirectangular

Spherical coordinates – Convert the light directions incident to the ball into spherical coordinates (phi, theta) – Map from mirror ball phi, theta to equirectangular phi, theta for i=1:d F = TriScatteredInterp(phi_ball, theta_ball, mirrorball(:,:,i)); latlon(:,:,i) = F(phi_latlon, theta_latlon); end

Mirror ball -> equirectangular Mirror ball Equirectangular NormalsReflection vectors Phi/theta of reflection vecs Phi/theta equirectangular domain

One small snag How do we deal with light sources? Sun, lights, etc? – They are much, much brighter than the rest of the environment

Problem: Dynamic Range

, ,000 2,000,000,000 The real world is high dynamic range.

Long Exposure Real world Picture 0 to 255 High dynamic range

Short Exposure Real world Picture High dynamic range 0 to 255

Camera Calibration Geometric –How pixel coordinates relate to directions in the world Photometric –How pixel values relate to radiance amounts in the world Geometric –How pixel coordinates relate to directions in the world Photometric –How pixel values relate to radiance amounts in the world

The Image Acquisition Pipeline scene radiance (W/sr/m ) scene radiance (W/sr/m )  sensor irradiance sensor irradiance sensor exposure sensor exposure latent image latent image Lens Shutter Film Electronic Camera 2 2 tttt film density film density analog voltages analog voltages digital values digital values pixel values pixel values Development CCD ADC Remapping

log Exposure = log (Radiance *  t) Imaging system response function Pixelvalue (CCD photon count)

Varying Exposure

Camera is not a photometer! Limited dynamic range  Perhaps use multiple exposures? Unknown, nonlinear response  Not possible to convert pixel values to radiance Solution: –Recover response curve from multiple exposures, then reconstruct the radiance map Limited dynamic range  Perhaps use multiple exposures? Unknown, nonlinear response  Not possible to convert pixel values to radiance Solution: –Recover response curve from multiple exposures, then reconstruct the radiance map

Next class How to capture HDR image using “bracketing” How to relight an object from an environment map