Mitsubishi Electric Research Labs Progressively Refined Reflectance Fields from Natural Illumination Wojciech Matusik Matt Loper Hanspeter Pfister.

Slides:



Advertisements
Similar presentations
Land’s Retinex algorithm
Advertisements

What do color changes reveal about an outdoor scene? Kalyan Sunkavalli Fabiano Romeiro Wojciech Matusik Todd Zickler Hanspeter Pfister Harvard University.
Some problems... Lens distortion  Uncalibrated structure and motion recovery assumes pinhole cameras  Real cameras have real lenses  How can we.
Investigation Into Optical Flow Problem in the Presence of Spatially-varying Motion Blur Mohammad Hossein Daraei June 2014 University.
Environment Mapping CSE 781 – Roger Crawfis
Chunhui Yao 1 Bin Wang 1 Bin Chan 2 Junhai Yong 1 Jean-Claude Paul 3,1 1 Tsinghua University, China 2 The University of Hong Kong, China 3 INRIA, France.
Spherical Convolution in Computer Graphics and Vision Ravi Ramamoorthi Columbia Vision and Graphics Center Columbia University SIAM Imaging Science Conference:
Spherical Harmonic Lighting of Wavelength-dependent Phenomena Clifford Lindsay, Emmanuel Agu Worcester Polytechnic Institute (USA)
Measuring BRDFs. Why bother modeling BRDFs? Why not directly measure BRDFs? True knowledge of surface properties Accurate models for graphics.
Part I: Basics of Computer Graphics
Advanced Computer Graphics
Acquiring the Reflectance Field of a Human Face Paul Debevec, Tim Hawkins, Chris Tchou, Haarm-Pieter Duiker, Westley Sarokin, Mark Sagar Haarm-Pieter Duiker,
1 Online Construction of Surface Light Fields By Greg Coombe, Chad Hantak, Anselmo Lastra, and Radek Grzeszczuk.
1. What is Lighting? 2 Example 1. Find the cubic polynomial or that passes through the four points and satisfies 1.As a photon Metal Insulator.
Illumination Model How to compute color to represent a scene As in taking a photo in real life: – Camera – Lighting – Object Geometry Material Illumination.
Precomputed Local Radiance Transfer for Real-time Lighting Design Anders Wang Kristensen Tomas Akenine-Moller Henrik Wann Jensen SIGGRAPH ‘05 Presented.
University of British Columbia CPSC 314 Computer Graphics Jan-Apr 2008 Alla Sheffer Advanced Rendering Week.
An Efficient Representation for Irradiance Environment Maps Ravi Ramamoorthi Pat Hanrahan Stanford University.
Advanced Computer Graphics (Spring 2005) COMS 4162, Lecture 21: Image-Based Rendering Ravi Ramamoorthi
Everything on Global Illumination Xavier Granier - IMAGER/UBC.
Global Illumination May 7, Global Effects translucent surface shadow multiple reflection.
7M836 Animation & Rendering
University of British Columbia CPSC 314 Computer Graphics Jan-Apr 2005 Tamara Munzner Lighting and Shading Week.
Exploiting Temporal Coherence for Incremental All-Frequency Relighting Ryan OverbeckRavi Ramamoorthi Aner Ben-ArtziEitan Grinspun Columbia University Ng.
A Theory of Locally Low Dimensional Light Transport Dhruv Mahajan (Columbia University) Ira Kemelmacher-Shlizerman (Weizmann Institute) Ravi Ramamoorthi.
The Radiosity Method Donald Fong February 10, 2004.
CSCE 641: Computer Graphics Image-based Rendering Jinxiang Chai.
Measure, measure, measure: BRDF, BTF, Light Fields Lecture #6
Computer Graphics Inf4/MSc Computer Graphics Lecture Notes #16 Image-Based Lighting.
Computer Vision Spring ,-685 Instructor: S. Narasimhan PH A18B T-R 10:30am – 11:50am Lecture #13.
1 Fabricating BRDFs at High Spatial Resolution Using Wave Optics Anat Levin, Daniel Glasner, Ying Xiong, Fredo Durand, Bill Freeman, Wojciech Matusik,
COMPUTER GRAPHICS CS 482 – FALL 2014 AUGUST 27, 2014 FIXED-FUNCTION 3D GRAPHICS MESH SPECIFICATION LIGHTING SPECIFICATION REFLECTION SHADING HIERARCHICAL.
The Free-form Light Stage Vincent Masselus Philip Dutré Frederik Anrys Department of Computer Science.
Image-Based Rendering from a Single Image Kim Sang Hoon Samuel Boivin – Andre Gagalowicz INRIA.
Interactive Virtual Relighting and Remodelling of Real Scenes C. Loscos 1, MC. Frasson 1,2,G. Drettakis 1, B. Walter 1, X. Granier 1, P. Poulin 2 (1) iMAGIS*
Jonathan M Chye Technical Supervisor : Mr Matthew Bett 2010.
-Global Illumination Techniques
Sebastian Enrique Columbia University Relighting Framework COMS 6160 – Real-Time High Quality Rendering Nov 3 rd, 2004.
09/11/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Graphics Pipeline Texturing Overview Cubic Environment Mapping.
An Efficient Representation for Irradiance Environment Maps Ravi Ramamoorthi Pat Hanrahan Stanford University SIGGRAPH 2001 Stanford University SIGGRAPH.
View-Dependent Precomputed Light Transport Using Nonlinear Gaussian Function Approximations Paul Green 1 Jan Kautz 1 Wojciech Matusik 2 Frédo Durand 1.
All-Frequency Shadows Using Non-linear Wavelet Lighting Approximation Ren Ng Stanford Ravi Ramamoorthi Columbia SIGGRAPH 2003 Pat Hanrahan Stanford.
Plenoptic Modeling: An Image-Based Rendering System Leonard McMillan & Gary Bishop SIGGRAPH 1995 presented by Dave Edwards 10/12/2000.
Quick survey about PRT Valentin JANIAUT KAIST (Korea Advanced Institute of Science and Technology)
Rendering Synthetic Objects into Real Scenes: Bridging Traditional and Image-based Graphics with Global Illumination and High Dynamic Range Photography.
Inverse Global Illumination: Recovering Reflectance Models of Real Scenes from Photographs Computer Science Division University of California at Berkeley.
Interreflections : The Inverse Problem Lecture #12 Thanks to Shree Nayar, Seitz et al, Levoy et al, David Kriegman.
Relighting with 4D Incident Light Fields Vincent Masselus Pieter Peers Philip Dutré Yves D. Willems.
Photo-realistic Rendering and Global Illumination in Computer Graphics Spring 2012 Material Representation K. H. Ko School of Mechatronics Gwangju Institute.
Graphics Graphics Korea University cgvr.korea.ac.kr 1 Surface Rendering Methods 고려대학교 컴퓨터 그래픽스 연구실.
Accurate Image Based Relighting through Optimization Pieter Peers Philip Dutré Department of Computer Science K.U.Leuven, Belgium.
Inferring Reflectance Functions from Wavelet Noise Pieter Peers Philip Dutré Pieter Peers Philip Dutré June 30 th 2005 Department of Computer Science.
Photo-realistic Rendering and Global Illumination in Computer Graphics Spring 2012 Hybrid Algorithms K. H. Ko School of Mechatronics Gwangju Institute.
02/12/03© 2003 University of Wisconsin Last Time Intro to Monte-Carlo methods Probability.
Distinctive Image Features from Scale-Invariant Keypoints
Photo-realistic Rendering and Global Illumination in Computer Graphics Spring 2012 Stochastic Path Tracing Algorithms K. H. Ko School of Mechatronics Gwangju.
Thank you for the introduction
Non-Linear Kernel-Based Precomputed Light Transport Paul Green MIT Jan Kautz MIT Wojciech Matusik MIT Frédo Durand MIT Henrik Wann Jensen UCSD.
Illumination Model How to compute color to represent a scene As in taking a photo in real life: – Camera – Lighting – Object Geometry Material Illumination.
Graphics Lecture 14: Slide 1 Interactive Computer Graphics Lecture 14: Radiosity - Computational Issues.
All-Frequency Shadows Using Non-linear Wavelet Lighting Approximation Ren Ng Stanford Ravi Ramamoorthi Columbia Pat Hanrahan Stanford.
EECS 274 Computer Vision Sources, Shadows, and Shading.
Toward Real-Time Global Illumination. Global Illumination == Offline? Ray Tracing and Radiosity are inherently slow. Speedup possible by: –Brute-force:
Toward Real-Time Global Illumination. Project Ideas Distributed ray tracing Extension of the radiosity assignment Translucency (subsurface scattering)
MAN-522 Computer Vision Spring
Photorealistic Rendering vs. Interactive 3D Graphics
The Rendering Equation
Image Based Modeling and Rendering (PI: Malik)
Image and Video Processing
CS5500 Computer Graphics May 29, 2006
Presentation transcript:

Mitsubishi Electric Research Labs Progressively Refined Reflectance Fields from Natural Illumination Wojciech Matusik Matt Loper Hanspeter Pfister

Motivation Complex natural scenes are difficult to acquire Acquisition needs to be easy and robust Image-based lighting offers high realism We would like to relight image-based models at any scale (from small objects to cities)

Motivation Image-based Relighting –no scene geometry – just images –no assumptions about scene reflectance properties

Previous Work Forward Approaches –Georghiades 99, Debevec 2000, Malzbender 01, Masselus 02, Peers 03 Inverse Approaches –Zongker 99, Chuang 00, Wexler 02 Pre-computed Light Transport –Sloan 02, Ng 03

Reflectance Field 8D function: [Debevec 2000] (θ r, φ r ) (u r,v r ) (θ i, φ i ) (u i,v i )

Reflectance (Weighting) Function Assumes incident illumination originates at infinity x,y are image space coordinates θiθi φiφi

Light Transport Model A light flow in the scene can be modeled as a multiple-input / multiple-output linear system: Scene light transport matrix T Incident Light L Observed Image B Unroll to a vector

Light Transport Model Solve independently for each output pixel multiple-input / single-output linear system : Scene light transport vector T i Incident Light L Observed Pixel b i

Representation Approximate T i as a sum of 2D rectangular kernels R k,i, each with weight w k,i. θiθi φiφi

Inverse Estimation Given input images L j we record observed pixel values b ij : Given matrix L and vector b i the goal is to estimate T i –Positions and sizes of the rectangular kernels R k,i –Weights w k, i

Estimating Kernel Weights Assume that we know sizes and positions of the kernels R k,i and would like to compute their weights Efficient solution using quadratic programming

Estimating Kernel Positions & Sizes Hierarchical kd-tree subdivision of the kernels input image domain At each level choose subdivision that reduces error the most Kernels are non-overlapping

Kernel Subdivisions specular refractive subsurface scattering glossy hard shadow Subdivisions

Spatial Correction The kernels search strategy does not always work Solution: For each output pixel: –try kernel positions and sizes of the neighboring output pixels –try shifted versions of the current kernels –solve for new weights –keep new kernels if the error decreases

Integration with Incident Illumination Is very efficient For each output pixel i The incident illumination is stored as a summed- area table to evaluate

Data Acquisition We have built two acquisition systems –Indoor scenes / small objects –Outdoor scenes (city)

Acquisition System I

Example Input Images

Results Refractive and specular elements Prediction Actual

Results – New Illumination

Results - White Vertical Bar Prediction Actual

Results Estimate Actual Diffuse elements, shadows

Results - White Vertical Bar

Results Estimate Actual Subsurface Scattering

Results - White Vertical Bar

Results Glossy elements and interreflections Estimate Actual

Results - White Vertical Bar

Results One shifted version of the same image used as input illumination

Acquisition System II Two Synchronized Cameras Camera #1 Camera #2

Example Observed Images

Results – Relighting The City White vertical bar

Lessons Inverse approaches benefit from good kernel search strategies & more computation power Inverse approaches are more efficient than forward approaches Challenges: –Scene needs to be static –Varied set of input illumination –Illumination is not at infinity

Conclusions Advantages of our algorithm: –Natural Illumination Input –All-frequency Robustness –Compact Representation –Progressive Refinement –Fast Evaluation –Simplicity

Future Work New acquisition systems –object and camera are fixed w.r.t. each other and they rotate in a single, natural environment Combining representations from different viewpoints and proxy geometry Coarse-to-fine estimation in the observed image space –start with low resolution observed images & search exhaustively for the best kernels –propagate the kernels to higher resolution images

Acknowledgements Jan Kautz Barb Cutler Jennifer Roderick Pfister EGSR Reviewers