Sebastian Enrique Columbia University Relighting Framework COMS 6160 – Real-Time High Quality Rendering Nov 3 rd, 2004.

Slides:



Advertisements
Similar presentations
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Advertisements

Medical Image Registration Kumar Rajamani. Registration Spatial transform that maps points from one image to corresponding points in another image.
Tensors and Component Analysis Musawir Ali. Tensor: Generalization of an n-dimensional array Vector: order-1 tensor Matrix: order-2 tensor Order-3 tensor.
An Introduction to Light Fields Mel Slater. Outline Introduction Rendering Representing Light Fields Practical Issues Conclusions.
High Dynamic Range Imaging Samu Kemppainen VBM02S.
Interactive Rendering using the Render Cache Bruce Walter, George Drettakis iMAGIS*-GRAVIR/IMAG-INRIA Steven Parker University of Utah *iMAGIS is a joint.
3D Graphics Rendering and Terrain Modeling
1 Online Construction of Surface Light Fields By Greg Coombe, Chad Hantak, Anselmo Lastra, and Radek Grzeszczuk.
Precomputed Local Radiance Transfer for Real-time Lighting Design Anders Wang Kristensen Tomas Akenine-Moller Henrik Wann Jensen SIGGRAPH ‘05 Presented.
Advanced Computer Graphics CSE 190 [Spring 2015], Lecture 14 Ravi Ramamoorthi
Advanced Computer Graphics (Fall 2010) CS 283, Lecture 18: Precomputation-Based Real-Time Rendering Ravi Ramamoorthi
Computational Fundamentals of Reflection COMS , Lecture
Real-time Combined 2D+3D Active Appearance Models Jing Xiao, Simon Baker,Iain Matthew, and Takeo Kanade CVPR 2004 Presented by Pat Chan 23/11/2004.
Light Field Mapping: Hardware-Accelerated Visualization of Surface Light Fields.
Image Forgery Detection by Gamma Correction Differences.
Face Recognition Based on 3D Shape Estimation
Photorealistic Rendering of Rain Streaks Department of Computer Science Columbia University Kshitiz Garg Shree K. Nayar SIGGRAPH Conference July 2006,
Image-based Rendering of Real Objects with Complex BRDFs.
SVD and PCA COS 323. Dimensionality Reduction Map points in high-dimensional space to lower number of dimensionsMap points in high-dimensional space to.
Surface Light Fields for 3D Photography Daniel N. Wood University of Washington SIGGRAPH 2001 Course.
CSCE 641 Computer Graphics: Image-based Rendering (cont.) Jinxiang Chai.
Exploiting Temporal Coherence for Incremental All-Frequency Relighting Ryan OverbeckRavi Ramamoorthi Aner Ben-ArtziEitan Grinspun Columbia University Ng.
Previously Two view geometry: epipolar geometry Stereo vision: 3D reconstruction epipolar lines Baseline O O’ epipolar plane.
Matrix Row-Column Sampling for the Many-Light Problem Miloš Hašan (Cornell University) Fabio Pellacini (Dartmouth College) Kavita Bala (Cornell University)
3D Geometry for Computer Graphics
A Theory of Locally Low Dimensional Light Transport Dhruv Mahajan (Columbia University) Ira Kemelmacher-Shlizerman (Weizmann Institute) Ravi Ramamoorthi.
Convergence of vision and graphics Jitendra Malik University of California at Berkeley Jitendra Malik University of California at Berkeley.
Face Relighting with Radiance Environment Maps Zhen Wen 1, Zicheng Liu 2, Thomas Huang 1 Beckman Institute 1 University of Illinois Urbana, IL61801, USA.
Computer Graphics Inf4/MSc Computer Graphics Lecture Notes #16 Image-Based Lighting.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Pre-computed Radiance Transfer Jaroslav Křivánek, KSVI, MFF UK
Technology and Historical Overview. Introduction to 3d Computer Graphics  3D computer graphics is the science, study, and method of projecting a mathematical.
Image Formation. Input - Digital Images Intensity Images – encoding of light intensity Range Images – encoding of shape and distance They are both a 2-D.
Interactive Virtual Relighting and Remodelling of Real Scenes C. Loscos 1, MC. Frasson 1,2,G. Drettakis 1, B. Walter 1, X. Granier 1, P. Poulin 2 (1) iMAGIS*
Deep Learning – Fall 2013 Instructor: Bhiksha Raj Paper: T. D. Sanger, “Optimal Unsupervised Learning in a Single-Layer Linear Feedforward Neural Network”,
Interactively Modeling with Photogrammetry Pierre Poulin Mathieu Ouimet Marie-Claude Frasson Dép. Informatique et recherche opérationnelle Université de.
Real-Time Rendering Digital Image Synthesis Yung-Yu Chuang 01/03/2006 with slides by Ravi Ramamoorthi and Robin Green.
Image-based Rendering. © 2002 James K. Hahn2 Image-based Rendering Usually based on 2-D imagesUsually based on 2-D images Pre-calculationPre-calculation.
An Efficient Representation for Irradiance Environment Maps Ravi Ramamoorthi Pat Hanrahan Stanford University SIGGRAPH 2001 Stanford University SIGGRAPH.
Real-Time Relighting Digital Image Synthesis Yung-Yu Chuang 1/10/2008 with slides by Ravi Ramamoorthi, Robin Green and Milos Hasan.
View-Dependent Precomputed Light Transport Using Nonlinear Gaussian Function Approximations Paul Green 1 Jan Kautz 1 Wojciech Matusik 2 Frédo Durand 1.
Lighting affects appearance. How do we represent light? (1) Ideal distant point source: - No cast shadows - Light distant - Three parameters - Example:
All-Frequency Shadows Using Non-linear Wavelet Lighting Approximation Ren Ng Stanford Ravi Ramamoorthi Columbia SIGGRAPH 2003 Pat Hanrahan Stanford.
Plenoptic Modeling: An Image-Based Rendering System Leonard McMillan & Gary Bishop SIGGRAPH 1995 presented by Dave Edwards 10/12/2000.
Quick survey about PRT Valentin JANIAUT KAIST (Korea Advanced Institute of Science and Technology)
Rendering Synthetic Objects into Real Scenes: Bridging Traditional and Image-based Graphics with Global Illumination and High Dynamic Range Photography.
Fast Approximation to Spherical Harmonics Rotation Sumanta Pattanaik University of Central Florida Kadi Bouatouch IRISA / INRIA Rennes Jaroslav Křivánek.
Fast Approximation to Spherical Harmonics Rotation
Applying 3-D Methods to Video for Compression Salih Burak Gokturk Anne Margot Fernandez Aaron March 13, 2002 EE 392J Project Presentation.
Event retrieval in large video collections with circulant temporal encoding CVPR 2013 Oral.
112/5/ :54 Graphics II Image Based Rendering Session 11.
Artistic Surface Rendering Using Layout Of Text Tatiana Surazhsky Gershon Elber Technion, Israel Institute of Technology.
Mitsubishi Electric Research Labs Progressively Refined Reflectance Fields from Natural Illumination Wojciech Matusik Matt Loper Hanspeter Pfister.
Review on Graphics Basics. Outline Polygon rendering pipeline Affine transformations Projective transformations Lighting and shading From vertices to.
Photo-realistic Rendering and Global Illumination in Computer Graphics Spring 2012 Hybrid Algorithms K. H. Ko School of Mechatronics Gwangju Institute.
CSCE 641 Computer Graphics: Image-based Rendering (cont.) Jinxiang Chai.
Thank you for the introduction
Non-Linear Kernel-Based Precomputed Light Transport Paul Green MIT Jan Kautz MIT Wojciech Matusik MIT Frédo Durand MIT Henrik Wann Jensen UCSD.
Learning Photographic Global Tonal Adjustment with a Database of Input / Output Image Pairs.
Algorithm Development with Higher Order SVD
Image-Based Rendering Geometry and light interaction may be difficult and expensive to model –Think of how hard radiosity is –Imagine the complexity of.
Speaker Min-Koo Kang March 26, 2013 Depth Enhancement Technique by Sensor Fusion: MRF-based approach.
Non-Photorealistic Rendering FORMS. Model dependent Threshold dependent View dependent Outline form of the object Interior form of the object Boundary.
Eigen Texture Method : Appearance compression based method Surface Light Fields for 3D photography Presented by Youngihn Kho.
All-Frequency Shadows Using Non-linear Wavelet Lighting Approximation Ren Ng Stanford Ravi Ramamoorthi Columbia Pat Hanrahan Stanford.
Toward Real-Time Global Illumination. Global Illumination == Offline? Ray Tracing and Radiosity are inherently slow. Speedup possible by: –Brute-force:
Toward Real-Time Global Illumination. Project Ideas Distributed ray tracing Extension of the radiosity assignment Translucency (subsurface scattering)
3D Graphics Rendering PPT By Ricardo Veguilla.
Image Based Modeling and Rendering (PI: Malik)
Computer Graphics Recitation 12.
Presentation transcript:

Sebastian Enrique Columbia University Relighting Framework COMS 6160 – Real-Time High Quality Rendering Nov 3 rd, 2004

COMS 6160 Relighting Framework Sebastian Enrique - Columbia University - Nov 3 rd, About Relighting Why it is useful? Photorealistic real-time rendering of complex scenes with complex illumination is an open problem. One IBR approach is to capture or render a set of original images from a scene, and then relight it in real time to produce the same scene but with novel illumination! IMAGES += LIGHTS RELIGHTED SCENE What is Relighting? “Given a set of different illuminated images from a scene, relighting is the process of producing new images of the same scene with new lighting conditions, composing in some way the original data.”

Sebastian Enrique - Columbia University - 3 About Relighting (cont.) What is one of the most challenging parts of it? High quantity of images is needed to produce good results, the challenge is to find/use a good compression technique in order to relight fast and using as less memory as possible. IMAGES ? COMPRESSED IMAGES COMS 6160Nov 3 rd, 2004 Relighting Framework

Sebastian Enrique - Columbia University - 4 Some Previous Work Doorsey et al. used a progressive radiosity method where they pre-rendered synthetic scenes to simulate lighting conditions –superimposing single light source images- in opera stages in Many (Hallinan ‘94, Epstein ‘95, etc.) have pre-acquired real images changing the lighting direction. Debevec compressed the pre-acquired images in JPG and processed in the compressed domain in Sloan uses low-frequency spherical harmonics on geometry in In 2003 Sloan uses clustered or VQPCA on spherical harmonic coefficients. Ng compressed data using wavelets in Sloan (Local Deformable PRT) and Ramamoorthi (Triple Product Wavelets) are the most recent related techniques, COMS 6160Nov 3 rd, 2004 Relighting Framework

Sebastian Enrique - Columbia University - 5 LSD Relighting Approach The Lighting Sensitive Display (LSD) was developed by Shree Nayar, Peter Belhumeur, and Terry Boult. Basically, it is a monitor with an attached camera that captures the lighting conditions of the environment. The monitor shows an scene, which changes (is relighted) as the illumination in the environment changes. They adopted an image-based approach, using a large set of images. COMS 6160Nov 3 rd, 2004 Relighting Framework

Sebastian Enrique - Columbia University - 6 LSD Relighting Approach (cont.) They have developed a novel algorithm (using two stages of Principal Component Analysis or PCA) that compresses that large set of images and allows the relighting in real-time with complex lighting conditions. They have reached compression ratios of 476:1 for colored images. The algorithm simultaneously exploits correlations over the lighting domain as well as coherences over the spatial domain of the image. COMS 6160Nov 3 rd, 2004 Relighting Framework

Sebastian Enrique - Columbia University - 7 LSD Relighting Approach (cont.) We based our Relighting Framework on the LSD approach. In the following slides we will explain: How the input images for the LSD algorithm should be taken. First compression stage of the algorithm. Second compression stage of the algorithm. Real-Time rendering. Then, we will get on: Extending the LSD approach with the use of cubemaps. Problems found. Future directions. Discussion and Videos COMS 6160Nov 3 rd, 2004 Relighting Framework

Sebastian Enrique - Columbia University - 8 Setting Up Images Fixed Viewpoint Grid of Positions for Distant Light Sources, it’s in the front face of an imaginary cube. Distant Light Source For Single Image Scene to Render / Capture To take the images, a grid of light source positions is generated in the front face of an imaginary cube (a plane parallel to the image plane). For each position on the grid, place the distant light with maximum intensity and generate (render or capture) an image. COMS 6160Nov 3 rd, 2004 Relighting Framework

Sebastian Enrique - Columbia University - 9 First Stage SVD The first SVD (Singular Value Decomposition) exploits the fact that locally (within small regions of the images) the variation due to changes in illumination can be approximated by a small number of bases. The image is divided into m square blocks, each containing p pixels, and bases are computed for each. Each image I i is an image of the scene illuminated by a single distant point light source. IMAGES Image divided in m square blocks Each block has p pixels Image I i COMS 6160Nov 3 rd, 2004 Relighting Framework

Sebastian Enrique - Columbia University - 10 First Stage SVD (cont.) I i j denote the jth block of the ith image. For each block in the scene a low-dimensional approximation is computed as follows: I j is a p x n matrix representing a collection of image blocks. The I column of I j is formed by p pixels of the jth block. Using SVD, a rank b approximation to I j is found as: E j is a p x b column-orthogonal matrix, called block bases. S j is a b x b diagonal matrix. C j is an n x b column-orthogonal matrix. The singular values from S j are absorbed into C jT, getting a b x n matrix, L j, called block lighting coefficients. Each I j represents the pixels for the same block on all images COMS 6160Nov 3 rd, 2004 Relighting Framework

Ij Ij SjSj C jT p x n n x n = xx diagonal matrix (singular values) Sebastian Enrique - Columbia University - 11 First Stage SVD (cont.) Ij Ij C jT p x n b x bb x n  xx EjEj p x b SjSj Applying Rank b: SVD: EjEj p x p Ij Ij LjLj p x n b x n  x EjEj p x b Abosorving S j values into C iT : COMS 6160Nov 3 rd, 2004 Relighting Framework

Sebastian Enrique - Columbia University - 12 First Stage SVD (cont.) Then, stacking all of the m L j block matrices, we get the lighting coefficient matrix L. The image bases m E j are also stacked in the matrix E. The collection of submatrices within E and L contain all the information needed to approximate the collection of images corresponding to the n lighting directions. L (m * b) x n E (m * p) x b COMS 6160Nov 3 rd, 2004 Relighting Framework

Sebastian Enrique - Columbia University - 13 First Stage SVD (cont.) Image: 640x480; n=64x64=4096; m=40x30=1200; p=16x16=256; b=10 COMS 6160Nov 3 rd, 2004 Relighting Framework

Sebastian Enrique - Columbia University - 14 Second Stage SVD The fact that there is much coherence in image blocks is exploited using a second SVD. It is applied to the light coefficient matrix L: U is an (m x b) x q column-orthogonal matrix, called lighting coefficient bases. V is a q x n matrix, called compressed coefficient matrix. Rank q denotes the number of linear bases kept to approximate L. L V (m * b) x n q x n  x U (m * b) x q COMS 6160Nov 3 rd, 2004 Relighting Framework

Sebastian Enrique - Columbia University - 15 Second Stage SVD (cont.) Image: 640x480; n=64x64=4096; m=40x30=1200; p=16x16=256; b=10; q=200 COMS 6160Nov 3 rd, 2004 Relighting Framework

Sebastian Enrique - Columbia University - 16 Examples Relighted Image SVD First Stage Rank 3 Original Image Bad Rank Election (showing only red color component) Note block discontinuities! COMS 6160Nov 3 rd, 2004 Relighting Framework

Sebastian Enrique - Columbia University - 17 Examples (cont.) Relighted Image SVD First Stage: Rank 10 SVD Second Stage: Rank 20 Original Image (in fact, two combined original images) Higher Ranks Only small differences in brightness are noticeable, but there are no visible block discontinuities. COMS 6160Nov 3 rd, 2004 Relighting Framework

Sebastian Enrique - Columbia University - 18 Real-Time Rendering Real-Time illumination is represented by the illumination field vector s. Each element s i corresponds to a point light, and its value represents the intensity that that point light is contributing with to the scene. To render the relighted scene in real-time, preprocessed matrices E, U, and V should be used: 1.Compute a compressed coefficient vector as the product (q elements) 2.Compute a lighting coefficient vector I as (m*b elements) 3.To render the j block, a subvector I j from I must be multiplied with the corresponding stacked E j matrix. This must be done each frame for the m blocks. (b elements) (p elements) COMS 6160Nov 3 rd, 2004 Relighting Framework

Sebastian Enrique - Columbia University - 19 Real-Time Rendering (cont.) COMS 6160Nov 3 rd, 2004 Relighting Framework

Sebastian Enrique - Columbia University - 20 Demo: Point Lights / David Image: 480x640; n=6x16x16=1536; m=32x32=1024; p=16x16=256; b=10; q=20 COMS 6160Nov 3 rd, 2004 Relighting Framework

Sebastian Enrique - Columbia University - 21 Extending LSD: Cube Lighting In place of using only the front face of the cube to capture images with different lighting condition, use all of the 6 faces. The parameters that change are the quantity of input images and the length of the s light field vector (both now 6 * n). In this way, the scene can be relighted with illumination coming for all around. Grid of Positions for Distant Light Sources in the Full Cube Distant Light Source For Single Image Fixed Viewpoint COMS 6160Nov 3 rd, 2004 Relighting Framework

Sebastian Enrique - Columbia University - 22 High-Dynamic Range Cubic Environment Map Having the relighting scheme for the full cube, we can add cubic environment maps to relight the scene. To have more accuracy, High-Dynamic Range (HDR) cubic environment maps are used (floating point values in place of ). The CubicMap could be rotated, and each element in the s light field vector is contributed by the value of the corresponding texel, weighted by the solid angle. HDR(i) is the corresponding texel value; N is the corresponding cube face normal; R is the vector from the origin to the center of the element i in the cube grid; m is the face cube grid resolution (squared is how many images per face exist); each cube face has a size of 2 x 2. COMS 6160Nov 3 rd, 2004 Relighting Framework

Sebastian Enrique - Columbia University - 23 Demo: CubeMap & Point Lights / Nicole Image: 512x512; n=6x16x16=1536; m=32x32=1024; p=16x16=256; b=10; q=20 COMS 6160Nov 3 rd, 2004 Relighting Framework

Sebastian Enrique - Columbia University - 24 Framework Summary PREPROCESS (SVD 1 & 2) IMAGES E b = […] U b = […] V b = […] MATRICES E b = […] U b = […] V b = […] MATRICES REAL-TIME RELIGHTING FRAMEWORK RELIGHTED SCENE HDR Generate Scene-Matrices For Images Set Of Each Scene to Relight Relight in Real-Time the Given Scene-Matrices with HDR CubeMap and Point Lights COMS 6160Nov 3 rd, 2004 Relighting Framework

Sebastian Enrique - Columbia University - 25 Problems Found Preprocess takes TOO MUCH TIME. Preprocess uses TOO MUCH MEMORY. Few things could be done in current graphics hardware (HDR cubemap processing specially). COMS 6160Nov 3 rd, 2004 Relighting Framework

Sebastian Enrique - Columbia University - 26 Future Directions Optimize preprocessing stages (distributed computations?) Use error metrics to automatically select adequate ranks. Extend user interface to allow the relighting using lower ranks than that given in the input matrices. Allow viewpoint changes. COMS 6160Nov 3 rd, 2004 Relighting Framework

Sebastian Enrique - Columbia University - 27 End of Talk Ready to hear… Comments Suggestions Discussions Questions More VIDEOS to show while chatting… COMS 6160Nov 3 rd, 2004 Relighting Framework