Recent Methods for Image-based Modeling and Rendering

Slides:



Advertisements
Similar presentations
Computer Graphics Inf4/MSc Computer Graphics Lecture Notes #16 Image-Based Modelling, Rendering and Lighting.
Advertisements

Introduction to Image-Based Rendering Jian Huang, CS 594, Spring 2002 A part of this set of slides reference slides used at Standford by Prof. Pat Hanrahan.
View Morphing by Steven M. SeitzCharles R. Dyer Irwin Chiu Hau Computer Science McGill University Winter 2004 Comp 767: Advanced Topics in Graphics: Image-Based.
December 5, 2013Computer Vision Lecture 20: Hidden Markov Models/Depth 1 Stereo Vision Due to the limited resolution of images, increasing the baseline.
Dana Cobzas-PhD thesis Image-Based Models with Applications in Robot Navigation Dana Cobzas Supervisor: Hong Zhang.
Dr. Hassan Foroosh Dept. of Computer Science UCF
A new approach for modeling and rendering existing architectural scenes from a sparse set of still photographs Combines both geometry-based and image.
Advanced Computer Graphics CSE 190 [Spring 2015], Lecture 14 Ravi Ramamoorthi
Plenoptic Stitching: A Scalable Method for Reconstructing 3D Interactive Walkthroughs Daniel G. Aliaga Ingrid Carlbom
18.1 Si31_2001 SI31 Advanced Computer Graphics AGR Lecture 18 Image-based Rendering Light Maps What We Did Not Cover Learning More...
Advanced Computer Graphics (Fall 2010) CS 283, Lecture 16: Image-Based Rendering and Light Fields Ravi Ramamoorthi
CSCE 641 Computer Graphics: Image-based Modeling Jinxiang Chai.
Advanced Computer Graphics (Spring 2005) COMS 4162, Lecture 21: Image-Based Rendering Ravi Ramamoorthi
Direct Methods for Visual Scene Reconstruction Paper by Richard Szeliski & Sing Bing Kang Presented by Kristin Branson November 7, 2002.
1 Image-Based Visual Hulls Paper by Wojciech Matusik, Chris Buehler, Ramesh Raskar, Steven J. Gortler and Leonard McMillan [
Representations of Visual Appearance COMS 6160 [Spring 2007], Lecture 4 Image-Based Modeling and Rendering
Image-Based Modeling and Rendering CS 6998 Lecture 6.
Multi-view stereo Many slides adapted from S. Seitz.
View interpolation from a single view 1. Render object 2. Convert Z-buffer to range image 3. Re-render from new viewpoint 4. Use depths to resolve overlaps.
Recent Methods for Image-based Modeling and Rendering Lecture 6: Rendering Darius Burschka Johns Hopkins University Dana Cobzas University of Alberta Zach.
Computing With Images: Outlook and applications
Image Stitching and Panoramas
CSCE 641 Computer Graphics: Image-based Modeling Jinxiang Chai.
Copyright  Philipp Slusallek IBR: View Interpolation Philipp Slusallek.
Image-Based Rendering using Hardware Accelerated Dynamic Textures Keith Yerex Dana Cobzas Martin Jagersand.
Image or Object? Michael F. Cohen Microsoft Research.
CSCE 641 Computer Graphics: Image-based Rendering (cont.) Jinxiang Chai.
Siggraph’2000, July 27, 2000 Jin-Xiang Chai Xin Tong Shing-Chow Chan Heung-Yeung Shum Microsoft Research, China Plenoptic Sampling SIGGRAPH’2000.
Introduction to Computer Vision CS223B, Winter 2005.
Computational Photography Light Field Rendering Jinxiang Chai.
Recent Methods for Image-based Modeling and Rendering Darius Burschka Johns Hopkins University Dana Cobzas University of Alberta Zach Dodds Harvey Mudd.
CSCE 641: Computer Graphics Image-based Rendering Jinxiang Chai.
CS 563 Advanced Topics in Computer Graphics View Interpolation and Image Warping by Brad Goodwin Images in this presentation are used WITHOUT permission.
Convergence of vision and graphics Jitendra Malik University of California at Berkeley Jitendra Malik University of California at Berkeley.
 Marc Levoy IBM / IBR “The study of image-based modeling and rendering is the study of sampled representations of geometry.”
CS 563 Advanced Topics in Computer Graphics Introduction To IBR By Cliff Lindsay Slide Show ’99 Siggraph[6]
View interpolation from a single view 1. Render object 2. Convert Z-buffer to range image 3. Re-render from new viewpoint 4. Use depths to resolve overlaps.
Image Based Rendering And Modeling Techniques And Their Applications Jiao-ying Shi State Key laboratory of Computer Aided Design and Graphics Zhejiang.
The Story So Far The algorithms presented so far exploit: –Sparse sets of images (some data may not be available) –User help with correspondences (time.
CSCE 641 Computer Graphics: Image-based Modeling Jinxiang Chai.
Real-Time High Quality Rendering CSE 291 [Winter 2015], Lecture 6 Image-Based Rendering and Light Fields
Advanced Computer Graphics (Spring 2013) CS 283, Lecture 15: Image-Based Rendering and Light Fields Ravi Ramamoorthi
Image-Based Rendering. 3D Scene = Shape + Shading Source: Leonard mcMillan, UNC-CH.
Image Based Rendering(IBR) Jiao-ying Shi State Key laboratory of Computer Aided Design and Graphics Zhejiang University, Hangzhou, China
Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy.
Image-based rendering Michael F. Cohen Microsoft Research.
03/12/03© 2003 University of Wisconsin Last Time NPR Assignment Projects High-Dynamic Range Capture Image Based Rendering Intro.
December 4, 2014Computer Vision Lecture 22: Depth 1 Stereo Vision Comparing the similar triangles PMC l and p l LC l, we get: Similarly, for PNC r and.
Image stitching Digital Visual Effects Yung-Yu Chuang with slides by Richard Szeliski, Steve Seitz, Matthew Brown and Vaclav Hlavac.
Image-based Rendering. © 2002 James K. Hahn2 Image-based Rendering Usually based on 2-D imagesUsually based on 2-D images Pre-calculationPre-calculation.
03/24/03© 2003 University of Wisconsin Last Time Image Based Rendering from Sparse Data.
Image Based Rendering. Light Field Gershun in 1936 –An illuminated objects fills the surrounding space with light reflected of its surface, establishing.
Lec 22: Stereo CS4670 / 5670: Computer Vision Kavita Bala.
03/09/05© 2005 University of Wisconsin Last Time HDR Image Capture Image Based Rendering –Improved textures –Quicktime VR –View Morphing NPR Papers: Just.
CSL 859: Advanced Computer Graphics Dept of Computer Sc. & Engg. IIT Delhi.
112/5/ :54 Graphics II Image Based Rendering Session 11.
CSCE 641 Computer Graphics: Image-based Rendering (cont.) Jinxiang Chai.
Image-Based Rendering Geometry and light interaction may be difficult and expensive to model –Think of how hard radiosity is –Imagine the complexity of.
MASKS © 2004 Invitation to 3D vision. MASKS © 2004 Invitation to 3D vision Lecture 1 Overview and Introduction.
Video Textures Arno Schödl Richard Szeliski David Salesin Irfan Essa Microsoft Research, Georgia Tech.
Presented by 翁丞世  View Interpolation  Layered Depth Images  Light Fields and Lumigraphs  Environment Mattes  Video-Based.
Advanced Computer Graphics
Jun Shimamura, Naokazu Yokoya, Haruo Takemura and Kazumasa Yamazawa
Steps Towards the Convergence of Graphics, Vision, and Video
CS4670 / 5670: Computer Vision Kavita Bala Lec 27: Stereo.
Introduction to Graphics Modeling
Image-Based Rendering
© 2005 University of Wisconsin
Image Based Modeling and Rendering (PI: Malik)
Filtering Things to take away from this lecture An image as a function
Presentation transcript:

Recent Methods for Image-based Modeling and Rendering IEEE VR 2003 tutorial 1 Recent Methods for Image-based Modeling and Rendering Darius Burschka Johns Hopkins University Dana Cobzas University of Alberta Zach Dodds Harvey Mudd College Greg Hager Johns Hopkins University Martin Jagersand University of Alberta Keith Yerex Virtual Universe Corporation

IEEE Virtual Reality 2003 Next Lectures Single view geometry and camera calibration. Multiple view projective, affine and Euclidean Geometry. Scene and object modeling from images. Differential image variability and dynamic textures. Real-time visual tracking and video processing. Hard-ware accelerated image-based rendering. Software system and hands-on lab.

Image-based Modeling Rendering IBR/IBM: Label on a wide range of techniques Promising for various reasons, e.g.: Cameras are cheap/common while 3D laser range sensors are expensive and manual modeling time consuming. Achieving photo-realism is easier if we start with real photos. Speed up graphics rendering by warping and blending whole images instead of building them from components in each frame. Common trait: Images serve important role. Partially or wholly replaces geometry and modeling.

Image-based Models from consumer cameras Rendering of models obtained using a $100 web cam and a home PC (Cobzas, Yerex Jagersand 2002) We’ll learn how to do this in the lab this afternoon…

Photo-Realism from images 1. Geometry+images (Debevec – Camillo Façade) 2. Set of all light rays –Plenoptic function Capture Render new views

Rendering speed-up Post-warping images Blending a light basis

Modeling: Two Complementary Approaches Conventional graphics Image-based modeling and rendering real images geometry, physics computer algorithms geometry, physics computer algorithms synthetic images synthetic images

Confluence of Computer Graphics and Vision Traditional computer graphics (image synthesis, forward modeling) Creating artificial images and videos from scratch Computer vision & image processing (image analysis & transformation, inverse modeling) Analyzing photographs & videos of the real world Both fields rely on the same physical & mathematical principles and a common set of representations They mainly differ on how these representations are built

Object & Environment Modeling Basic techniques from the conventional (hand) modeling perspective: Declarative: write it down (e.g. typical graphics course) Interactive: sculpt it (Maya, Blender …) Programmatic: let it grow (L-systems for plants, Fish motion control) Basic techniques from the image-based perspective: Collect many pictures of a real object/environment; rely on image analysis to unfold the picture formation process (principled) Collect one or more pictures of a real object/environment; manipulate them to achieve the desired effect (heuristic)

Rendering Traditional rendering Image-based rendering 1. Input: 3D description of 3D scene & camera 2. Solve light transport through environment 3. Project to camera’s viewpoint 4. Perform ray-tracing Image-based rendering 1. Collect one or more images of a real scene 2. Warp, morph, or interpolate between these images to obtain new views

Important Issues in Image-Based Modeling and Rendering What are theoretical limits on the information obtained from one or multiple images? (Geometry) How to stably and reliably compute properties of the real word from image data? (Comp Vision) How to efficiently represent image-based objects and merge multiple objects into new scenes? (CG) How to efficiently render new views and animate motion in scenes? (IBR)

Information obtained from images Viewing geometry describes global properties of the scene structure and camera motion Traditional Euclidean geometry Past decade surge in applying non-Euclidean (projective, affine) geometry to describe camera imaging Differential properties in the intensity image gives clues to local shape and motion. Shape from shading, texture, small motion

Viewing Geometry and Camera Models Scene object Viewing Geometry Euclidean Calibrated camera Affine “Infinite” camera Projective Uncalibrated cam Visual equivalent Shape invariant transform C sim { g | g  GL(4), g = , R  SO(3) } R t 0 1 C aff { g | g  GL(4), g = , A  GL(3) } A t 0 1 { g | g  GL(4) } C proj Possibly ambigous shape!

Intensity-based Information We get information only when there is intesity difference (Baker et.al. 2003) Hence there are often local ambiguities

Photo-Consistent Hull In cases of structural ambiguity it is possible to define a photo-consistent shape – “visual hull” (Kutulakos and Seitz 2001)

Two main representations in Image-Based Modeling Ray set = Plenoptic function Geometry and texture   (X,Y,Z) Represents … “the intensity of light rays passing through the camera center at every location, at every possible viewing angle (5D)”

Image Mosaics m=[u,v]T m’=[u’,v’]T When images sample a planar surface or are taken from the same point of view, they are related by a linear projective transformation (homography). So … images can be mosaicked into a larger image 3D plenoptic function. m=[u,v]T m’=[u’,v’]T (u,v) (u’,v’)

Cylindrical Panorama Mosaics Quicktime VR: Warps from cylindrical panorama to create new planar view (from same viewpoint)

Image and View Morphig Generate intermediate views by image/ view/ flow-field interpolation. Can produce geometrically incorrect images

Image and View Morphing - Examples Beier &Neely – “Feature-Based Image Metamorphosis“ Image processing technique used as an animation tool for metamorphosis from one image to another. Specify correspondence between source and destination using a set of line segments pairs.

View Morphing along a line Generate new views that represent a physically-correct transition between two reference images. (Seitz & Dyer)

Light Field Rendering Sample a 4D plenoptic function if the scene can be constrained to a bounding box Approximate the resampling process by interpolating the 4D function from nearest samples. (Levoy & Hanrahan)

The Lumigraph Gortler and al.; Microsoft Lumigraph is reconstructed by a linear sum of the product between a basis function and the value at each grid point (u,v,s,t). acquisition stage volumetric model novel view

Concentric Mosaics H-Y Shum, L-W He; Microsoft Sample a 3D plenoptic function when camera motion is restricted to planar concentric circles.

Pixel Reprojection Using Scene Geometry Images Renderings Geometric constranits: Depth, disparity Epipolar constraint Trilinear tensor Laveau and Faugeras: Use a collection of images (reference views) and the disparities between images to compute a novel view using a raytracing process.

Plenoptic Modeling McMillan and Bishop: Plenoptic modeling (5D plenoptic function): compute new views from cylindrical panoramic images.

Virtualized Reality T. Kanade -CMU 49 cameras for images and six uniformly spaced microphones for sound 3D reconstruction: volumetric method called Shape from Silhouette

Layer Depth Images Shade and al. LDI is a view of the scene from a single input camera view, but with multiple pixels along each line of sight. movie

Relief Textures Augment 2D texture map with depth map Render new views by 3D Pre-warp onto poly Perspective warp into cam Oliviera 00

Image-based Objects Envelope object with coarse geometry Map textures+depth through 3D warp from each polygon Texture 2D texturing 3D texturing

Rendering Architecture from Photographs Combine both image-based and geometry based techniques. “Façade” (Debevec et. al.)

Structure from motion algorithm Tracked features poses structure Structure from motion algorithm Estimated geometry at best approximation of true

Geometric re-projection errors dynamic Texturing: static

Spatial Basis Intro Moving sine wave can be modeled: (Jagersand 1997) Moving sine wave can be modeled: Small image motion Spatially fixed basis 2 basis vectors 6 basis vectors

Example: Light variation

Geometric SFM and dynamic textures Training Model New view I1 It Structure P New pose (R a b) … = = (R1 a1 b1) …(Rt at bt) Motion params + + Texture basis (Cobzas, Yerex Jagersand 2002) Warped texture y1 … yt Texture coeff

Geometric SFM and dynamic textures Example Renderings Rendering of models obtained using a $100 web cam and a home PC (Cobzas, Yerex Jagersand 2002) We’ll learn how to do this in the lab this afternoon…

Summary - IBMR +/- Image and view morphing Technique Input data Rendered images +/- Image and view morphing Interpolation 2 images Interpolate the reference images + easy to generate images - nonrealistic Interpolation from dense samples 4D plenoptic function of a constrained scene Samples of the plenoptic function Interpolate the 4D function + easy to generate renderings - Need exact cam. Cal. - mostly synthetic scenes - large amount of data Geometrically valid pixel reprojection Use geometric constraints 2,3, more images taken from the same scene Pixel reprojection + low amount o data + geometrically correct renderings - requires depth/ disparity Geometric SFM + Dynamic texture Obtain coarse geometry from images Many (100+) images from the same scene Geometric projection and texture mapping + integrates with standard computer graphics scenes -large amount of data.

IEEE Virtual Reality 2003 Next Lectures Single view geometry and camera calibration. Multiple view projective, affine and Euclidean Geometry. Scene and object modeling from images. Differential image variability and dynamic textures. Real-time visual tracking and video processing. Hard-ware accelerated image-based rendering. Software system and hands-on lab.