Presentation is loading. Please wait.

Presentation is loading. Please wait.

Parameterized Environment Maps Ziyad Hakura, Stanford University John Snyder, Microsoft Research Jed Lengyel, Microsoft Research.

Similar presentations


Presentation on theme: "Parameterized Environment Maps Ziyad Hakura, Stanford University John Snyder, Microsoft Research Jed Lengyel, Microsoft Research."— Presentation transcript:

1 Parameterized Environment Maps Ziyad Hakura, Stanford University John Snyder, Microsoft Research Jed Lengyel, Microsoft Research

2 Static Environment Maps (EMs) Generated using standard techniques: Photograph a physical sphere in an environment Render six faces of a cube from object center Generated using standard techniques: Photograph a physical sphere in an environment Render six faces of a cube from object center

3 Ray-Traced vs. Static EM Self-reflections are missing

4

5 Parameterized Environment Maps (PEM)

6 3-Step Process 1) Preprocess: Ray-trace images at each viewpoint 2) Preprocess: Infer environment maps (EMs) 3) Run-time: Blend between 2 nearest EMs 1) Preprocess: Ray-trace images at each viewpoint 2) Preprocess: Infer environment maps (EMs) 3) Run-time: Blend between 2 nearest EMs

7 Environment Map Geometry

8 Why Parameterized Environment Maps? Captures view-dependent shading in environment Accounts for geometric error due to approximation of environment with simple geometry Captures view-dependent shading in environment Accounts for geometric error due to approximation of environment with simple geometry

9 How to Parameterize the Space? Experimental setup 1D view space 1˚ separation between views 100 sampled viewpoints In general, author specifies parameters Space can be 1D, 2D or more Viewpoint, light changes, object motions Experimental setup 1D view space 1˚ separation between views 100 sampled viewpoints In general, author specifies parameters Space can be 1D, 2D or more Viewpoint, light changes, object motions

10 Ray-Traced vs. PEM Closely match local reflections like self-reflections

11

12 Movement Away from Viewpoint Samples Ray-Traced PEM

13

14 Previous Work Reflections on Planar Surfaces [Diefenbach96] Reflections on Curved Surfaces [Ofek98] Image-Based Rendering Methods Light Field, Lumigraph, Surface Light Field, LDIs Decoupling of Geometry and Illumination Cabral99, Heidrich99 Parameterized Texture Maps [Hakura00] Reflections on Planar Surfaces [Diefenbach96] Reflections on Curved Surfaces [Ofek98] Image-Based Rendering Methods Light Field, Lumigraph, Surface Light Field, LDIs Decoupling of Geometry and Illumination Cabral99, Heidrich99 Parameterized Texture Maps [Hakura00]

15 Surface Light Fields [Miller98,Wood00] Surface Light Field Dense sampling over surface points of low-resolution lumispheres Dense sampling over surface points of low-resolution lumispheres PEM Sparse sampling over viewpoints of high-resolution EMs Sparse sampling over viewpoints of high-resolution EMs

16 Parameterized Texture Maps [Hakura00] Light View Captures realistic pre-rendered shading effects

17 Comparison with Parameterized Texture Maps Parameterized Texture Maps [Hakura00] Static texture coordinates Pasted-on look away from sampled views Parameterized Environment Maps Bounce rays off, intersect simple geometry Layered maps for local and distant environment Better quality away from sampled views Parameterized Texture Maps [Hakura00] Static texture coordinates Pasted-on look away from sampled views Parameterized Environment Maps Bounce rays off, intersect simple geometry Layered maps for local and distant environment Better quality away from sampled views

18

19 EM Representations EM Geometry How reflected environment is approximated Examples: Sphere at infinity Finite cubes, spheres, and ellipsoids EM Mapping How geometry is represented in a 2D map Examples: Gazing ball (OpenGL) mapping Cubic mapping EM Geometry How reflected environment is approximated Examples: Sphere at infinity Finite cubes, spheres, and ellipsoids EM Mapping How geometry is represented in a 2D map Examples: Gazing ball (OpenGL) mapping Cubic mapping

20 Layered EMs Segment environment into local and distant maps Allows different EM geometries in each layer Supports parallax between layers Segment environment into local and distant maps Allows different EM geometries in each layer Supports parallax between layers

21 Segmented, Ray-Traced Images Distant Local Color Local Alpha Fresnel EMs are inferred for each layer separately

22 Distant Layer Ray directly reaches distant environment

23 Distant Layer Ray bounces more times off reflector

24 Distant Layer Ray propagated through reflector

25 Local Layer Local Color Local Alpha

26 Fresnel Layer Fresnel modulation is generated at run-time

27 EM Inference A x = b Unknown EM Texels Ray-Traced Image HW Filter Coefficients Hardware Render Hardware Render Screen EM Texture

28 Inferred EMs per Viewpoint Distant Local Color Local Color Local Alpha Local Alpha

29 Run-Time Over blending mode to composite local/distant layers Fresnel modulation, F, generated on-the-fly per vertex Blend between neighboring viewpoint EMs Teapot object requires 5 texture map accesses: 2 EMs (local/distant layers) at each of 2 viewpoints (for smooth interpolation) and 1 1D Fresnel map (for better polynomial interpolation) Over blending mode to composite local/distant layers Fresnel modulation, F, generated on-the-fly per vertex Blend between neighboring viewpoint EMs Teapot object requires 5 texture map accesses: 2 EMs (local/distant layers) at each of 2 viewpoints (for smooth interpolation) and 1 1D Fresnel map (for better polynomial interpolation)

30 Video Results Experimental setup 1D view space 1˚ separation between views 100 sampled viewpoints Experimental setup 1D view space 1˚ separation between views 100 sampled viewpoints

31 Layered PEM vs. Infinite Sphere PEM

32

33 Real-time Demo

34

35 Summary Parameterized Environment Maps Layered Parameterized by viewpoint Inferred to match ray-traced imagery Accounts for environments Geometry View-dependent shading Mirror-like, local reflections Hardware-accelerated display Parameterized Environment Maps Layered Parameterized by viewpoint Inferred to match ray-traced imagery Accounts for environments Geometry View-dependent shading Mirror-like, local reflections Hardware-accelerated display

36 Future Work Placement/partitioning of multiple environment shells Automatic selection of EM geometry Incomplete imaging of environment off the manifold Refractive objects Glossy surfaces Placement/partitioning of multiple environment shells Automatic selection of EM geometry Incomplete imaging of environment off the manifold Refractive objects Glossy surfaces

37 Questions

38 Timing Results On the Manifold Off the Manifold texgen time 35ms frame time 45ms 57ms FPS #geometry passes #geometry passes

39 Texel Impulse Response To measure the hardware impulse response, render with a single texel set to 1. Hardware Render Hardware Render Screen Texture

40 Single Texel Response

41 Model for Single Texel one column per texel one row per screen pixel one row per screen pixel

42 Model for MIPMAPs

43 Conclusion PEMs provide: faithful approximation to ray-traced images at pre-rendered viewpoint samples plausible movement away from those samples using real-time graphics hardware PEMs provide: faithful approximation to ray-traced images at pre-rendered viewpoint samples plausible movement away from those samples using real-time graphics hardware

44 PEM vs. Static EM

45


Download ppt "Parameterized Environment Maps Ziyad Hakura, Stanford University John Snyder, Microsoft Research Jed Lengyel, Microsoft Research."

Similar presentations


Ads by Google