Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

Similar presentations


Presentation on theme: "CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view."— Presentation transcript:

1 CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view of sampling Today Variance reduction Importance sampling Stratified sampling Multidimensional sampling patterns Discrepancy and Quasi-Monte Carlo Latter Path tracing for interreflection Density estimation

2 CS348B Lecture 9Pat Hanrahan, Spring 2005 Cameras Source: Cook, Porter, Carpenter, 1984 Source: Mitchell, 1991 Depth of Field Motion Blur

3 CS348B Lecture 9Pat Hanrahan, Spring 2005 Variance 1 shadow ray per eye ray16 shadow rays per eye ray

4 CS348B Lecture 9Pat Hanrahan, Spring 2005 Variance Definition Variance decreases with sample size

5 CS348B Lecture 9Pat Hanrahan, Spring 2005 Variance Reduction Efficiency measure If one technique has twice the variance as another technique, then it takes twice as many samples to achieve the same variance If one technique has twice the cost of another technique with the same variance, then it takes twice as much time to achieve the same variance Techniques to increase efficiency Importance sampling Stratified sampling

6 CS348B Lecture 9Pat Hanrahan, Spring 2005 Biasing Previously used a uniform probability distribution Can use another probability distribution But must change the estimator

7 CS348B Lecture 9Pat Hanrahan, Spring 2005 Unbiased Estimate Probability Estimator

8 CS348B Lecture 9Pat Hanrahan, Spring 2005 Importance Sampling Sample according to f

9 CS348B Lecture 9Pat Hanrahan, Spring 2005 Importance Sampling Variance Sample according to f Zero variance!

10 CS348B Lecture 9Pat Hanrahan, Spring 2005 Example methodSampling function varianceSamples needed for standard error of 0.008 importance(6-x)/1656.8N-1887,500 importance1/421.3N-1332,812 importance(x+2)/166.4N-198,432 importancex/801 stratified1/421.3N-370 Peter Shirley – Realistic Ray Tracing

11 CS348B Lecture 9Pat Hanrahan, Spring 2005 Examples Projected solid angle 4 eye rays per pixel 100 shadow rays Area 4 eye rays per pixel 100 shadow rays

12 CS348B Lecture 9Pat Hanrahan, Spring 2005 Irradiance Generate cosine weighted distribution

13 CS348B Lecture 9Pat Hanrahan, Spring 2005 Cosine Weighted Distribution

14 CS348B Lecture 9Pat Hanrahan, Spring 2005 Sampling a Circle Equi-Areal

15 CS348B Lecture 9Pat Hanrahan, Spring 2005 Shirley’s Mapping

16 CS348B Lecture 9Pat Hanrahan, Spring 2005 Stratified Sampling Stratified sampling is like jittered sampling Allocate samples per region New variance Thus, if the variance in regions is less than the overall variance, there will be a reduction in resulting variance For example: An edge through a pixel

17 CS348B Lecture 9Pat Hanrahan, Spring 2005 Mitchell 91 Uniform randomSpectrally optimized

18 CS348B Lecture 9Pat Hanrahan, Spring 2005 Discrepancy

19 CS348B Lecture 9Pat Hanrahan, Spring 2005 Theorem on Total Variation Theorem: Proof: Integrate by parts

20 CS348B Lecture 9Pat Hanrahan, Spring 2005 Quasi-Monte Carlo Patterns Radical inverse (digit reverse) of integer i in integer base b Hammersley points Halton points (sequential) 11.11/2 210.011/4 311.113/4 4100.0013/8 5101.1015/8

21 CS348B Lecture 9Pat Hanrahan, Spring 2005 Hammersly Points

22 CS348B Lecture 9Pat Hanrahan, Spring 2005 Edge Discrepancy Note: SGI IR Multisampling extension: 8x8 subpixel grid; 1,2,4,8 samples

23 CS348B Lecture 9Pat Hanrahan, Spring 2005 Low-Discrepancy Patterns Process16 points256 points1600 points Zaremba0.05040.004780.00111 Jittered0.05380.005950.00146 Poisson-Disk0.06130.007670.00241 N-Rooks0.06370.01230.00488 Random0.09240.02240.00866 Discrepancy of random edges, From Mitchell (1992) Random sampling converges as N -1/2 Zaremba converges faster and has lower discrepancy Zaremba has a relatively poor blue noise spectra Jittered and Poisson-Disk recommended

24 CS348B Lecture 9Pat Hanrahan, Spring 2005 High-dimensional Sampling Numerical quadrature For a given error … Random sampling For a given variance … Monte Carlo requires fewer samples for the same error in high dimensional spaces

25 CS348B Lecture 9Pat Hanrahan, Spring 2005 Block Design Alphabet of size n Each symbol appears exactly once in each row and column Rows and columns are stratified Latin Square

26 CS348B Lecture 9Pat Hanrahan, Spring 2005 Block Design N-Rook Pattern Incomplete block design Replaced n 2 samples with n samples Permutations: Generalizations: N-queens, 2D projection

27 CS348B Lecture 9Pat Hanrahan, Spring 2005 Space-time Patterns Distribute samples in time Complete in space Samples in space should have blue-noise spectrum Incomplete in time Decorrelate space and time Nearby samples in space should differ greatly in time Pan-diagonal Magic Square Cook Pattern

28 CS348B Lecture 9Pat Hanrahan, Spring 2005 Path Tracing 4 eye rays per pixel 16 shadow rays per eye ray 64 eye rays per pixel 1 shadow ray per eye ray Complete Incomplete

29 CS348B Lecture 9Pat Hanrahan, Spring 2005 Views of Integration 1. Signal processing Sampling and reconstruction, aliasing and antialiasing Blue noise good 2. Statistical sampling (Monte Carlo) Sampling like polling Variance High dimensional sampling: 1/N 1/2 3. Quasi Monte Carlo Discrepancy Asymptotic efficiency in high dimensions 4. Numerical Quadrature/Integration rules Smooth functions


Download ppt "CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view."

Similar presentations


Ads by Google