Presentation is loading. Please wait.

Presentation is loading. Please wait.

Importance Resampling for Global Illumination Justin F. Talbot Master’s Thesis Defense Brigham Young University Provo, UT.

Similar presentations


Presentation on theme: "Importance Resampling for Global Illumination Justin F. Talbot Master’s Thesis Defense Brigham Young University Provo, UT."— Presentation transcript:

1 Importance Resampling for Global Illumination Justin F. Talbot Master’s Thesis Defense Brigham Young University Provo, UT

2 Global Illumination Goal: Create realistic images from virtual scenes.

3 Global Illumination Goal: Create realistic images from virtual scenes. Approach: Treat each pixel as an integral.

4 Monte Carlo Integration Approximation: Generate random samples, {y 1,…,y N }, from density q Generate random samples, {y 1,…,y N }, from density q Evaluate f at each sample Evaluate f at each sample Compute estimate Compute estimate

5 Monte Carlo Integration Importance Sampling Importance Sampling Choose q to be nearly proportional to fChoose q to be nearly proportional to f Restrictions:Restrictions: q must be normalized (integrate to 1) q must be normalized (integrate to 1) q must be easy to sample q must be easy to sample

6 Thesis Question Can we generalize importance sampling to allow Can we generalize importance sampling to allow unnormalized q?unnormalized q? difficult to sample q?difficult to sample q? Motivation: Motivation: If so, then we can pick a q that is more proportional to f.If so, then we can pick a q that is more proportional to f. More variance reduction.More variance reduction.

7 Thesis Contributions Resampled Importance Sampling (RIS) Resampled Importance Sampling (RIS) Proofs Proofs RIS unbiasedRIS unbiased RIS varianceRIS variance Efficiency optimal parametersEfficiency optimal parameters Robust approximate parametersRobust approximate parameters RIS combined with RIS combined with Stratified SamplingStratified Sampling Multiple Importance SamplingMultiple Importance Sampling Application to direct lighting problem Application to direct lighting problem

8 Resampled Importance Sampling A generalization of importance sampling that permits A generalization of importance sampling that permits unnormalized q andunnormalized q and difficult to sample q.difficult to sample q. Based upon a sampling technique called importance resampling. Based upon a sampling technique called importance resampling.

9 Importance Resampling Goal: generate samples from q Goal: generate samples from q Problems: Problems: q may not be normalized.q may not be normalized. q can’t be sampled using simpler techniquesq can’t be sampled using simpler techniques Solution: use 2-stage sampling (resampling) Solution: use 2-stage sampling (resampling)

10 Importance Resampling

11 1) Generate proposals from density p. p should be easy to sample Proposals = {x 1,…,x M }

12 Importance Resampling 1) Generate proposals from density p. 2) Compute weights. Weighted proposals form a discrete approximation of q

13 Importance Resampling 1) Generate proposals from density p. 2) Compute weights. 3) Draw samples from the proposals with probability prop. to weight. Samples are approximately distributed according to q! Samples = {y 1,…,y N }

14 Importance Resampling Provides a way to generate samples from a “difficult” distribution. Provides a way to generate samples from a “difficult” distribution. Limitations: Limitations: Distribution is an approximation for any finite number of proposals, M.Distribution is an approximation for any finite number of proposals, M. Samples may be repeated if drawn from same set of proposals.Samples may be repeated if drawn from same set of proposals.

15 Resampled Importance Sampling How do we combine How do we combine proposals {x 1,…,x M },proposals {x 1,…,x M }, weights {w(x 1 ),…,w(x M )}, andweights {w(x 1 ),…,w(x M )}, and samples {y 1,…,y N }samples {y 1,…,y N } to create an unbiased estimate of ?

16 Resampled Importance Sampling How do we combine How do we combine proposals {x 1,…,x M },proposals {x 1,…,x M }, weights {w(x 1 ),…,w(x M )}, andweights {w(x 1 ),…,w(x M )}, and samples {y 1,…,y N }samples {y 1,…,y N } to create an unbiased estimate of ?

17 Resampled Importance Sampling How do we combine How do we combine proposals {x 1,…,x M },proposals {x 1,…,x M }, weights {w(x 1 ),…,w(x M )}, andweights {w(x 1 ),…,w(x M )}, and samples {y 1,…,y N }samples {y 1,…,y N } to create an unbiased estimate of ? Same as standard Monte Carlo integration estimate (except q is not normalized)

18 Resampled Importance Sampling How do we combine How do we combine proposals {x 1,…,x M },proposals {x 1,…,x M }, weights {w(x 1 ),…,w(x M )}, andweights {w(x 1 ),…,w(x M )}, and samples {y 1,…,y N }samples {y 1,…,y N } to create an unbiased estimate of ? Additional term corrects: Importance Resampling approximation Unnormalized q

19 Thesis Question Can we generalize importance sampling to allow Can we generalize importance sampling to allow unnormalized q?unnormalized q? difficult to sample q?difficult to sample q? Motivation: Motivation: If so, then we can pick a q that is more proportional to f.If so, then we can pick a q that is more proportional to f. More variance reduction.More variance reduction. YES!

20 Resampled Importance Sampling The variance of RIS is: The variance of RIS is: To give more variance reduction than standard importance sampling: To give more variance reduction than standard importance sampling: proposals must be computationally cheaper than samples ANDproposals must be computationally cheaper than samples AND q must be more prop. to f than p (a better importance sampling density).q must be more prop. to f than p (a better importance sampling density).

21 Resampled Importance Sampling We also have to choose M (# of proposals) and N (# of samples). We also have to choose M (# of proposals) and N (# of samples). For a fixed time constraint, we have to trade off. For a fixed time constraint, we have to trade off.

22 Example - Choosing M and N N=100, M=100 (Better shadows, color) N=1, M=450 (Better direct lighting) ↔

23 Resampled Importance Sampling Could directly minimize variance equation Could directly minimize variance equation Too hard, so we approximate Too hard, so we approximate

24 Resampled Importance Sampling M* = 0.5 * T total / T proposal M* = 0.5 * T total / T proposal N* = 0.5 * T total / T sample N* = 0.5 * T total / T sample Simple Simple Give equal time to proposals and samplesGive equal time to proposals and samples Robust Robust Results in no more than twice the variance of the true optimal valuesResults in no more than twice the variance of the true optimal values

25 Results – Direct Lighting RIS using estimated optimal values: M* = 218, N* = 64.8 57% variance reduction (equal time)

26 Results – Direct Lighting N=100, M=100 N=64.8, M=218N=1, M=450

27 Results II 34% variance reduction

28 Results III 33% variance reduction

29 Stratifying RIS Stratified sampling Stratified sampling

30 Stratifying RIS Stratified sampling Stratified sampling Divide domain into strataDivide domain into strata Take a single sample in each strataTake a single sample in each strata Avoids clustering of samplesAvoids clustering of samples

31 Stratifying RIS In RIS In RIS Stratify proposalsStratify proposals Avoids clustering Avoids clustering Apply standard techniques Apply standard techniques

32 Stratifying Proposals 34% variance reduction RIS without stratification Proposals only

33 Stratifying RIS In RIS In RIS Stratify proposalsStratify proposals Avoids clustering Avoids clustering Apply standard techniques Apply standard techniques

34 Stratifying RIS In RIS In RIS Stratify proposalsStratify proposals Avoids clustering Avoids clustering Apply standard techniques Apply standard techniques Stratify samplesStratify samples Avoids clustering Avoids clustering Avoids duplicates Avoids duplicates

35 Stratifying RIS How do we stratify samples? How do we stratify samples? Equal-proposalsEqual-proposals Equal-weightsEqual-weights

36 Stratifying Samples 34% variance reduction37% variance reduction42% variance reduction Proposals onlyEqual-proposalsEqual-weights

37 Multiple Importance Sampling We can often generate proposals from multiple densities We can often generate proposals from multiple densities How can we combine them? How can we combine them? Start at surfaceStart at light

38 Multiple Importance Sampling We can often generate proposals from multiple densities We can often generate proposals from multiple densities How can we combine them? How can we combine them? Multiple Importance SamplingMultiple Importance Sampling Start at surfaceStart at light

39 Multiple Importance Sampling 1) Generate proposals from densities p 1,…,p K. p should be easy to sample, i.e. using CDF inversion or rejection sampling Proposals = {x 1,…,x M }

40 Multiple Importance Sampling 1) Generate proposals from densities p 1,…,p K. 2) Compute weights.

41 Multiple Importance Sampling 1) Generate proposals from densities p 1,…,p K. 2) Compute weights.

42 Multiple Importance Sampling 1) Generate proposals from densities p 1,…,p K. 2) Compute weights. 3) Draw samples from the proposals with probability prop. to weight.

43 Multiple Importance Sampling Start at surfaceStart at light

44 Multiple Importance Sampling MIS without RISMIS with RIS 30% variance reduction

45 Thesis Contributions Resampled Importance Sampling (RIS) Resampled Importance Sampling (RIS) Proofs Proofs RIS unbiasedRIS unbiased RIS varianceRIS variance Efficiency optimal parametersEfficiency optimal parameters Robust approximate parametersRobust approximate parameters RIS combined with RIS combined with Stratified SamplingStratified Sampling Multiple Importance SamplingMultiple Importance Sampling Application to direct lighting problem Application to direct lighting problem

46 Concluding Thoughts RIS is better than IS when: RIS is better than IS when: q is a better importance sampling density than p q is a better importance sampling density than pAND Computing proposals is much cheaper than computing samples Computing proposals is much cheaper than computing samples Intuition: RIS takes advantage of differences in variance or computation expense Intuition: RIS takes advantage of differences in variance or computation expense

47 Concluding Thoughts Future Work Future Work Application to other problems in global illuminationApplication to other problems in global illumination Application to other fieldsApplication to other fields Development of better choices of q and pDevelopment of better choices of q and p Examine trade off between computational expense and importance sampling quality Examine trade off between computational expense and importance sampling quality

48 Questions


Download ppt "Importance Resampling for Global Illumination Justin F. Talbot Master’s Thesis Defense Brigham Young University Provo, UT."

Similar presentations


Ads by Google