Presentation is loading. Please wait.

Presentation is loading. Please wait.

02/10/03© 2003 University of Wisconsin Last Time Participating Media Assignment 2 –A solution program now exists, so you can preview what your solution.

Similar presentations


Presentation on theme: "02/10/03© 2003 University of Wisconsin Last Time Participating Media Assignment 2 –A solution program now exists, so you can preview what your solution."— Presentation transcript:

1 02/10/03© 2003 University of Wisconsin Last Time Participating Media Assignment 2 –A solution program now exists, so you can preview what your solution should look like –It uses OpenGL with smooth shading, so you would have to implement Gourand shading in your ray-tracer to see the same thing –It is OK to use OpenGL to generate the image in this project (but you’ll get better results if you “gather to a pixel” in a ray-tracer)

2 02/10/03© 2003 University of Wisconsin Today Monte-Carlo introduction Probability overview Sampling methods

3 02/10/03© 2003 University of Wisconsin Monte Carlo Rendering Central theme is sampling: –Determine what happens at a set of discrete points, or for a set of light paths, and extrapolate from there Central algorithm is ray tracing –Rays from the eye, the lights or other surfaces Theoretical basis is probability theory –The study of random events

4 02/10/03© 2003 University of Wisconsin Distributed Ray Tracing (Cook, Porter, Carpenter 1984) Arguably the simplest Monte-Carlo method Addresses the inability of ray tracing to capture: –non-ideal reflection/transmission –soft shadows –motion blur –depth of field Basic idea: Use multiple sample rays to build the image for each pixel Rays are distributed, not the algorithm (should probably be called distribution ray tracing)

5 02/10/03© 2003 University of Wisconsin Specific Cases Cast multiple rays per pixel, through different lens paths, to get depth-of-field Sample several directions around the reflected direction to get non-ideal reflection Send multiple rays to area light sources to get soft shadows Cast multiple rays per pixel, spread in time, to get motion blur Must be careful not to get correlated samples, which would be particularly apparent in an image –What should you do for lights?

6 02/10/03© 2003 University of Wisconsin Depth-of-Field Regardless of where they pass through the lens, all the rays from a point on the focal plane end up on the image Rays from a non-focal plane point converge somewhere else, resulting in a circle on the image plane Note that the aperture also allows a sampling of directions through a world point Focal planeLens Aperture Image plane Out of Focal plane gives circle of confusion

7 02/10/03© 2003 University of Wisconsin Distributed Depth-of-Field Sample multiple rays through the aperture from the image point If there are objects outside the focal plane, different rays will hit different points, and contribute different colors Also cast rays from within area of pixel Average all rays to get resulting pixel color Lens Aperture Image plane

8 02/10/03© 2003 University of Wisconsin Distributed Directional Reflectance Cast multiple rays to determine what is reflected in a surface Rays should be sampled according to the BRDF Average contribution to get pixel color Why is this a poor strategy for very diffuse surfaces?

9 02/10/03© 2003 University of Wisconsin Better Approaches Distributed ray tracing would need to cast extremely dense ray-trees to get good diffuse effects Other approaches improve upon this: –Don’t cast complete trees, just single paths –Observe that the same point may be hit many times, but distributed ray-tracing does not ever re-use partial ray-trees But first, some math…

10 02/10/03© 2003 University of Wisconsin Random Variables A random variable, x –Takes values from some domain,  –Has an associated probability density function, p(x) A probability density function, p(x) –Is positive over the domain,  –“Integrates” to 1 over  :

11 02/10/03© 2003 University of Wisconsin Expected Value The expected value of a random variable, x, is defined as: The expected value of a function, f(x), is defined as: The sample mean, for samples x i is defined as:

12 02/10/03© 2003 University of Wisconsin Variance and Standard Deviation The variance of a random variable is defined as: The standard deviation of a random variable is defined as the square root of its variance: The sample variance is:

13 02/10/03© 2003 University of Wisconsin Sampling A process samples according to the distribution p(x) if it randomly chooses a value for x such that: Weak Law of Large Numbers: If x i are independent samples from p(x), then in the limit of infinite samples, the sample mean is equal to the expected value:

14 02/10/03© 2003 University of Wisconsin Algorithms for Sampling Psuedo-random number generators give independent samples from the uniform distribution on [0,1), p(x)=1 Transformation method: take random samples from the uniform distribution and convert them to samples from another Rejection sampling: sample from a different distribution, but reject some samples Importance sampling (to compute an expectation): Sample from an easy distribution, but weight the samples

15 02/10/03© 2003 University of Wisconsin Distribution Functions Assume a density function f(y) defined on [a,b] Define the probability distribution function, or cumulative distribution function, as: Monotonically increasing function with F(a)=0 and F(b)=1

16 02/10/03© 2003 University of Wisconsin Transformation Method for 1D Generate z i uniform on [0,1) Compute Then the x i are distributed according to f(x) To apply, must be able to compute and invert distribution function F

17 02/10/03© 2003 University of Wisconsin Multi-Dimensional Transformation Assume density function f(x,y) defined on [a,b]  [c,d] Distribution function: Sample x i according to: Sample y i according to:

18 02/10/03© 2003 University of Wisconsin Rejection Sampling Say we wish to sample x i according to f(x) Find a function g(x) such that g(x)>f(x) for all x in the domain Geometric interpretation: generate sample uniform in the area under g, and accept if also under f Transform a weighted uniform sample according to x i =G - 1 (z), and generate y i uniform on [0,g(x i )] Keep the sample if y i <f(x), otherwise reject

19 02/10/03© 2003 University of Wisconsin Important Example Consider uniformly sampling a point on a sphere –Uniformly means that the probability that the point is in a region depends only on the area of the region, not its location on the sphere Generate points inside a cube [-1,1]x[-1,1]x[-1,1] Reject if the point lies outside the sphere Push accepted point onto surface Fraction of pts accepted:  /6 Bad strategy in higher dimensions

20 02/10/03© 2003 University of Wisconsin Estimating Integrals Say we wish to estimate Write h=gf, where f is something you choose If we sample x i according to f, then:

21 02/10/03© 2003 University of Wisconsin Standard Deviation of the Estimate Expected error in the estimate after n samples is measured by the standard deviation of the estimate: Note that error goes down with This technique is called importance sampling f should be as close as possible to g Same principle for higher dimensional integrals

22 02/10/03© 2003 University of Wisconsin Example: Form Factor Integrals We wish to estimate Sample from f by sampling x i uniformly on P i and y i uniformly on P j Define Estimate is


Download ppt "02/10/03© 2003 University of Wisconsin Last Time Participating Media Assignment 2 –A solution program now exists, so you can preview what your solution."

Similar presentations


Ads by Google