Monte Carlo Integration Robert Lin April 20, 2004.

Slides:



Advertisements
Similar presentations
Final Gathering using Adaptive Multiple Importance Sampling 1. Introduction We propose an efficient final gathering technique using adaptive multiple importance.
Advertisements

Photorealistic Rendering. Ray tracing v. photorealistic rendering What illumination effects are not captured by ray tracing? What illumination effects.
Path Differentials for MC Rendering Frank Suykens Department of Computer Science K.U.Leuven, Belgium Dagstuhl 2001: Stochastic methods in Rendering.
Sampling Attila Gyulassy Image Synthesis. Overview Problem Statement Random Number Generators Quasi-Random Number Generation Uniform sampling of Disks,
Advanced Computer Graphics (Spring 2005) COMS 4162, Lectures 18, 19: Monte Carlo Integration Ravi Ramamoorthi Acknowledgements.
Photon Tracing with Arbitrary Materials Patrick Yau.
Advanced Computer Graphics (Fall 2009) CS 294, Rendering Lecture 5: Monte Carlo Path Tracing Ravi Ramamoorthi
CF-3 Bank Hapoalim Jun-2001 Zvi Wiener Computational Finance.
Machine Learning CUNY Graduate Center Lecture 7b: Sampling.
Advanced Computer Graphics (Fall 2010) CS 283, Lecture 10: Global Illumination Ravi Ramamoorthi Some images courtesy.
Monte Carlo Integration COS 323 Acknowledgment: Tom Funkhouser.
Advanced Computer Graphics (Spring 2006) COMS 4162, Lecture 20: Monte Carlo Path Tracing Ravi Ramamoorthi Acknowledgements.
EGWR 2000 Metropolis Light Transport for Participating Media Mark Pauly Thomas KolligAlexander Keller ETH ZürichUniversity of Kaiserslautern.
CIS 681 Distributed Ray Tracing. CIS 681 Anti-Aliasing Graphics as signal processing –Scene description: continuous signal –Sample –digital representation.
A) Transformation method (for continuous distributions) U(0,1) : uniform distribution f(x) : arbitrary distribution f(x) dx = U(0,1)(u) du When inverse.
Variance Fall 2003, Math 115B. Basic Idea Tables of values and graphs of the p.m.f.’s of the finite random variables, X and Y, are given in the sheet.
Lecture II-2: Probability Review
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
Monte Carlo Integration Digital Image Synthesis Yung-Yu Chuang with slides by Pat Hanrahan and Torsten Moller.
1 Machine Learning: Lecture 5 Experimental Evaluation of Learning Algorithms (Based on Chapter 5 of Mitchell T.., Machine Learning, 1997)
Simulation of Random Walk How do we investigate this numerically? Choose the step length to be a=1 Use a computer to generate random numbers r i uniformly.
1 Statistical Mechanics and Multi- Scale Simulation Methods ChBE Prof. C. Heath Turner Lecture 11 Some materials adapted from Prof. Keith E. Gubbins:
01/24/05© 2005 University of Wisconsin Last Time Raytracing and PBRT Structure Radiometric quantities.
1 Lesson 3: Choosing from distributions Theory: LLN and Central Limit Theorem Theory: LLN and Central Limit Theorem Choosing from distributions Choosing.
University of Texas at Austin CS384G - Computer Graphics Fall 2008 Don Fussell Distribution Ray Tracing.
Monte Carlo I Previous lecture Analytical illumination formula This lecture Numerical evaluation of illumination Review random variables and probability.
Continuous Distributions The Uniform distribution from a to b.
1 Lesson 8: Basic Monte Carlo integration We begin the 2 nd phase of our course: Study of general mathematics of MC We begin the 2 nd phase of our course:
Filtering Robert Lin April 29, Outline Why filter? Filtering for Graphics Sampling and Reconstruction Convolution The Fourier Transform Overview.
02/10/03© 2003 University of Wisconsin Last Time Participating Media Assignment 2 –A solution program now exists, so you can preview what your solution.
Computer Science, Software Engineering & Robotics Workshop, FGCU, April 27-28, 2012 Fault Prediction with Particle Filters by David Hatfield mentors: Dr.
MA-250 Probability and Statistics Nazar Khan PUCIT Lecture 26.
Probability & Statistics I IE 254 Summer 1999 Chapter 4  Continuous Random Variables  What is the difference between a discrete & a continuous R.V.?
Mean and Standard Deviation of Discrete Random Variables.
Monte Carlo Methods So far we have discussed Monte Carlo methods based on a uniform distribution of random numbers on the interval [0,1] p(x) = 1 0  x.
4. Numerical Integration. Standard Quadrature We can find numerical value of a definite integral by the definition: where points x i are uniformly spaced.
Importance Resampling for Global Illumination Justin Talbot, David Cline, and Parris Egbert Brigham Young University Provo, UT.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
1 CSI5388 Current Approaches to Evaluation (Based on Chapter 5 of Mitchell T.., Machine Learning, 1997)
CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.
On robust Monte Carlo algorithms for multi-pass global illumination Frank Suykens – De Laet 17 September 2002.
Monte-Carlo Ray Tracing and
02/12/03© 2003 University of Wisconsin Last Time Intro to Monte-Carlo methods Probability.
Computer Graphics III – Monte Carlo integration Direct illumination
Computer Graphics III Winter Term 2015 Organization Jaroslav Křivánek, MFF UK
Slide 1Lastra, 2/14/2016 Monte-Carlo Methods. Slide 2Lastra, 2/14/2016 Topics Kajiya’s paper –Showed that existing rendering methods are approximations.
Simplifying Algebraic Expressions. 1. Evaluate each expression using the given values of the variables (similar to p.72 #37-49)
01/26/05© 2005 University of Wisconsin Last Time Raytracing and PBRT Structure Radiometric quantities.
Global Illumination (3) Path Tracing. Overview Light Transport Notation Path Tracing Photon Mapping.
Operations of Functions Given two functions  and g, then for all values of x for which both  (x) and g (x) are defined, the functions  + g,
Introduction to Computer Simulation of Physical Systems (Lecture 10) Numerical and Monte Carlo Methods (CONTINUED) PHYS 3061.
Monte Carlo Integration Digital Image Synthesis Yung-Yu Chuang 12/3/2008 with slides by Pat Hanrahan and Torsten Moller.
Random Variables By: 1.
Advanced Computer Graphics
Introduction to Monte Carlo Method
Basic simulation methodology
Sampling and Reconstruction of Visual Appearance
Cumulative distribution functions and expected values
Ray Tracing via Markov Chain Monte-Carlo Method
Chapter 4: Mathematical Expectation:
Distribution Ray Tracing
Path Tracing (some material from University of Wisconsin)
Probability Review for Financial Engineers
Efficient Importance Sampling Techniques for the Photon Map
Monte Carlo I Previous lecture Analytical illumination formula
Monte Carlo Rendering Central theme is sampling:
Lecture 4 - Monte Carlo improvements via variance reduction techniques: antithetic sampling Antithetic variates: for any one path obtained by a gaussian.
Monte Carlo Path Tracing and Caching Illumination
Monte Carlo Integration
Photon Density Estimation using Multiple Importance Sampling
Presentation transcript:

Monte Carlo Integration Robert Lin April 20, 2004

Outline Integration Applications Random variables, probability, expected value, variance Integration Approximation Monte Carlo Integration Variance Reduction (sampling methods)

Integration Applications Antialiasing

Integration Applications Soft Shadows

Integration Applications Indirect Lighting

Random Variables, Probability Density Function Continuous random variable x: scalar or vector quantity that randomly takes on a value (- ∞,+∞) Probability Density Function p associated with x (denoted x ~ p) describes the distribution of x: Properties:

Random Variables, Probability Density Function Example: Let ε be a random variable taking on values [0, 1) uniformly Probability Density Function ε ~ q Probability that ε takes on a certain value [a, b] in [0, 1) is

Expected Value The average value of a function f(x) with probability distribution function (pdf) p(x) is called the expected value: The expected value of a 1D random variable can be calculated by letting f(x) = x. Expected Value Properties: 1. 2.

Multidimensionality Random variables and expected values can be extended to multiple dimensions easily Let S represent a multidimensional space with measure μ Let x be a random variable with pdf p Probability that x takes on a value in region in S i, a subset of S, is

Multidimensionality Example:  Let α be a 2D random variable uniformly distributed on a disk of radius R  p(α) = 1 / (πR 2 )

Multidimensionality Example  Given a unit square S = [0, 1] x [0, 1]  Given pdf p(x, y) = 4xy  The expected value of the x coordinate is found by setting f(x, y) = x:

Variance The variance of a random variable is defined as the expected value of the square of the difference between x and E(x). Some algebra lets us convert this to the form:

Integration Problems Integrals for rendering can be difficult to evaluate  Multi-dimensional integrals  Non-continuous functions Highlights Occluders

Integration Approximation How to evaluate integral of f(x)?

Integration Approximation Can approximate using another function g(x)

Integration Approximation Can approximate by taking the average value

Integration Approximation Estimate the average by taking N samples

Monte Carlo Integration I m = Monte Carlo estimate N = number of samples x 1, x 2, …, x N are uniformly distributed random numbers between a and b

Monte Carlo Integration

We have the definition of expected value and how to estimate it. Since the expected value can be expressed as an integral, the integral is also approximated by the sum. To simplify the integral, we can substitute g(x) = f(x)p(x).

Variance The variance describes how much the sampled values vary from each other. Variance proportional to 1/N

Variance Standard Deviation is just the square root of the variance Standard Deviation proportional to 1 / sqrt(N) Need 4X samples to halve the error

Variance Problem:  Variance (noise) decreases slowly  Using more samples only removes a small amount of noise

Variance Reduction There are several ways to reduce the variance  Importance Sampling  Stratified Sampling  Quasi-random Sampling  Metropolis Random Mutations

Importance Sampling Idea: use more samples in important regions of the function If function is high in small areas, use more samples there

Importance Sampling Want g/p to have low variance Choose a good function p similar to g:

Stratified Sampling Partition S into smaller domains S i Evaluate integral as sum of integrals over S i Example: jittering for pixel sampling Often works much better than importance sampling in practice

Examples

Conclusion Monte Carlo Integration Pros  Good to estimate integrals with many dimensions  Good to estimate integrals with complex functions  General integration method with many applications Monte Carlo Integration Cons  Variance reduces slowly (error appears as noise)  Reduce variance with importance sampling, stratified sampling, etc.  Can use other methods (filtering) to remove noise

References Peter Shirley, R. Keith Morley. Realistic Ray Tracing, Natick, MA: A K Peters, Ltd., 2003, pages 47-51, Henrik Wann Jensen. Realistic Image Synthesis Using Photon Mapping, Natick, MA: A K Peters, Ltd., 2001, pages Pat Hanrahan. Monte Carlo Integration 1 (Lecture Notes): Thomas Funkhouser, Monte Carlo Integration For Image Synthesis: o.pdf Eric Veach. Robust Monte Carlo Methods for Light Transport Simulation. Ph.D Thesis, Stanford University, Dec 1997.