Summer Seminar Ruizhen Hu. Spectral Sampling of Manifolds (Siggraph Asia 2010) Accurate Multidimensional Poisson-Disk Sampling (TOG) Efficient Maximal.

Slides:



Advertisements
Similar presentations
Line Segment Sampling with Blue-Noise Properties Xin Sun 1 Kun Zhou 2 Jie Guo 3 Guofu Xie 4,5 Jingui Pan 3 Wencheng Wang 4 Baining Guo 1 1 Microsoft Research.
Advertisements

Shapelets Correlated with Surface Normals Produce Surfaces Peter Kovesi School of Computer Science & Software Engineering The University of Western Australia.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: The Linear Prediction Model The Autocorrelation Method Levinson and Durbin.
Various Regularization Methods in Computer Vision Min-Gyu Park Computer Vision Lab. School of Information and Communications GIST.
L1 sparse reconstruction of sharp point set surfaces
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Surface Simplification Using Quadric Error Metrics Speaker: Fengwei Zhang September
Fast Algorithms For Hierarchical Range Histogram Constructions
1 Low-Dose Dual-Energy CT for PET Attenuation Correction with Statistical Sinogram Restoration Joonki Noh, Jeffrey A. Fessler EECS Department, The University.
Light Fields PROPERTIES AND APPLICATIONS. Outline  What are light fields  Acquisition of light fields  from a 3D scene  from a real world scene 
Accelerating Spatially Varying Gaussian Filters Jongmin Baek and David E. Jacobs Stanford University.
High-Quality Parallel Depth-of- Field Using Line Samples Stanley Tzeng, Anjul Patney, Andrew Davidson, Mohamed S. Ebeida, Scott A. Mitchell, John D. Owens.
Digital Image Processing In The Name Of God Digital Image Processing Lecture3: Image enhancement M. Ghelich Oghli By: M. Ghelich Oghli
5D COVARIA NCE TRACING FOR EFFICIENT DEFOCUS AND MOTION BLUR Laurent Belcour 1 Cyril Soler 2 Kartic Subr 3 Nicolas Holzschuch 2 Frédo Durand 4 1 Grenoble.
Uncertainty Representation. Gaussian Distribution variance Standard deviation.
Instructor: Mircea Nicolescu Lecture 13 CS 485 / 685 Computer Vision.
Lapped Textures Emil Praun and Adam Finkelstien (Princeton University) Huges Hoppe (Microsoft Research) SIGGRAPH 2000 Presented by Anteneh.
Advanced Computer Graphics (Spring 2005) COMS 4162, Lectures 18, 19: Monte Carlo Integration Ravi Ramamoorthi Acknowledgements.
Multi-Class Blue Noise Sampling Li-Yi Wei 魏立一 Microsoft Research.
Image processing. Image operations Operations on an image –Linear filtering –Non-linear filtering –Transformations –Noise removal –Segmentation.
Probabilistic video stabilization using Kalman filtering and mosaicking.
Adaptive Rao-Blackwellized Particle Filter and It’s Evaluation for Tracking in Surveillance Xinyu Xu and Baoxin Li, Senior Member, IEEE.
Shape Modeling International 2007 – University of Utah, School of Computing Robust Smooth Feature Extraction from Point Clouds Joel Daniels ¹ Linh Ha ¹.
Feature Screening Concept: A greedy feature selection method. Rank features and discard those whose ranking criterions are below the threshold. Problem:
Single Point of Contact Manipulation of Unknown Objects Stuart Anderson Advisor: Reid Simmons School of Computer Science Carnegie Mellon University.
Spectral Processing of Point-sampled Geometry
Radial Basis Function Networks
Zoltan Szego †*, Yoshihiro Kanamori ‡, Tomoyuki Nishita † † The University of Tokyo, *Google Japan Inc., ‡ University of Tsukuba.
Image Segmentation Rob Atlas Nick Bridle Evan Radkoff.
CSCE 441: Computer Graphics Image Filtering Jinxiang Chai.
Fast Bilateral Filtering
The Statistical Properties of Large Scale Structure Alexander Szalay Department of Physics and Astronomy The Johns Hopkins University.
1 CE 530 Molecular Simulation Lecture 7 David A. Kofke Department of Chemical Engineering SUNY Buffalo
Gwangju Institute of Science and Technology Intelligent Design and Graphics Laboratory Feature-Aware Filtering for Point-Set Surface Denoising Min Ki Park*Seung.
1 Statistical Mechanics and Multi- Scale Simulation Methods ChBE Prof. C. Heath Turner Lecture 11 Some materials adapted from Prof. Keith E. Gubbins:
1 Patch Complexity, Finite Pixel Correlations and Optimal Denoising Anat Levin, Boaz Nadler, Fredo Durand and Bill Freeman Weizmann Institute, MIT CSAIL.
POF darts: Geometric Adaptive Sampling for Probability of Failure
Gwangju Institute of Science and Technology Intelligent Design and Graphics Laboratory Multi-scale tensor voting for feature extraction from unstructured.
Surface Simplification Using Quadric Error Metrics Michael Garland Paul S. Heckbert.
01/28/05© 2005 University of Wisconsin Last Time Improving Monte Carlo Efficiency.
ALIGNMENT OF 3D ARTICULATE SHAPES. Articulated registration Input: Two or more 3d point clouds (possibly with connectivity information) of an articulated.
EE369C Final Project: Accelerated Flip Angle Sequences Jan 9, 2012 Jason Su.
Module 1: Statistical Issues in Micro simulation Paul Sousa.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Image Processing Edge detection Filtering: Noise suppresion.
INFORMATIK Laplacian Surface Editing Olga Sorkine Daniel Cohen-Or Yaron Lipman Tel Aviv University Marc Alexa TU Darmstadt Christian Rössl Hans-Peter Seidel.
Ch 4. Linear Models for Classification (1/2) Pattern Recognition and Machine Learning, C. M. Bishop, Summarized and revised by Hee-Woong Lim.
A Spatial Data Structure for Fast Poisson-Disk Sample Generation SIGGRAPH 2006 Daniel Dunbar, Greg Humphreys. University of Virginia.
1 Complex Images k’k’ k”k” k0k0 -k0-k0 branch cut   k 0 pole C1C1 C0C0 from the Sommerfeld identity, the complex exponentials must be a function.
A New Method of Probability Density Estimation for Mutual Information Based Image Registration Ajit Rajwade, Arunava Banerjee, Anand Rangarajan. Dept.
Fields of Experts: A Framework for Learning Image Priors (Mon) Young Ki Baik, Computer Vision Lab.
23 November Md. Tanvir Al Amin (Presenter) Anupam Bhattacharjee Department of Computer Science and Engineering,
1 Research Question  Can a vision-based mobile robot  with limited computation and memory,  and rapidly varying camera positions,  operate autonomously.
Reconstruction of Solid Models from Oriented Point Sets Misha Kazhdan Johns Hopkins University.
A Theory of Monte Carlo Visibility Sampling
1  The Problem: Consider a two class task with ω 1, ω 2   LINEAR CLASSIFIERS.
Outline Introduction Research Project Findings / Results
1 Methods in Image Analysis – Lecture 3 Fourier CMU Robotics Institute U. Pitt Bioengineering 2630 Spring Term, 2004 George Stetten, M.D., Ph.D.
Active Learning and the Importance of Feedback in Sampling Rui Castro Rebecca Willett and Robert Nowak.
Mesh Resampling Wolfgang Knoll, Reinhard Russ, Cornelia Hasil 1 Institute of Computer Graphics and Algorithms Vienna University of Technology.
Speaker Min-Koo Kang March 26, 2013 Depth Enhancement Technique by Sensor Fusion: MRF-based approach.
Instructor: Mircea Nicolescu Lecture 5 CS 485 / 685 Computer Vision.
CDS 301 Fall, 2008 Domain-Modeling Techniques Chap. 8 November 04, 2008 Jie Zhang Copyright ©
3D Object Representations 2009, Fall. Introduction What is CG?  Imaging : Representing 2D images  Modeling : Representing 3D objects  Rendering : Constructing.
Deep Learning Overview Sources: workshop-tutorial-final.pdf
Biointelligence Laboratory, Seoul National University
Dynamical Statistical Shape Priors for Level Set Based Tracking
Domain-Modeling Techniques
Craig Schroeder October 26, 2004
Sampling Gabor Noise in the Spatial Domain
Presentation transcript:

Summer Seminar Ruizhen Hu

Spectral Sampling of Manifolds (Siggraph Asia 2010) Accurate Multidimensional Poisson-Disk Sampling (TOG) Efficient Maximal Poisson-Disk Sampling Blue-Noise Point Sampling using Kernel Density Model Differential Domain Analysis for Non-uniform Sampling Sampling Noise & filtering Filtering Solid Gabor Noise Accelerating Spatially Varying Gaussian

Spectral Sampling of Manifolds A. Cengiz Öztireli Marc Alexa Markus Gross ETH Zürich TU Berlin ETH Zürich

Authors CENGİZ ÖZTİRELİ PhD Candidate Computer Graphics Laboratory ETH Zürich Research interests: Reconstruction, sampling and processing of surfaces, and sketch based modeling. Marc Alexa Professor Electrical Engineering & Computer Science TU Berlin Markus Gross Professor Department of Computer Science ETH Zürich Research interests: Computer graphics, image generation, geometric modeling, computer animation, and scientific visualization

Motivation Goal: finding optimal sampling conditions for a given surface representation Work: propose a new method to solve this problem based on spectral analysis of manifolds, kernel methods and matrix perturbation theory

Contributions Efficient, simple to implement, easy to control through intuitive parameters, feature sensitive Result in accurate reconstructions with kernel based approximation methods and high quality isotropic samplings A discrete spectral analysis of manifolds using results from kernel methods and matrix perturbation theory

Main Algorithm Input: a set of points lying near a manifold with normals + a kernel function definition = a continuous surface

Algorithms for Sampling Subsampling: measuring the effect of a point on the manifold using the Laplace-Beltrami spectrum measures the contribution of a point to the manifold definition

Algorithms for Sampling Resampling: maximizing and equalizing s (x) for all points – use local operations and move points in a simple gradient ascent

Results

Conclusions New algorithms for the simplification and resampling of manifolds depending on a measure that restricts changes to the Laplace-Beltrami spectrum Limitations: the algorithms are greedy and thus not theoretically guaranteed to give the optimal sampling Future Directions: – texture on a surface – isotropic adaptive remeshing

Accurate Multi-Dimensional Poisson-Disk Sampling Manuel N. Gamito Steve Maddock Lightwork Design Ltd The University of Sheffield

Authors Manuel Noronha Gamito software engineer Lightwork Design Ltd Steve Maddock Senior Lecturer The University of Sheffield Research interests: character animation, specifically modelling and animating faces

Poisson-Disk Sampling Definition: – Each sample is placed with uniform probability density – No two samples are closer than, where is some chosen distribution radius – A distribution is maximal if no more samples can be inserted Poisson-Disk sampling is useful for: – Distributed ray tracing [Cook 1986; Hachisuka et al. 2008] – Object placement and texturing [Lagae and Dutré 2006; Cline et al. 2009] – Stippling and dithering [Deussen et al. 2000; Secord et al. 2002] – Global Illumination [Lehtinen et al. 2008]

Previous Methods Approximate Methods – Relax at least one of the sampling conditions Accurate Methods – Brute force Dart Throwing [Dippé and Wold 1985] – Assisted by a spatial data structure Voronoi diagram [Jones 2006] Scalloped sectors [Dunbar and Humphreys 2006] Uniform grid [Bridson 2007] Simplified subdivision tree and uniform grid [White et al. 2007]

Main Algorithm

Radius vs. Number of Samples A distribution can be specified by supplying either – The distribution radius r – The desired number of samples N When the number of samples is specified – The algorithm uses a radius r based on N and on the measured packing density of sample disks The packing density was obtained by averaging the packing densities measured from 100 distributions generated by our algorithm The number of samples of the resulting maximal distribution is approximately equal to the desired number N (error<5%)

Results Number of samples Sampling time Samples per second

Results

Conclusions A Poisson-Disk Sampling Algorithm that – Is statistically correct (see proof in paper) – Is efficient through the use of a subdivision tree – Works in any number of dimensions Subject to available physical memory – Generates maximal distributions – Allows approximate control over the number of samples – Can enforce periodic or wall boundary conditions on the boundaries of the domain

Future Work Make it multi-threaded – Distant parts of the domain can be sampled in parallel with different threads – Some synchronisation between threads is still required Generate non-uniform distributions – Have the distribution radius be a function of the position in the domain Work over irregular domains – Discard subdivided tree nodes that fall outside the domain

Efficient Maximal Poisson-Disk Sampling

Authors Mohamed S. Ebeida post-doctor Carnegie Mellon university Anjul Patney PhD Carnegie University of California, Davis Scott A. Mitchell Principal Member of Technical Staff Sandia National Laboratories Andrew Davidson PhD Carnegie University of California, Davis Patrick M. Knupp Distinguished Member Technical Staff Sandia National Laboratories John D. Owens Associate Professor Carnegie University of California, Davis

Work generating a uniform Poisson-disk sampling that is both maximal and unbiased over bounded non-convex domains

Motivation Maximal Poisson-disk sampling distributions: – Avoid aliasing – Have blue noise property Bias-free: – Crucial in fracture propagation simulations

Conditions Maximal : the sample disks overlap cover the whole domain leaving no room to insert an additional point Bias-free the likelihood of a sample being inside any subdomain is proportional to the area of the subdomain, provided the subdomain is completely outside all prior samples’ disks

Previous methods relax the unbiased or maximal conditions, or require potentially unbounded time or space Dart-throwing – unbiased but also not maximal Tile-based – biased and require relatively large storage.

Main Algorithm First phase: – an unbiased, near-maximal covering – voids: the part of a grid cell outside all circles Second phase: – completes the maximal covering – darts are thrown directly into the voids, maintaining the bias- free condition – A maximal distribution is achieved when the domain is completely covered, leaving no room for new points to be selected

Sequential Sampling 1.Generate a background grid; mark interior and boundary cells 2.Phase I. Throw darts into square cells; remove hit cells 3.Generate polygonal approximations to the remaining voids 4.Phase II. Throw darts into voids; update remaining areas

Algorithm through Phase I

Voids Polygonal approximations to arc-voids

Results

Implementation Performance

Conclusions An efficient algorithm for maximal Poisson-disk sampling in two-dimensions – the final result is provably maximal – the sampling is unbiased – it is O(n log n) in expected time – it is O(n) in deterministic memory required – not limited to convex domains – efficiently implemented in both sequential and parallel forms Future work: 3D maximal Poisson-disk sampling algorithm

Blue-Noise Point Sampling using Kernel Density Model Raanan Fattal Hebrew University of Jerusalem, Israel

Author Raanan Fattal Alon faculty member School of Computer Science and Engineering The Hebrew University of Jerusalem

Work A new approach for generating point sets with high-quality blue noise properties that formulates the problem using a statistical mechanics interacting particle model

Contributions present a new approach that formulates the problem using a statistical mechanics interacting particle model derive a highly efficient multi-scale sampling scheme for drawing random point distributions avoids the critical slowing down phenomena that plagues this type of models

Previous work Dart throwing – constrain a minimal distance between every pair of points Relaxation – follow a greedy strategy that maximizes this distance – two main shortcomings: teriminatin impreciseness : apparent blur

New Approach model the target density as a sum of nonnegative radially- symmetric kernels The j-th kernel centered around the point x j:

New Approach The error of this approximation Minimizing E, with respect to the kernel centers – equivalent to the one obtained by converged Lloyd’s iterations – has the ability to achieve spectral enhancement We unify error minimization and randomness by defining a statistical mechanics particle model using E

New Approach Assigning each configuration a probability density according to the following Boltzmann- Gibbs distribution:

Drawing samples Markov-chain Monte Carlo(MCMC) Gibbs sampler Langevin method Metropolis-Hastings(MH) test

Critical slowing down

Multi-scale sampling

Results

Differential Domain Analysis for Non-uniform Sampling Li-Yi Wei Rui Wang Microsoft Research University of Massachusetts Amherst

Authors Li-Yi Wei Researcher Microsoft Research Rui Wang Assistant Professor Department of Computer Science University of Massachusetts

Work new methods for analyzing non-uniform sample distributions

Previous work Two common methodologies exist for evaluating the quality of samples – spatial uniformity: discrepancy & relative radius – Power spectrum analysis, including radial mean and anisotropy However, existing methods are primarily designed for uniform Euclidean domains and can not be easily extended to general non-uniform scenarios, such as – adaptive – anisotropic – non-Euclidean domains

Contributions A reformulation of standard Fourier spectrum analysis into a form that depends on sample location differentials A generalization of this basic formulation, including different distance transformations for various domains, and range selection for better control of quality and speed Applications in spectral and spatial analysis for non-uniform sample distributions

Core idea Fourier power spectrum – Fourier transform: – power spectrum: Differential representation:

Core idea Integral form: General kernel: – Range selection (for computational reasons): Where is a local distance measure of s’ with respect to the local frame centered at s. In uniform Euclidean domains,

Core idea Non-uniform domain : Where is a differential domain transformation function that locally warps each d from a non-uniform to a (hypothetical) uniform

Kernel selection a cos kernel may amplify some information while obscure others A Gaussian kernel, in contrast, displays the main peak clearly without undulations. it can manifest the distribution properties more clearly, e.g. more apparent characteristic structures

Spectral Analysis Exact – Isotropic Euclidean domain – Anisotropic Euclidean domain Range selection Radial measures – We can compute the circular average and variance of p(d) – The former gives the radial mean, indicating the overall distance-based property of p(d) – The latter gives the anisotropy, which reveals if there is any directional bias/structure in the distribution

Comparisons How our method relates and compares to traditional Fourier spectrum analysis?

Results

Filtering Solid Gabor Noise Ares Lagae George Drettakis Katholieke Universiteit Leuven REVES/INRIA Sophia-Antipolis

Authors Ares Lagae Postdoctoral Fellow Computer Graphics Research Group Katholieke Universiteit Leuven George Drettakis Group Leader REVES/Inria Sophia-Antipolis

Work we show that a slicing approach is required to preserve continuity across sharp edges, and we present a new noise function that supports anisotropic filtering of sliced solid noise

Filtering Filtering the noise on the surface using frequency clamping works better if the power spectrum of the noise on the surface is bandpass. The Fourier Slice Theorem: projecting in the spatial domain corresponds to slicing in the frequency domain, and vice versa

Previous works Perlin noise[Perlin 1985]: – use slicing and frequency clamping – Filtering introduces an aliasing vs. detail loss trade-off since the power spectrum of Perlin noise is not band-pass Wavelet noise [Cook and DeRose 2005]: – use projection and frequency clamping – Even though the power spectrum of the noise on the surface is band-pass, filtering does not fully solve the aliasing vs. detail loss trade-off Gabor noise [Lagae et al. 2009]: – use projection and a filtering approach specific to Gabor noise – This results in high-quality anisotropic filtering, however, introduces discontinuities at sharp edges

New noise solid random-phase Gabor noise: – use slice (preserves continuity at sharp edges) – Since filtering is inherently a 2D operation, we have to explicitly model the slicing of the 3D Gabor kernels to be able to filter the resulting 2D Gabor kernels. This requires the introduction of a new Gabor kernel, the phase-augmented Gabor kernel a new Gabor noise, random-phase Gabor noise

Slicing Solid Gabor Noise The Gabor kernel of Lagae et al. [2009] is not closed under slicing:

Phase-augmented Gabor kernel The phase-augmented Gabor kernel is closed under slicing: solid random-phase Gabor noise (also closed under slicing):

Slicing Random-Phase Gabor Noise solid random-phase Gabor noise is also closed under slicing the statistical properties of a sliced solid random-phase Gabor noise are obtained using the analytical expressions for the statistical properties of a 2D random-phase Gabor noise

Filtering Sliced Solid Gabor Noise

Conclusion A new procedural noise function, random-phase Gabor noise, that supports – continuity across sharp edges – high-quality anisotropic filtering – Anisotropy Future work: – exploring volumetric filtering of solid Gabor noise – further exploring anisotropy in the context of solid noise – designing user interfaces for interacting with anisotropic solid noise

Results

Accelerating Spatially Varying Gaussian Filters Jongmin Baek David E. Jacobs Stanford University

Jongmin Baek Ph.D. student Stanford University David E. Jacobs PhD candidate Stanford University Authors

Motivation Input Gaussian Filter Spatially Varying Gaussian Filter

1) Accelerating Spatially Varying Gaussian Filters 2) Accelerating Spatially Varying Gaussian Filters 3) Accelerating Spatially Varying Gaussian Filters 4) Applications Roadmap

Gaussian Filters Position Value

Gaussian Filters Each output value …

Gaussian Filters … is a weighted sum of input values …

Gaussian Filters … whose weight is a Gaussian …

Gaussian Filters … in the space of the associated positions.

Gaussian Blur Gaussian Filters: Uses

Bilateral Filter Gaussian Filters: Uses

Non-local Means Filter Gaussian Filters: Uses

Applications  Denoising images and meshes  Data fusion and upsampling  Abstraction / Stylization  Tone-mapping ... Gaussian Filters: Summary Previous work on fast Gaussian Filters  Bilateral Grid (Chen, Paris, Durand; 2007)  Gaussian KD-Tree (Adams et al.; 2009)  Permutohedral Lattice (Adams, Baek, Davis; 2010)

Summary of Previous Implementations:  A separable blur flanked by resampling operations.  Exploit the separability of the Gaussian kernel. Gaussian Filters: Implementations

Spatially Varying Gaussian Filters Spatially varying covariance matrix Spatially Invariant

Trilateral Filter (Choudhury and Tumblin, 2003)  Tilt the kernel of a bilateral filter along the image gradient.  “Piecewise linear” instead of “Piecewise constant” model. Spatial Variance in Previous Work

Spatially Varying Gaussian Filters: Tradeoff Benefits:  Can adapt the kernel spatially.  Better filtering performance. Cost:  No longer separable.  No existing acceleration schemes. Input Bilateral-filtered Trilateral-filtered

Problem:  Spatially varying (thus non-separable) Gaussian filter Existing Tool:  Fast algorithms for spatially invariant Gaussian filters Solution:  Re-formulate the problem to fit the tool.  Need to obey the “piecewise-constant” assumption Acceleration

Naïve Approach (Toy Example) I LOST THE GAME Input Signal Desired Kernel filtered w/ 1 filtered w/ 2 filtered w/ 3 filtered w/ Output Signal 4

In practice, the # of kernels can be very large. Challenge #1 Pixel Location x Desired Kernel K(x) Range of Kernels needed

Sample a few kernels and interpolate. Solution #1 Desired Kernel K(x) Sampled kernels Interpolate result! Pixel Location x K1K1 K2K2 K3K3

Interpolation needs an extra assumption to work:  The covariance matrix Ʃ i is either piecewise-constant, or smoothly varying.  Kernel is spatially varying, but locally spatially invariant. Assumptions

Runtime scales with the # of sampled kernels. Challenge #2 Desired Kernel K(x) Filter only some regions of the image with each kernel. (“support”) Pixel Location x Sampled kernels K1K1 K2K2 K3K3

In this example, x needs to be in the support of K 1 & K 2. Defining the Support Desired Kernel K(x) Pixel Location x K1K1 K2K2 K3K3

Dilating the Support Desired Kernel K(x) Pixel Location x K1K1 K2K2 K3K3

Algorithm 1) Identify kernels to sample. 2) For each kernel, compute the support needed. 3) Dilate each support. 4) Filter each dilated support with its kernel. 5) Interpolate from the filtered results.

Algorithm 1) Identify kernels to sample. 2) For each kernel, compute the support needed. 3) Dilate each support. 4) Filter each dilated support with its kernel. 5) Interpolate from the filtered results. K1K1 K2K2 K3K3

Algorithm 1) Identify kernels to sample. 2) For each kernel, compute the support needed. 3) Dilate each support. 4) Filter each dilated support with its kernel. 5) Interpolate from the filtered results. K1K1 K2K2 K3K3

Algorithm 1) Identify kernels to sample. 2) For each kernel, compute the support needed. 3) Dilate each support. 4) Filter each dilated support with its kernel. 5) Interpolate from the filtered results. K1K1 K2K2 K3K3

Algorithm 1) Identify kernels to sample. 2) For each kernel, compute the support needed. 3) Dilate each support. 4) Filter each dilated support with its kernel. 5) Interpolate from the filtered results. K1K1 K2K2 K3K3

Algorithm 1) Identify kernels to sample. 2) For each kernel, compute the support needed. 3) Dilate each support. 4) Filter each dilated support with its kernel. 5) Interpolate from the filtered results. K1K1 K2K2 K3K3

Applications  HDR Tone-mapping  Joint Range Data Upsampling

Application #1: HDR Tone-mapping Input HDR Detail Base Filter Output Attenuate

Tone-mapping Example Bilateral Filter Kernel Sampling

Application #2: Joint Range Data Upsampling Range Finder Data  Sparse  Unstructured  Noisy Scene Image Output Filter

Synthetic Example Scene Image Ground Truth Depth

Synthetic Example Scene ImageSimulated Sensor Data

Synthetic Example : Result Kernel Sampling Bilateral Filter

Synthetic Example : Relative Error Bilateral Filter Kernel Sampling 2.41% Mean Relative Error0.95% Mean Relative Error

Real-World Example Scene Image Range Finder Data *Dataset courtesy of Jennifer Dolson, Stanford University

Real-World Example: Result Input Bilateral Naive Kernel Sampling

Performance Kernel Sampling Choudhury and Tumblin (2003) Naïve Tonemap s41.54 s s Tonemap s88.08 s s Kernel Sampling (No segmentation) Depth s57.90 s Depth s s

1.A generalization of Gaussian filters Spatially varying kernels Lose the piecewise-constant assumption. 2.Acceleration via Kernel Sampling Filter only necessary pixels (and their support) and interpolate. 3.Applications Conclusion