Smoothing 3D Meshes using Markov Random Fields

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

HOPS: Efficient Region Labeling using Higher Order Proxy Neighborhoods Albert Y. C. Chen 1, Jason J. Corso 1, and Le Wang 2 1 Dept. of Computer Science.
Linear Time Methods for Propagating Beliefs Min Convolution, Distance Transforms and Box Sums Daniel Huttenlocher Computer Science Department December,
L1 sparse reconstruction of sharp point set surfaces
Discrete Differential Geometry Planar Curves 2D/3D Shape Manipulation, 3D Printing March 13, 2013 Slides from Olga Sorkine, Eitan Grinspun.
CHAPTER 8 A NNEALING- T YPE A LGORITHMS Organization of chapter in ISSO –Introduction to simulated annealing –Simulated annealing algorithm Basic algorithm.
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
2D/3D Shape Manipulation, 3D Printing
MATHIEU GAUTHIER PIERRE POULIN LIGUM, DEPT. I.R.O. UNIVERSITÉ DE MONTRÉAL GRAPHICS INTERFACE 2009 Preserving Sharp Edges in Geometry Images.
Image Segmentation and Active Contour
Artificial Intelligence Lecture 2 Dr. Bo Yuan, Professor Department of Computer Science and Engineering Shanghai Jiaotong University
Markov Networks.
Self-Validated Labeling of MRFs for Image Segmentation Wei Feng 1,2, Jiaya Jia 2 and Zhi-Qiang Liu 1 1. School of Creative Media, City University of Hong.
Visual Recognition Tutorial
J. Mike McHugh,Janusz Konrad, Venkatesh Saligrama and Pierre-Marc Jodoin Signal Processing Letters, IEEE Professor: Jar-Ferr Yang Presenter: Ming-Hua Tang.
A Sketch-Based Interface for Detail-Preserving Mesh Editing Andrew Nealen Olga Sorkine Marc Alexa Daniel Cohen-Or.
2010/5/171 Overview of graph cuts. 2010/5/172 Outline Introduction S-t Graph cuts Extension to multi-label problems Compare simulated annealing and alpha-
Randomized Cuts for 3D Mesh Analysis
1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU
Announcements Readings for today:
Efficient simplification of point-sampled geometry Mark Pauly Markus Gross Leif Kobbelt ETH Zurich RWTH Aachen.
Andrew Nealen, TU Berlin, CG 11 Andrew Nealen TU Berlin Takeo Igarashi The University of Tokyo / PRESTO JST Olga Sorkine Marc Alexa TU Berlin Laplacian.
Visual Recognition Tutorial
Stereo Matching & Energy Minimization Vision for Graphics CSE 590SS, Winter 2001 Richard Szeliski.
Robust Statistical Estimation of Curvature on Discretized Surfaces Evangelos Kalogerakis Patricio Simari Derek Nowrouzezahrai Karan Singh Symposium on.
Rician Noise Removal in Diffusion Tensor MRI
Advanced Image Processing Image Relaxation – Restoration and Feature Extraction 02/02/10.
Discrete Distortion for Surface Meshes Mohammed Mostefa Mesmoudi Leila De Floriani Paola Magillo Dept. of Computer Science, University of Genova, Italy.
CSE554Laplacian DeformationSlide 1 CSE 554 Lecture 8: Laplacian Deformation Fall 2012.
Gwangju Institute of Science and Technology Intelligent Design and Graphics Laboratory Multi-scale tensor voting for feature extraction from unstructured.
Dual/Primal Mesh Optimization for Polygonized Implicit Surfaces
Techniques for Estimating Layers from Polar Radar Imagery Jerome E. Mitchell, Geoffrey C. Fox, and David J. Crandall :: CReSIS NSF Site Visit :: School.
Hierarchical Distributed Genetic Algorithm for Image Segmentation Hanchuan Peng, Fuhui Long*, Zheru Chi, and Wanshi Siu {fhlong, phc,
Hierarchical Bayesian Modeling (HBM) in EEG and MEG source analysis Carsten Wolters Institut für Biomagnetismus und Biosignalanalyse, Westfälische Wilhelms-Universität.
INFORMATIK Mesh Smoothing by Adaptive and Anisotropic Gaussian Filter Applied to Mesh Normals Max-Planck-Institut für Informatik Saarbrücken, Germany Yutaka.
EDGE DETECTION IN COMPUTER VISION SYSTEMS PRESENTATION BY : ATUL CHOPRA JUNE EE-6358 COMPUTER VISION UNIVERSITY OF TEXAS AT ARLINGTON.
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
Markov Random Fields Probabilistic Models for Images
Xu Huaping, Wang Wei, Liu Xianghua Beihang University, China.
1 Markov Random Fields with Efficient Approximations Yuri Boykov, Olga Veksler, Ramin Zabih Computer Science Department CORNELL UNIVERSITY.
Fields of Experts: A Framework for Learning Image Priors (Mon) Young Ki Baik, Computer Vision Lab.
Geometric Modeling using Polygonal Meshes Lecture 3: Discrete Differential Geometry and its Application to Mesh Processing Office: South B-C Global.
Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel.
1 Markov random field: A brief introduction (2) Tzu-Cheng Jen Institute of Electronics, NCTU
Lecture 2: Statistical learning primer for biologists
Motion Estimation using Markov Random Fields Hrvoje Bogunović Image Processing Group Faculty of Electrical Engineering and Computing University of Zagreb.
6. Population Codes Presented by Rhee, Je-Keun © 2008, SNU Biointelligence Lab,
6.4 Random Fields on Graphs 6.5 Random Fields Models In “Adaptive Cooperative Systems” Summarized by Ho-Sik Seok.
Efficient Belief Propagation for Image Restoration Qi Zhao Mar.22,2006.
Javad Azimi, Ali Jalali, Xiaoli Fern Oregon State University University of Texas at Austin In NIPS 2011, Workshop in Bayesian optimization, experimental.
Mesh Resampling Wolfgang Knoll, Reinhard Russ, Cornelia Hasil 1 Institute of Computer Graphics and Algorithms Vienna University of Technology.
Regularization of energy-based representations Minimize total energy E p (u) + (1- )E d (u,d) E p (u) : Stabilizing function - a smoothness constraint.
Outline ● Introduction – What is the problem ● Generate stochastic textures ● Improve realism ● High level approach - Don't just jump into details – Why.
Ch 6. Markov Random Fields 6.1 ~ 6.3 Adaptive Cooperative Systems, Martin Beckerman, Summarized by H.-W. Lim Biointelligence Laboratory, Seoul National.
6.8 Maximizer of the Posterior Marginals 6.9 Iterated Conditional Modes of the Posterior Distribution Jang, HaYoung.
Biointelligence Laboratory, Seoul National University
Probability Theory and Parameter Estimation I
Simple Instances of Swendson-Wang & RJMCMC
Markov Random Fields with Efficient Approximations
Latent Variables, Mixture Models and EM
Probabilistic Models for Linear Regression
Markov Networks.
Binarization of Low Quality Text Using a Markov Random Field Model
Akio Utsugi National Institute of Bioscience and Human-technology,
Markov Random Fields for Edge Classification
Image and Video Processing
Adaptive Cooperative Systems Chapter 6 Markov Random Fields
Markov Networks.
Simulated Annealing & Boltzmann Machines
Outline Texture modeling - continued Markov Random Field models
Presentation transcript:

Smoothing 3D Meshes using Markov Random Fields Vedrana Andersen

Overview Aim: To investigate the use of Markov Random Fields (MRF) for formulating priors on 3D surfaces represented as triangle meshes Focus on: Mesh smoothing, feature-preserving mesh smoothing (preserving surface ridges) Vertex process and edge process

MRF THEORY Markov Random Fields Random fields, sites, labels, labeling Markov Random Fields, Markovianity, neighborhood system and cliques Markov-Gibbs equivalence Gibbs Random Fields (Gibbs distribution)

MESH SMOOTHING Smoothness Prior Mesh smoothing: sites – vertices, labels – spatial positions of vertices Absolute mean curvature: dihedral angle and edge length Potential of the smoothness prior

MESH SMOOTHING Smoothness Prior Potential assigned to all 4-cliques of vertices

MRF THEORY MAP-MRF Labeling Maximum a-posteriori solution Bayes rule In terms of energy

MESH SMOOTHING Likelihood Function Input mesh – underlying surface corrupted by noise Noise: isotropic Gaussian Likelihood energy

MESH SMOOTHING Optimization Metropolis sampler with simulated annealing Sampling – new candidate positions Metropolis criterion Cooling scheme

MESH SMOOTHING Optimization Optimization parameters: Sampling step size Initial temperature and annealing constant Modeling parameter: Weight of the data term

MESH SMOOTHING Results 10x10x10 cube corrupted with Gaussian noise (σ=0.3 average edge length) Very slow cooling, 500 iterations, (Fig. 7.3)

MESH SMOOTHING Results Monitoring the potentials and the number of updates over time

MESH SMOOTHING Results Smoothing with a zero temperature, 100 iterations, (Fig. 7.5)

MESH SMOOTHING Discussion Convergence, monitoring Sensitive to the size of the sampling step Optimization? Annealing? What is a smooth mesh?

MESH SMOOTHING Alternative Formulations Original formulation – dihedral angles and edge lengths Quadratic and square-root potentials

MESH SMOOTHING Alternative Formulations Angle based potential – indifferent Quadratic – over-smoothing Square root – feature preserving

MESH SMOOTHING Alternative Formulations Results of using curvature-based, quadratic and square-root potential, 300 iterations (Fig. 7.15)

MESH SMOOTHING Alternative Formulations Feature preserving – thresholded smoothness potential, implicit edge labeling Conclusion: Control achieved by the choice of the smoothness potential

FEATURE DETECTION Edge Labeling Detecting feature edges – ridges of the underlying surface Idea: To use edge labels as weights for smoothing process Based on: edge sharpness, neighborhood support.

FEATURE DETECTION Edge Labeling continuous or discrete, deterministic or stochastic.

FEATURE DETECTION Ridge Sharpness Sharpness threshold Ф0 Deterministic case: thresholding, cut-off function.

FEATURE DETECTION Ridge Sharpness Stochastic case (MRF framework, assigning sharpness potential to 1-edge cliques of edges): linear potential alternative: difference from cut-off function

FEATURE DETECTION Neighborhood Support Support threshold θ0 Idea: Presence of sharp edges in the neighborhood influences labeling

FEATURE DETECTION Neighborhood Support Stochastic case discrete linear formulation alternative: penalizing label differences along a line

FEATURE DETECTION Edge Labeling Results Labeling edges of the fandisk model, (Fig. 9.2)

FEATURE DETECTION Two Questions Neighborhood support – does it help? If not, edge sharpness – is it needed? Relevant for coupled models

FEATURE-PRESERVING MESH SMOOTHING Coupled Model Using edge labels as weights Potentials contributing to the total posterior energy: smoothness, likelihood, edge labeling (sharpness and neighborhood support)

FEATURE-PRESERVING MESH SMOOTHING Coupled Model Minimizing total posterior energy, or… Alternating between vertex process and edge process Independent cooling schemes Ordering of vertex and edge process

FEATURE-PRESERVING MESH SMOOTHING Results Smoothing the noisy cube, noise 0.2 average edge length, (Fig. 11.5)

FEATURE-PRESERVING MESH SMOOTHING Results Reconstructing the fandisk model, noise 0.2 average edge length, (Fig. 11.7)

FEATURE-PRESERVING MESH SMOOTHING Discusion Two questions revisited: Neighborhood support? Edge labeling? Control vs. automation

FEATURE-PRESERVING MESH SMOOTHING Possible Improvements Sampling (shape, adaptive step) Optimization (deterministic?) Larger neighborhood for edge labeling Surface to surface smoothing Mesh optimization Future work: Piecewise-quadratic surfaces

Thank you!