MAP Estimation of Semi-Metric MRFs via Hierarchical Graph Cuts M. Pawan Kumar Daphne Koller Aim: To obtain accurate, efficient maximum a posteriori (MAP)

Slides:



Advertisements
Similar presentations
POSE–CUT Simultaneous Segmentation and 3D Pose Estimation of Humans using Dynamic Graph Cuts Mathieu Bray Pushmeet Kohli Philip H.S. Torr Department of.
Advertisements

Mean-Field Theory and Its Applications In Computer Vision1 1.
Primal-dual Algorithm for Convex Markov Random Fields Vladimir Kolmogorov University College London GDR (Optimisation Discrète, Graph Cuts et Analyse d'Images)
O BJ C UT M. Pawan Kumar Philip Torr Andrew Zisserman UNIVERSITY OF OXFORD.
Solving Markov Random Fields using Second Order Cone Programming Relaxations M. Pawan Kumar Philip Torr Andrew Zisserman.
Linear Time Methods for Propagating Beliefs Min Convolution, Distance Transforms and Box Sums Daniel Huttenlocher Computer Science Department December,
Constrained Approximate Maximum Entropy Learning (CAMEL) Varun Ganapathi, David Vickrey, John Duchi, Daphne Koller Stanford University TexPoint fonts used.
Introduction to Markov Random Fields and Graph Cuts Simon Prince
An Analysis of Convex Relaxations (PART I) Minimizing Higher Order Energy Functions (PART 2) Philip Torr Work in collaboration with: Pushmeet Kohli, Srikumar.
Graph-based image segmentation Václav Hlaváč Czech Technical University in Prague Faculty of Electrical Engineering Department of Cybernetics Prague, Czech.
Probabilistic Inference Lecture 1
Simultaneous Segmentation and 3D Pose Estimation of Humans or Detection + Segmentation = Tracking? Philip H.S. Torr Pawan Kumar, Pushmeet Kohli, Matt Bray.
Epipolar lines epipolar lines Baseline O O’ epipolar plane.
Last Time Pinhole camera model, projection
P 3 & Beyond Solving Energies with Higher Order Cliques Pushmeet Kohli Pawan Kumar Philip H. S. Torr Oxford Brookes University CVPR 2007.
Improved Moves for Truncated Convex Models M. Pawan Kumar Philip Torr.
2010/5/171 Overview of graph cuts. 2010/5/172 Outline Introduction S-t Graph cuts Extension to multi-label problems Compare simulated annealing and alpha-
Efficiently Solving Convex Relaxations M. Pawan Kumar University of Oxford for MAP Estimation Philip Torr Oxford Brookes University.
Recovering Articulated Object Models from 3D Range Data Dragomir Anguelov Daphne Koller Hoi-Cheung Pang Praveen Srinivasan Sebastian Thrun Computer Science.
Relaxations and Moves for MAP Estimation in MRFs M. Pawan Kumar STANFORDSTANFORD Vladimir KolmogorovPhilip TorrDaphne Koller.
Hierarchical Graph Cuts for Semi-Metric Labeling M. Pawan Kumar Joint work with Daphne Koller.
Measuring Uncertainty in Graph Cut Solutions Pushmeet Kohli Philip H.S. Torr Department of Computing Oxford Brookes University.
Computer vision: models, learning and inference
MAP Estimation Algorithms in M. Pawan Kumar, University of Oxford Pushmeet Kohli, Microsoft Research Computer Vision - Part I.
Multiplicative Bounds for Metric Labeling M. Pawan Kumar École Centrale Paris École des Ponts ParisTech INRIA Saclay, Île-de-France Joint work with Phil.
Graph-based consensus clustering for class discovery from gene expression data Zhiwen Yum, Hau-San Wong and Hongqiang Wang Bioinformatics, 2007.
Probabilistic Inference Lecture 4 – Part 2 M. Pawan Kumar Slides available online
MRFs and Segmentation with Graph Cuts Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 03/31/15.
MRFs and Segmentation with Graph Cuts Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 02/24/10.
CS774. Markov Random Field : Theory and Application Lecture 13 Kyomin Jung KAIST Oct
Planar Cycle Covering Graphs for inference in MRFS The Typhon Algorithm A New Variational Approach to Ground State Computation in Binary Planar Markov.
Multiplicative Bounds for Metric Labeling M. Pawan Kumar École Centrale Paris Joint work with Phil Torr, Daphne Koller.
Rounding-based Moves for Metric Labeling M. Pawan Kumar Center for Visual Computing Ecole Centrale Paris.
Geometry 3: Stereo Reconstruction Introduction to Computer Vision Ronen Basri Weizmann Institute of Science.
Learning a Small Mixture of Trees M. Pawan Kumar Daphne Koller Aim: To efficiently learn a.
Discrete Optimization Lecture 2 – Part I M. Pawan Kumar Slides available online
Discrete Optimization Lecture 4 – Part 2 M. Pawan Kumar Slides available online
Fast Parallel and Adaptive Updates for Dual-Decomposition Solvers Ozgur Sumer, U. Chicago Umut Acar, MPI-SWS Alexander Ihler, UC Irvine Ramgopal Mettu,
Probabilistic Inference Lecture 3 M. Pawan Kumar Slides available online
Discrete Optimization in Computer Vision M. Pawan Kumar Slides will be available online
Discrete Optimization Lecture 3 – Part 1 M. Pawan Kumar Slides available online
1 Markov Random Fields with Efficient Approximations Yuri Boykov, Olga Veksler, Ramin Zabih Computer Science Department CORNELL UNIVERSITY.
Probabilistic Inference Lecture 5 M. Pawan Kumar Slides available online
CS774. Markov Random Field : Theory and Application Lecture 02
Daphne Koller Message Passing Belief Propagation Algorithm Probabilistic Graphical Models Inference.
Efficient Discriminative Learning of Parts-based Models M. Pawan Kumar Andrew Zisserman Philip Torr
Approximate Inference: Decomposition Methods with Applications to Computer Vision Kyomin Jung ( KAIST ) Joint work with Pushmeet Kohli (Microsoft Research)
O BJ C UT M. Pawan Kumar Philip Torr Andrew Zisserman UNIVERSITY OF OXFORD.
Lecture 2: Statistical learning primer for biologists
Discrete Optimization Lecture 2 – Part 2 M. Pawan Kumar Slides available online
Using Combinatorial Optimization within Max-Product Belief Propagation
A Dynamic Conditional Random Field Model for Object Segmentation in Image Sequences Duke University Machine Learning Group Presented by Qiuhua Liu March.
Inference for Learning Belief Propagation. So far... Exact methods for submodular energies Approximations for non-submodular energies Move-making ( N_Variables.
Probabilistic Inference Lecture 2 M. Pawan Kumar Slides available online
Markov Random Fields & Conditional Random Fields
Discrete Optimization Lecture 1 M. Pawan Kumar Slides available online
A global approach Finding correspondence between a pair of epipolar lines for all pixels simultaneously Local method: no guarantee we will have one to.
Efficient Belief Propagation for Image Restoration Qi Zhao Mar.22,2006.
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Markov Random Fields in Vision
Daphne Koller Introduction Motivation and Overview Probabilistic Graphical Models.
Rounding-based Moves for Metric Labeling M. Pawan Kumar École Centrale Paris INRIA Saclay, Île-de-France.
Energy minimization Another global approach to improve quality of correspondences Assumption: disparities vary (mostly) smoothly Minimize energy function:
Introduction of BP & TRW-S
Semi-Supervised Clustering
Learning a Region-based Scene Segmentation Model
Markov Random Fields with Efficient Approximations
Geometry 3: Stereo Reconstruction
Discrete Inference and Learning
MAP Estimation of Semi-Metric MRFs via Hierarchical Graph Cuts
Presentation transcript:

MAP Estimation of Semi-Metric MRFs via Hierarchical Graph Cuts M. Pawan Kumar Daphne Koller Aim: To obtain accurate, efficient maximum a posteriori (MAP) estimation for Markov random fields (MRF) with semi-metric pairwise potentials MAP Estimation vava vbvb  a (i)  ab (i,k) Semi-Metric Potentials  b (k)  ab (i,k) = w ab d(i,k) f(a)-f(b) d(i,i) = 0, d(i,j) = d(j,i) > 0 d(i,j) - d(j,k) ≤  d(i,k) f : {a,b, …} {1, …, H} Bounds For  =1 (Metric) Linear Program: O(log H) Graph Cuts: 2 d max /d min Our Method: O(log H) lili lklk r-HST Metrics min f Q(f) Q(f) = ∑  a (f(a)) + ∑  ab (f(a),f(b)) Variables V, Labels L l1l1 l2l2 l3l3 l4l4 A A BB CC B ≤ A/r C ≤ A/r Overview Distance d T  path length d   1 d T1 +  2 d T2 + …. min f Q(f;d T1 ) f T1 min f Q(f;d T2 ) f T2.. Combine f T1, f T2 …. r-HST Metric Labeling Efficient Divide-and-Conquer Approach Analysis l1l1 l2l2 l3l3 l4l4 Use  -Expansion f 1 = min f Q(f) f(a)  {1,2} l5l5 l6l6 f 2 = min f Q(f) f(a)  {3,4} f 3 = min f Q(f) f(a)  {5,6} Combine f i using  -Expansion Initialize f 0 = f 1 Repeat At each iteration Choose an f i f t (a) = f t-1 (a) OR f t (a) = f i (a) Optimal move using graph cuts Image Denoising ExpTRWBPOur+ EM Q Time Stereo Reconstruction Scene Registration ExpTRWBPOur+ EM Q Time ExpTRWBPOur+ EM Q Time ExpTRWBPOur+ EM Q Time ExpTRWBPOur+ EM Q Time ExpTRWBPOur+ EM Q Time Learning a Mixture of rHSTs (Hierarchical Clustering ) min max i,k ∑  t d T t (i,k) d(i,k) Refinement (Hard EM) l1l1 l3l3 l4l4 l2l2 l3l3 l1l1 l4l4 l1l1 l3l3 l4l4 Permutation π Cluster C j Cluster C j+1 Root  1 cluster Choose random π For l i in cluster C j Find first l k in π s.t. d(i,k) ≤ T Decrease T by r Repeat Fakcharoenphol et al., 2000 Derandomization Boosting-style descent y ik = Residual min ∑y ik d T (i,k) Update y ik. Repeat. Bounds For  =1, O(log H) For  1, O((  log H) 2 ) Initial labeling f y ik : contribution of (i,k) to current labeling min ∑y ik d T (i,k) New labeling f’ Approximate E and M y ik = ∑w ab [f(a)=i][f(b)=k] Bound of 1 for unary potentials, 2r/(r-1) for pairwise potentials l1l1 l2l2 l3l3 l4l4 A A BB CC Mathematical Induction True for children vava vbvb Unary potential bound follows from  -Expansion Bound = 1 vava vbvb vava vbvb Bound = 2d max /d min = 2r/(r-1) Clean up an image with noise and missing data Find correspondence between two epipolar corrected images of a scene Find correspondence between two scenes with common elements (building, fire) QExpSwapTRWBPRSwpRExpOur+EM T-L T-L rHST Met SMet Synthetic Experiments 100 randomly generated 4-connected grid graphs of size 100x100 TimeExpSwapTRWBPRSwpRExpOur+EM T-L T-L rHST Met SMet