Download presentation
Presentation is loading. Please wait.
1
What Energy Functions Can be Minimized Using Graph Cuts? Shai Bagon Advanced Topics in Computer Vision June 2010
2
What is an Energy Function? E a number suggested solution For a given problem:Image Segmentation: 237-20 Useful Energy function: 1.Good solution Low energy 2.Tractable Can be minimized
3
Families of Functions or Outline F 2 submodular Non submodular F 3 Beyond F 3
4
Foreground Selection Let y i – color of i th pixel x i {0,1} BG/FG labels (variables) Given BG/FG scribbles: Pr(x i |y i ) =How likely each pixel to be FG/BG Pr(x m |x n ) =Adjacent pixels should have same label F 2 energy: E(x)=∑ i E i (x i )+∑ ij E ij (x i,x j ) xmxm xnxn xixi yiyi
5
Submodular Known concept from set-functions: E(x) = ∑ i E i (x i ) + ∑ ij E ij (x i, x j ), x i {0,1} 1CD 0AB xjxixjxi 01 E ij (x i,x j ): What does it mean? B+C-A-D ≥ 0
6
How to Minimize? E(x) = ∑ i E i (x i ) + ∑ ij E ij (x i, x j ), x i {0,1} Local “beliefs”: Data term Prior knowledge: Smoothness term F 2 submodular
7
Graph Partitioning A weighted graph G=( V E w ) Special Nodes: s t s-t cut: Cost of a cut: Nice property: 1:1 mapping s-t cut ↔ {0,1} |V|-2 w V E w ij s t TjSi TSCut, ),( VTSTS TtSsVTVS ,,,,
8
s t Graph Partitioning - Energy E(x) = ∑ i E i (x i ) + ∑ ij E ij (x i, x j ) Graph Partitioning i j E j (1) D-C B+C-A-D E i (0) 1CD 0AB xjxixjxi 01 E ij (x i,x j ) C-A 00 D-C0 0 00 B+C-A-D0 = A +++ C-A
9
s t Graph Partitioning - Energy E(x) = ∑ i E i (x i ) + ∑ ij E ij (x i, x j ) Graph Partitioning i j E j (1) B+C-A-D E i (0) C-A D-C st cut binary assignment cut cost energy of assignment min cut Energy min. B=E ij (0,1)
10
Recap F 2 submodular: E(x) = ∑ i E i (x i ) + ∑ ij E ij (x i, x j ) E ij (1,0)+E ij (0,1)≥E ij (0,0)+E ij (1,1) Mapping from energy to graph partition Min Energy = computing min-cut Global optimum in poly time for submodular functions!
11
Next… Multi-label F 2 E(x)=∑ i E i (x i ) + ∑ ij E ij (x i,x j ) s.t. x i {1,…,L} –Fusion moves: solving binary sub-problems –Applications to stereo, stitching, segmentation… ● Current labeling suggested labeling “Alpha expansion” = Fusion Solve Binary problem: x i =0 x i =1
12
Stereo matching see http://vision.middlebury.edu/stereo/http://vision.middlebury.edu/stereo/ Ground truth Pairwise MRF [Boykov et al. ‘01] slide by Carsten Rother, ICCV’09 Input:
13
Panoramic stitching slide by Carsten Rother, ICCV’09
14
Panoramic stitching slide by Pushmeet Kohli, ICCV’09
15
AutoCollage http://research.microsoft.com/en-us/um/cambridge/projects/autocollage/ [Rother et. al. Siggraph ‘05 ]
16
Next… Multi-label F 2 E(x)=∑ i E i (x i ) + ∑ ij E ij (x i,x j ) s.t. x i {1,…,L} –Fusion moves: solving binary sub-problems –Applications to stereo, stitching, segmentation… Non-submodular Beyond pair-wise interactions: F 3
17
Merging Regions input image regions (Ncuts) “edge” prob. pipi “weak” edge “strong” edge p i – prob. of boundary being edge GOAL: Find labeling x i {0,1} that max: i j min: Taking -log
18
Merging Regions Adding and subtracting the same number
19
Merging Regions Solving for edges: Consistency constraints: No “dangling” edge J x1x1 x2x2 x3x3 EJEJ 0000 1110 0110 001 λ wiwi xixi No longer pair-wise: F 3
20
Minimization trick Freedman D., Turek MW, Graph cuts with many pixel interactions: theory and applications to shape modeling. Image Vision Computing 2010
21
Merging Regions The resulting energy: + Pair-wise - Non submodular!
22
Quadratic Pseudo-Boolean Optimization s i j t ij Kolmogorov V., Carsten R., Minimizing non-submodular functions with graph cuts – a review. PAMI ’ 07
23
+ All edges with positive capacities - No constraint Labeling rule: partial labeling s i j t ij Quadratic Pseudo-Boolean Optimization
24
Properties of partial labeling y: 1. Let z=FUSE(y,x) E(z)≤E(x) 2. y is subset of optimal y* y is complete: 1. E submodular 2. Exists flipping (inference in trees) s i j t ij Quadratic Pseudo-Boolean Optimization
25
0????? rpqst 000?? 0010? rpqst rpqst QPBO: Probe Node p: 0 1 What can we say about variables? r -> is always 0 s -> is always equal to q t -> is 0 when q = 1 slide by Pushmeet Kohli, ICCV’09 QBPO - Probing
26
Probe nodes in an order until energy unchanged Simplified energy preserves global optimality and (sometimes) gives the global minimum slide by Pushmeet Kohli, ICCV’09 QBPO - Probing
27
Merging Regions Result using QPBO-P: Result regions (Ncuts)input image
28
Recap F 3 and more –Minimization trick Non submodular –QPBO approx. – partial labeling
29
Beyond F 3 … [Kohli et. al. CVPR ‘07, ‘08, PAMI ’08, IJCV ‘09]
30
Image Segmentation E(X) = ∑ c i x i + ∑ d ij |x i -x j | ii,j E: {0,1} n → R 0 → fg, 1 → bg n = number of pixels [Boykov and Jolly ‘ 01] [Blake et al. ‘04] [Rother et al.`04] Image Unary Cost Segmentation
31
P n Potts Potentials Patch Dictionary (Tree) C max 0 { 0 if x i = 0, i p C max otherwise h(X p ) = p [slide credits: Kohli]
32
P n Potts Potentials E(X) = ∑ c i x i + ∑ d ij |x i -x j | + ∑ h p (X p ) ii,j p p { 0 if x i = 0, i p C max otherwise h(X p ) = E: {0,1} n → R 0 → fg, 1 → bg n = number of pixels [slide credits: Kohli]
33
Image Segmentation E(X) = ∑ c i x i + ∑ d ij |x i -x j | + ∑ h p (X p ) ii,j ImagePairwise SegmentationFinal Segmentation p E: {0,1} n → R 0 → fg, 1 → bg n = number of pixels [slide credits: Kohli]
34
Application: Recognition and Segmentation from [Kohli et al. ‘08] Image Unaries only TextonBoost [Shotton et al. ‘06] Pairwise CRF only [Shotton et al. ‘06] P n Potts One super- pixelization another super- pixelization
35
Robust(soft) P n Potts model { 0 if x i = 0, i p f( ∑ x p ) otherwise h(x p ) = p p from [Kohli et al. ‘08] Robust P n PottsP n Potts
36
Application: Recognition and Segmentation From [Kohli et al. ‘08] Image Unaries only TextonBoost [Shotton et al. ‘06] Pairwise CRF only [Shotton et al. ‘06] P n Potts robust P n Potts (different f) One super- pixelization another super- pixelization
37
Same idea for surface-based stereo [Bleyer ‘10] One input image Ground truth depth Stereo with hard-segmentation Stereo with robust P n Potts This approach gets best result on Middlebury Teddy image-pair:
38
How is it done… H (X) = F ( ∑ x i ) Most general binary function: H (X) ∑ x i concave 0 The transformation is to a submodular pair-wise MRF, hence optimization globally optimal [slide credits: Kohli]
39
Higher order to Quadratic Start with P n Potts model: { 0 if all x i = 0 C 1 otherwise f(x) = x {0,1} n min f(x) min C 1 a + C 1 (1-a) ∑ x i x = x,a {0,1} Higher Order Function Quadratic Submodular Function ∑ x i = 0 a=0 f(x) = 0 ∑ x i > 0 a=1f(x) = C 1 [slide credits: Kohli]
40
Higher order to Quadratic min f(x) min C 1 a + C 1 (1-a) ∑ x i x = x,a {0,1} Higher Order FunctionQuadratic Submodular Function ∑xi∑xi 1 23 C1C1 C1∑xiC1∑xi [slide credits: Kohli]
41
Higher order to Quadratic min f(x) min C 1 a + C 1 (1-a) ∑ x i x = x,a {0,1} Higher Order Submodular Function Quadratic Submodular Function ∑xi∑xi 1 23 C1C1 C1∑xiC1∑xi a=1 a=0 Lower envelope of concave functions is concave [slide credits: Kohli]
42
Summary Submodular F 2 F 3 and beyond: minimization trick Non submodular –QPBO(P) Beyond F 3 – Robust HOP s i j t ij ∑xi∑xi a=1 a=0 f 2 (x) f 1 (x)
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.