Download presentation

Presentation is loading. Please wait.

Published byJamison Wheeler Modified about 1 year ago

1
Graph Cut Algorithms for Computer Vision & Medical Imaging Ramin Zabih Computer Science & Radiology Cornell University Joint work with Y. Boykov, V. Kolmogorov, A. Raj and O. Veksler

2
Outline Pixel labeling problems Piecewise constant property of the image Graph cuts Expansion move algorithm Beyond regularity Reconstructing MRI’s via graph cuts LBP versus graph cuts

3
Given Pixel labeling problem Assignment cost for giving a particular label to a particular node. Written as D. Separation cost for assigning a particular pair of labels to neighboring nodes. Written as V. Find Labeling f = (f 1,…,f n ) Such that the sum of the assignment costs and separation costs (the energy E ) is small

4
We want to minimize the energy E(f) Solving pixel labeling problems Classical problem in vision and beyond Bayesian justification Markov Random Fields (MRF’s)

5
Potts model Truncated linear model Linear model Quadratic model RobustNot robust Choices of V

6
Pixel labeling for stereo Labels are shifts (hence depths) Assignment cost from intensity difference Neighboring pixels should be at similar depths Except at the borders of objects! Stereo

7
How to minimize the energy? Until late-90’s, poor solutions Problem is NP-hard [K/BVZ PAMI ’01] In vision, we tend to focus on the deriving the “right” energy function Minimize via general-purpose methods Computer scientists disagree General-purpose methods must be weak Nearby energy functions can be “easy”

8
Sample results Right answersCorrelation Dynamic programming Graph cuts

9
Statistical performance

10
Graph cuts and expansion moves

11
Graph cuts Reduce energy minimization problem to computing the min s-t cut on a graph Cuts are labelings, cut costs are energy Rapidly solvable by max flow Running times are linear in the number of pixels and labels Asymptotically, low-order polynomial

12
What do graph cuts provide? For less interesting V, polynomial algorithm for global minimum! For a particularly interesting V, approximation algorithm Proof of NP hardness For many choices of V, algorithms that find a “strong” local minimum Very strong experimental results

13
Spectrum of results Special- purpose General- purpose Convex V: Global min Potts V: 2-approximation Regular V: Strong local min Arbitrary V: Local min Expansion move algorithm [BVZ PAMI ’01]

14
Gradient descent methods Subproblem: pick a pixel, find the label that minimizes E, repeat Minimize restricted version of E (line search) Computes a local minimum

15
Gradient descent vs. Graph cuts Continuous vs. discrete No floating point with graph cuts Local min in line search vs. global min Minimize over a line vs. hypersurface Containing O(2 n ) candidates Local minimum: weak vs. strong 2-approximation for the Potts model Within much less than 1% of global min!

16
Expansion move algorithm Find red expansion move that most decreases E Move there, then find the best blue expansion move, etc Done when no -expansion move decreases the energy, for any label Many nice theoretical properties Red expansion move from f Input labeling f

17
local minimum optimal solution Summing up over all labels: 2-approximation for Potts model

18
Expansion moves in action initial solution -expansion For each move we choose expansion that gives the largest decrease in the energy: binary energy minimization subproblem

19
Binary sub-problem Input labeling Expansion moveBinary image

20
Expansion move energy Goal: find the binary image with lowest energy Binary image energy is a restriction of E Depends on f,

21
Graph cuts solution This can be done as long as V has a specific form (works for arbitrary D ) Regularity constraint [KZ PAMI ’04] Can find cheapest -expansion from f if

22
Regular choices of V Suppose that V is a metric Then what?

23
Applications in vision Two tricks to get best stereo answers Monocular cues (“fragile constraint”) combine segmentation and stereo Without understanding image statistics Continuous label sets exploit the power of the Potts model Labels can be planes or smooth surfaces

24
Applications outside vision Kleinberg & Tardos [FOCS ’98][JACM ’02] gave an approximation algorithm when V is a metric Various follow-up papers Recent applications in SIGGRAPH 1 paper in ’03, >5 papers (!) in ’04 Key limitation: regularity

25
Beyond regularity What energy functions can’t be minimized via graph cuts?

26
Beyond regularity Arbitrary non-regular functions are NP- hard Only regular functions can be solved via graph cuts I.e., compute optimal expansion move Very recent work has relaxed this restriction Kolmogorov & coworkers (Digital Tapestry) Raj & Zabih

27
Other energy functions? You can make a non-regular function regular Can find optimal expansion move for new energy What does this say about the original energy? If you do this correctly, the original energy never increases! Digital tapestry: careful truncation for arbitrary V Raj & Zabih: linear inverse problems

28
Linear inverse problems Denoising if H is the identity matrix Data cost for is Goal: piecewise constant solution Noise Unknown image Observed image

29
What about non-diagonal H ? Example: H performs local averaging The data cost depends on the neighbors’ hypothesized values also! Good

30
Regularity is a challenge For non-negative H, the energy function is regular iff Can compute the optimal -expansion move for a pixel below where all its neighbors are above (or vice-versa) This is true for very few pixels!

31
Strategy At a given point we are given f, The energy function will depend on them Dynamically updated as the algorithm runs E ’( f, ) is a regular approximation to the non-regular E we want to minimize Can find -expansion that most reduces E ’

32
Approximation For each , split pixel pairs into those with regular cost vs non-regular cost: Approximate E 2 terms using input labeling f :

33
Approximation properties Additional approximations are also made to increase the number of pixels that can move to Reducing the modified energy is guaranteed to reduce the original energy Modified energy is very close to the original when few pixels move to

34
Reconstructing MRI’s via Graph Cuts (or: MRF’s for MRI’s)

35
Reconstructing MRI’s MR requires substantial cleverness in image formation Unique among image modalities Under-appreciated task of Radiologists Acquisition speed really matters Physiological processes take place at different timescales Heartbeat, respiration, etc.

36
Evaluating reconstructions is easy Expert RadiologistComputer

37
Combiner Reconstructed image Imaging target Parallel Imaging System Encodes different Coil outputs

38
Graph cut reconstruction Reconstruct the image to be consistent with the observed data Each coil gives aliased data Coils have different spatial sensitivities Standard reconstruction algorithm (SENSE) uses least squares Equivalent to maximum likelihood Graph cuts can impose smoothness

39
SENSE recon Results on MRI Reconstruction Original phantom SENSE with regularization GC recon

40
Zoomed results Original phantomSENSE reconSENSE with regularizationGC recon

41
Conclusions Powerful optimization tool for vision And beyond… Trade off generality versus power More general than thought Even applicable to medical imaging

42
Graph cuts versus LBP Evaluation criteria: Application effectiveness, speed, quality of minima, guarantees, generality Application effectiveness: comparable Probably the most important criterion Speed: LBP is now faster for stereo But graph cuts use O(n) space vs O(mn) BP has better ties to statistics

43
Minima quality: graph cuts Data from [TF ICCV ’03 ]

44
Guarantees and generality Graph cuts are better understood Always converge to some kind of minimum Global, strong local, or weak local Depends on the class of problem This doesn’t make graph cuts a better method, just one we know more about LBP has gotten faster, graph cuts have gotten more general (just in the last year!)

Similar presentations

© 2016 SlidePlayer.com Inc.

All rights reserved.

Ads by Google