1 Applications of belief propagation in low-level vision Bill Freeman Massachusetts Institute of Technology Jan. 12, 2010 Joint work with: Egon Pasztor,

Slides:



Advertisements
Similar presentations
Numbers Treasure Hunt Following each question, click on the answer. If correct, the next page will load with a graphic first – these can be used to check.
Advertisements

Scenario: EOT/EOT-R/COT Resident admitted March 10th Admitted for PT and OT following knee replacement for patient with CHF, COPD, shortness of breath.
Simplifications of Context-Free Grammars
Angstrom Care 培苗社 Quadratic Equation II
AP STUDY SESSION 2.
1
Copyright © 2003 Pearson Education, Inc. Slide 1 Computer Systems Organization & Architecture Chapters 8-12 John D. Carpinelli.
Basic Steps 1.Compute the x and y image derivatives 2.Classify each derivative as being caused by either shading or a reflectance change 3.Set derivatives.
Algorithms compared Bicubic Interpolation Mitra's Directional Filter
STATISTICS Joint and Conditional Distributions
1 Hyades Command Routing Message flow and data translation.
David Burdett May 11, 2004 Package Binding for WS CDL.
1 RA I Sub-Regional Training Seminar on CLIMAT&CLIMAT TEMP Reporting Casablanca, Morocco, 20 – 22 December 2005 Status of observing programmes in RA I.
Local Customization Chapter 2. Local Customization 2-2 Objectives Customization Considerations Types of Data Elements Location for Locally Defined Data.
Process a Customer Chapter 2. Process a Customer 2-2 Objectives Understand what defines a Customer Learn how to check for an existing Customer Learn how.
Custom Services and Training Provider Details Chapter 4.
1 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt BlendsDigraphsShort.
Mean-Field Theory and Its Applications In Computer Vision1 1.
1 Hierarchical Part-Based Human Body Pose Estimation * Ramanan Navaratnam * Arasanathan Thayananthan Prof. Phil Torr * Prof. Roberto Cipolla * University.
Bayesian Belief Propagation
1 Click here to End Presentation Software: Installation and Updates Internet Download CD release NACIS Updates.
Simple Linear Regression 1. review of least squares procedure 2
Break Time Remaining 10:00.
Turing Machines.
Table 12.1: Cash Flows to a Cash and Carry Trading Strategy.
PP Test Review Sections 6-1 to 6-6
EIS Bridge Tool and Staging Tables September 1, 2009 Instructor: Way Poteat Slide: 1.
Outline Minimum Spanning Tree Maximal Flow Algorithm LP formulation 1.
1 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt Vocabulary.
Bellwork Do the following problem on a ½ sheet of paper and turn in.
Thomas Jellema & Wouter Van Gool 1 Question. 2Answer.
Exarte Bezoek aan de Mediacampus Bachelor in de grafische en digitale media April 2014.
Computer vision: models, learning and inference
Copyright © 2012, Elsevier Inc. All rights Reserved. 1 Chapter 7 Modeling Structure with Blocks.
1 RA III - Regional Training Seminar on CLIMAT&CLIMAT TEMP Reporting Buenos Aires, Argentina, 25 – 27 October 2006 Status of observing programmes in RA.
1..
CONTROL VISION Set-up. Step 1 Step 2 Step 3 Step 5 Step 4.
Adding Up In Chunks.
MaK_Full ahead loaded 1 Alarm Page Directory (F11)
1 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt Synthetic.
Artificial Intelligence
1 Using Bayesian Network for combining classifiers Leonardo Nogueira Matos Departamento de Computação Universidade Federal de Sergipe.
: 3 00.
5 minutes.
1 hi at no doifpi me be go we of at be do go hi if me no of pi we Inorder Traversal Inorder traversal. n Visit the left subtree. n Visit the node. n Visit.
Essential Cell Biology
Converting a Fraction to %
Chapter 8 Estimation Understandable Statistics Ninth Edition
Clock will move after 1 minute
PSSA Preparation.
Physics for Scientists & Engineers, 3rd Edition
Energy Generation in Mitochondria and Chlorplasts
Select a time to count down from the clock above
1.step PMIT start + initial project data input Concept Concept.
9. Two Functions of Two Random Variables
1 Dr. Scott Schaefer Least Squares Curves, Rational Representations, Splines and Continuity.
1 Decidability continued…. 2 Theorem: For a recursively enumerable language it is undecidable to determine whether is finite Proof: We will reduce the.
Adaptive Segmentation Based on a Learned Quality Metric
Probabilistic Reasoning over Time
Introduction to Belief Propagation and its Generalizations. Max Welling Donald Bren School of Information and Computer and Science University of California.
Graphical models, belief propagation, and Markov random fields 1.
Learning Low-Level Vision William T. Freeman Egon C. Pasztor Owen T. Carmichael.
Super-Resolution of Remotely-Sensed Images Using a Learning-Based Approach Isabelle Bégin and Frank P. Ferrie Abstract Super-resolution addresses the problem.
Markov Random Fields, Graph Cuts, Belief Propagation
Graphical models, belief propagation, and Markov random fields Bill Freeman, MIT Fredo Durand, MIT March 21, 2005.
Markov Random Fields Allows rich probabilistic models for images. But built in a local, modular way. Learn local relationships, get global effects out.
Bayesian Belief Propagation for Image Understanding David Rosenberg.
Jianchao Yang, John Wright, Thomas Huang, Yi Ma CVPR 2008 Image Super-Resolution as Sparse Representation of Raw Image Patches.
Expectation-Maximization & Belief Propagation
Presentation transcript:

1 Applications of belief propagation in low-level vision Bill Freeman Massachusetts Institute of Technology Jan. 12, 2010 Joint work with: Egon Pasztor, Jonathan Yedidia, Yair Weiss, Thouis Jones, Edward Adelson, Marshall Tappen.

2 y1y1 Derivation of belief propagation x1x1 y2y2 x2x2 y3y3 x3x3

3 The posterior factorizes y1y1 x1x1 y2y2 x2x2 y3y3 x3x3

4 Propagation rules y1y1 x1x1 y2y2 x2x2 y3y3 x3x3

5 y1y1 x1x1 y2y2 x2x2 y3y3 x3x3

6 y1y1 x1x1 y2y2 x2x2 y3y3 x3x3

7 Belief propagation messages ji i = j To send a message: Multiply together all the incoming messages, except from the node youre sending to, then multiply by the compatibility matrix and marginalize over the senders states. A message: can be thought of as a set of weights on each of your possible states

8 Belief propagation: the nosey neighbor rule Given everything that Ive heard, heres what I think is going on inside your house (Given my incoming messages, affecting my state probabilities, and knowing how my states affect your states, heres how I think you should modify the probabilities of your states)

9 Beliefs j To find a nodes beliefs: Multiply together all the messages coming in to that node. (Show this for the toy example.)

10 Optimal solution in a chain or tree: Belief Propagation Do the right thing Bayesian algorithm. For Gaussian random variables over time: Kalman filter. For hidden Markov models: forward/backward algorithm (and MAP variant is Viterbi).

11 Markov Random Fields Allows rich probabilistic models for images. But built in a local, modular way. Learn local relationships, get global effects out.

12 MRF nodes as pixels Winkler, 1995, p. 32

13 MRF nodes as patches image patches (x i, y i ) (x i, x j ) image scene scene patches

14 Network joint probability scene image Scene-scene compatibility function neighboring scene nodes local observations Image-scene compatibility function i ii ji ji yxxx Z yxP),(),( 1 ),(,

15 In order to use MRFs: Given observations y, and the parameters of the MRF, how infer the hidden variables, x? How learn the parameters of the MRF?

16 Inference in Markov Random Fields Gibbs sampling, simulated annealing Iterated conditional modes (ICM) Belief propagation Application examples: super-resolution motion analysis shading/reflectance separation Graph cuts Variational methods

17 Inference in Markov Random Fields Gibbs sampling, simulated annealing Iterated conditional modes (ICM) Belief propagation Application examples: super-resolution motion analysis shading/reflectance separation Graph cuts Variational methods

18 y1y1 Derivation of belief propagation x1x1 y2y2 x2x2 y3y3 x3x3

19 No factorization with loops! y1y1 x1x1 y2y2 x2x2 y3y3 x3x3 31 ),(xx

20 Applications of belief propagation in low-level vision Bill Freeman Massachusetts Institute of Technology Jan. 12, 2010 Joint work with: Egon Pasztor, Jonathan Yedidia, Yair Weiss, Thouis Jones, Edward Adelson, Marshall Tappen.

21 Belief, and message updates ji i = j

22 Optimal solution in a chain or tree: Belief Propagation Do the right thing Bayesian algorithm. For Gaussian random variables over time: Kalman filter. For hidden Markov models: forward/backward algorithm (and MAP variant is Viterbi).

23 Justification for running belief propagation in networks with loops Experimental results: –Error-correcting codes –Vision applications Theoretical results: –For Gaussian processes, means are correct. –Large neighborhood local maximum for MAP. –Equivalent to Bethe approx. in statistical physics. –Tree-weighted reparameterization Weiss and Freeman, 2000 Yedidia, Freeman, and Weiss, 2000 Freeman and Pasztor, 1999; Frey, 2000 Kschischang and Frey, 1998; McEliece et al., 1998 Weiss and Freeman, 1999 Wainwright, Willsky, Jaakkola, 2001

24 Results from Bethe free energy analysis Fixed point of belief propagation equations iff. Bethe approximation stationary point. Belief propagation always has a fixed point. Connection with variational methods for inference: both minimize approximations to Free Energy, –variational: usually use primal variables. –belief propagation: fixed pt. equs. for dual variables. Kikuchi approximations lead to more accurate belief propagation algorithms. Other Bethe free energy minimization algorithms Yuille, Welling, etc.

25 References on BP and GBP J. Pearl, 1985 –classic Y. Weiss, NIPS 1998 –Inspires application of BP to vision W. Freeman et al learning low-level vision, IJCV 1999 –Applications in super-resolution, motion, shading/paint discrimination H. Shum et al, ECCV 2002 –Application to stereo M. Wainwright, T. Jaakkola, A. Willsky –Reparameterization version J. Yedidia, AAAI 2000 –The clearest place to read about BP and GBP.

26 Inference in Markov Random Fields Gibbs sampling, simulated annealing Iterated conditional modes (ICM) Belief propagation Application examples: super-resolution motion analysis shading/reflectance separation Graph cuts Variational methods

27 Super-resolution Image: low resolution image Scene: high resolution image image scene ultimate goal...

28 Polygon-based graphics images are resolution independent Pixel-based images are not resolution independent Pixel replication Cubic spline Cubic spline, sharpened Training-based super-resolution

29 3 approaches to perceptual sharpening (1) Sharpening; boost existing high frequencies. (2) Use multiple frames to obtain higher sampling rate in a still frame. (3) Estimate high frequencies not present in image, although implicitly defined. In this talk, we focus on (3), which well call super-resolution. spatial frequency amplitude spatial frequency amplitude

30 Super-resolution: other approaches Schultz and Stevenson, 1994 Pentland and Horowitz, 1993 fractal image compression (Polvere, 1998; Iterated Systems) astronomical image processing (eg. Gull and Daniell, 1978; pixons Follow-on: Jianchao Yang, John Wright, Thomas S. Huang, Yi Ma: Image super-resolution as sparse representation of raw image patches. CVPR 2008

31 Training images, ~100,000 image/scene patch pairs Images from two Corel database categories: giraffes and urban skyline.

32 Do a first interpolation Zoomed low-resolution Low-resolution

33 Zoomed low-resolution Low-resolution Full frequency original

34 Full freq. original Representation Zoomed low-freq.

35 True high freqs Low-band input (contrast normalized, PCA fitted) Full freq. original Representation Zoomed low-freq. (to minimize the complexity of the relationships we have to learn, we remove the lowest frequencies from the input image, and normalize the local contrast level).

36 Training data samples (magnified)... Gather ~100,000 patches low freqs. high freqs.

37 True high freqs. Input low freqs. Training data samples (magnified)... Nearest neighbor estimate low freqs. high freqs. Estimated high freqs.

38 Input low freqs. Training data samples (magnified)... Nearest neighbor estimate low freqs. high freqs. Estimated high freqs.

39 Example: input image patch, and closest matches from database Input patch Closest image patches from database Corresponding high-resolution patches from database

40

41 Scene-scene compatibility function, (x i, x j ) Assume overlapped regions, d, of hi-res. patches differ by Gaussian observation noise: d Uniqueness constraint, not smoothness.

42 Image-scene compatibility function, (x i, y i ) Assume Gaussian noise takes you from observed image patch to synthetic sample: y x

43 Markov network image patches (x i, y i ) (x i, x j ) scene patches

44 Iter. 3 Iter. 1 Belief Propagation Input Iter. 0 After a few iterations of belief propagation, the algorithm selects spatially consistent high resolution interpretations for each low-resolution patch of the input image.

45 Zooming 2 octaves 85 x 51 input Cubic spline zoom to 340x204 Max. likelihood zoom to 340x204 We apply the super-resolution algorithm recursively, zooming up 2 powers of 2, or a factor of 4 in each dimension.

46 True 200x232 Original 50x58 (cubic spline implies thin plate prior) Now we examine the effect of the prior assumptions made about images on the high resolution reconstruction. First, cubic spline interpolation.

47 Cubic spline True 200x232 Original 50x58 (cubic spline implies thin plate prior)

48 True Original 50x58 Training images Next, train the Markov network algorithm on a world of random noise images.

49 Markov network True Original 50x58 The algorithm learns that, in such a world, we add random noise when zoom to a higher resolution. Training images

50 True Original 50x58 Training images Next, train on a world of vertically oriented rectangles.