Presentation is loading. Please wait.

Presentation is loading. Please wait.

Lecture 8: Image Alignment and RANSAC

Similar presentations


Presentation on theme: "Lecture 8: Image Alignment and RANSAC"— Presentation transcript:

1 Lecture 8: Image Alignment and RANSAC
CS6670: Computer Vision Noah Snavely Lecture 8: Image Alignment and RANSAC

2 Reading Szeliski Chapter 6.1

3 Estimating Translations
Problem: more equations than unknowns “Overdetermined” system of equations We will find the least squares solution

4 Least squares formulation
For each point we define the residuals as

5 Least squares formulation
Goal: minimize sum of squared residuals “Least squares” solution For translations, is equal to mean displacement

6 Least squares formulation
Can also write as a matrix equation 2n x 2 2 x 1 2n x 1

7 Least squares Find t that minimizes
To solve, form the normal equations

8 Affine transformations
How many unknowns? How many equations per match? How many matches do we need?

9 Affine transformations
Residuals: Cost function:

10 Affine transformations
Matrix form 2n x 6 6 x 1 2n x 1

11 Homographies To unwarp (rectify) an image p’ p
solve for homography H given p and p’ solve equations of the form: wp’ = Hp linear in unknowns: w and coefficients of H H is defined up to an arbitrary scale factor how many points are necessary to solve for H?

12 Solving for homographies

13 Solving for homographies

14 Solving for homographies
9 2n Defines a least squares problem: Since is only defined up to scale, solve for unit vector Solution: = eigenvector of with smallest eigenvalue Works with 4 or more points

15 Questions?

16 Image Alignment Algorithm
Given images A and B Compute image features for A and B Match features between A and B Compute homography between A and B using least squares on set of matches What could go wrong?

17 Robustness outliers

18 Robustness Let’s consider a simpler example… How can we fix this?
Problem: Fit a line to these datapoints Least squares fit

19 Idea Given a hypothesized line
Count the number of points that “agree” with the line “Agree” = within a small distance of the line I.e., the inliers to that line For all possible lines, select the one with the largest number of inliers

20 Counting inliers

21 Counting inliers Inliers: 3

22 Counting inliers Inliers: 20

23 How do we find the best line?
Unlike least-squares, no simple closed-form solution Hypothesize-and-test Try out many lines, keep the best one Which lines?

24 Translations

25 RAndom SAmple Consensus
Select one match at random, count inliers

26 RAndom SAmple Consensus
Select another match at random, count inliers

27 RAndom SAmple Consensus
Output the translation with the highest number of inliers

28 RANSAC Idea: All the inliers will agree with each other on the translation vector; the (hopefully small) number of outliers will (hopefully) disagree with each other RANSAC only has guarantees if there are < 50% outliers “All good matches are alike; every bad match is bad in its own way.” – Tolstoy via Alyosha Efros

29 RANSAC Inlier threshold related to the amount of noise we expect in inliers Often model noise as Gaussian with some standard deviation (e.g., 3 pixels) Number of rounds related to the percentage of outliers we expect, and the probability of success we’d like to guarantee Suppose there are 20% outliers, and we want to find the correct answer with 99% probability How many rounds do we need?

30 RANSAC y translation x translation
set threshold so that, e.g., 95% of the Gaussian lies inside that radius x translation

31 RANSAC x y Back to linear regression How do we generate a hypothesis?

32 RANSAC Back to linear regression How do we generate a hypothesis? y x

33 RANSAC General version: Randomly choose s samples
Typically s = minimum sample size that lets you fit a model Fit a model (e.g., line) to those samples Count the number of inliers that approximately fit the model Repeat N times Choose the model that has the largest set of inliers

34 proportion of outliers e
How many rounds? If we have to choose s samples each time with an outlier ratio e and we want the right answer with probability p proportion of outliers e s 5% 10% 20% 25% 30% 40% 50% 2 3 5 6 7 11 17 4 9 19 35 13 34 72 12 26 57 146 16 24 37 97 293 8 20 33 54 163 588 44 78 272 1177 p = 0.99 Source: M. Pollefeys

35 How big is s? For alignment, depends on the motion model
Here, each sample is a correspondence (pair of matching points)

36 RANSAC pros and cons Pros Cons Simple and general
Applicable to many different problems Often works well in practice Cons Parameters to tune Sometimes too many iterations are required Can fail for extremely low inlier ratios We can often do better than brute-force sampling

37 Final step: least squares fit
Find average translation vector over all inliers

38 RANSAC An example of a “voting”-based fitting scheme
Each hypothesis gets voted on by each data point, best hypothesis wins There are many other types of voting schemes E.g., Hough transforms…

39 Questions? 3-minute break

40 Blending We’ve aligned the images – now what?

41 Blending Want to seamlessly blend them together

42 Image Blending

43 Feathering 1 + =

44 Effect of window size 1 left 1 right

45 Effect of window size 1 1

46 Good window size “Optimal” window: smooth but not ghosted
1 “Optimal” window: smooth but not ghosted Doesn’t always work...

47 Pyramid blending Create a Laplacian pyramid, blend each level
Burt, P. J. and Adelson, E. H., A multiresolution spline with applications to image mosaics, ACM Transactions on Graphics, 42(4), October 1983,

48 - = - = - = The Laplacian Pyramid Gaussian Pyramid Laplacian Pyramid
expand - = expand - = expand - =

49 Alpha Blending Encoding blend weights: I(x,y) = (R, G, B, )
Optional: see Blinn (CGA, 1994) for details: I2 Encoding blend weights: I(x,y) = (R, G, B, ) color at p = Implement this in two steps: 1. accumulate: add up the ( premultiplied) RGB values at each pixel 2. normalize: divide each pixel’s accumulated RGB by its  value Q: what if  = 0? A: render as black

50 Poisson Image Editing For more info: Perez et al, SIGGRAPH 2003

51 Some panorama examples
Before Siggraph Deadline:

52 Some panorama examples
Every image on Google Streetview

53 Magic: ghost removal M. Uyttendaele, A. Eden, and R. Szeliski. Eliminating ghosting and exposure artifacts in image mosaics. In Proceedings of the Interational Conference on Computer Vision and Pattern Recognition, volume 2, pages , Kauai, Hawaii, December 2001.

54 Magic: ghost removal How to handle motion? M. Uyttendaele, A. Eden, and R. Szeliski. Eliminating ghosting and exposure artifacts in image mosaics. In Proceedings of the Interational Conference on Computer Vision and Pattern Recognition, volume 2, pages , Kauai, Hawaii, December 2001.

55 Questions?


Download ppt "Lecture 8: Image Alignment and RANSAC"

Similar presentations


Ads by Google