Lecture 10: Robust fitting CS4670: Computer Vision Noah Snavely.

Slides:



Advertisements
Similar presentations
The fundamental matrix F
Advertisements

Photo by Carl Warner.
Image alignment Image from
776 Computer Vision Jared Heinly Spring 2014 (slides borrowed from Jan-Michael Frahm, Svetlana Lazebnik, and others)
Mosaics con’t CSE 455, Winter 2010 February 10, 2010.
Multiple View Geometry
Computer Vision Fitting Marc Pollefeys COMP 256 Some slides and illustrations from D. Forsyth, T. Darrel, A. Zisserman,...
Fitting. We’ve learned how to detect edges, corners, blobs. Now what? We would like to form a higher-level, more compact representation of the features.
Fitting: Concepts and recipes. Review: The Hough transform What are the steps of a Hough transform? What line parametrization is best for the Hough transform?
Lecture 7: Image Alignment and Panoramas CS6670: Computer Vision Noah Snavely What’s inside your fridge?
Lecture 21: Multiple-view geometry and structure from motion
Geometric Optimization Problems in Computer Vision.
Lecture 9: Image alignment CS4670: Computer Vision Noah Snavely
Lecture 10: Cameras CS4670: Computer Vision Noah Snavely Source: S. Lazebnik.
Fitting a Model to Data Reading: 15.1,
Fitting. Choose a parametric object/some objects to represent a set of tokens Most interesting case is when criterion is not local –can’t tell whether.
Lecture 8: Image Alignment and RANSAC
Automatic Image Alignment (feature-based) : Computational Photography Alexei Efros, CMU, Fall 2006 with a lot of slides stolen from Steve Seitz and.
Random Sample Consensus: A Paradigm for Model Fitting with Application to Image Analysis and Automated Cartography Martin A. Fischler, Robert C. Bolles.
Computing transformations Prof. Noah Snavely CS1114
Fitting.
כמה מהתעשייה? מבנה הקורס השתנה Computer vision.
Fitting and Registration
Robust fitting Prof. Noah Snavely CS1114
Image Stitching Ali Farhadi CSE 455
Image alignment.
CSE 185 Introduction to Computer Vision
Chapter 6 Feature-based alignment Advanced Computer Vision.
CSC 589 Lecture 22 Image Alignment and least square methods Bei Xiao American University April 13.
Computer Vision - Fitting and Alignment
Lecture 12: Image alignment CS4670/5760: Computer Vision Kavita Bala
Fitting and Registration Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 02/14/12.
Fitting & Matching Lecture 4 – Prof. Bregler Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros.
Example: line fitting. n=2 Model fitting Measure distances.
Computer Vision : CISC 4/689 Going Back a little Cameras.ppt.
Model Fitting Computer Vision CS 143, Brown James Hays 10/03/11 Slides from Silvio Savarese, Svetlana Lazebnik, and Derek Hoiem.
Medical Image Analysis Dr. Mohammad Dawood Department of Computer Science University of Münster Germany.
EECS 274 Computer Vision Model Fitting. Fitting Choose a parametric object/some objects to represent a set of points Three main questions: –what object.
Computer Vision - Fitting and Alignment (Slides borrowed from various presentations)
Fitting image transformations Prof. Noah Snavely CS1114
776 Computer Vision Jan-Michael Frahm Spring 2012.
CS 2750: Machine Learning Linear Regression Prof. Adriana Kovashka University of Pittsburgh February 10, 2016.
Robust Estimation Course web page: vision.cis.udel.edu/~cv April 23, 2003  Lecture 25.
Fitting.
Optimization and least squares Prof. Noah Snavely CS1114
Invariant Local Features Image content is transformed into local feature coordinates that are invariant to translation, rotation, scale, and other imaging.
Lecture 10: Image alignment CS4670/5760: Computer Vision Noah Snavely
Grouping and Segmentation. Sometimes edge detectors find the boundary pretty well.
776 Computer Vision Jan-Michael Frahm Spring 2012.
Lecture 16: Image alignment
Fitting a transformation: feature-based alignment
A Brief Introduction of RANSAC
Lecture 7: Image alignment
Homography From Wikipedia In the field of computer vision, any
CS 2750: Machine Learning Linear Regression
A special case of calibration
Features Readings All is Vanity, by C. Allan Gilbert,
Object recognition Prof. Graeme Bailey
Fitting.
Segmentation by fitting a model: robust estimators and RANSAC
Fitting CS 678 Spring 2018.
Lecture 8: Image alignment
Lecture 8: Image alignment
Calibration and homographies
Back to equations of geometric transformations
CS5760: Computer Vision Lecture 9: RANSAC Noah Snavely
CS5760: Computer Vision Lecture 9: RANSAC Noah Snavely
Image Stitching Linda Shapiro ECE/CSE 576.
Lecture 11: Image alignment, Part 2
Image Stitching Linda Shapiro ECE P 596.
Presentation transcript:

Lecture 10: Robust fitting CS4670: Computer Vision Noah Snavely

Announcements Quiz on Friday Project 2a due Monday Prelim?

Least squares: translations 2n x 22 x 12n x 1

Least squares Find t that minimizes To solve, form the normal equations

Least squares: affine transformations Matrix form 2n x 6 6 x 1 2n x 1

Least squares: generalized linear regression y = mx + b (y i, x i )

Linear regression

Homographies To unwarp (rectify) an image solve for homography H given p and p’ solve equations of the form: wp’ = Hp –linear in unknowns: w and coefficients of H –H is defined up to an arbitrary scale factor –how many points are necessary to solve for H? p p’

Solving for homographies

Defines a least squares problem: Since is only defined up to scale, solve for unit vector Solution: = eigenvector of with smallest eigenvalue Works with 4 or more points 2n × 9 9 2n

Questions?

Image Alignment Algorithm Given images A and B 1.Compute image features for A and B 2.Match features between A and B 3.Compute homography between A and B using least squares on set of matches What could go wrong?

Outliers outliers inliers

Robustness Let’s consider a simpler example… How can we fix this? Problem: Fit a line to these datapointsLeast squares fit

Idea Given a hypothesized line Count the number of points that “agree” with the line – “Agree” = within a small distance of the line – I.e., the inliers to that line For all possible lines, select the one with the largest number of inliers

Counting inliers

Inliers: 3

Counting inliers Inliers: 20

How do we find the best line? Unlike least-squares, no simple closed-form solution Hypothesize-and-test – Try out many lines, keep the best one – Which lines?

Translations

RAndom SAmple Consensus Select one match at random, count inliers

RAndom SAmple Consensus Select another match at random, count inliers

RAndom SAmple Consensus Output the translation with the highest number of inliers

RANSAC Idea: – All the inliers will agree with each other on the translation vector; the (hopefully small) number of outliers will (hopefully) disagree with each other RANSAC only has guarantees if there are < 50% outliers – “All good matches are alike; every bad match is bad in its own way.” – Tolstoy via Alyosha Efros

RANSAC Inlier threshold related to the amount of noise we expect in inliers – Often model noise as Gaussian with some standard deviation (e.g., 3 pixels) Number of rounds related to the percentage of outliers we expect, and the probability of success we’d like to guarantee – Suppose there are 20% outliers, and we want to find the correct answer with 99% probability – How many rounds do we need?

RANSAC x translation y translation set threshold so that, e.g., 95% of the Gaussian lies inside that radius

RANSAC Back to linear regression How do we generate a hypothesis? x y

RANSAC x y Back to linear regression How do we generate a hypothesis?

RANSAC General version: 1.Randomly choose s samples Typically s = minimum sample size that lets you fit a model 2.Fit a model (e.g., line) to those samples 3.Count the number of inliers that approximately fit the model 4.Repeat N times 5.Choose the model that has the largest set of inliers

How many rounds? If we have to choose s samples each time – with an outlier ratio e – and we want the right answer with probability p proportion of outliers e s5%10%20%25%30%40%50% Source: M. Pollefeys p = 0.99

How big is s? For alignment, depends on the motion model – Here, each sample is a correspondence (pair of matching points)

RANSAC pros and cons Pros – Simple and general – Applicable to many different problems – Often works well in practice Cons – Parameters to tune – Sometimes too many iterations are required – Can fail for extremely low inlier ratios – We can often do better than brute-force sampling

Final step: least squares fit Find average translation vector over all inliers

RANSAC An example of a “voting”-based fitting scheme Each hypothesis gets voted on by each data point, best hypothesis wins There are many other types of voting schemes – E.g., Hough transforms…

Hough transform  