Fitting and Registration Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 02/14/12.

Slides:



Advertisements
Similar presentations
Fitting: The Hough transform. Voting schemes Let each feature vote for all the models that are compatible with it Hopefully the noise features will not.
Advertisements

Feature Matching and Robust Fitting Computer Vision CS 143, Brown James Hays Acknowledgment: Many slides from Derek Hoiem and Grauman&Leibe 2008 AAAI Tutorial.
Object Recognition using Invariant Local Features Applications l Mobile robots, driver assistance l Cell phone location or object recognition l Panoramas,
776 Computer Vision Jan-Michael Frahm, Enrique Dunn Spring 2013.
Photo by Carl Warner.
776 Computer Vision Jared Heinly Spring 2014 (slides borrowed from Jan-Michael Frahm, Svetlana Lazebnik, and others)
Fitting: The Hough transform
CS 558 C OMPUTER V ISION Lecture VIII: Fitting, RANSAC, and Hough Transform Adapted from S. Lazebnik.
Object Recognition with Invariant Features n Definition: Identify objects or scenes and determine their pose and model parameters n Applications l Industrial.
Fitting. We’ve learned how to detect edges, corners, blobs. Now what? We would like to form a higher-level, more compact representation of the features.
Fitting: Concepts and recipes. Review: The Hough transform What are the steps of a Hough transform? What line parametrization is best for the Hough transform?
Geometric Optimization Problems in Computer Vision.
Fitting a Model to Data Reading: 15.1,
Lecture 8: Image Alignment and RANSAC
Automatic Image Alignment (feature-based) : Computational Photography Alexei Efros, CMU, Fall 2006 with a lot of slides stolen from Steve Seitz and.
Fitting: The Hough transform
Robust estimation Problem: we want to determine the displacement (u,v) between pairs of images. We are given 100 points with a correlation score computed.
Lecture 10: Robust fitting CS4670: Computer Vision Noah Snavely.
1Jana Kosecka, CS 223b EM and RANSAC EM and RANSAC.
Fitting.
Alignment and Object Instance Recognition Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 02/16/12.
כמה מהתעשייה? מבנה הקורס השתנה Computer vision.
Fitting & Matching Lectures 7 & 8 – Prof. Fergus Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros.
Fitting and Registration
CSE 185 Introduction to Computer Vision
Chapter 6 Feature-based alignment Advanced Computer Vision.
Computer Vision - Fitting and Alignment
Lecture 12: Image alignment CS4670/5760: Computer Vision Kavita Bala
Fitting: The Hough transform. Voting schemes Let each feature vote for all the models that are compatible with it Hopefully the noise features will not.
Fitting & Matching Lecture 4 – Prof. Bregler Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros.
Photo by Carl Warner. Feature Matching and Robust Fitting Computer Vision James Hays Acknowledgment: Many slides from Derek Hoiem and Grauman&Leibe.
CS 1699: Intro to Computer Vision Matching and Fitting Prof. Adriana Kovashka University of Pittsburgh September 29, 2015.
CS654: Digital Image Analysis Lecture 25: Hough Transform Slide credits: Guillermo Sapiro, Mubarak Shah, Derek Hoiem.
776 Computer Vision Jan-Michael Frahm, Enrique Dunn Spring 2013.
Fitting: The Hough transform
Model Fitting Computer Vision CS 143, Brown James Hays 10/03/11 Slides from Silvio Savarese, Svetlana Lazebnik, and Derek Hoiem.
Edge Detection and Geometric Primitive Extraction Jinxiang Chai.
EECS 274 Computer Vision Model Fitting. Fitting Choose a parametric object/some objects to represent a set of points Three main questions: –what object.
Computer Vision - Fitting and Alignment (Slides borrowed from various presentations)
CSE 185 Introduction to Computer Vision Feature Matching.
776 Computer Vision Jan-Michael Frahm Spring 2012.
CS 2750: Machine Learning Linear Regression Prof. Adriana Kovashka University of Pittsburgh February 10, 2016.
Fitting.
CSCI 631 – Foundations of Computer Vision March 15, 2016 Ashwini Imran Image Stitching.
Hough Transform CS 691 E Spring Outline Hough transform Homography Reading: FP Chapter 15.1 (text) Some slides from Lazebnik.
CS 558 C OMPUTER V ISION Fitting, RANSAC, and Hough Transfrom Adapted from S. Lazebnik.
Grouping and Segmentation. Sometimes edge detectors find the boundary pretty well.
CS 2770: Computer Vision Fitting Models: Hough Transform & RANSAC
Segmentation by fitting a model: Hough transform and least squares
SIFT Scale-Invariant Feature Transform David Lowe
CS 4501: Introduction to Computer Vision Sparse Feature Descriptors: SIFT, Model Fitting, Panorama Stitching. Connelly Barnes Slides from Jason Lawrence,
University of Ioannina
Fitting.
Line Fitting James Hayes.
Fitting: The Hough transform
Exercise Class 11: Robust Tecuniques RANSAC, Hough Transform
Lecture 7: Image alignment
CS 2750: Machine Learning Linear Regression
A special case of calibration
Fitting.
Photo by Carl Warner.
Prof. Adriana Kovashka University of Pittsburgh October 19, 2016
Photo by Carl Warner.
Segmentation by fitting a model: robust estimators and RANSAC
Fitting CS 678 Spring 2018.
SIFT.
776 Computer Vision Jan-Michael Frahm.
CSE 185 Introduction to Computer Vision
Calibration and homographies
CS5760: Computer Vision Lecture 9: RANSAC Noah Snavely
Presentation transcript:

Fitting and Registration Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 02/14/12

Announcements HW 1 due today Early feedback form

Fitting: find the parameters of a model that best fit the data Alignment: find the parameters of the transformation that best align matched points

Fitting and Alignment Other challenges – Design a suitable goodness of fit measure – How can we be sure that we have the best? Avoid local optima – How can we find the parameters quickly? Find ways to avoid testing all parameters

Fitting and Alignment Design challenges – Design a suitable goodness of fit measure Similarity should reflect application goals Encode robustness to outliers and noise – Design an optimization method Avoid local optima Find best parameters quickly

Fitting and Alignment: Methods Global optimization / Search for parameters – Least squares fit – Robust least squares – Iterative closest point (ICP) Hypothesize and test – Generalized Hough transform – RANSAC

Simple example: Fitting a line

Least squares line fitting Data: (x 1, y 1 ), …, (x n, y n ) Line equation: y i = m x i + b Find ( m, b ) to minimize (x i, y i ) y=mx+b Matlab: p = A \ y; Modified from S. Lazebnik

Problem with “vertical” least squares Not rotation-invariant Fails completely for vertical lines Slide from S. Lazebnik

Total least squares If (a 2 +b 2 =1) then Distance between point (x i, y i ) is |ax i + by i + c| (x i, y i ) ax+by+c=0 Unit normal: N=(a, b) Slide modified from S. Lazebnik proof: LineDistance2-Dimensional.html LineDistance2-Dimensional.html

Total least squares If (a 2 +b 2 =1) then Distance between point (x i, y i ) is |ax i + by i + c| Find (a, b, c) to minimize the sum of squared perpendicular distances (x i, y i ) ax+by+c=0 Unit normal: N=(a, b) Slide modified from S. Lazebnik

Total least squares Find (a, b, c) to minimize the sum of squared perpendicular distances (x i, y i ) ax+by+c=0 Unit normal: N=(a, b) Solution is eigenvector corresponding to smallest eigenvalue of A T A See details on Raleigh Quotient: Slide modified from S. Lazebnik

Recap: Two Common Optimization Problems Problem statementSolution Problem statementSolution (matlab)

Least squares (global) optimization Good Clearly specified objective Optimization is easy Bad May not be what you want to optimize Sensitive to outliers – Bad matches, extra points Doesn’t allow you to get multiple good fits – Detecting multiple objects, lines, etc.

Robust least squares (to deal with outliers) General approach: minimize u i (x i, θ) – residual of i th point w.r.t. model parameters θ ρ – robust function with scale parameter σ The robust function ρ Favors a configuration with small residuals Constant penalty for large residuals Slide from S. Savarese

Robust Estimator

Demo – part 1

Other ways to search for parameters (for when no closed form solution exists) Line search 1.For each parameter, step through values and choose value that gives best fit 2.Repeat (1) until no parameter changes Grid search 1.Propose several sets of parameters, evenly sampled in the joint set 2.Choose best (or top few) and sample joint parameters around the current best; repeat Gradient descent 1.Provide initial position (e.g., random) 2.Locally search for better parameters by following gradient

Hypothesize and test 1.Propose parameters – Try all possible – Each point votes for all consistent parameters – Repeatedly sample enough points to solve for parameters 2.Score the given parameters – Number of consistent points, possibly weighted by distance 3.Choose from among the set of parameters – Global or local maximum of scores 4.Possibly refine parameters using inliers

Hough Transform: Outline 1.Create a grid of parameter values 2.Each point votes for a set of parameters, incrementing those values in grid 3.Find maximum or local maxima in grid

x y b m y = m x + b Hough transform Given a set of points, find the curve or line that explains the data points best P.V.C. Hough, Machine Analysis of Bubble Chamber Pictures, Proc. Int. Conf. High Energy Accelerators and Instrumentation, 1959 Hough space Slide from S. Savarese

x y b m x ym b Hough transform Slide from S. Savarese

x y Hough transform Issue : parameter space [m,b] is unbounded… P.V.C. Hough, Machine Analysis of Bubble Chamber Pictures, Proc. Int. Conf. High Energy Accelerators and Instrumentation, 1959 Hough space Use a polar representation for the parameter space Slide from S. Savarese

features votes Hough transform - experiments Slide from S. Savarese

featuresvotes Need to adjust grid size or smooth Hough transform - experiments Noisy data Slide from S. Savarese

Issue: spurious peaks due to uniform noise featuresvotes Hough transform - experiments Slide from S. Savarese

1. Image  Canny

2. Canny  Hough votes

3. Hough votes  Edges Find peaks and post-process

Hough transform example

Finding lines using Hough transform Using m,b parameterization Using r, theta parameterization – Using oriented gradients Practical considerations – Bin size – Smoothing – Finding multiple lines – Finding line segments

Finding circles (x 0, y 0, r) using Hough transform Fixed r Variable r

Hough transform conclusions Good Robust to outliers: each point votes separately Fairly efficient (much faster than trying all sets of parameters) Provides multiple good fits Bad Some sensitivity to noise Bin size trades off between noise tolerance, precision, and speed/memory – Can be hard to find sweet spot Not suitable for more than a few parameters – grid size grows exponentially Common applications Line fitting (also circles, ellipses, etc.) Object instance recognition (parameters are affine transform) Object category recognition (parameters are position/scale)

such that: Model parameters RANSAC Fischler & Bolles in ‘81. (RANdom SAmple Consensus) : Learning technique to estimate parameters of a model by random sampling of observed data Source: Savarese

RANSAC Algorithm: 1. Sample (randomly) the number of points required to fit the model 2. Solve for model parameters using samples 3. Score by the fraction of inliers within a preset threshold of the model Repeat 1-3 until the best model is found with high confidence Fischler & Bolles in ‘81. (RANdom SAmple Consensus) :

RANSAC Algorithm: 1. Sample (randomly) the number of points required to fit the model (#=2) 2. Solve for model parameters using samples 3. Score by the fraction of inliers within a preset threshold of the model Repeat 1-3 until the best model is found with high confidence Illustration by Savarese Line fitting example

RANSAC Algorithm: 1. Sample (randomly) the number of points required to fit the model (#=2) 2. Solve for model parameters using samples 3. Score by the fraction of inliers within a preset threshold of the model Repeat 1-3 until the best model is found with high confidence Line fitting example

RANSAC Algorithm: 1. Sample (randomly) the number of points required to fit the model (#=2) 2. Solve for model parameters using samples 3. Score by the fraction of inliers within a preset threshold of the model Repeat 1-3 until the best model is found with high confidence Line fitting example

RANSAC Algorithm: 1. Sample (randomly) the number of points required to fit the model (#=2) 2. Solve for model parameters using samples 3. Score by the fraction of inliers within a preset threshold of the model Repeat 1-3 until the best model is found with high confidence

How to choose parameters? Number of samples N – Choose N so that, with probability p, at least one random sample is free from outliers (e.g. p=0.99) (outlier ratio: e ) Number of sampled points s – Minimum number needed to fit the model Distance threshold  – Choose  so that a good point with noise is likely (e.g., prob=0.95) within threshold – Zero-mean Gaussian noise with std. dev. σ: t 2 =3.84σ 2 proportion of outliers e s5%10%20%25%30%40%50% modified from M. Pollefeys

RANSAC conclusions Good Robust to outliers Applicable for larger number of objective function parameters than Hough transform Optimization parameters are easier to choose than Hough transform Bad Computational time grows quickly with fraction of outliers and number of parameters Not good for getting multiple fits Common applications Computing a homography (e.g., image stitching) Estimating fundamental matrix (relating two views)

Demo – part 2

What if you want to align but have no prior matched pairs? Hough transform and RANSAC not applicable Important applications Medical imaging: match brain scans or contours Robotics: match point clouds

Iterative Closest Points (ICP) Algorithm Goal: estimate transform between two dense sets of points 1.Initialize transformation (e.g., compute difference in means and scale) 2.Assign each point in {Set 1} to its nearest neighbor in {Set 2} 3.Estimate transformation parameters – e.g., least squares or robust least squares 4.Transform the points in {Set 1} using estimated parameters 5.Repeat steps 2-4 until change is very small

Algorithm Summary Least Squares Fit – closed form solution – robust to noise – not robust to outliers Robust Least Squares – improves robustness to noise – requires iterative optimization Hough transform – robust to noise and outliers – can fit multiple models – only works for a few parameters (1-4 typically) RANSAC – robust to noise and outliers – works with a moderate number of parameters (e.g, 1-8) Iterative Closest Point (ICP) – For local alignment only: does not require initial correspondences

Example: solving for translation A1A1 A2A2 A3A3 B1B1 B2B2 B3B3 Given matched points in {A} and {B}, estimate the translation of the object

Example: solving for translation A1A1 A2A2 A3A3 B1B1 B2B2 B3B3 Least squares solution (t x, t y ) 1.Write down objective function 2.Derived solution a)Compute derivative b)Compute solution 3.Computational solution a)Write in form Ax=b b)Solve using pseudo-inverse or eigenvalue decomposition

Example: solving for translation A1A1 A2A2 A3A3 B1B1 B2B2 B3B3 RANSAC solution (t x, t y ) 1.Sample a set of matching points (1 pair) 2.Solve for transformation parameters 3.Score parameters with number of inliers 4.Repeat steps 1-3 N times Problem: outliers A4A4 A5A5 B5B5 B4B4

Example: solving for translation A1A1 A2A2 A3A3 B1B1 B2B2 B3B3 Hough transform solution (t x, t y ) 1.Initialize a grid of parameter values 2.Each matched pair casts a vote for consistent values 3.Find the parameters with the most votes 4.Solve using least squares with inliers A4A4 A5A5 A6A6 B4B4 B5B5 B6B6 Problem: outliers, multiple objects, and/or many-to-one matches

Example: solving for translation (t x, t y ) Problem: no initial guesses for correspondence ICP solution 1.Find nearest neighbors for each point 2.Compute transform using matches 3.Move points using transform 4.Repeat steps 1-3 until convergence

Next class: Object Recognition Keypoint-based object instance recognition and search