Parallel implementation of RAndom SAmple Consensus (RANSAC) Adarsh Kowdle.

Slides:



Advertisements
Similar presentations
PARAMETER ESTIMATION FOR ODES USING A CROSS-ENTROPY APPROACH Wayne Enright Bo Wang University of Toronto.
Advertisements

Performance Measurement n Assignment? n Timing #include double When() { struct timeval tp; gettimeofday(&tp, NULL); return((double)tp.tv_sec + (double)tp.tv_usec.
Image Modeling & Segmentation
The fundamental matrix F
Discussion topics SLAM overview Range and Odometry data Landmarks
Robot Vision SS 2005 Matthias Rüther 1 ROBOT VISION Lesson 3: Projective Geometry Matthias Rüther Slides courtesy of Marc Pollefeys Department of Computer.
Computer Vision Detecting the existence, pose and position of known objects within an image Michael Horne, Philip Sterne (Supervisor)
Chapter 6 Feature-based alignment Advanced Computer Vision.
Implementation of ICP Variants Pavan Ram Piratla Janani Venkateswaran.
Topic 6: Introduction to Hypothesis Testing
Mutual Information Mathematical Biology Seminar
Reinforcement Learning Rafy Michaeli Assaf Naor Supervisor: Yaakov Engel Visit project’s home page at: FOR.
Incremental Learning of Temporally-Coherent Gaussian Mixture Models Ognjen Arandjelović, Roberto Cipolla Engineering Department, University of Cambridge.
Geometric Optimization Problems in Computer Vision.
reconstruction process, RANSAC, primitive shapes, alpha-shapes
CS664 Lecture #19: Layers, RANSAC, panoramas, epipolar geometry Some material taken from:  David Lowe, UBC  Jiri Matas, CMP Prague
Topics: Regression Simple Linear Regression: one dependent variable and one independent variable Multiple Regression: one dependent variable and two or.
Lecture 10: Robust fitting CS4670: Computer Vision Noah Snavely.
Random Sample Consensus: A Paradigm for Model Fitting with Application to Image Analysis and Automated Cartography Martin A. Fischler, Robert C. Bolles.
1Jana Kosecka, CS 223b EM and RANSAC EM and RANSAC.
CSE 473/573 RANSAC & Least Squares Devansh Arpit.
כמה מהתעשייה? מבנה הקורס השתנה Computer vision.
Chapter 6 Feature-based alignment Advanced Computer Vision.
CS470/570 Lecture 5 Introduction to OpenMP Compute Pi example OpenMP directives and options.
1 Copyright © 2010, Elsevier Inc. All rights Reserved Chapter 5 Shared Memory Programming with OpenMP An Introduction to Parallel Programming Peter Pacheco.
Optimization of System Performance using OpenMP m Yumiko Kimezawa May 25, 20111RPS.
Sampling January 9, Cardinal Rule of Sampling Never sample on the dependent variable! –Example: if you are interested in studying factors that lead.
Active Learning for Class Imbalance Problem
Advanced Computer Vision Feature-based Alignment Lecturer: Lu Yi & Prof. Fuh CSIE NTU.
1 Robust estimation techniques in real-time robot vision Ezio Malis, Eric Marchand INRIA Sophia, projet ICARE INRIA Rennes, projet Lagadic.
Measuring Synchronisation and Scheduling Overheads in OpenMP J. Mark Bull EPCC University of Edinburgh, UK
Confidence intervals for the mean - continued
Chapter 8 Confidence Intervals 8.1 Confidence Intervals about a Population Mean,  Known.
Mixture of Gaussians This is a probability distribution for random variables or N-D vectors such as… –intensity of an object in a gray scale image –color.
Source: Computer Vision and Pattern Recognition Workshops (CVPRW), 2010 IEEE Computer Society Conference on Author: Paucher, R.; Turk, M.; Adviser: Chia-Nian.
Sparse Signals Reconstruction Via Adaptive Iterative Greedy Algorithm Ahmed Aziz, Ahmed Salim, Walid Osamy Presenter : 張庭豪 International Journal of Computer.
Parallelization of likelihood functions for data analysis Alfio Lazzaro CERN openlab Forum on Concurrent Programming Models and Frameworks.
RANSAC Robust model estimation from data contaminated by outliers Ondřej Chum.
1 Let’s play snooker Group 2 Yannick Thimister Frans van den Heuvel Enno Ruijters Esther Verhoef Ali B. Ozmen Achim Leydecker.
EECS 274 Computer Vision Geometric Camera Calibration.
Real-Time Turbo Decoder Nasir Ahmed Mani Vaya Elec 434 Rice University.
Learning and Acting with Bayes Nets Chapter 20.. Page 2 === A Network and a Training Data.
1 The Software Development Process ► Systems analysis ► Systems design ► Implementation ► Testing ► Documentation ► Evaluation ► Maintenance.
2.2 What do samples tell us? Key Q-How do you use a sample?
Multiple Regression Learning Objectives n Explain the Linear Multiple Regression Model n Interpret Linear Multiple Regression Computer Output n Test.
From the population to the sample The sampling distribution FETP India.
Robust Estimation Course web page: vision.cis.udel.edu/~cv April 23, 2003  Lecture 25.
Non-parametric Methods for Clustering Continuous and Categorical Data Steven X. Wang Dept. of Math. and Stat. York University May 13, 2010.
Camera calibration from multiple view of a 2D object, using a global non linear minimization method Computer Engineering YOO GWI HYEON.
CSCI 631 – Foundations of Computer Vision March 15, 2016 Ashwini Imran Image Stitching.
EE 7730 Parametric Motion Estimation. Bahadir K. Gunturk2 Parametric (Global) Motion Affine Flow.
Accelerating K-Means Clustering with Parallel Implementations and GPU Computing Janki Bhimani Miriam Leeser Ningfang Mi
Line fitting.
Grouping and Segmentation. Sometimes edge detectors find the boundary pretty well.
CSCI 631 – Foundations of Computer Vision March 15, 2016 Ashwini Imran Image Stitching Link: singhashwini.mesinghashwini.me.
Chapter 5 STATISTICS (PART 4).
A Simple Artificial Neuron
A Brief Introduction of RANSAC
Mai Zhou Dept. of Statistics, University of Kentucky Chengwu Cui
A special case of calibration
Sampling Distribution
Sampling Distribution
Simple Linear Regression
M4 and Parallel Programming
Introduction to Sensor Interpretation
Concurrency Platforms OpenMP and Cilk Plus
Multi-Information Based GCPs Selection Method
Introduction to Sensor Interpretation
Calibration and homographies
How Confident Are You?.
Presentation transcript:

Parallel implementation of RAndom SAmple Consensus (RANSAC) Adarsh Kowdle

Algorithm description Iterative method to estimate parameters of a mathematical model from a set of observed data, which contains outliers A simple form of RANSAC considered for the project

Algorithm description Step 1: Randomly sample the data to obtain two points Step 2: Determine the parameters of the line joining these two points Step 3: Evaluate the distance of every other points from this line => serves as error function to be minimized Step 4: Repeat Step 1 till the required number of iterations have been completed Step 5: The resulting line parameters represents the best line fit for the given observations of 2D points.

Parallel design proposed - OpenMP Master Thread 1 Thread 2 Thread 3Thread N Suppose that there are data points Master randomly samples data points and passes it to the threads Thread 1 Thread 2 Thread 3Thread N Data points in shared memory, instantiate threads to compute error Different variants of the code tried out

Results and conclusions Extracted data points offline using Matlab – about data points The data points were created by adding noise to it a known line, so, best fit line is known Implemented the parallel version of RANSAC using OpenMP directives and tested with these data points Performance – Serial code took 0.24 seconds for 5000 iterations – OpenMP parallel implementation takes 0.05 seconds for 5000 iterations

Results and conclusions A parallel version of RANSAC has been implemented With the use of OMP directives, a five-fold decrease in time taken has been achieved which is significant