Computational Support for RRTs David Johnson. Basic Extend.

Slides:



Advertisements
Similar presentations
Lecture outline Nearest-neighbor search in low dimensions
Advertisements

A Nonlinear Approach to Dimension Reduction Robert Krauthgamer Weizmann Institute of Science Joint work with Lee-Ad Gottlieb TexPoint fonts used in EMF.
Nearest Neighbor Search in High Dimensions Seminar in Algorithms and Geometry Mica Arie-Nachimson and Daniel Glasner April 2009.
Nearest Neighbor Finding Using Kd-tree Ref: Andrew Moore’s PhD thesis (1991)Andrew Moore’s PhD thesis.
Multidimensional Indexing
Searching on Multi-Dimensional Data
Data Structures and Functional Programming Algorithms for Big Data Ramin Zabih Cornell University Fall 2012.
Fast High-Dimensional Feature Matching for Object Recognition David Lowe Computer Science Department University of British Columbia.
Asst. Prof. Yusuf Sahillioğlu
KD TREES CS16: Introduction to Data Structures & Algorithms Tuesday, April 7,
Given by: Erez Eyal Uri Klein Lecture Outline Exact Nearest Neighbor search Exact Nearest Neighbor search Definition Definition Low dimensions Low dimensions.
Rapidly Expanding Random Trees
Instance Based Learning
Approximate Nearest Subspace Search with Applications to Pattern Recognition Ronen Basri, Tal Hassner, Lihi Zelnik-Manor presented by Andrew Guillory and.
Analysis of greedy active learning Sanjoy Dasgupta UC San Diego.
Scalable Data Mining The Auton Lab, Carnegie Mellon University Brigham Anderson, Andrew Moore, Dan Pelleg, Alex Gray, Bob Nichols, Andy.
Lars Arge1, Mark de Berg2, Herman Haverkort3 and Ke Yi1
Efficient Nearest-Neighbor Search in Large Sets of Protein Conformations Fabian Schwarzer Itay Lotan.
Bounding Volume Hierarchies and Spatial Partitioning Kenneth E. Hoff III COMP-236 lecture Spring 2000.
Approximate Nearest Subspace Search with applications to pattern recognition Ronen Basri Tal Hassner Lihi Zelnik-Manor Weizmann Institute Caltech.
Which Spatial Partition Trees Are Adaptive to Intrinsic Dimension? Nakul Verma, Samory Kpotufe, and Sanjoy Dasgupta {naverma, skpotufe,
FLANN Fast Library for Approximate Nearest Neighbors
10/11/2001CS 638, Fall 2001 Today Kd-trees BSP Trees.
 This lecture introduces multi-dimensional queries in databases, as well as addresses how we can query and represent multi- dimensional data.
Module 04: Algorithms Topic 07: Instance-Based Learning
Anna Yershova Dept. of Computer Science University of Illinois
Trees for spatial data representation and searching
10/09/2001CS 638, Fall 2001 Today Spatial Data Structures –Why care? –Octrees/Quadtrees –Kd-trees.
Dynamic-Domain RRTs Anna Yershova, Steven M. LaValle 03/08/2006.
B-trees and kd-trees Piotr Indyk (slides partially by Lars Arge from Duke U)
Multidimensional Indexes Applications: geographical databases, data cubes. Types of queries: –partial match (give only a subset of the dimensions) –range.
Multi-dimensional Search Trees
Fast search methods Pasi Fränti Clustering methods: Part 5 Speech and Image Processing Unit School of Computing University of Eastern Finland
PRESENTED BY – GAURANGI TILAK SHASHANK AGARWAL Collision Detection.
Orthogonal Range Search
Fast BVH Construction on GPUs (Eurographics 2009) Park, Soonchan KAIST (Korea Advanced Institute of Science and Technology)
Data Mining Practical Machine Learning Tools and Techniques Chapter 4: Algorithms: The Basic Methods Section 4.7: Instance-Based Learning Rodney Nielsen.
Computational Geometry Piyush Kumar (Lecture 5: Range Searching) Welcome to CIS5930.
Nonparametric Modeling of Images
Geometric Problems in High Dimensions: Sketching Piotr Indyk.
Approximate Nearest Neighbors: Towards Removing the Curse of Dimensionality Piotr Indyk, Rajeev Motwani The 30 th annual ACM symposium on theory of computing.
Efficient Nearest Neighbor Searching for Motion Planning Anna Atramentov Dept. of Computer Science Iowa State University Ames, IA, USA Steven M. LaValle.
Multi-dimensional Search Trees CS302 Data Structures Modified from Dr George Bebis.
KNN Classifier.  Handed an instance you wish to classify  Look around the nearby region to see what other classes are around  Whichever is most common—make.
Randomized Kinodynamics Planning Steven M. LaVelle and James J
SuperResolution (SR): “Classical” SR (model-based) Linear interpolation (with post-processing) Edge-directed interpolation (simple idea) Example-based.
1 Point Location Strategies Idit Haran
1 Learning Bias & Clustering Louis Oliphant CS based on slides by Burr H. Settles.
CS 8751 ML & KDDInstance Based Learning1 k-Nearest Neighbor Locally weighted regression Radial basis functions Case-based reasoning Lazy and eager learning.
Spatial Data Management
Optimal Acceleration and Braking Sequences for Vehicles in the Presence of Moving Obstacles Jeff Johnson, Kris Hauser School of Informatics and Computing.
Bounding Volume Hierarchies and Spatial Partitioning
Data Structures Lab Algorithm Animation.
Data Science Algorithms: The Basic Methods
Collision Detection Spring 2004.
Spatial data structures -kdtrees
Bounding Volume Hierarchies and Spatial Partitioning
Spatial Indexing I Point Access Methods.
KD Tree A binary search tree where every node is a
Orthogonal Range Searching and Kd-Trees
K Nearest Neighbor Classification
6. Introduction to nonparametric clustering
Multidimensional Indexes
Exact Nearest Neighbor Algorithms
Shape-based Registration
CMPS 3130/6130 Computational Geometry Spring 2017
President’s Day Lecture: Advanced Nearest Neighbor Search
Ronen Basri Tal Hassner Lihi Zelnik-Manor Weizmann Institute Caltech
Data Mining CSCI 307, Spring 2019 Lecture 21
Data Mining CSCI 307, Spring 2019 Lecture 23
Presentation transcript:

Computational Support for RRTs David Johnson

Basic Extend

Example in holonomic empty space

Computational Bottlenecks Collision detection –Not much different here than for PRMs Any differences? Finding the nearest neighbor to a vertex –Linear search O(n) time. –May have >10K points.

Kd-trees The kd-tree is a powerful data structure that is based on recursively subdividing a set of points with alternating axis- aligned hyperplanes. The classical kd-tree answers queries in time logarithmic in n, but exponential in d.

Construction Given –[(2,3), (5,4), (9,6), (4,7), (8,1), (7,2)] Split by median along axis –For big point sets might use median of a few random samples Switch axis Based on wikipedia article

Kd-trees. Construction l5l5 l1l1 l9l9 l6l6 l3l3 l 10 l7l7 l4l4 l8l8 l2l2 l1l1 l8l8 1 l2l2 l3l3 l4l4 l5l5 l7l7 l6l6 l9l Split Heuristics Median Point Median Value Clustering

Kd-trees. Query For Point Existence l5l5 l1l1 l9l9 l6l6 l3l3 l 10 l7l7 l4l4 l8l8 l2l2 l1l1 l8l8 1 l2l2 l3l3 l4l4 l5l5 l7l7 l6l6 l9l q

Another Example

Find Nearest Neighbor

Check Neighbor Cells

Can Be Efficient

Might Not Be Efficient

k-d Trees in High Dimensions Rule of thumb –Need num points >> 2^d for k-d trees to give much efficiency. Suggests that approximate answers may be worthwhile

ANN – Approximate Nearest Neighbor Approximate nearest neighbor (ANN) problem: –Find a point p  P that is an  –approximate nearest neighbor of the query q in that for all p'  P, d ( p, q )  (1+  ) d ( p', q ).

Visualization of ANN