Topological Signatures For Fast Mobility Analysis

Slides:



Advertisements
Similar presentations
Applications of one-class classification
Advertisements

The Software Infrastructure for Electronic Commerce Databases and Data Mining Lecture 4: An Introduction To Data Mining (II) Johannes Gehrke
Aggregating local image descriptors into compact codes
Ch2 Data Preprocessing part3 Dr. Bernard Chen Ph.D. University of Central Arkansas Fall 2009.
Integrated Instance- and Class- based Generative Modeling for Text Classification Antti PuurulaUniversity of Waikato Sung-Hyon MyaengKAIST 5/12/2013 Australasian.
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
DIMENSIONALITY REDUCTION: FEATURE EXTRACTION & FEATURE SELECTION Principle Component Analysis.
Motion Planning for Point Robots CS 659 Kris Hauser.
Data Mining Classification: Alternative Techniques
Distributed Approximate Spectral Clustering for Large- Scale Datasets FEI GAO, WAEL ABD-ALMAGEED, MOHAMED HEFEEDA PRESENTED BY : BITA KAZEMI ZAHRANI 1.
Dynamics of Learning VQ and Neural Gas Aree Witoelar, Michael Biehl Mathematics and Computing Science University of Groningen, Netherlands in collaboration.
4/15/2017 Using Gaussian Process Regression for Efficient Motion Planning in Environments with Deformable Objects Barbara Frank, Cyrill Stachniss, Nichola.
K nearest neighbor and Rocchio algorithm
Week 9 Data Mining System (Knowledge Data Discovery)
1 ISI’02 Multidimensional Databases Challenge: representation for efficient storage, indexing & querying Examples (time-series, images) New multidimensional.
Multiple Object Class Detection with a Generative Model K. Mikolajczyk, B. Leibe and B. Schiele Carolina Galleguillos.
Memory-Based Learning Instance-Based Learning K-Nearest Neighbor.
A fuzzy video content representation for video summarization and content-based retrieval Anastasios D. Doulamis, Nikolaos D. Doulamis, Stefanos D. Kollias.
General Mining Issues a.j.m.m. (ton) weijters Overfitting Noise and Overfitting Quality of mined models (some figures are based on the ML-introduction.
Clustering An overview of clustering algorithms Dènis de Keijzer GIA 2004.
Module 04: Algorithms Topic 07: Instance-Based Learning
Methods in Medical Image Analysis Statistics of Pattern Recognition: Classification and Clustering Some content provided by Milos Hauskrecht, University.
© Manfred Huber Autonomous Robots Robot Path Planning.
DATA MINING LECTURE 10 Classification k-nearest neighbor classifier Naïve Bayes Logistic Regression Support Vector Machines.
Introduction to machine learning and data mining 1 iCSC2014, Juan López González, University of Oviedo Introduction to machine learning Juan López González.
Glasgow 02/02/04 NN k networks for content-based image retrieval Daniel Heesch.
Intelligent Vision Systems ENT 496 Object Shape Identification and Representation Hema C.R. Lecture 7.
CSE 185 Introduction to Computer Vision Pattern Recognition 2.
December 9, 2014Computer Vision Lecture 23: Motion Analysis 1 Now we will talk about… Motion Analysis.
Vehicle Segmentation and Tracking From a Low-Angle Off-Axis Camera Neeraj K. Kanhere Committee members Dr. Stanley Birchfield Dr. Robert Schalkoff Dr.
1 Research Question  Can a vision-based mobile robot  with limited computation and memory,  and rapidly varying camera positions,  operate autonomously.
Project 11: Determining the Intrinsic Dimensionality of a Distribution Okke Formsma, Nicolas Roussis and Per Løwenborg.
Trajectories in same homotopy classses Trajectories in different homotopy classses Definition of Homotopy Class Set of trajectories joining same start.
Outline Problem Background Theory Extending to NLP and Experiment
KNN & Naïve Bayes Hongning Wang Today’s lecture Instance-based classifiers – k nearest neighbors – Non-parametric learning algorithm Model-based.
Panther: Fast Top-k Similarity Search in Large Networks JING ZHANG, JIE TANG, CONG MA, HANGHANG TONG, YU JING, AND JUANZI LI Presented by Moumita Chanda.
DATA MINING LECTURE 10b Classification k-nearest neighbor classifier
Eick: kNN kNN: A Non-parametric Classification and Prediction Technique Goals of this set of transparencies: 1.Introduce kNN---a popular non-parameric.
Instance-Based Learning Evgueni Smirnov. Overview Instance-Based Learning Comparison of Eager and Instance-Based Learning Instance Distances for Instance-Based.
Tree and Forest Classification and Regression Tree Bagging of trees Boosting trees Random Forest.
KNN & Naïve Bayes Hongning Wang
Naifan Zhuang, Jun Ye, Kien A. Hua
Image Representation and Description – Representation Schemes
General-Purpose Learning Machine
Data Transformation: Normalization
Semi-Supervised Clustering
Data Science Algorithms: The Basic Methods
Fast nearest neighbor searches in high dimensions Sami Sieranoja
Instance Based Learning
Efficient Image Classification on Vertically Decomposed Data
CS 326A: Motion Planning Probabilistic Roadmaps for Path Planning in High-Dimensional Configuration Spaces (1996) L. Kavraki, P. Švestka, J.-C. Latombe,
Machine Learning Basics
Vehicle Segmentation and Tracking in the Presence of Occlusions
Jianping Fan Dept of CS UNC-Charlotte
Efficient Image Classification on Vertically Decomposed Data
K Nearest Neighbor Classification
William Norris Professor and Head, Department of Computer Science
Finding Fastest Paths on A Road Network with Speed Patterns
Fast Nearest Neighbor Search on Road Networks
Design of Hierarchical Classifiers for Efficient and Accurate Pattern Classification M N S S K Pavan Kumar Advisor : Dr. C. V. Jawahar.
Haitao Wang Utah State University SoCG 2017, Brisbane, Australia
Data Mining Classification: Alternative Techniques
Finding Periodic Discrete Events in Noisy Streams
Avoid Overfitting in Classification
Word representations David Kauchak CS158 – Fall 2016.
Memory-Based Learning Instance-Based Learning K-Nearest Neighbor
Can Genetic Programming Do Manifold Learning Too?
Overview: Chapter 2 Localization and Tracking
L. Glimcher, R. Jin, G. Agrawal Presented by: Leo Glimcher
Presentation transcript:

Topological Signatures For Fast Mobility Analysis Abhirup Ghosh, Benedek Rozemberczki, *Subramanian Ramamoorthy, Rik Sarkar University of Edinburgh, *FiveAI Ltd., Edinburgh

Mobility analysis We want to analyze mobility for Clustering similar trajectories Predicting motion at large scale Finding nearest neighbor trajectories

Mobility analysis is challenging Trajectories are Complex Sequential Have different lengths Trajectory distances (Fréchet) are expensive to compute Standard learning and mining tools for point clouds do not apply

Overview: Topological signatures Fixed dimensional Euclidean vectors Efficiently compare trajectories using Euclidean distance Can apply standard learning and mining ? 500𝑚 Nearest neighbor search Clustering Motion prediction at large scale

Related work Markov model and Neural Networks [IJCAI ‘17] are popular for modelling mobility Can predict motion at small scale But, not accurate for prediction at large scale Models are compute intensive Memoryless assumption for a Markov model does not fit for real trajectories Neural networks are not general enough for other analytical tasks like clustering trajectories

Perspective of obstacles Complexity of trajectories arise from navigating around obstacles Homotopy types classify trajectories regarding navigation patterns Different homotopy Different from the red Obstacle Same homotopy

Limitations of topological analysis Homotopy types cannot compare trajectories with different source-destinations Homotopy types are categorial – difficult to use in further analysis Cannot compare using homotopy

Algorithm: Constructing topological signatures

Angles encode how a trajectory navigates obstacles Key observations A trajectory creates angles at obstacles Angles are equal for navigating an obstacle similarly Angles differ for navigating an obstacle differently 𝛼 Signature space (−𝜋,𝛼) (𝜋, 𝛼) 𝑂 1 𝑂 2 𝑂 2 𝑂 1 −𝜋 𝜋 Angles encode how a trajectory navigates obstacles

Method using differential forms Formalise using differential forms on a graph Steps to build topological signatures: Discretize domain – create planar graph Construct differential forms on edges Build topological signatures using differential forms More general than angles and work without localization

Discrete domain – Planar graph Road networks naturally discretize the domain Triangulation on random points creates planar graph Trajectories are sequence of edges Obstacles – Regions with no / less mobility Discretization by road network Triangulating random points

Differential 1-form 𝑎 𝑏 𝑓 𝑎𝑏 =1 𝑓 𝑏𝑎 =−1 Differential 1-forms are weights on edge set 𝐸 of the planar graph – 𝑓:𝐸→ℝ Weights are associated with directed edges so that 𝑓 𝑎𝑏 =−𝑓(𝑏𝑎) 𝑓 𝑎𝑏 =1 𝑓 𝑏𝑎 =−1 𝑏 𝑎 Connection,

Differential forms on a planar graph Multiple straight walks from an obstacle to the boundary in random directions Assign differential forms / directed weights on crossing edges Weight at an edge is the number of crossing walks 1 Assign directed edge weights 2

Integrate differential forms along trajectory Integration over a path = Sum the differential forms [−𝟐] [𝟓] 1 1 1 1 1 1 1 Integration values can separate the trajectories

Differential forms for all obstacles Construct differential forms for all obstacles Maintain them separately at edges 𝑂 1 𝑂 2 𝑂 1 𝑂 2 2 1 𝑎𝑏 𝑏𝑎 Differential forms for edge between 𝑎 and 𝑏

Topological signature Topological signature: integration of differential forms along a trajectory Maintain integrations for different obstacles in separate dimensions Topological Signatures 𝑂 1 𝑂 1 𝑂 2 𝑂 3 𝑂 4 5 -6 -7 -9 4 -5 𝑂 2 𝑂 4 𝑂 3

Signatures preserve topological properties 𝑇ℎ𝑒𝑜𝑟𝑒𝑚 4.1 − Trajectories of same homotopy have the same signatures and trajectories of different homotopy have different signatures 𝑇ℎ𝑒𝑜𝑟𝑒𝑚 4.3 − We can efficiently find a trajectory up to topological equivalence from its signature Theorems are valid for non self-intersecting trajectories Homotopy equal -> same source / dest Theorems 𝑠𝑡𝑎𝑟𝑡 𝑒𝑛𝑑

More on properties of signatures Signatures are compact representations Analysis algorithms run efficiently on signatures Efficient ways to compute signatures Online Distributed Framework is general – flexible ways to create differential forms

Applications & Experiments

Experimental setup Public datasets of GPS trajectories: Rome Taxi [CRAWDAD] , Porto Taxi [kaggle] Triangulate random points using Delaunay triangulation Obstacles – regions with <5 trajectories passing by 3900 trajectories – 2.5 x 2.1 km. A point every 250 m^2 – 20,000 points. We construct a planar graph using triangulation by Delaunay triangulation of random points in the points.

Direction prediction at large scale Given history path, predict direction at scale 𝑟 (𝑒.𝑔., 500𝑚) Regression methods: Error 500𝑚 Prediction method LSTM Standard regressors Feature Location history Topological Signature Neural network for sequence modelling

Accuracy – LSTM vs Signature based % of test cases Prediction error in degrees Even simple KNN prediction outperforms LSTM – Signatures encode powerful features

Efficiency – LSTM vs Signature based Training time (min) Query time (sec) # trajectories in dataset # trajectories in dataset Signatures enable efficient prediction

Trajectory clustering Trajectory clustering – Standard clustering on signatures

Trajectory clustering DBSCAN on signatures can separate complex overlapping patterns – Signatures contain important features

Searching Fréchet nearest neighbor Prune using Locality Sensitive Hashing on signatures Hash function – project signatures on random line and segment the line into buckets Trajectories in the same bucket with query are similar to query Query trajectory Signatures Nearest neighbours – Explain a bucket contains similar trajectories Project on Random line A bucket contains similar trajectories

Nearest neighbor search – Accuracy % success # trajectories to test using Fréchet Small # of candidates to find Fréchet nearest neighbor

Nearest neighbor search – Efficiency Pairwise Fréchet Compute time (min) LSH # trajectories in the dataset Faster nearest neighbor than pair-wise Fréchet

Dimensionality of signatures Signatures can be high dimensional – many obstacles % success number # trajectories to test using Fréchet Selected 5 dimensions out of 67. Low dimensional signatures preserve nearest neighbor accuracy

Summary Signatures preserve topological properties Enable motion prediction at large scale Enable analytic tools clustering nearest neighbor search density estimate dimension reduction Framework to produce signatures is fast and robust Robust to localization noise and localization frequency Can be extended multi-resolution signatures Didn’t discuss – mention. Multi res Prediction , clustering Vector x All same level