Download presentation

Presentation is loading. Please wait.

Published byAriel Bilberry Modified about 1 year ago

1
1 Learning the topology of a data set Michaël Aupetit – Researcher Engineer (CEA) Pierre Gaillard – Ph.D. Student Gerard Govaert – Professor (University of Technology of Compiegne) PASCAL BOOTCAMP, Vilanova, July 2007

2
2 12/07/2007 BootCamp PASCAL Introduction Given a set of M data in R D, the estimation of the density allow solving various problems : classification, clustering, regression

3
3 12/07/2007 BootCamp PASCAL A question without answer… The generative models cannot aswer this question: Which is the « shape » of this data set ?

4
4 12/07/2007 BootCamp PASCAL An subjective answer 1 point and 1 curve not connected to each other The expected answer is : The problem : what is the topology of the principal manifolds

5
5 12/07/2007 BootCamp PASCAL Why learning topology : (semi)-supervised applications Estimate the complexity of the classification task [Lallich02, Aupetit05Neurocomputing] Add a topological a priori to design a classifier [Belkin05Nips] Add topological features to statistical features Classify through the connected components or the intrinsic dimension. [Belkin]

6
6 12/07/2007 BootCamp PASCAL Why learning topology : unsupervised applications [Zeller, Schulten -IEEE ISIC1996] Clusters defined by the connected components Data exploration (e.g. shortest path) Robotic (Optimal path, inverse kinematic)

7
7 12/07/2007 BootCamp PASCAL Gaussian Mixture Generative manifold learning Problems: fixed or incomplete topology GTM [Bishop] Revisited Principal Curves [Hastie,Stuetzle] MPPCA [Bishop]

8
8 12/07/2007 BootCamp PASCAL Computational Topology M1M1 M2M2 All the previous work about topology learning has been grounded on the result of Edelsbrunner and Shah (1997) which proved that given a manifold and a set of N prototypes nearby M, it exists a subgraph* of the Delaunay graph of the prototypes which has the same topology as M * more exactly a subcomplex of the Delaunay complex

9
9 12/07/2007 BootCamp PASCAL Computational Topology M1M1 M2M2 All the previous work about topology learning has been grounded on the result of Edelsbrunner and Shah (1997) which proved that given a manifold and a set of N prototypes nearby M, it exists a subgraph of the Delaunay graph of the prototypes which has the same topology as M

10
10 12/07/2007 BootCamp PASCAL Computational Topology M1M1 M2M2 All the previous work about topology learning has been grounded on the result of Edelsbrunner and Shah (1997) which proved that given a manifold and a set of N prototypes nearby M, it exists a subgraph of the Delaunay graph of the prototypes which has the same topology as M

11
11 12/07/2007 BootCamp PASCAL Computational Topology M1M1 M2M2 All the previous work about topology learning has been grounded on the result of Edelsbrunner and Shah (1997) which proved that given a manifold and a set of N prototypes nearby M, it exists a subgraph of the Delaunay graph of the prototypes which has the same topology as M

12
12 12/07/2007 BootCamp PASCAL Computational Topology M1M1 M2M2 All the previous work about topology learning has been grounded on the result of Edelsbrunner and Shah (1997) which proved that given a manifold and a set of N prototypes nearby M, it exists a subgraph of the Delaunay graph of the prototypes which has the same topology as M

13
13 12/07/2007 BootCamp PASCAL Computational Topology M1M1 M2M2 All the previous work about topology learning has been grounded on the result of Edelsbrunner and Shah (1997) which proved that given a manifold and a set of N prototypes nearby M, it exists a subgraph of the Delaunay graph of the prototypes which has the same topology as M Extractible topology O(DN 3 )

14
14 12/07/2007 BootCamp PASCAL Application : known manifold Topology of molecules [Edelsbrunner1994]

15
15 12/07/2007 BootCamp PASCAL Approximation : manifold known throught a data set

16
16 12/07/2007 BootCamp PASCAL Topology Representing Network Topology Representing Network [Martinetz, Schulten 1994] Connect the 1 st and 2 nd NN of each data

17
17 12/07/2007 BootCamp PASCAL Topology Representing Network Topology Representing Network [Martinetz, Schulten 1994] 1 er 2 nd

18
18 12/07/2007 BootCamp PASCAL Topology Representing Network Topology Representing Network [Martinetz, Schulten 1994] 1 er 2 nd

19
19 12/07/2007 BootCamp PASCAL Topology Representing Network Topology Representing Network [Martinetz, Schulten 1994] 1 er 2 nd

20
20 12/07/2007 BootCamp PASCAL Topology Representing Network Topology Representing Network [Martinetz, Schulten 1994]

21
21 12/07/2007 BootCamp PASCAL Topology Representing Network Topology Representing Network [Martinetz, Schulten 1994]

22
22 12/07/2007 BootCamp PASCAL Topology Representing Network

23
23 12/07/2007 BootCamp PASCAL Topology Representing Network Good points : 1- O(DNM) 2- If there are enough prototypes and if they are well located then resulting graph is « good » in practice. Some drawbacks from the machine learning point of view

24
24 12/07/2007 BootCamp PASCAL Topology Representing Network : some drawbacks Noise sensitivity

25
25 12/07/2007 BootCamp PASCAL Topology Representing Network : some drawbacks Not self-consistent [Hastie]

26
26 12/07/2007 BootCamp PASCAL Topology Representing Network : some drawbacks No quality measure –How to measure the quality of the TRN if D >3 ? –How to compare two models ? For all these reasons, we propose a generative model

27
27 12/07/2007 BootCamp PASCAL General assumptions on data generation Unknown principal manifolds

28
28 12/07/2007 BootCamp PASCAL Unknown principal manifolds …from which are drawn data with a unknown pdf General assumptions on data generation

29
29 12/07/2007 BootCamp PASCAL Unknown principal manifolds …from which are drawn data with a unknown pdf …corroputed with some unknown noise leading to the observarion General assumptions on data generation

30
30 12/07/2007 BootCamp PASCAL General assumptions on data generation Unknown principal manifolds …from which are drawn data with a unknown pdf …corroputed with some unknown noise leading to the observarion The goal is to learn from the observed data, the principal manifolds such that their topological features can be extracted

31
31 12/07/2007 BootCamp PASCAL 3 assumptions…1 generative model The manifold is close to the DG of some prototypes

32
32 12/07/2007 BootCamp PASCAL 3 assumptions…1 generative model The manifold is close to the DG of some prototypes we associate to each component a weighted uniform distribution

33
33 12/07/2007 BootCamp PASCAL 3 assumptions…1 generative model we convolve the components by an isotropic Gaussian noise we associate to each component a weighted uniform distribution The manifold is close to the DG of some prototypes

34
34 12/07/2007 BootCamp PASCAL A Gaussian-point and a Gaussian-segment A A B How to define a generative model based on points and segments ? can be expressed in terms of « erf »

35
35 12/07/2007 BootCamp PASCAL Hola !

36
36 12/07/2007 BootCamp PASCAL Proposed approach : 3 steps 1. Initialization Location of the prototypes with a « classical » isotropic GM …and then building of the Delaunay Graph Initialize the generative model (equiprobability of the components)

37
37 12/07/2007 BootCamp PASCAL Number of prototypes min BIC ~ - Likelihood + Complexity of the model

38
38 12/07/2007 BootCamp PASCAL Proposed approach : 3 steps 2. Learning update the variance of the Gaussian noise, the weights of the components, and the location of the prototypes with the EM algorithm in order to maximize the Likelihood of the model w.r.t the N observed data :

39
39 12/07/2007 BootCamp PASCAL EM updates

40
40 12/07/2007 BootCamp PASCAL EM updates

41
41 12/07/2007 BootCamp PASCAL Proposed approach : 3 steps Some components have a (quasi-) nul probability (weights): They do not explain the data and can be prunned from the initial graph 3. After the learning

42
42 12/07/2007 BootCamp PASCAL Threshold setting ~ « Cattell Scree Test »

43
43 12/07/2007 BootCamp PASCAL Toy Experiment Seuillage sur le nombre de witness

44
44 12/07/2007 BootCamp PASCAL Toy Experiment

45
45 12/07/2007 BootCamp PASCAL Toy Experiment

46
46 12/07/2007 BootCamp PASCAL Other applications + O

47
47 12/07/2007 BootCamp PASCAL Comments There is « no free lunch » –Time Complexity O(DN 3 ) (initial Delaunay graph) –Slow convergence (EM) –Local optima

48
48 12/07/2007 BootCamp PASCAL Key Points Statistical learning of the topology of a data set Assumption : »Initial Delaunay graph is rich enough to contain a sub-graph having the same topology as the principal manifolds Based on a statistical criterion (the likelihood) available in any dimension « Generalized » Gaussian Mixture Can be seen as a generalization of the « Gaussian mixture » (no edges) Can be seen as a finite mixture (number of « Gaussian-segment ») of an infinite mixture (Gaussian-segment) This preliminary work is an attempt to bridge the gap between Statistical Learning Theory and Computational Topology

49
49 12/07/2007 BootCamp PASCAL Open questions Validity of the assumption « good » penalized-likelihood = « good » topology Theorem of «universal approximation» of manifold ?

50
50 12/07/2007 BootCamp PASCAL Related works Publications NIPS 2005 (unsupervised) and ESANN 2007 (supervised: analysis of the iris and oil flow data sets) Workshop submission at NIPS on this topic in collaboration with F. Chazal (INRIA Futurs), D. Cohen-Steiner (INRIA Sophia), S. Canu and G.Gasso (INSA Rouen)

51
51 Thanks

52
52 12/07/2007 BootCamp PASCAL Equations

53
53 12/07/2007 BootCamp PASCAL Topology d i =0 d i =1 d i =2 … … Number of holes (Betti), connectedness… Homeomorphism: topological equivalence = = intrinsic dimension

Similar presentations

© 2016 SlidePlayer.com Inc.

All rights reserved.

Ads by Google