Download presentation
Presentation is loading. Please wait.
Published byMaximilian Sutton Modified over 8 years ago
1
Graphs, Vectors, and Matrices Daniel A. Spielman Yale University AMS Josiah Willard Gibbs Lecture January 6, 2016
2
From Applied to Pure Mathematics Algebraic and Spectral Graph Theory Sparsification: approximating graphs by graphs with fewer edges The Kadison-Singer problem
3
A Social Network Graph
5
“vertex” or “node” “edge” = pair of nodes
6
“vertex” or “node” “edge” = pair of nodes A Social Network Graph
7
A Big Social Network Graph
8
A Graph 8 5 1 7 3 2 9 4 6 10 12 11 = vertices, = edges, pairs of vertices
9
The Graph of a Mesh
10
Examples of Graphs 8 5 1 7 3 2 9 4 6 10 12 11
11
Examples of Graphs 8 5 1 7 3 2 9 4 6 10 12 11
12
How to understand large-scale structure Draw the graph Identify communities and hierarchical structure Use physical metaphors Edges as resistors or rubber bands Examine processes Diffusion of gas / Random Walks
13
The Laplacian quadratic form of
14
0 0.5 1 1 1.5 The Laplacian quadratic form of
15
0 0.5 1 1 1.5 The Laplacian quadratic form of
16
The Laplacian matrix of
17
Graphs as Resistor Networks View edges as resistors connecting vertices Apply voltages at some vertices. Measure induced voltages and current flow. 1V 0V
18
Graphs as Resistor Networks Induced voltages minimize, subject to constraints. 1V 0V
19
0.5V 0.625V 0.375V Graphs as Resistor Networks Induced voltages minimize, subject to constraints. 1V 0V 1V
20
0V 0.5V 0.625V 0.375V Graphs as Resistor Networks Induced voltages minimize, subject to constraints. 1V 0V 1V
21
Graphs as Resistor Networks Induced voltages minimize, subject to constraints. 1V 0V 0.5V 0.625V 0.375V 1V 0V 1V Effective conductance = current flow with one volt
22
Weighted Graphs Edge assigned a non-negative real weight measuring strength of connection 1/resistance
23
Want to map with most edges short Spectral Graph Drawing (Hall ’70) Edges are drawn as curves for visibility.
24
Want to map with most edges short Minimize to fix scale, require Spectral Graph Drawing (Hall ’70)
25
Want to map with most edges short Minimize to fix scale, require Spectral Graph Drawing (Hall ’70)
26
Courant-Fischer Theorem Where is the smallest eigenvalue of and is the corresponding eigenvector.
27
For and is a constant vector Where is the smallest eigenvalue of and is the corresponding eigenvector. Courant-Fischer Theorem
28
Want to map with most edges short Minimize Such that and Spectral Graph Drawing (Hall ’70)
29
Want to map with most edges short Minimize Such that and Spectral Graph Drawing (Hall ’70) Courant-Fischer Theorem: solution is, the eigenvector of, the second-smallest eigenvalue
30
Spectral Graph Drawing (Hall ’70) = area under blue curves
31
Spectral Graph Drawing (Hall ’70) = area under blue curves
32
Space the points evenly
33
And, move them to the circle
34
Finish by putting me back in the center
35
Want to map with most edges short Such that and Spectral Graph Drawing (Hall ’70) and Minimize
36
Want to map with most edges short Such that and Spectral Graph Drawing (Hall ’70) Minimize and
37
Want to map with most edges short Such that and Spectral Graph Drawing (Hall ’70) and Minimize and, to prevent
38
Such that Spectral Graph Drawing (Hall ’70) Minimize Courant-Fischer Theorem: solution is, up to rotation and
39
Spectral Graph Drawing (Hall ’70) 3 1 2 4 5 6 7 8 9 1 2 4 5 6 9 3 8 7 Arbitrary Drawing Spectral Drawing
40
Spectral Graph Drawing (Hall ’70) Original Drawing Spectral Drawing
41
Spectral Graph Drawing (Hall ’70) Original Drawing Spectral Drawing
42
Dodecahedron Best embedded by first three eigenvectors
43
Spectral drawing of Erdos graph: edge between co-authors of papers
44
When there is a “nice” drawing: Most edges are short Vertices are spread out and don’t clump too much is close to 0 When is big, say there is no nice picture of the graph
45
Expanders: when is big Formally: infinite families of graphs of constant degree d and large Examples: random d -regular graphs Ramanujan graphs Have no communities or clusters. Incredibly useful in Computer Science: Act like random graphs (pseudo-random) Used in many important theorems and algorithms
46
- regular graphs with Courant-Fischer: for all Good Expander Graphs
47
- regular graphs with Courant-Fischer: for all For, the complete graph on vertices, so for Good Expander Graphs
49
Sparse Approximations of Graphs A graph is a sparse approximation of if has few edges and few: the number of edges in is or, where if for all (S-Teng ‘04)
50
Sparse Approximations of Graphs A graph is a sparse approximation of if has few edges and few: the number of edges in is or, where if for all (S-Teng ‘04) Where if for all
51
Sparse Approximations of Graphs A graph is a sparse approximation of if has few edges and few: the number of edges in is or, where if for all (S-Teng ‘04) Where if for all
52
Sparse Approximations of Graphs The number of edges in is or, where (S-Teng ‘04) Where if for all
53
Why we sparsify graphs To save memory when storing graphs. To speed up algorithms: flow problems in graphs (Benczur-Karger ‘96) linear equations in Laplacians (S-Teng ‘04)
54
Graph Sparsification Theorems For every, there is a s.t. and (Batson-S-Srivastava ‘09)
55
Graph Sparsification Theorems For every, there is a s.t. and (Batson-S-Srivastava ‘09) By careful random sampling, can quickly get (S-Srivastava ‘08)
56
Laplacian Matrices
59
Matrix Sparsification
60
subset of vectors, scaled up Matrix Sparsification
61
subset of vectors, scaled up Matrix Sparsification
63
Simplification of Matrix Sparsification is equivalent to
64
Set We need Simplification of Matrix Sparsification
65
“Decomposition of the identity” “Parseval frame” “Isotropic Position” Set Simplification of Matrix Sparsification
66
Matrix Sparsification by Sampling For with (Rudelson ‘99, Ahlswede-Winter ‘02, Tropp ’11) Choose with probability If choose, set
67
Matrix Sparsification by Sampling For with (Rudelson ‘99, Ahlswede-Winter ‘02, Tropp ’11) Choose with probability If choose, set (effective conductance)
68
Matrix Sparsification by Sampling For with (Rudelson ‘99, Ahlswede-Winter ‘02, Tropp ’11) Choose with probability If choose, set
69
Matrix Sparsification by Sampling For with (Rudelson ‘99, Ahlswede-Winter ‘02, Tropp ’11) Choose with probability If choose, set With high probability, choose vectors and
70
Optimal (?) Matrix Sparsification For with Can choose vectors and nonzero values for the so that (Batson-S-Srivastava ‘09)
71
Optimal (?) Matrix Sparsification For with Can choose vectors and nonzero values for the so that (Batson-S-Srivastava ‘09)
72
Optimal (?) Matrix Sparsification For with Can choose vectors and nonzero values for the so that (Batson-S-Srivastava ‘09)
73
The Kadison-Singer Problem ‘59 Equivalent to: Anderson’s Paving Conjectures (‘79, ‘81) Bourgain-Tzafriri Conjecture (‘91) Feichtinger Conjecture (‘05) Many others Implied by: Weaver’s KS 2 conjecture (‘04)
74
for every unit vector Weaver’s Conjecture: Isotropic vectors
75
Partition into approximately ½-Isotropic Sets
78
because Partition into approximately ½-Isotropic Sets
79
Big vectors make this difficult
81
Weaver’s Conjecture KS 2 There exist positive constants and so that if all and then exists a partition into S 1 and S 2 with
82
For all if all and then exists a partition into S 1 and S 2 with Theorem (Marcus-S-Srivastava ‘15)
83
We want
85
Consider expected polynomial of a random partition. We want
86
Proof Outline 1.Prove expected characteristic polynomial has real roots 2.Prove its largest root is at most 3.Prove is an interlacing family, so exists a partition whose polynomial has largest root at most
87
Interlacing Polynomial interlaces if Example:
88
Common Interlacing and have a common interlacing if can partition the line into intervals so that each contains one root from each polynomial ) ) ) ) ( (( (
89
) ) ) ) ( (( ( Largest root of average Common Interlacing If p 1 and p 2 have a common interlacing, for some i.
90
) ) ) ) ( (( ( If p 1 and p 2 have a common interlacing, for some i. Largest root of average Common Interlacing
91
Without a common interlacing
95
) ) ) ) ( (( ( Largest root of average Common Interlacing If p 1 and p 2 have a common interlacing, for some i.
96
and have a common interlacing iff is real rooted for all ) ) ) ) ( (( ( Common Interlacing
97
is an interlacing family if its members can be placed on the leaves of a tree so that when every node is labeled with the average of leaves below, siblings have common interlacings Interlacing Family of Polynomials
98
have a common interlacing is an interlacing family if its members can be placed on the leaves of a tree so that when every node is labeled with the average of leaves below, siblings have common interlacings Interlacing Family of Polynomials
99
have a common interlacing is an interlacing family if its members can be placed on the leaves of a tree so that when every node is labeled with the average of leaves below, siblings have common interlacings Interlacing Family of Polynomials
100
Theorem: There is a so that Interlacing Family of Polynomials
101
Theorem: There is a so that Interlacing Family of Polynomials have a common interlacing
102
Theorem: There is a so that Interlacing Family of Polynomials have a common interlacing
103
Our family is interlacing Form other polynomials in the tree by fixing the choices of where some vectors go
104
Summary 1.Prove expected characteristic polynomial has real roots 2.Prove its largest root is at most 3.Prove is an interlacing family, so exists a partition whose polynomial has largest root at most
105
To learn more about Laplacians, see My web page on Laplacian linear equations, sparsification, etc. My class notes from “Spectral Graph Theory” and “Graphs and Networks” Papers in Annals of Mathematics and survey from ICM. Available on arXiv and my web page To learn more about Kadison-Singer
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.