Talk 2: Graph Mining Tools - SVD, ranking, proximity Christos Faloutsos CMU.

Slides:



Advertisements
Similar presentations
CMU SCS PageRank Brin, Page description: C. Faloutsos, CMU.
Advertisements

Eigen Decomposition and Singular Value Decomposition
Eigen Decomposition and Singular Value Decomposition
Dimensionality Reduction. High-dimensional == many features Find concepts/topics/genres: – Documents: Features: Thousands of words, millions of word pairs.
CMU SCS : Multimedia Databases and Data Mining Lecture #20: SVD - part III (more case studies) C. Faloutsos.
CMU SCS : Multimedia Databases and Data Mining Lecture #19: SVD - part II (case studies) C. Faloutsos.
Matrices, Digraphs, Markov Chains & Their Use by Google Leslie Hogben Iowa State University and American Institute of Mathematics Leslie Hogben Iowa State.
Information Networks Link Analysis Ranking Lecture 8.
Graphs, Node importance, Link Analysis Ranking, Random walks
15-826: Multimedia Databases and Data Mining
DATA MINING LECTURE 12 Link Analysis Ranking Random walks.
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 3 March 23, 2005
N EIGHBORHOOD F ORMATION AND A NOMALY D ETECTION IN B IPARTITE G RAPHS Jimeng Sun, Huiming Qu, Deepayan Chakrabarti & Christos Faloutsos Jimeng Sun, Huiming.
Fast Direction-Aware Proximity for Graph Mining KDD 2007, San Jose Hanghang Tong, Yehuda Koren, Christos Faloutsos.
Link Analysis Ranking. How do search engines decide how to rank your query results? Guess why Google ranks the query results the way it does How would.
Introduction to PageRank Algorithm and Programming Assignment 1 CSC4170 Web Intelligence and Social Computing Tutorial 4 Tutor: Tom Chao Zhou
10-603/15-826A: Multimedia Databases and Data Mining SVD - part II (more case studies) C. Faloutsos.
Neighborhood Formation and Anomaly Detection in Bipartite Graphs Jimeng Sun Huiming Qu Deepayan Chakrabarti Christos Faloutsos Speaker: Jimeng Sun.
Text Databases. Outline Spatial Databases Temporal Databases Spatio-temporal Databases Data Mining Multimedia Databases Text databases Image and video.
Multimedia Databases SVD II. Optimality of SVD Def: The Frobenius norm of a n x m matrix M is (reminder) The rank of a matrix M is the number of independent.
Introduction to Information Retrieval Introduction to Information Retrieval Hinrich Schütze and Christina Lioma Lecture 21: Link Analysis.
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 4 March 30, 2005
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 3 April 2, 2006
Multimedia Databases SVD II. SVD - Detailed outline Motivation Definition - properties Interpretation Complexity Case studies SVD properties More case.
SCS CMU Proximity Tracking on Time- Evolving Bipartite Graphs Speaker: Hanghang Tong Joint Work with Spiros Papadimitriou, Philip S. Yu, Christos Faloutsos.
Link Analysis, PageRank and Search Engines on the Web
Measure Proximity on Graphs with Side Information Joint Work by Hanghang Tong, Huiming Qu, Hani Jamjoom Speaker: Mary McGlohon 1 ICDM 2008, Pisa, Italy15-19.
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 6 May 7, 2006
10-603/15-826A: Multimedia Databases and Data Mining SVD - part I (definitions) C. Faloutsos.
3D Geometry for Computer Graphics
Link Structure and Web Mining Shuying Wang
Multimedia Databases LSI and SVD. Text - Detailed outline text problem full text scanning inversion signature files clustering information filtering and.
Fast Random Walk with Restart and Its Applications
Singular Value Decomposition and Data Management
CMU SCS KDD'09Faloutsos, Miller, Tsourakakis P3-1 Large Graph Mining: Power Tools and a Practitioner’s guide Task 3: Recommendations & proximity Faloutsos,
E.G.M. PetrakisDimensionality Reduction1  Given N vectors in n dims, find the k most important axes to project them  k is user defined (k < n)  Applications:
SVD(Singular Value Decomposition) and Its Applications
CMU SCS : Multimedia Databases and Data Mining Lecture #20: SVD - part III (more case studies) C. Faloutsos.
CS315 – Link Analysis Three generations of Search Engines Anchor text Link analysis for ranking Pagerank HITS.
Carnegie Mellon Powerful Tools for Data Mining Fractals, power laws, SVD C. Faloutsos Carnegie Mellon University.
CpSc 881: Information Retrieval. 2 Recall: Term-document matrix This matrix is the basis for computing the similarity between documents and queries. Today:
The PageRank Citation Ranking: Bringing Order to the Web Lawrence Page, Sergey Brin, Rajeev Motwani, Terry Winograd Presented by Anca Leuca, Antonis Makropoulos.
KDD 2007, San Jose Fast Direction-Aware Proximity for Graph Mining Speaker: Hanghang Tong Joint work w/ Yehuda Koren, Christos Faloutsos.
SCS CMU Proximity on Large Graphs Speaker: Hanghang Tong Guest Lecture.
Fast Random Walk with Restart and Its Applications Hanghang Tong, Christos Faloutsos and Jia-Yu (Tim) Pan ICDM 2006 Dec , HongKong.
CMU SCS KDD '09Faloutsos, Miller, Tsourakakis P5-1 Large Graph Mining: Power Tools and a Practitioner’s guide Task 5: Graphs over time & tensors Faloutsos,
Hanghang Tong, Brian Gallagher, Christos Faloutsos, Tina Eliassi-Rad
Introduction to Linear Algebra Mark Goldman Emily Mackevicius.
KDD 2007, San Jose Fast Direction-Aware Proximity for Graph Mining Speaker: Hanghang Tong Joint work w/ Yehuda Koren, Christos Faloutsos.
1 1 COMP5331: Knowledge Discovery and Data Mining Acknowledgement: Slides modified based on the slides provided by Lawrence Page, Sergey Brin, Rajeev Motwani.
CMU SCS : Multimedia Databases and Data Mining Lecture #18: SVD - part I (definitions) C. Faloutsos.
- Murtuza Shareef Authoritative Sources in a Hyperlinked Environment More specifically “Link Analysis” using HITS Algorithm.
Kijung Shin Jinhong Jung Lee Sael U Kang
Center-Piece Subgraphs: Problem definition and Fast Solutions Hanghang Tong Christos Faloutsos Carnegie Mellon University.
SCS CMU Speaker Hanghang Tong Colibri: Fast Mining of Large Static and Dynamic Graphs Speaking Skill Requirement.
CS 540 Database Management Systems Web Data Management some slides are due to Kevin Chang 1.
Large Graph Mining: Power Tools and a Practitioner’s guide
Search Engines and Link Analysis on the Web
Link-Based Ranking Seminar Social Media Mining University UC3M
15-826: Multimedia Databases and Data Mining
Hanghang Tong, Brian Gallagher, Christos Faloutsos, Tina Eliassi-Rad
LSI, SVD and Data Management
Large Graph Mining: Power Tools and a Practitioner’s guide
Part 2: Graph Mining Tools - SVD and ranking
Speaker: Hanghang Tong Carnegie Mellon University
Roadmap Introduction – Motivation Part#1: Graphs
Next Generation Data Mining Tools: SVD and Fractals
Proximity in Graphs by Using Random Walks
Presentation transcript:

Talk 2: Graph Mining Tools - SVD, ranking, proximity Christos Faloutsos CMU

Lipari 2010(C) 2010, C. Faloutsos 2 Outline Introduction – Motivation Task 1: Node importance Task 2: Recommendations Task 3: Connection sub-graphs Conclusions

Lipari 2010(C) 2010, C. Faloutsos 3 Node importance - Motivation: Given a graph (eg., web pages containing the desirable query word) Q: Which node is the most important?

Lipari 2010(C) 2010, C. Faloutsos 4 Node importance - Motivation: Given a graph (eg., web pages containing the desirable query word) Q: Which node is the most important? A1: HITS (SVD = Singular Value Decomposition) A2: eigenvector (PageRank)

Lipari 2010(C) 2010, C. Faloutsos 5 Node importance - motivation SVD and eigenvector analysis: very closely related

Lipari 2010(C) 2010, C. Faloutsos 6 SVD - Detailed outline Motivation Definition - properties Interpretation Complexity Case studies

Lipari 2010(C) 2010, C. Faloutsos 7 SVD - Motivation problem #1: text - LSI: find ‘concepts’ problem #2: compression / dim. reduction

Lipari 2010(C) 2010, C. Faloutsos 8 SVD - Motivation problem #1: text - LSI: find ‘concepts’

Lipari 2010(C) 2010, C. Faloutsos 9 SVD - Motivation Customer-product, for recommendation system: bread lettuce beef vegetarians meat eaters tomatos chicken

Lipari 2010(C) 2010, C. Faloutsos 10 SVD - Motivation problem #2: compress / reduce dimensionality

Lipari 2010(C) 2010, C. Faloutsos 11 Problem - specs ~10**6 rows; ~10**3 columns; no updates; random access to any cell(s) ; small error: OK

Lipari 2010(C) 2010, C. Faloutsos 12 SVD - Motivation

Lipari 2010(C) 2010, C. Faloutsos 13 SVD - Motivation

Lipari 2010(C) 2010, C. Faloutsos 14 SVD - Detailed outline Motivation Definition - properties Interpretation Complexity Case studies Additional properties

Lipari 2010(C) 2010, C. Faloutsos 15 SVD - Definition (reminder: matrix multiplication x 3 x 22 x 1 =

Lipari 2010(C) 2010, C. Faloutsos 16 SVD - Definition (reminder: matrix multiplication x 3 x 22 x 1 = 3 x 1

Lipari 2010(C) 2010, C. Faloutsos 17 SVD - Definition (reminder: matrix multiplication x 3 x 22 x 1 = 3 x 1

Lipari 2010(C) 2010, C. Faloutsos 18 SVD - Definition (reminder: matrix multiplication x 3 x 22 x 1 = 3 x 1

Lipari 2010(C) 2010, C. Faloutsos 19 SVD - Definition (reminder: matrix multiplication x=

Lipari 2010(C) 2010, C. Faloutsos 20 SVD - Definition A [n x m] = U [n x r]   r x r] (V [m x r] ) T A: n x m matrix (eg., n documents, m terms) U: n x r matrix (n documents, r concepts)  : r x r diagonal matrix (strength of each ‘concept’) (r : rank of the matrix) V: m x r matrix (m terms, r concepts)

Lipari 2010(C) 2010, C. Faloutsos 21 SVD - Definition A = U  V T - example:

Lipari 2010(C) 2010, C. Faloutsos 22 SVD - Properties THEOREM [Press+92]: always possible to decompose matrix A into A = U  V T, where U,  V: unique (*) U, V: column orthonormal (ie., columns are unit vectors, orthogonal to each other) –U T U = I; V T V = I (I: identity matrix)  : singular are positive, and sorted in decreasing order

Lipari 2010(C) 2010, C. Faloutsos 23 SVD - Example A = U  V T - example: data inf. retrieval brain lung = CS MD xx

Lipari 2010(C) 2010, C. Faloutsos 24 SVD - Example A = U  V T - example: data inf. retrieval brain lung = CS MD xx CS-concept MD-concept

Lipari 2010(C) 2010, C. Faloutsos 25 SVD - Example A = U  V T - example: data inf. retrieval brain lung = CS MD xx CS-concept MD-concept doc-to-concept similarity matrix

Lipari 2010(C) 2010, C. Faloutsos 26 SVD - Example A = U  V T - example: data inf. retrieval brain lung = CS MD xx ‘strength’ of CS-concept

Lipari 2010(C) 2010, C. Faloutsos 27 SVD - Example A = U  V T - example: data inf. retrieval brain lung = CS MD xx term-to-concept similarity matrix CS-concept

Lipari 2010(C) 2010, C. Faloutsos 28 SVD - Example A = U  V T - example: data inf. retrieval brain lung = CS MD xx term-to-concept similarity matrix CS-concept

Lipari 2010(C) 2010, C. Faloutsos 29 SVD - Detailed outline Motivation Definition - properties Interpretation Complexity Case studies Additional properties

Lipari 2010(C) 2010, C. Faloutsos 30 SVD - Interpretation #1 ‘documents’, ‘terms’ and ‘concepts’: U: document-to-concept similarity matrix V: term-to-concept sim. matrix  : its diagonal elements: ‘strength’ of each concept

Lipari 2010(C) 2010, C. Faloutsos 31 SVD – Interpretation #1 ‘documents’, ‘terms’ and ‘concepts’: Q: if A is the document-to-term matrix, what is A T A? A: Q: A A T ? A:

Lipari 2010(C) 2010, C. Faloutsos SVD – Interpretation #1 ‘documents’, ‘terms’ and ‘concepts’: Q: if A is the document-to-term matrix, what is A T A? A: term-to-term ([m x m]) similarity matrix Q: A A T ? A: document-to-document ([n x n]) similarity matrix

Lipari 2010(C) 2010, C. Faloutsos 33 Copyright: Faloutsos, Tong (2009) 2-33 SVD properties V are the eigenvectors of the covariance matrix A T A U are the eigenvectors of the Gram (inner- product) matrix AA T Further reading: 1. Ian T. Jolliffe, Principal Component Analysis (2 nd ed), Springer, Gilbert Strang, Linear Algebra and Its Applications (4 th ed), Brooks Cole, 2005.

Lipari 2010(C) 2010, C. Faloutsos 34 SVD - Interpretation #2 best axis to project on: (‘best’ = min sum of squares of projection errors)

Lipari 2010(C) 2010, C. Faloutsos 35 SVD - Motivation

Lipari 2010(C) 2010, C. Faloutsos 36 SVD - interpretation #2 minimum RMS error SVD: gives best axis to project v1 first singular vector

Lipari 2010(C) 2010, C. Faloutsos 37 SVD - Interpretation #2

Lipari 2010(C) 2010, C. Faloutsos 38 SVD - Interpretation #2 A = U  V T - example: = xx v1

Lipari 2010(C) 2010, C. Faloutsos 39 SVD - Interpretation #2 A = U  V T - example: = xx variance (‘spread’) on the v1 axis

Lipari 2010(C) 2010, C. Faloutsos 40 SVD - Interpretation #2 A = U  V T - example: –U  gives the coordinates of the points in the projection axis = xx

Lipari 2010(C) 2010, C. Faloutsos 41 SVD - Interpretation #2 More details Q: how exactly is dim. reduction done? = xx

Lipari 2010(C) 2010, C. Faloutsos 42 SVD - Interpretation #2 More details Q: how exactly is dim. reduction done? A: set the smallest singular values to zero: = xx

Lipari 2010(C) 2010, C. Faloutsos 43 SVD - Interpretation #2 ~ xx

Lipari 2010(C) 2010, C. Faloutsos 44 SVD - Interpretation #2 ~ xx

Lipari 2010(C) 2010, C. Faloutsos 45 SVD - Interpretation #2 ~ xx

Lipari 2010(C) 2010, C. Faloutsos 46 SVD - Interpretation #2 ~

Lipari 2010(C) 2010, C. Faloutsos 47 SVD - Interpretation #2 Exactly equivalent: ‘spectral decomposition’ of the matrix: = xx

Lipari 2010(C) 2010, C. Faloutsos 48 SVD - Interpretation #2 Exactly equivalent: ‘spectral decomposition’ of the matrix: = xx u1u1 u2u2 1 2 v1v1 v2v2

Lipari 2010(C) 2010, C. Faloutsos 49 SVD - Interpretation #2 Exactly equivalent: ‘spectral decomposition’ of the matrix: =u1u1 1 vT1vT1 u2u2 2 vT2vT n m

Lipari 2010(C) 2010, C. Faloutsos 50 SVD - Interpretation #2 Exactly equivalent: ‘spectral decomposition’ of the matrix: =u1u1 1 vT1vT1 u2u2 2 vT2vT n m n x 1 1 x m r terms

Lipari 2010(C) 2010, C. Faloutsos 51 SVD - Interpretation #2 approximation / dim. reduction: by keeping the first few terms (Q: how many?) =u1u1 1 vT1vT1 u2u2 2 vT2vT n m assume: 1 >= 2 >=...

Lipari 2010(C) 2010, C. Faloutsos 52 SVD - Interpretation #2 A (heuristic - [Fukunaga]): keep 80-90% of ‘energy’ (= sum of squares of i ’s) =u1u1 1 vT1vT1 u2u2 2 vT2vT n m assume: 1 >= 2 >=...

Lipari 2010(C) 2010, C. Faloutsos 53 SVD - Detailed outline Motivation Definition - properties Interpretation –#1: documents/terms/concepts –#2: dim. reduction –#3: picking non-zero, rectangular ‘blobs’ Complexity Case studies Additional properties

Lipari 2010(C) 2010, C. Faloutsos 54 SVD - Interpretation #3 finds non-zero ‘blobs’ in a data matrix = xx

Lipari 2010(C) 2010, C. Faloutsos 55 SVD - Interpretation #3 finds non-zero ‘blobs’ in a data matrix = xx

Lipari 2010(C) 2010, C. Faloutsos 56 SVD - Interpretation #3 finds non-zero ‘blobs’ in a data matrix = ‘communities’ (bi-partite cores, here) Row 1 Row 4 Col 1 Col 3 Col 4 Row 5 Row 7

Lipari 2010(C) 2010, C. Faloutsos 57 SVD - Detailed outline Motivation Definition - properties Interpretation Complexity Case studies Additional properties

Lipari 2010(C) 2010, C. Faloutsos 58 SVD - Complexity O( n * m * m) or O( n * n * m) (whichever is less) less work, if we just want singular values or if we want first k singular vectors or if the matrix is sparse [Berry] Implemented: in any linear algebra package (LINPACK, matlab, Splus, mathematica...)

Lipari 2010(C) 2010, C. Faloutsos 59 SVD - conclusions so far SVD: A= U  V T : unique (*) U: document-to-concept similarities V: term-to-concept similarities  : strength of each concept dim. reduction: keep the first few strongest singular values (80-90% of ‘energy’) –SVD: picks up linear correlations SVD: picks up non-zero ‘blobs’

Lipari 2010(C) 2010, C. Faloutsos 60 SVD - Detailed outline Motivation Definition - properties Interpretation Complexity SVD properties Case studies Conclusions

Lipari 2010(C) 2010, C. Faloutsos 61 SVD - Other properties - summary can produce orthogonal basis (obvious) (who cares?) can solve over- and under-determined linear problems (see C(1) property) can compute ‘fixed points’ (= ‘steady state prob. in Markov chains’) (see C(4) property)

Lipari 2010(C) 2010, C. Faloutsos 62 SVD -outline of properties (A): obvious (B): less obvious (C): least obvious (and most powerful!)

Lipari 2010(C) 2010, C. Faloutsos 63 Properties - by defn.: A(0): A [n x m] = U [ n x r ]  [ r x r ] V T [ r x m] A(1): U T [r x n] U [n x r ] = I [r x r ] (identity matrix) A(2): V T [r x n] V [n x r ] = I [r x r ] A(3):  k = diag( 1 k, 2 k,... r k ) (k: ANY real number) A(4): A T = V  U T

Lipari 2010(C) 2010, C. Faloutsos 64 Less obvious properties A(0): A [n x m] = U [ n x r ]  [ r x r ] V T [ r x m] B(1): A [n x m] (A T ) [m x n] = ??

Lipari 2010(C) 2010, C. Faloutsos 65 Less obvious properties A(0): A [n x m] = U [ n x r ]  [ r x r ] V T [ r x m] B(1): A [n x m] (A T ) [m x n] = U  2 U T symmetric; Intuition?

Lipari 2010(C) 2010, C. Faloutsos 66 Less obvious properties A(0): A [n x m] = U [ n x r ]  [ r x r ] V T [ r x m] B(1): A [n x m] (A T ) [m x n] = U  2 U T symmetric; Intuition? ‘document-to-document’ similarity matrix B(2): symmetrically, for ‘V’ (AT) [m x n] A [n x m] = V L2 VT Intuition?

Lipari 2010(C) 2010, C. Faloutsos 67 Less obvious properties A: term-to-term similarity matrix B(3): ( (A T ) [m x n] A [n x m] ) k = V  2k V T and B(4): (A T A ) k ~ v 1 1 2k v 1 T for k>>1 where v 1 : [m x 1] first column (singular-vector) of V 1 : strongest singular value

Lipari 2010(C) 2010, C. Faloutsos 68 Less obvious properties B(4): (A T A ) k ~ v 1 1 2k v 1 T for k>>1 B(5): (A T A ) k v’ ~ (constant) v 1 ie., for (almost) any v’, it converges to a vector parallel to v 1 Thus, useful to compute first singular vector/value (as well as the next ones, too...)

Lipari 2010(C) 2010, C. Faloutsos 69 Less obvious properties - repeated: A(0): A [n x m] = U [ n x r ]  [ r x r ] V T [ r x m] B(1): A [n x m] (A T ) [m x n] = U  2 U T B(2):(A T ) [m x n] A [n x m] = V  2 V T B(3): ( (A T ) [m x n] A [n x m] ) k = V  2k V T B(4): (A T A ) k ~ v 1 1 2k v 1 T B(5): (A T A ) k v’ ~ (constant) v 1

Lipari 2010(C) 2010, C. Faloutsos 70 Least obvious properties - cont’d A(0): A [n x m] = U [ n x r ]  [ r x r ] V T [ r x m] C(2): A [n x m] v 1 [m x 1] = 1 u 1 [n x 1] where v 1, u 1 the first (column) vectors of V, U. (v 1 == right-singular-vector) C(3): symmetrically: u 1 T A = 1 v 1 T u 1 == left-singular-vector Therefore:

Lipari 2010(C) 2010, C. Faloutsos 71 Least obvious properties - cont’d A(0): A [n x m] = U [ n x r ]  [ r x r ] V T [ r x m] C(4): A T A v 1 = 1 2 v 1 (fixed point - the dfn of eigenvector for a symmetric matrix)

Lipari 2010(C) 2010, C. Faloutsos 72 Least obvious properties - altogether A(0): A [n x m] = U [ n x r ]  [ r x r ] V T [ r x m] C(1): A [n x m] x [m x 1] = b [n x 1] then, x 0 = V  (-1) U T b: shortest, actual or least- squares solution C(2): A [n x m] v 1 [m x 1] = 1 u 1 [n x 1] C(3): u 1 T A = 1 v 1 T C(4): A T A v 1 = 1 2 v 1

Lipari 2010(C) 2010, C. Faloutsos 73 Properties - conclusions A(0): A [n x m] = U [ n x r ]  [ r x r ] V T [ r x m] B(5): (A T A ) k v’ ~ (constant) v 1 C(1): A [n x m] x [m x 1] = b [n x 1] then, x 0 = V  (-1) U T b: shortest, actual or least- squares solution C(4): A T A v 1 = 1 2 v 1

Lipari 2010(C) 2010, C. Faloutsos 74 SVD - detailed outline... SVD properties case studies –Kleinberg’s algorithm –Google’s algorithm Conclusions

Lipari 2010(C) 2010, C. Faloutsos 75 Kleinberg’s algo (HITS) Kleinberg, Jon (1998). Authoritative sources in a hyperlinked environment. Proc. 9th ACM-SIAM Symposium on Discrete Algorithms.

Lipari 2010(C) 2010, C. Faloutsos 76 Recall: problem dfn Given a graph (eg., web pages containing the desirable query word) Q: Which node is the most important?

Lipari 2010(C) 2010, C. Faloutsos 77 Kleinberg’s algorithm Problem dfn: given the web and a query find the most ‘authoritative’ web pages for this query Step 0: find all pages containing the query terms Step 1: expand by one move forward and backward

Lipari 2010(C) 2010, C. Faloutsos 78 Kleinberg’s algorithm Step 1: expand by one move forward and backward

Lipari 2010(C) 2010, C. Faloutsos 79 Kleinberg’s algorithm on the resulting graph, give high score (= ‘authorities’) to nodes that many important nodes point to give high importance score (‘hubs’) to nodes that point to good ‘authorities’) hubsauthorities

Lipari 2010(C) 2010, C. Faloutsos 80 Kleinberg’s algorithm observations recursive definition! each node (say, ‘i’-th node) has both an authoritativeness score a i and a hubness score h i

Lipari 2010(C) 2010, C. Faloutsos 81 Kleinberg’s algorithm Let E be the set of edges and A be the adjacency matrix: the (i,j) is 1 if the edge from i to j exists Let h and a be [n x 1] vectors with the ‘hubness’ and ‘authoritativiness’ scores. Then:

Lipari 2010(C) 2010, C. Faloutsos 82 Kleinberg’s algorithm Then: a i = h k + h l + h m that is a i = Sum (h j ) over all j that (j,i) edge exists or a = A T h k l m i

Lipari 2010(C) 2010, C. Faloutsos 83 Kleinberg’s algorithm symmetrically, for the ‘hubness’: h i = a n + a p + a q that is h i = Sum (q j ) over all j that (i,j) edge exists or h = A a p n q i

Lipari 2010(C) 2010, C. Faloutsos 84 Kleinberg’s algorithm In conclusion, we want vectors h and a such that: h = A a a = A T h Recall properties: C(2): A [n x m] v 1 [m x 1] = 1 u 1 [n x 1] C(3): u 1 T A = 1 v 1 T =

Lipari 2010(C) 2010, C. Faloutsos 85 Kleinberg’s algorithm In short, the solutions to h = A a a = A T h are the left- and right- singular-vectors of the adjacency matrix A. Starting from random a’ and iterating, we’ll eventually converge (Q: to which of all the singular-vectors? why?)

Lipari 2010(C) 2010, C. Faloutsos 86 Kleinberg’s algorithm (Q: to which of all the singular-vectors? why?) A: to the ones of the strongest singular-value, because of property B(5): B(5): (A T A ) k v’ ~ (constant) v 1

Lipari 2010(C) 2010, C. Faloutsos 87 Kleinberg’s algorithm - results Eg., for the query ‘java’: java.sun.com (“the java developer”)

Lipari 2010(C) 2010, C. Faloutsos 88 Kleinberg’s algorithm - discussion ‘authority’ score can be used to find ‘similar pages’ (how?)

Lipari 2010(C) 2010, C. Faloutsos 89 SVD - detailed outline... Complexity SVD properties Case studies –Kleinberg’s algorithm (HITS) –Google’s algorithm Conclusions

Lipari 2010(C) 2010, C. Faloutsos 90 PageRank (google) Brin, Sergey and Lawrence Page (1998). Anatomy of a Large-Scale Hypertextual Web Search Engine. 7th Intl World Wide Web Conf. Larry Page Sergey Brin

Lipari 2010(C) 2010, C. Faloutsos 91 Problem: PageRank Given a directed graph, find its most interesting/central node A node is important, if it is connected with important nodes (recursive, but OK!)

Lipari 2010(C) 2010, C. Faloutsos 92 Problem: PageRank - solution Given a directed graph, find its most interesting/central node Proposed solution: Random walk; spot most ‘popular’ node (-> steady state prob. (ssp)) A node has high ssp, if it is connected with high ssp nodes (recursive, but OK!)

Lipari 2010(C) 2010, C. Faloutsos 93 (Simplified) PageRank algorithm Let A be the adjacency matrix; let B be the transition matrix: transpose, column-normalized - then = To From B

Lipari 2010(C) 2010, C. Faloutsos 94 (Simplified) PageRank algorithm B p = p =

Lipari 2010(C) 2010, C. Faloutsos 95 Definitions AAdjacency matrix (from-to) DDegree matrix = (diag ( d1, d2, …, dn) ) BTransition matrix: to-from, column normalized B = A T D -1

Lipari 2010(C) 2010, C. Faloutsos 96 (Simplified) PageRank algorithm B p = 1 * p thus, p is the eigenvector that corresponds to the highest eigenvalue (=1, since the matrix is column-normalized ) Why does such a p exist? –p exists if B is nxn, nonnegative, irreducible [Perron–Frobenius theorem]

Lipari 2010(C) 2010, C. Faloutsos 97 (Simplified) PageRank algorithm In short: imagine a particle randomly moving along the edges compute its steady-state probabilities (ssp) Full version of algo: with occasional random jumps Why? To make the matrix irreducible

Lipari 2010(C) 2010, C. Faloutsos 98 Full Algorithm With probability 1-c, fly-out to a random node Then, we have p = c B p + (1-c)/n 1 => p = (1-c)/n [I - c B] -1 1

Lipari 2010(C) 2010, C. Faloutsos 99 Alternative notation MModified transition matrix M = c B + (1-c)/n 1 1 T Then p = M p That is: the steady state probabilities = PageRank scores form the first eigenvector of the ‘modified transition matrix’

Lipari 2010(C) 2010, C. Faloutsos 100 Parenthesis: intuition behind eigenvectors

Lipari 2010(C) 2010, C. Faloutsos 101 Formal definition If A is a (n x n) square matrix , x) is an eigenvalue/eigenvector pair of A if A x = x CLOSELY related to singular values:

Lipari 2010(C) 2010, C. Faloutsos 102 Property #1: Eigen- vs singular-values if B [n x m] = U [n x r]   r x r] (V [m x r] ) T then A = ( B T B ) is symmetric and C(4): B T B v i = i 2 v i ie, v 1, v 2,...: eigenvectors of A = (B T B)

Lipari 2010(C) 2010, C. Faloutsos 103 Property #2 If A [nxn] is a real, symmetric matrix Then it has n real eigenvalues (if A is not symmetric, some eigenvalues may be complex)

Lipari 2010(C) 2010, C. Faloutsos 104 Property #3 If A [nxn] is a real, symmetric matrix Then it has n real eigenvalues And they agree with its n singular values, except possibly for the sign

Lipari 2010(C) 2010, C. Faloutsos 105 Intuition A as vector transformation Axx’ = x

Lipari 2010(C) 2010, C. Faloutsos 106 Intuition By defn., eigenvectors remain parallel to themselves (‘fixed points’) Av1v1 v1v1 = 3.62 * 1

Lipari 2010(C) 2010, C. Faloutsos 107 Convergence Usually, fast:

Lipari 2010(C) 2010, C. Faloutsos 108 Convergence Usually, fast:

Lipari 2010(C) 2010, C. Faloutsos 109 Convergence Usually, fast: depends on ratio 1 : 2 1 2

Lipari 2010(C) 2010, C. Faloutsos 110 Kleinberg/google - conclusions SVD helps in graph analysis: hub/authority scores: strongest left- and right- singular-vectors of the adjacency matrix random walk on a graph: steady state probabilities are given by the strongest eigenvector of the (modified) transition matrix

Lipari 2010(C) 2010, C. Faloutsos 111 Conclusions SVD: a valuable tool given a document-term matrix, it finds ‘concepts’ (LSI)... and can find fixed-points or steady-state probabilities (google/ Kleinberg/ Markov Chains)

Lipari 2010(C) 2010, C. Faloutsos 112 Conclusions cont’d (We didn’t discuss/elaborate, but, SVD... can reduce dimensionality (KL)... and can find rules (PCA; RatioRules)... and can solve optimally over- and under- constraint linear systems (least squares / query feedbacks)

Lipari 2010(C) 2010, C. Faloutsos 113 References Berry, Michael: Brin, S. and L. Page (1998). Anatomy of a Large-Scale Hypertextual Web Search Engine. 7th Intl World Wide Web Conf.

Lipari 2010(C) 2010, C. Faloutsos 114 References Christos Faloutsos, Searching Multimedia Databases by Content, Springer, (App. D)Searching Multimedia Databases by Content Fukunaga, K. (1990). Introduction to Statistical Pattern Recognition, Academic Press. I.T. Jolliffe Principal Component Analysis Springer, 2002 (2 nd ed.)

Lipari 2010(C) 2010, C. Faloutsos 115 References cont’d Kleinberg, J. (1998). Authoritative sources in a hyperlinked environment. Proc. 9th ACM-SIAM Symposium on Discrete Algorithms. Press, W. H., S. A. Teukolsky, et al. (1992). Numerical Recipes in C, Cambridge University Press.

Lipari 2010(C) 2010, C. Faloutsos 116 Outline Introduction – Motivation Task 1: Node importance Task 2: Recommendations & proximity Task 3: Connection sub-graphs Conclusions

Lipari 2010(C) 2010, C. Faloutsos 117 Acknowledgement : Most of the foils in ‘Task 2’ are by Hanghang TONG

Lipari 2010(C) 2010, C. Faloutsos 118 Detailed outline Problem dfn and motivation Solution: Random walk with restarts Efficient computation Case study: image auto-captioning Extensions: bi-partite graphs; tracking Conclusions

Lipari 2010(C) 2010, C. Faloutsos 119 A B i i i i Motivation: Link Prediction Should we introduce Mr. A to Mr. B? ?

Lipari 2010(C) 2010, C. Faloutsos 120 Motivation - recommendations customers Products / movies ‘ smith ’ Terminator 2 ??

Lipari 2010(C) 2010, C. Faloutsos 121 Answer: proximity ‘yes’, if ‘A’ and ‘B’ are ‘close’ ‘yes’, if ‘smith’ and ‘terminator 2’ are ‘close’ QUESTIONS in this part: -How to measure ‘closeness’/proximity? -How to do it quickly? -What else can we do, given proximity scores?

Lipari 2010(C) 2010, C. Faloutsos 122 How close is ‘A’ to ‘B’? a.k.a Relevance, Closeness, ‘Similarity’…

Lipari 2010(C) 2010, C. Faloutsos 123 Why is it useful? Recommendation And many more Image captioning [Pan+] Conn. / CenterPiece subgraphs [Faloutsos+], [Tong+], [Koren+] and Link prediction [Liben-Nowell+], [Tong+] Ranking [Haveliwala], [Chakrabarti+] Management [Minkov+] Neighborhood Formulation [Sun+] Pattern matching [Tong+] Collaborative Filtering [Fouss+] …

Lipari 2010(C) 2010, C. Faloutsos 124 Test Image SeaSunSkyWaveCatForestTigerGrass Image Keyword Region Automatic Image Captioning Q: How to assign keywords to the test image? A: Proximity! [Pan+ 2004]

Lipari 2010(C) 2010, C. Faloutsos 125 Center-Piece Subgraph(CePS) Original Graph CePS Q: How to find hub for the black nodes? A: Proximity! [Tong+ KDD 2006] CePS guy Input Output

Detailed outline Problem dfn and motivation Solution: Random walk with restarts Efficient computation Case study: image auto-captioning Extensions: bi-partite graphs; tracking Conclusions Lipari 2010(C) 2010, C. Faloutsos 126

Lipari 2010(C) 2010, C. Faloutsos 127 How close is ‘A’ to ‘B’? Should be close, if they have - many, - short - ‘heavy’ paths

Lipari 2010(C) 2010, C. Faloutsos 128 Why not shortest path? A: ‘pizza delivery guy’ problem Some ``bad’’ proximities

Lipari 2010(C) 2010, C. Faloutsos 129 Why not max. netflow? A: No penalty for long paths Some ``bad’’ proximities

Lipari 2010(C) 2010, C. Faloutsos 130 What is a ``good’’ Proximity? Multiple Connections Quality of connection Direct & In-directed Conns Length, Degree, Weight… …

Lipari 2010(C) 2010, C. Faloutsos Random walk with restart [Haveliwala’02]

Lipari 2010(C) 2010, C. Faloutsos 132 Random walk with restart Node 4 Node 1 Node 2 Node 3 Node 4 Node 5 Node 6 Node 7 Node 8 Node 9 Node 10 Node 11 Node Ranking vector More red, more relevant Nearby nodes, higher scores

Lipari 2010(C) 2010, C. Faloutsos 133 Why RWR is a good score? all paths from i to j with length 1 all paths from i to j with length 2 all paths from i to j with length 3 : adjacency matrix. c: damping factor i j

Detailed outline Problem dfn and motivation Solution: Random walk with restarts –variants Efficient computation Case study: image auto-captioning Extensions: bi-partite graphs; tracking Conclusions Lipari 2010(C) 2010, C. Faloutsos 134

Lipari 2010(C) 2010, C. Faloutsos 135 Variant: escape probability Define Random Walk (RW) on the graph Esc_Prob(CMU  Paris) –Prob (starting at CMU, reaches Paris before returning to CMU) CMU Paris the remaining graph Esc_Prob = Pr (smile before cry)

Lipari 2010(C) 2010, C. Faloutsos 136 Other Variants Other measure by RWs –Community Time/Hitting Time [Fouss+] –SimRank [Jeh+] Equivalence of Random Walks –Electric Networks: EC [Doyle+]; SAEC[Faloutsos+]; CFEC[Koren+] –Spring Systems Katz [Katz], [Huang+], [Scholkopf+] Matrix-Forest-based Alg [Chobotarev+]

Lipari 2010(C) 2010, C. Faloutsos 137 Other Variants Other measure by RWs –Community Time/Hitting Time [Fouss+] –SimRank [Jeh+] Equivalence of Random Walks –Electric Networks: EC [Doyle+]; SAEC[Faloutsos+]; CFEC[Koren+] –Spring Systems Katz [Katz], [Huang+], [Scholkopf+] Matrix-Forest-based Alg [Chobotarev+] All are “related to” or “similar to” random walk with restart!

Lipari 2010(C) 2010, C. Faloutsos 138 Map of proximity measurements RWR Esc_Prob + Sink Hitting Time/ Commute Time Effective Conductance String System Regularized Un-constrained Quad Opt. Harmonic Func. Constrained Quad Opt. Mathematic Tools X out-degree “voltage = position” relax 4 ssp decides 1 esc_prob Katz Norma lize Physical Models

Lipari 2010(C) 2010, C. Faloutsos 139 Notice: Asymmetry (even in undirected graphs) A B C D E C-> A : high A-> C: low

Lipari 2010(C) 2010, C. Faloutsos 140 Summary of Proximity Definitions Goal: Summarize multiple relationships Solutions –Basic: Random Walk with Restarts [Haweliwala’02] [Pan+ 2004][Sun+ 2006][Tong+ 2006] –Properties: Asymmetry [Koren+ 2006][Tong+ 2007] [Tong+ 2008] –Variants: Esc_Prob and many others. [Faloutsos+ 2004] [Koren+ 2006][Tong+ 2007]

Detailed outline Problem dfn and motivation Solution: Random walk with restarts Efficient computation Case study: image auto-captioning Extensions: bi-partite graphs; tracking Conclusions Lipari 2010(C) 2010, C. Faloutsos 141

Lipari 2010(C) 2010, C. Faloutsos 142 Reminder: PageRank With probability 1-c, fly-out to a random node Then, we have p = c B p + (1-c)/n 1 => p = (1-c)/n [I - c B] -1 1

Lipari 2010(C) 2010, C. Faloutsos 143 Ranking vector Starting vector Adjacency matrix Restart p p = c B p + (1-c)/n 1 The only difference

Lipari 2010(C) 2010, C. Faloutsos 144 Computing RWR n x n n x 1 Ranking vector Starting vector Adjacency matrix 1 Restart p p = c B p + (1-c)/n 1

Lipari 2010(C) 2010, C. Faloutsos 145 Q: Given query i, how to solve it? ? ? Adjacency matrix Starting vector Ranking vector Query

Lipari 2010(C) 2010, C. Faloutsos OntheFly: No pre-computation/ light storage Slow on-line response O(mE)

Lipari 2010(C) 2010, C. Faloutsos PreCompute R:R: c x Q Q

Lipari 2010(C) 2010, C. Faloutsos 148 PreCompute: Fast on-line response Heavy pre-computation/storage cost O(n ) 3 2

Lipari 2010(C) 2010, C. Faloutsos 149 Q: How to Balance? On-line Off-line

Lipari 2010(C) 2010, C. Faloutsos 150 How to balance? Idea (‘B-Lin’) Break into communities Pre-compute all, within a community Adjust (with S.M.) for ‘bridge edges’ H. Tong, C. Faloutsos, & J.Y. Pan. Fast Random Walk with Restart and Its Applications. ICDM, , 2006.

Lipari 2010(C) 2010, C. Faloutsos 151 Detailed outline Problem dfn and motivation Solution: Random walk with restarts Efficient computation Case study: image auto-captioning Extensions: bi-partite graphs; tracking Conclusions

Lipari 2010(C) 2010, C. Faloutsos 152 gCaP: Automatic Image Caption Q … SeaSunSkyWave {} {} CatForestGrassTiger {?, ?, ?,} A: Proximity! [Pan+ KDD2004]

Lipari 2010(C) 2010, C. Faloutsos 153 Test Image SeaSunSkyWaveCatForestTigerGrass Image Keyword Region

Lipari 2010(C) 2010, C. Faloutsos 154 Test Image SeaSunSkyWaveCatForestTigerGrass Image Keyword Region {Grass, Forest, Cat, Tiger}

Lipari 2010(C) 2010, C. Faloutsos 155 C-DEM (Screen-shot)

Lipari 2010(C) 2010, C. Faloutsos 156 C-DEM: Multi-Modal Query System for Drosophila Embryo Databases [Fan+ VLDB 2008]

Detailed outline Problem dfn and motivation Solution: Random walk with restarts Efficient computation Case study: image auto-captioning Extensions: bi-partite graphs; tracking Conclusions Lipari 2010(C) 2010, C. Faloutsos 157

Lipari 2010(C) 2010, C. Faloutsos 158 Problem: update E’ edges changed Involves n’ authors, m’ confs. n authors m Conferences

Lipari 2010(C) 2010, C. Faloutsos 159 Solution: Use Sherman-Morrison Lemma to quickly update the inverse matrix

Lipari 2010(C) 2010, C. Faloutsos 160 Fast-Single-Update 176x speedup 40x speedup log(Time) (Seconds) Datasets Our method

Lipari 2010(C) 2010, C. Faloutsos 161 pTrack: Philip S. Yu’s Top-5 conferences up to each year ICDE ICDCS SIGMETRICS PDIS VLDB CIKM ICDCS ICDE SIGMETRICS ICMCS KDD SIGMOD ICDM CIKM ICDCS ICDM KDD ICDE SDM VLDB Databases Performance Distributed Sys. Databases Data Mining DBLP: (Au. x Conf.) - 400k aus, - 3.5k confs - 20 yrs

Lipari 2010(C) 2010, C. Faloutsos 162 pTrack: Philip S. Yu’s Top-5 conferences up to each year ICDE ICDCS SIGMETRICS PDIS VLDB CIKM ICDCS ICDE SIGMETRICS ICMCS KDD SIGMOD ICDM CIKM ICDCS ICDM KDD ICDE SDM VLDB Databases Performance Distributed Sys. Databases Data Mining DBLP: (Au. x Conf.) - 400k aus, - 3.5k confs - 20 yrs

Lipari 2010(C) 2010, C. Faloutsos 163 KDD’s Rank wrt. VLDB over years Prox. Rank Year Data Mining and Databases are getting closer & closer

Lipari 2010(C) 2010, C. Faloutsos 164 cTrack:10 most influential authors in NIPS community up to each year Author-paper bipartite graph from NIPS k papers, 2037 authors, spreading over 13 years T. Sejnowski M. Jordan

Lipari 2010(C) 2010, C. Faloutsos 165 Conclusions - Take-home messages Proximity Definitions –RWR –and a lot of variants Computation –Sherman–Morrison Lemma –Fast Incremental Computation Applications –Recommendations; auto-captioning; tracking –Center-piece Subgraphs (next) – management; anomaly detection, …

Lipari 2010(C) 2010, C. Faloutsos 166 References L. Page, S. Brin, R. Motwani, & T. Winograd. (1998), The PageRank Citation Ranking: Bringing Order to the Web, Technical report, Stanford Library. T.H. Haveliwala. (2002) Topic-Sensitive PageRank. In WWW, , 2002 J.Y. Pan, H.J. Yang, C. Faloutsos & P. Duygulu. (2004) Automatic multimedia cross-modal correlation discovery. In KDD, , 2004.

Lipari 2010(C) 2010, C. Faloutsos 167 References C. Faloutsos, K. S. McCurley & A. Tomkins. (2002) Fast discovery of connection subgraphs. In KDD, , J. Sun, H. Qu, D. Chakrabarti & C. Faloutsos. (2005) Neighborhood Formation and Anomaly Detection in Bipartite Graphs. In ICDM, , W. Cohen. (2007) Graph Walks and Graphical Models. Draft.

Lipari 2010(C) 2010, C. Faloutsos 168 References P. Doyle & J. Snell. (1984) Random walks and electric networks, volume 22. Mathematical Association America, New York. Y. Koren, S. C. North, and C. Volinsky. (2006) Measuring and extracting proximity in networks. In KDD, 245–255, A. Agarwal, S. Chakrabarti & S. Aggarwal. (2006) Learning to rank networked entities. In KDD, 14-23, 2006.

Lipari 2010(C) 2010, C. Faloutsos 169 References S. Chakrabarti. (2007) Dynamic personalized pagerank in entity-relation graphs. In WWW, , F. Fouss, A. Pirotte, J.-M. Renders, & M. Saerens. (2007) Random-Walk Computation of Similarities between Nodes of a Graph with Application to Collaborative Recommendation. IEEE Trans. Knowl. Data Eng. 19(3),

Lipari 2010(C) 2010, C. Faloutsos 170 References H. Tong & C. Faloutsos. (2006) Center-piece subgraphs: problem definition and fast solutions. In KDD, , H. Tong, C. Faloutsos, & J.Y. Pan. (2006) Fast Random Walk with Restart and Its Applications. In ICDM, , H. Tong, Y. Koren, & C. Faloutsos. (2007) Fast direction-aware proximity for graph mining. In KDD, , 2007.

Lipari 2010(C) 2010, C. Faloutsos 171 References H. Tong, B. Gallagher, C. Faloutsos, & T. Eliassi- Rad. (2007) Fast best-effort pattern matching in large attributed graphs. In KDD, , H. Tong, S. Papadimitriou, P.S. Yu & C. Faloutsos. (2008) Proximity Tracking on Time-Evolving Bipartite Graphs. SDM 2008.

Lipari 2010(C) 2010, C. Faloutsos 172 References B. Gallagher, H. Tong, T. Eliassi-Rad, C. Faloutsos. Using Ghost Edges for Classification in Sparsely Labeled Networks. KDD 2008 H. Tong, Y. Sakurai, T. Eliassi-Rad, and C. Faloutsos. Fast Mining of Complex Time-Stamped Events CIKM 08 H. Tong, H. Qu, and H. Jamjoom. Measuring Proximity on Graphs with Side Information. ICDM 2008

Lipari 2010(C) 2010, C. Faloutsos 173 Resources For software, papers, and ppt of presentations ial.htmlwww.cs.cmu.edu/~htong/tut/cikm2008/cikm_tutor ial.html For the CIKM’08 tutorial on graphs and proximity Again, thanks to Hanghang TONG for permission to use his foils in this part

Lipari 2010(C) 2010, C. Faloutsos 174 Outline Introduction – Motivation Task 1: Node importance Task 2: Recommendations & proximity Task 3: Connection sub-graphs Conclusions

Lipari 2010(C) 2010, C. Faloutsos 175 Detailed outline Problem definition Solution Results H. Tong & C. Faloutsos Center-piece subgraphs: problem definition and fast solutions. In KDD, , 2006.

Lipari 2010(C) 2010, C. Faloutsos 176 Center-Piece Subgraph(Ceps) Given Q query nodes Find Center-piece ( ) Input of Ceps –Q Query nodes –Budget b –k softAnd number App. –Social Network –Law Inforcement –Gene Network –… B A CB A C

Lipari 2010(C) 2010, C. Faloutsos 177 Challenges in Ceps Q1: How to measure importance? (Q2: How to extract connection subgraph? Q3: How to do it efficiently?)

Lipari 2010(C) 2010, C. Faloutsos 178 Challenges in Ceps Q1: How to measure importance? A: “proximity” – but how to combine scores? (Q2: How to extract connection subgraph? Q3: How to do it efficiently?)

Lipari 2010(C) 2010, C. Faloutsos 179 AND: Combine Scores Q: How to combine scores?

Lipari 2010(C) 2010, C. Faloutsos 180 AND: Combine Scores Q: How to combine scores? A: Multiply …= prob. 3 random particles coincide on node j

Lipari 2010(C) 2010, C. Faloutsos 181 Detailed outline Problem definition Solution Results

Lipari 2010(C) 2010, C. Faloutsos 182 Case Study: AND query

Lipari 2010(C) 2010, C. Faloutsos 183 Case Study: AND query

Lipari 2010(C) 2010, C. Faloutsos 184 Conclusions Proximity (e.g., w/ RWR) helps answer ‘AND’ and ‘k_softAnd’ queries

Overall conclusions SVD: a powerful tool –HITS/ pageRank –(dimensionality reduction) Proximity: Random Walk with Restarts –Recommendation systems –Auto-captioning –Center-Piece Subgraphs Lipari 2010(C) 2010, C. Faloutsos 185