Presentation is loading. Please wait.

Presentation is loading. Please wait.

05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Combinatorial and algebraic tools for multigrid Yiannis Koutis Computer Science.

Similar presentations


Presentation on theme: "05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Combinatorial and algebraic tools for multigrid Yiannis Koutis Computer Science."— Presentation transcript:

1 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Combinatorial and algebraic tools for multigrid Yiannis Koutis Computer Science Department Carnegie Mellon University

2 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 multilevel methods citations 25 free software packages 10 special conferences since 1983 Algorithms not always working Limited theoretical understanding

3 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 multilevel methods: our goals provide theoretical understanding solve multilevel design problems small changes in current software study structure of eigenspaces of Laplacians extensions for multilevel eigensolvers

4 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Overview Quick definitions Subgraph preconditioners Support graph preconditioners Algebraic expressions Low frequency eigenvectors and good partitionings Multigrid introduction and current state Multigrid – Our contributions

5 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 quick definitions Given a graph G, with weights w ij Laplacian: A(i,j) = -w ij, row sums =0 Normalized Laplacian: (A,B) is a measure of how well B approximates A (and vice-versa)

6 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 linear systems : preconditioning Goal: Solve Ax = b via an iterative method A is a Laplacian of size n with m edges. Complexity depends on (A,I) and m Solution: Solve B -1 Ax = B -1 b Bz=y must be easily solvable (A,B) is small B is the preconditioner

7 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Overview Quick definitions Subgraph preconditioners Support graph preconditioners Algebraic expressions Low frequency eigenvectors and good partitionings Multigrid introduction and current state Multigrid – Our contributions

8 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 combinatorial preconditioners the Vaidya thread B is a sparse subgraph of A, possibly with additional edges Solving Bz=y is performed as follows: 1.Gaussian elimination on degree · 2 nodes of B 2.A new system must be solved 3.Recursively call the same algorithm on to get an approximate solution.

9 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 combinatorial preconditioners the Vaidya thread Graph Sparsification [ Spielman, Teng ] Low stretch trees [ Elkin, Emek, Spielman, Teng ] Near optimal O(m poly( log n)) complexity

10 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 combinatorial preconditioners the Vaidya thread Graph Sparsification [ Spielman, Teng ] Low stretch trees [ Elkin, Emek, Spielman, Teng ] Near optimal O(m poly( log n)) complexity Focus on constructing a good B (A,B) is well understood – B is sparser than A B can look complicated even for simple graphs A

11 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Overview Quick definitions Subgraph preconditioners Support graph preconditioners Algebraic expressions Low frequency eigenvectors and good partitionings Multigrid introduction and current state Multigrid – Our contributions

12 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 combinatorial preconditioners the Gremban - Miller thread the support graph S is bigger than A

13 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 combinatorial preconditioners the Gremban - Miller thread the support graph S is bigger than A Quotient 1

14 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 combinatorial preconditioners the Gremban - Miller thread The preconditioner S is often a natural graph S inherits the sparsity properties of A S is equivalent to a dense graph B of size equal to that of A : (A,S) = (A,B) Analysis of (A,S) made easy by work of [Maggs, Miller, Ravi, Woo, Parekh] Existence of good S by work of [Racke]

15 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Overview Quick definitions Subgraph preconditioners Support graph preconditioners Algebraic expressions Low frequency eigenvectors and good partitionings Multigrid introduction and current state Multigrid – Our contributions Other results

16 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Suppose we are given m clusters in A R(i,j) = 1 if the j th cluster contains node i R is n x m Quotient R is the clustering matrix algebraic expressions

17 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 The inverse preconditioner The normalized version R T D 1/2 is the weighted clustering matrix algebraic expressions

18 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Overview Quick definitions Subgraph preconditioners Support graph preconditioners Algebraic expressions Low frequency eigenvectors and good partitionings Multigrid introduction and current state Multigrid – Our contributions Other results

19 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 good partitions and low frequency invariant subspaces Suppose the graph A has a good clustering defined by the clustering matrix R Let Let y be any vector such that

20 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Suppose the graph A has a good clustering defined by the clustering matrix R Let Let y be any vector such that Theorem: The inequality is tight up to a constant for certain graphs good partitions and low frequency invariant subspaces quality test?

21 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 good partitions and low frequency invariant subspaces Let y be any vector such that Let x be mostly a linear combination of eigenvectors corresponding to eigenvalues close to Theorem: Prove ? We can find random vector x and check the distance to the closest y

22 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Overview Quick definitions Subgraph preconditioners Support graph preconditioners Algebraic expressions Low frequency eigenvectors and good partitionings Multigrid introduction and current state Multigrid – Our contributions

23 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 multigrid – short introduction General class of algorithms Richardson iteration: High frequency components are reduced:

24 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 initial and smoothed error initial errorsmoothed error

25 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Define a smaller graph Q Define a projection operator Rproject Define a lift operator Rlift the basic multigrid algorithm 1.Apply t rounds of smoothing 2.Take the residual r = b-Ax old 3. Solve Qz = R project r 4.Form new iterate x new = x old + R lift z 5.Apply t rounds of smoothing how many? which iteration ? recursion is this needed ?

26 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 algebraic multigrid (AMG) Goals: The range of R project must approximate the unreduced error very well. The error not reduced by smoothing must be reduced by the smaller grid.

27 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 algebraic multigrid (AMG) Goals: The range of R project must approximate the unreduced error very well. The error not reduced by smoothing must be reduced in the smaller grid. Jacobi iteration: or scaled Richardson: Find a clustering R project = (R lift ) T Q = R project T A R project

28 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 algebraic multigrid (AMG) Goals: The range of R project must approximate the unreduced error very well. The error not reduced by smoothing must be reduced in the smaller grid. Jacobi iteration: or scaled Richardson Find a clustering [heuristic] R project = (R lift ) T [heuristic] Q = R project T A R project

29 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 two level analysis Analyze the maximum eigenvalue of where The matrix T 1 eliminates the error in A low frequency eigenvector has a significant component in

30 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 two level analysis Starting hypothesis: Let X be the subspace corresponding to eigenvalues smaller than. Let Y be the null space of R project. Assume, 2 · / Two level convergence : error reduced by Proving the hypothesis ? Limited cases

31 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 current state there is no systematic AMG approach that has proven effective in any kind of general context [BCFHJMMR, SIAM Journal on Scientific Computing, 2003]

32 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Overview Quick definitions Subgraph preconditioners Support graph preconditioners Algebraic expressions Low frequency eigenvectors and good partitionings Multigrid introduction and current state Multigrid – Our contributions

33 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 our contributions – two level There exists a good clustering given by R. The quality is measured by the condition number (A,S) Q = R T A R Richardsons with Projection matrix

34 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 our contributions - two level analysis Starting hypothesis: Let X be the subspace corresponding to eigenvalues smaller than. Let Y be the null space of R project = R T D 1/2 Assume, 2 · / Two level convergence : error reduced by Proving the hypothesis ? Yes! Using (A,S) Result holds for t=1 smoothing Additional smoothings do not help

35 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 our contributions - recursion There is a matrix M which characterizes the error reduction after one full multigrid cycle We need to upper bound its maximum eigenvalue as a function of the two-level eigenvalues the maximum eigenvalue of M is upper bounded by the sum of the maximum eigenvalues over all two-levels

36 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 towards full convergence Goal: The error not reduced by smoothing must be reduced by the smaller grid A different point of view The small grid does not reduce part of the error. It rather changes its spectral profile.

37 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 full convergence for regular d-dimensional toroidal meshes A simple change in the implementation of the algorithm: where T 2 has eigenvalues 1 and -1 T 2 x low = x high

38 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 full convergence for regular d-dimensional toroidal meshes With t=O(log log n) smoothings Using recursive analysis: max (M) · 1/2 Both pre-smoothings and post-smoothings are needed Holds for perturbations of toroidal meshes

39 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Overview Quick definitions Subgraph preconditioners Support graph preconditioners Algebraic expressions Low frequency eigenvectors and good partitionings Multigrid introduction and current state Multigrid – Our contributions

40 05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Thanks!


Download ppt "05/11/2005 Carnegie Mellon School of Computer Science Aladdin Lamps 05 Combinatorial and algebraic tools for multigrid Yiannis Koutis Computer Science."

Similar presentations


Ads by Google