Presentation is loading. Please wait.

Presentation is loading. Please wait.

Sparse Matrix Methods Day 1: Overview Matlab and examples Data structures Ax=b Sparse matrices and graphs Fill-reducing matrix permutations Matching and.

Similar presentations


Presentation on theme: "Sparse Matrix Methods Day 1: Overview Matlab and examples Data structures Ax=b Sparse matrices and graphs Fill-reducing matrix permutations Matching and."— Presentation transcript:

1 Sparse Matrix Methods Day 1: Overview Matlab and examples Data structures Ax=b Sparse matrices and graphs Fill-reducing matrix permutations Matching and block triangular form Day 2: Direct methods Day 3: Iterative methods D

2 (Demo in Matlab) Matlab sprand spy sparse, full matrix arithmetic and indexing examples of sparse matrices from different applications (from UF site)

3 Matlab sparse matrices: Design principles All operations should give the same results for sparse and full matrices (almost all) Sparse matrices are never created automatically, but once created they propagate Performance is important -- but usability, simplicity, completeness, and robustness are more important Storage for a sparse matrix should be O(nonzeros) Time for a sparse operation should be O(flops) (as nearly as possible)

4 Data structures Full: 2-dimensional array of real or complex numbers (nrows*ncols) memory 31053 0590 41260 3141592653 13231 Sparse: compressed column storage about (1.5*nzs +.5*ncols) memory D

5 (Demos in Matlab) A \ b Cholesky factorization and orderings

6 Solving linear equations x = A \ b; Is A square? no => use QR to solve least squares problem Is A triangular or permuted triangular? yes => sparse triangular solve Is A symmetric with positive diagonal elements? yes => attempt Cholesky after symmetric minimum degree Otherwise => use LU on A(:, colamd(A))

7 Matrix factorizations in Matlab Cholesky: R = chol(A); simple left-looking column algorithm Nonsymmetric LU: [L,U,P] = lu(A); left-looking “GPMOD”, depth-first search, symmetric pruning Orthogonal: [Q,R] = qr(A); George-Heath algorithm: row-wise Givens rotations

8 Graphs and Sparse Matrices : Cholesky factorization 10 1 3 2 4 5 6 7 8 9 1 3 2 4 5 6 7 8 9 G(A) G + (A) [chordal] Symmetric Gaussian elimination: for j = 1 to n add edges between j’s higher-numbered neighbors Fill: new nonzeros in factor

9 Elimination Tree 10 1 3 2 4 5 6 7 8 9 Cholesky factor G + (A)T(A) 10 1 3 2 4 5 6 7 8 9 T(A) : parent(j) = min { i > j : (i,j) in G + (A) } T describes dependencies among columns of factor Can compute T from G(A) in almost linear time Can compute G + (A) easily from T D

10 (Demos in Matlab) matrix and graph elimination tree orderings in detail

11 Fill-reducing matrix permutations Minimum degree: Eliminate row/col with fewest nzs, add fill, repeat Theory: can be suboptimal even on 2D model problem Practice: often wins for medium-sized problems Nested dissection: Find a separator, number it last, proceed recursively Theory: approx optimal separators => approx optimal fill and flop count Practice: often wins for very large problems Banded orderings (Reverse Cuthill-McKee, Sloan,...): Try to keep all nonzeros close to the diagonal Theory, practice: often wins for “long, thin” problems Best modern general-purpose orderings are ND/MD hybrids.

12 Fill-reducing permutations in Matlab Nonsymmetric approximate minimum degree: p = colamd(A); column permutation: lu(A(:,p)) often sparser than lu(A) also for QR factorization Symmetric approximate minimum degree: p = symamd(A); symmetric permutation: chol(A(p,p)) often sparser than chol(A) Reverse Cuthill-McKee p = symrcm(A); A(p,p) often has smaller bandwidth than A similar to Sparspak RCM D

13 (Demos in Matlab) nonsymmetric LU dmperm, dmspy, components

14 Matching and block triangular form Dulmage-Mendelsohn decomposition: Bipartite matching followed by strongly connected components Square, full rank A: [p, q, r] = dmperm(A); A(p,q) has nonzero diagonal and is in block upper triangular form also, strongly connected components of a directed graph also, connected components of an undirected graph Arbitrary A: [p, q, r, s] = dmperm(A); maximum-size matching in a bipartite graph minimum-size vertex cover in a bipartite graph decomposition into strong Hall blocks

15 Complexity of direct methods 2D3D Space (fill): O(n log n)O(n 4/3 ) Time (flops): O(n 3/2 )O(n 2 ) n 1/2 n 1/3

16 The Landscape of Sparse Ax=b Solvers Direct A = LU Iterative y’ = Ay Non- symmetric Symmetric positive definite More RobustLess Storage More Robust More General D

17 (Demos in Matlab) conjugate gradients


Download ppt "Sparse Matrix Methods Day 1: Overview Matlab and examples Data structures Ax=b Sparse matrices and graphs Fill-reducing matrix permutations Matching and."

Similar presentations


Ads by Google