Presentation is loading. Please wait.

Presentation is loading. Please wait.

Krylov-Subspace Methods - I Lecture 6 Alessandra Nardi Thanks to Prof. Jacob White, Deepak Ramaswamy, Michal Rewienski, and Karen Veroy.

Similar presentations


Presentation on theme: "Krylov-Subspace Methods - I Lecture 6 Alessandra Nardi Thanks to Prof. Jacob White, Deepak Ramaswamy, Michal Rewienski, and Karen Veroy."— Presentation transcript:

1 Krylov-Subspace Methods - I Lecture 6 Alessandra Nardi Thanks to Prof. Jacob White, Deepak Ramaswamy, Michal Rewienski, and Karen Veroy

2 Last lecture review Iterative Methods Overview –Stationary –Non Stationary QR factorization to solve Mx=b –Modified Gram-Schmidt Algorithm –QR Pivoting –Minimization View of QR Basic Minimization approach Orthogonalized Search Directions Pointer to Krylov Subspace Methods

3 Last lecture reminder QR Factorization – By picture

4 For i = 1 to N “For each Target Column” For j = 1 to i-1 “For each Source Column left of target” end Normalize Orthogonalize Search Direction QR Factorization – Minimization View Minimization Algorithm

5 Iterative Methods Solve Mx=b minimizing the residual r=b-Mx Stationary: x (k+1) =Gx (k) +c Jacobi Gauss-Seidel Successive Overrelaxation Non Stationary: x (k+1) =x (k) +a k p k CG (Conjugate Gradient)  A symmetric and positive definite GCR (Generalized Conjugate Residual) GMRES, etc etc

6 Iterative Methods - CG Convergence is related to: –Number of distinct eigenvalues –Ratio between max and min eigenvalue Why ? How?

7 General Subspace Minimization Algorithm –Review orthogonalization and projection formulas Generalized Conjugate Residual Algorithm –Krylov-subspace –Simplification in the symmetric case. –Convergence properties Eigenvalue and Eigenvector Review –Norms and Spectral Radius –Spectral Mapping Theorem Outline

8 Arbitrary Subspace Methods Residual Minimization

9 Use Gram-Schmidt on Mw i’s ! Arbitrary Subspace Methods Residual Minimization

10 Arbitrary Subspace Methods Orthogonalization

11 Arbitrary Subspace Solution Algorithm 1.Given M, b and a set of search directions: {w 0,…,w k } 2.Make w i ’s MM T orthogonal and get new search directions: {p 0,…,p k } 3.Minimize the residual:

12 For i = 0 to k For j = 1 to i-1 end Normalize Orthogonalize Search Direction Update Solution Arbitrary Subspace Solution Algorithm

13 Krylov Subspace How about the initial set of search directions {w 0,…,w k } ? A particular choice that is commonly used is: {w 0,…,w k }  {b, Mb, M 2 b…}  m (A,v)  span{v, Av, A 2 v, …, A m-1 v} is called Krylov Subspace

14 kth order polynomial Krylov Subspace Methods

15 Krylov Subspace Methods Subspace Generation The set of residuals also can be used as a representation of the Krylov-Subspace Generalized Conjugate Residual Algorithm Nice because the residuals generate next search directions

16 Determine optimal stepsize in kth search direction Update the solution (trying to minimize residual) and the residual Compute the new orthogonalized search direction (by using the most recent residual) Krylov-Subspace Methods Generalized Conjugate Residual Method (k-th step)

17 Vector inner products, O(n) Matrix-vector product, O(n) if sparse Vector Adds, O(n) O(k) inner products, total cost O(nk) If M is sparse, as k (# of iters) approaches n, Better Converge Fast! Krylov-Subspace Methods Generalized Conjugate Residual Method (Computational Complexity for k-th step)

18 If k (# of iters )  n, then symmetric, sparse, GCR is O(n 2 ) Better Converge Fast! An Amazing fact that will not be derived Orthogonalization in one step Krylov-Subspace Methods Generalized Conjugate Residual Method (Symmetric Case – Conjugate Gradient Method)

19 Summary What is an iterative non stationary method: x (k+1) =x (k) +a k p k How search to calculate: –Search directions (p k ) –Step along search directions (a k ) Krylov Subspace  GCR GCR is O(k 2 n) –Better converge fast!  Now look at convergence properties of GCR

20 Krylov Methods Convergence Analysis Basic properties

21 Krylov Methods Convergence Analysis Optimality of GCR poly GCR optimality property (key property of the algorithm): GCR picks the best (k+1)-th order polynomial minimizing and subject to: 

22 GCR Optimality Property Therefore Any polynomial which satisfies the constraints can be used to get an upper bound on Krylov Methods Convergence Analysis Optimality of GCR poly

23 Eigenvalues and eigenvectors of a matrix M satisfy eigenvector eigenvalue Eigenvalues and eigenvectors review Basic definitions

24 Almost all NxN matrices have N linearly independent Eigenvectors The set of all eigenvalues of M is known as the Spectrum of M Eigenvalues and eigenvectors review A symplifying assumption

25 Almost all NxN matrices have N linearly independent Eigenvectors Eigenvalues and eigenvectors review A symplifying assumption

26 The spectral Radius of M is the radius of the smallest circle, centered at the origin, which encloses all of M’s eigenvalues Eigenvalues and eigenvectors review Spectral radius

27 L 2 (Euclidean) norm : L 1 norm : L  norm : Unit circle Unit square 1 1 Eigenvalues and eigenvectors review Vector norms

28 Vector induced norm : Induced norm of A is the maximum “magnification” of by = max abs column sum = max abs row sum = (largest eigenvalue of A T A) 1/2 Eigenvalues and eigenvectors review Matrix norms

29 Theorem: Any induced norm is a bound on the spectral radius Proof: Eigenvalues and eigenvectors review Induced norms

30 Given a polynomial Apply the polynomial to a matrix Then Useful Eigenproperties Spectral Mapping Theorem

31 Krylov Methods Convergence Analysis Overview where is any (k+1)-th order polynomial subject to:  may be used to get an upper bound on Matrix norm propertyGCR optimality property

32 Review on eigenvalues and eigenvectors –Induced norms: relate matrix eigenvalues to the matrix norms –Spectral mapping theorem: relate matrix eigenvalues to matrix polynomials Now ready to relate the convergence properties of Krylov Subspace methods to eigenvalues of M Krylov Methods Convergence Analysis Overview

33 Summary Generalized Conjugate Residual Algorithm –Krylov-subspace –Simplification in the symmetric case –Convergence properties Eigenvalue and Eigenvector Review –Norms and Spectral Radius –Spectral Mapping Theorem


Download ppt "Krylov-Subspace Methods - I Lecture 6 Alessandra Nardi Thanks to Prof. Jacob White, Deepak Ramaswamy, Michal Rewienski, and Karen Veroy."

Similar presentations


Ads by Google