Presentation is loading. Please wait.

Presentation is loading. Please wait.

Krylov-Subspace Methods - II Lecture 7 Alessandra Nardi Thanks to Prof. Jacob White, Deepak Ramaswamy, Michal Rewienski, and Karen Veroy.

Similar presentations


Presentation on theme: "Krylov-Subspace Methods - II Lecture 7 Alessandra Nardi Thanks to Prof. Jacob White, Deepak Ramaswamy, Michal Rewienski, and Karen Veroy."— Presentation transcript:

1 Krylov-Subspace Methods - II Lecture 7 Alessandra Nardi Thanks to Prof. Jacob White, Deepak Ramaswamy, Michal Rewienski, and Karen Veroy

2 Last lectures review Overview of Iterative Methods to solve Mx=b –Stationary –Non Stationary QR factorization –Modified Gram-Schmidt Algorithm –Minimization View of QR General Subspace Minimization Algorithm Generalized Conjugate Residual Algorithm –Krylov-subspace –Simplification in the symmetric case –Convergence properties Eigenvalue and Eigenvector Review –Norms and Spectral Radius –Spectral Mapping Theorem

3 Arbitrary Subspace Methods Residual Minimization

4 Use Gram-Schmidt on Mw i’s ! Arbitrary Subspace Methods Residual Minimization

5 kth order polynomial Krylov Subspace Methods Krylov Subspace

6 Krylov Subspace Methods Subspace Generation The set of residuals also can be used as a representation of the Krylov-Subspace Generalized Conjugate Residual Algorithm Nice because the residuals generate next search directions

7 Determine optimal stepsize in kth search direction Update the solution (trying to minimize residual) and the residual Compute the new orthogonalized search direction (by using the most recent residual) Krylov-Subspace Methods Generalized Conjugate Residual Method (k-th step)

8 Vector inner products, O(n) Matrix-vector product, O(n) if sparse Vector Adds, O(n) O(k) inner products, total cost O(nk) If M is sparse, as k (# of iters) approaches n, Better Converge Fast! Krylov-Subspace Methods Generalized Conjugate Residual Method (Computational Complexity for k-th step)

9 Summary What is an iterative non stationary method: x (k+1) =x (k) +a k p k How search to calculate: –Search directions (p k ) –Step along search directions (a k ) Krylov Subspace  GCR GCR is O(k 2 n) –Better converge fast!  Now look at convergence properties of GCR

10 Krylov Methods Convergence Analysis Basic properties

11 GCR Optimality Property Therefore Any polynomial which satisfies the constraints can be used to get an upper bound on Krylov Methods Convergence Analysis Optimality of GCR poly

12 Theorem: Any induced norm is a bound on the spectral radius Proof: Eigenvalues and eigenvectors review Induced norms

13 Given a polynomial Apply the polynomial to a matrix Then Useful Eigenproperties Spectral Mapping Theorem

14 Krylov Methods Convergence Analysis Overview where is any (k+1)-th order polynomial subject to:  may be used to get an upper bound on Matrix norm propertyGCR optimality property

15 Review on eigenvalues and eigenvectors –Induced norms: relate matrix eigenvalues to the matrix norms –Spectral mapping theorem: relate matrix eigenvalues to matrix polynomials Now ready to relate the convergence properties of Krylov Subspace methods to eigenvalues of M Krylov Methods Convergence Analysis Overview

16 Cond(V) Krylov Methods Convergence Analysis Norm of matrix polynomials

17

18 1) The GCR Algorithm converges to the exact solution in at most n steps 2) If M has only q distinct eigenvalues, the GCR Algorithm converges in at most q steps Krylov Methods Convergence Analysis Important observations

19 If M = M T then 2) M has real eigenvalues 1) M has orthonormal eigenvectors Krylov Methods Convergence Analysis Convergence for M T =M - Residual Polynomial

20 * = evals(M) - = 5th order poly - = 8th order poly 1 Krylov Methods Convergence Analysis Residual Polynomial Picture (n=10)

21 Strategically place zeros of the poly Krylov Methods Convergence Analysis Residual Polynomial Picture (n=10)

22 Krylov Methods Convergence Analysis Convergence for M T =M – Polynomial min-max problem

23 = The Chebyshev Polynomial Krylov Methods Convergence Analysis Convergence for M T =M – Chebyshev solves min-max

24 Chebychev Polynomials minimizing over [1,10]

25 Krylov Methods Convergence Analysis Convergence for M T =M – Chebyshev bounds

26 Krylov Methods Convergence Analysis Convergence for M T =M – Chebyshev result

27 For which problem will GCR Converge Faster? Krylov Methods Convergence Analysis Examples

28 Iteration Which Convergence Curve is GCR?

29 GCR Algorithm can eliminate outlying eigenvalues by placing polynomial zeros directly on them. Krylov Methods Convergence Analysis Chebyshev is a bound

30 Iterative Methods - CG Convergence is related to: –Number of distinct eigenvalues –Ratio between max and min eigenvalue Why ? How? Now we know

31 Reminder about GCR –Residual minimizing solution –Krylov Subspace –Polynomial Connection Review Eigenvalues –Induced Norms bound Spectral Radius –Spectral mapping theorem Estimating Convergence Rate –Chebyshev Polynomials Summary


Download ppt "Krylov-Subspace Methods - II Lecture 7 Alessandra Nardi Thanks to Prof. Jacob White, Deepak Ramaswamy, Michal Rewienski, and Karen Veroy."

Similar presentations


Ads by Google