Lecture 13 - Eigen-analysis CVEN 302 July 1, 2002.

Slides:



Advertisements
Similar presentations
Rules of Matrix Arithmetic
Advertisements

Numerical Solution of Linear Equations
Applied Informatics Štefan BEREŽNÝ
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Scientific Computing QR Factorization Part 2 – Algorithm to Find Eigenvalues.
Chapter 6 Eigenvalues and Eigenvectors
Algebraic, transcendental (i.e., involving trigonometric and exponential functions), ordinary differential equations, or partial differential equations...
Eigen-analysis and the Power Method
Solving Linear Systems (Numerical Recipes, Chap 2)
1.5 Elementary Matrices and a Method for Finding
The QR iteration for eigenvalues. ... The intention of the algorithm is to perform a sequence of similarity transformations on a real matrix so that the.
1.7 Diagonal, Triangular, and Symmetric Matrices.
SOLVING SYSTEMS OF LINEAR EQUATIONS. Overview A matrix consists of a rectangular array of elements represented by a single symbol (example: [A]). An individual.
CISE301_Topic3KFUPM1 SE301: Numerical Methods Topic 3: Solution of Systems of Linear Equations Lectures 12-17: KFUPM Read Chapter 9 of the textbook.
Lecture 9: Introduction to Matrix Inversion Gaussian Elimination Sections 2.4, 2.5, 2.6 Sections 2.2.3, 2.3.
DEF: Characteristic Polynomial (of degree n) QR - Algorithm Note: 1) QR – Algorithm is different from QR-Decomposition 2) a procedure to calculate the.
Chapter 9 Gauss Elimination The Islamic University of Gaza
1cs542g-term Notes  In assignment 1, problem 2: smoothness = number of times differentiable.
Linear Algebraic Equations
QR-RLS Algorithm Cy Shimabukuro EE 491D
Chapter 3 Determinants and Matrices
Chapter 2 Matrices Definition of a matrix.
Linear Least Squares QR Factorization. Systems of linear equations Problem to solve: M x = b Given M x = b : Is there a solution? Is the solution unique?
Finding Eigenvalues and Eigenvectors What is really important?
MOHAMMAD IMRAN DEPARTMENT OF APPLIED SCIENCES JAHANGIRABAD EDUCATIONAL GROUP OF INSTITUTES.
Digital Control Systems Vector-Matrix Analysis. Definitions.
MATH 685/ CSI 700/ OR 682 Lecture Notes Lecture 6. Eigenvalue problems.
Stats & Linear Models.
Dominant Eigenvalues & The Power Method
5.1 Orthogonality.
1.7 Diagonal, Triangular, and Symmetric Matrices 1.
Numerical Analysis 1 EE, NCKU Tien-Hao Chang (Darby Chang)
Scientific Computing QR Factorization Part 1 – Orthogonal Matrices, Reflections and Rotations.
Scientific Computing Linear Systems – LU Factorization.
Eigenvalue Problems Solving linear systems Ax = b is one part of numerical linear algebra, and involves manipulating the rows of a matrix. The second main.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
Numerical Computation Lecture 7: Finding Inverses: Gauss-Jordan United International College.
Lecture 22 - Exam 2 Review CVEN 302 July 29, 2002.
ΑΡΙΘΜΗΤΙΚΕΣ ΜΕΘΟΔΟΙ ΜΟΝΤΕΛΟΠΟΙΗΣΗΣ 4. Αριθμητική Επίλυση Συστημάτων Γραμμικών Εξισώσεων Gaussian elimination Gauss - Jordan 1.
Matrices CHAPTER 8.1 ~ 8.8. Ch _2 Contents  8.1 Matrix Algebra 8.1 Matrix Algebra  8.2 Systems of Linear Algebra Equations 8.2 Systems of Linear.
Linear algebra: matrix Eigen-value Problems
Chapter 3 Solution of Algebraic Equations 1 ChE 401: Computational Techniques for Chemical Engineers Fall 2009/2010 DRAFT SLIDES.
Lecture 7 - Systems of Equations CVEN 302 June 17, 2002.
Chapter 5 MATRIX ALGEBRA: DETEMINANT, REVERSE, EIGENVALUES.
Scientific Computing Singular Value Decomposition SVD.
MATH 685/ CSI 700/ OR 682 Lecture Notes Lecture 4. Least squares.
Lesson 3 CSPP58001.
What is the determinant of What is the determinant of
Lecture 16 - Approximation Methods CVEN 302 July 15, 2002.
Numerical Analysis – Eigenvalue and Eigenvector Hanyang University Jong-Il Park.
Eigenvalues The eigenvalue problem is to determine the nontrivial solutions of the equation Ax= x where A is an n-by-n matrix, x is a length n column.
Chapter 9 Gauss Elimination The Islamic University of Gaza
Similar diagonalization of real symmetric matrix
1 Instituto Tecnológico de Aeronáutica Prof. Maurício Vicente Donadon AE-256 NUMERICAL METHODS IN APPLIED STRUCTURAL MECHANICS Lecture notes: Prof. Maurício.
Unit #1 Linear Systems Fall Dr. Jehad Al Dallal.
Matrices CHAPTER 8.9 ~ Ch _2 Contents  8.9 Power of Matrices 8.9 Power of Matrices  8.10 Orthogonal Matrices 8.10 Orthogonal Matrices 
Computational Physics (Lecture 7) PHY4061. Eigen Value Problems.
Numerical Computation Lecture 6: Linear Systems – part II United International College.
ALGEBRAIC EIGEN VALUE PROBLEMS
Characteristic Polynomial Hung-yi Lee. Outline Last lecture: Given eigenvalues, we know how to find eigenvectors or eigenspaces Check eigenvalues This.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Matrices and vector spaces
Elementary Matrix Methid For find Inverse
Numerical Analysis Lecture 16.
Derivative of scalar forms
Numerical Analysis Lecture14.
Numerical Analysis Lecture10.
Numerical Analysis Lecture 17.
Linear Algebra: Matrix Eigenvalue Problems – Part 2
Ax = b Methods for Solution of the System of Equations (ReCap):
Presentation transcript:

Lecture 13 - Eigen-analysis CVEN 302 July 1, 2002

Lecture’s Goals –Shift Method –Inverse Power Method –Accelerated Power Method –QR Factorization –Householder –Hessenberg Method

Shift method It is possible to obtain another eigenvalue from the set equations by using a technique known as shifting the matrix. Subtract the a vector from each side, thereby changing the maximum eigenvalue

Shift method The eigenvalue, s, is the maximum value of the matrix A. The matrix is rewritten in a form. Use the Power method to obtain the largest eigenvalue of [B].

Example of Shift Method Consider the follow matrix A Assume an arbitrary vector x 0 = { 1 1 1} T

Example of Shift Method Multiply the matrix by the matrix [A] by {x} Normalize the result of the product

Example of Shift Method Continue with the iteration and the final value is = -5. However, to get the true you need to shift back by:

Inverse Power Method The inverse method is similar to the power method, except that it finds the smallest eigenvalue. Using the following technique.

Inverse Power Method The algorithm is the same as the Power method and the “eigenvector” is not the eigenvector for the smallest eigenvalue. To obtain the smallest eigenvalue from the power method.

Inverse Power Method The inverse algorithm use the technique avoids calculating the inverse matrix and uses a LU decomposition to find the {x} vector.

Example The matrix is defined as:

Matlab Program There are set of programs Power and InversePower. The InversePower(A, x 0,iter,tol) does the inverse method.

Accelerated Power Method The Power method can be accelerated by using the Rayleigh Quotient instead of the largest w k value. The Rayleigh Quotient is defined as:

Accelerated Power Method The values of the next z term is defined as: The Power method is adapted to use the new value.

Example of Accelerated Power Method Consider the follow matrix A Assume an arbitrary vector x 0 = { 1 1 1} T

Example of Accelerated Power Method Multiply the matrix by the matrix [A] by {x}

Example of Accelerated Power Method Multiply the matrix by the matrix [A] by {x}

Example of Accelerated Power Method

And so on...

QR Factorization The technique can be used to find the eigenvalue using a successive iteration using Householder transformation to find an equivalent matrix to [A] having an eigenvalues on the diagonal

QR Factorization Another form of factorization A = Q*R Produces an orthogonal matrix (“Q”) and a right upper triangular matrix (“R”) Orthogonal matrix - inverse is transpose

Why do we care? We can use Q and R to find eigenvalues 1. Get Q and R (A = Q*R) 2. Let A = R*Q 3. Diagonal elements of A are eigenvalue approximations 4. Iterate until converged QR Factorization Note: QR eigenvalue method gives all eigenvalues simultaneously, not just the dominant

In practice, QR factorization on any given matrix requires a number of steps First transform A into Hessenberg form Hessenberg matrix - upper triangular plus first sub-diagonal Special properties of Hessenberg matrix make it easier to find Q, R, and eigenvalues QR Eigenvalue Method

QR Factorization Construction of QR Factorization

QR Factorization Use Householder reflections and given rotations to reduce certain elements of a vector to zero. Use QR factorization that preserve the eigenvalues. The eigenvalues of the transformed matrix are much easier to obtain.

Jordan Canonical Form Any square matrix is orthogonally similar to a triangular matrix with the eigenvalues on the diagonal

Similarity Transformation Transformation of the matrix A of the form H -1 AH is known as similarity transformation. A real matrix Q is orthogonal if Q T Q = I. If Q is orthogonal, then A and Q -1 AQ are said to be orthogonally similar The eigenvalues are preserved under the similarity transformation.

Upper Triangular Matrix The diagonal elements R ii of the upper triangular matrix R are the eigenvalues

Householder Reflector Householder reflector is a matrix of the form It is straightforward to verify that Q is symmetric and orthogonal

Householder Matrix Householder matrix reduces z k+1,…,z n to zero To achieve the above operation, v must be a linear combination of x and e k

Householder Transformation

Householder matrix Corollary (k th Householder matrix): Let A be an nxn matrix and x any vector. If k is an integer with 1< k<n-1 we can construct a vector w (k) and matrix H (k) = I - 2w (k) w’ (k) so that

Householder matrix Define the value  so that The vector w is found by Choose  = sign(x k )g to reduce round-off error

Householder Matrices

Example: Householder Matrix

Basic QR Factorization [A] = [Q] [R] [Q] is orthogonal, Q T Q = I [R] is upper triangular QR factorization using Householder matrices Q = H (1) H (2) ….H (n-1)

Example: QR Factorization

Similarity transformation B = Q T AQ preserve the eigenvalues QR Factorization QR = A

Finding Eigenvalues Using QR Factorization Generate a sequence A (m) that are orthogonally similar to A Use Householder transformation H -1 AH the iterates converge to an upper triangular matrix with the eigenvalues on the diagonal Find all eigenvalues simultaneously!

QR Eigenvalue Method QR factorization: A = QR Similarity transformation: A (new) = RQ

Example: QR Eigenvalue

» A=[1 2 -1; ; ] A = » [Q,R]=QR_factor(A) Q = R = » e=QR_eig(A,6); A = A = A = A = A = A = e = MATLAB Example QR factorization eigenvalue

Improved QR Method Using similarity transformation to form an upper Hessenberg Matrix (upper triangular matrix & one nonzero band below diagonal). More efficient to form Hessenberg matrix without explicitly forming the Householder matrices (not given in textbook). function A = Hessenberg(A) [n,nn] = size(A); for k = 1:n-2 H = Householder(A(:,k),k+1); A = H*A*H; end

» A=[1 2 -1; ; ] A = » [Q,R]=QR_factor_g(A) Q = R = » e=QR_eig_g(A,6); A = A = Improved QR Method A = A = A = A = e = » eig(A) ans = Hessenberg matrix MATLAB function eigenvalue

Summary Single value eigen-analysis –Power Method –Shifting technique –Inverse Power Method QR Factorization –Householder matrix –Hessenberg matrix

Homework Check the Homework webpage