Download presentation
Presentation is loading. Please wait.
Published byAbigail Cummings Modified over 9 years ago
1
Numerical Analysis – Linear Equations(I) Hanyang University Jong-Il Park
2
Linear equations N unknowns, M equations where coefficient matrix
3
Solving methods Direct methods Gauss elimination Gauss-Jordan elimination LU decomposition Singular value decomposition … Iterative methods Jacobi iteration Gauss-Seidel iteration …
4
Basic properties of matrices(I) Definition element row column row matrix, column matrix square matrix order= MxN (M rows, N columns) diagonal matrix identity matrix : I upper/lower triangular matrix tri-diagonal matrix transposed matrix: A t symmetric matrix: A= A t orthogonal matrix: A t A= I
5
Diagonal dominance Transpose facts Basic properties of matrices(II)
6
Basic properties of matrices(III) Matrix multiplication
7
Determinant C
8
Determinant facts(I)
9
Determinant facts(II)
10
Geometrical interpretation of determinant
11
Over-determined/ Under-determined problem Over-determined problem (m>n) least-square estimation, robust estimation etc. Under-determined problem (n<m) singular value decomposition
12
Augmented matrix
13
Cramer’s rule
14
Triangular coefficient matrix
15
Substitution Upper triangular matrix Lower triangular matrix
16
Gauss elimination 1. Step 1: Gauss reduction =Forward elimination Coefficient matrix upper triangular matrix 2. Step 2: Backward substitution
17
Gauss reduction Gauss reduction
18
Eg. Gauss elimination(I)
19
Eg. Gauss elimination(II)
20
Troubles in Gauss elimination Harmful effect of round-off error in pivot coefficient Pivoting strategy
21
Eg. Trouble(I)
22
Eg. Trouble(II)
23
Pivoting strategy To determine the smallest such that and perform Partial pivoting dramatic enhancement!
24
Effect of partial pivoting
25
Scaled partial pivoting Scaling is to ensure that the largest element in each row has a relative magnitude of 1 before the comparison for row interchange is performed.
26
Eg. Effect of scaling
27
Complexity of Gauss elimination Too much!
28
Summary: Gauss elimination 1) Augmented matrix 의 행을 최대값이 1 이 되도록 scaling(= normalization) 2) 첫 번째 열에 가장 큰 원소가 오도록 partial pivoting 3) 둘째 행 이하의 첫 열을 모두 0 이 되도록 eliminating 4) 2 행에서 n 행까지 1)- 3) 을 반복 5) backward substitution 으로 해를 구함 0 0 0
29
Gauss-Jordan elimination
30
Eg. Obtaining inverse matrix(I)
31
Eg. Obtaining inverse matrix(II) Backward substitution For each column
32
LU decomposition Principle: Solving a set of linear equations based on decomposing the given coefficient matrix into a product of lower and upper triangular matrix. Ax = b LUx = b L -1 LUx = L -1 b A=LUL -1 L -1 b=c U x = c L L L -1 b = Lc L c = b (1) (2) By solving the equations (2) and (1) successively, we get the solution x. Upper triangular Lower triangular
33
Various LU decompositions Doolittle decomposition L 의 diagonal element 를 모두 1 로 만들어줌 Crout decomposition U 의 diagonal element 를 모두 1 로 만들어줌 Cholesky decomposition L 과 U 의 diagonal element 를 같게 만들어줌 symmetric, positive-definite matrix 에 적합
34
Crout decomposition
35
Implementation of Crout method
36
Programming using NR in C(I) Solving a set of linear equations
37
Programming using NR in C(II) Obtaining inverse matrix
38
Programming using NR in C(III) Calculating the determinant of a matrix
39
Homework #5 (Cont’) [Due: 10/22]
40
(Cont’) Homework #5
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.