C.C.Tscherning & M.Veicherts, University of Copenhagen, Jan. 2008 1 Accelerating generalized Cholesky decomposition using multiple processors.

Slides:



Advertisements
Similar presentations
4.1 Introduction to Matrices
Advertisements

Section 13-4: Matrix Multiplication
ARCGICE WP 4.3 Recommendations for inclusion of GOCE data C.C.Tscherning & S.Laxon C.C.Tscherning, UCPH, S.Laxon, UCLA,
ARCGICE WP 2.2 ERROR ESTIMATION OF NEW GEOID C.C.Tscherning, University of Copenhagen,
Maths for Computer Graphics
Solving systems using matrices
Thomas algorithm to solve tridiagonal matrices
ARCGICE WP 5.2 Plan for development of Atctic geoid using GOCE C.C.Tscherning, University of Copenhagen,
1 Matrix Addition, C = A + B Add corresponding elements of each matrix to form elements of result matrix. Given elements of A as a i,j and elements of.
CE 311 K - Introduction to Computer Methods Daene C. McKinney
Section 8.1 – Systems of Linear Equations
Multi-Processing Least Squares Collocation: Applications to Gravity Field Analysis. Kaas. E., B. Sørensen, C. C. Tscherning, M. Veicherts.
LU Decomposition 1. Introduction Another way of solving a system of equations is by using a factorization technique for matrices called LU decomposition.
Copyright © 2014, 2010, 2007 Pearson Education, Inc.
8.1 Matrices and Systems of Equations. Let’s do another one: we’ll keep this one Now we’ll use the 2 equations we have with y and z to eliminate the y’s.
Chapter 6 Matrices and Determinants Copyright © 2014, 2010, 2007 Pearson Education, Inc Matrix Solutions to Linear Systems.
Matrices & Determinants Chapter: 1 Matrices & Determinants.
Ch X 2 Matrices, Determinants, and Inverses.
Unit 6 : Matrices.
2.5 Determinants and Multiplicative Inverses of Matrices Objectives: Evaluate determinants. Find inverses of matrices. Solve systems of equations by using.
Unit 3: Matrices.
Matrices Matrices For grade 1, undergraduate students For grade 1, undergraduate students Made by Department of Math.,Anqing Teachers college.
C.C.Tscherning, University of Copenhagen, Denmark. Developments in the implementation and use of Least-Squares Collocation. IAG Scientific Assembly, Potsdam,
Co. Chapter 3 Determinants Linear Algebra. Ch03_2 Let A be an n  n matrix and c be a nonzero scalar. (a)If then |B| = …….. (b)If then |B| = …..... (c)If.
Class Opener:. Identifying Matrices Student Check:
2 x 2 Matrices, Determinants, and Inverses.  Definition 1: A square matrix is a matrix with the same number of columns and rows.  Definition 2: For.
Slide Copyright © 2009 Pearson Education, Inc. 7.3 Matrices.
10.3 Systems of Linear Equations: Matrices. A matrix is defined as a rectangular array of numbers, Column 1Column 2 Column jColumn n Row 1 Row 2 Row 3.
8.2 Operations With Matrices
3.6 Solving Systems Using Matrices You can use a matrix to represent and solve a system of equations without writing the variables. A matrix is a rectangular.
Sec 4.1 Matrices.
Algebra Matrix Operations. Definition Matrix-A rectangular arrangement of numbers in rows and columns Dimensions- number of rows then columns Entries-
2.5 Determinants and Multiplicative Inverses of Matrices. Objectives: 1.Evaluate determinants. 2.Find the inverses of matrices. 3.Solve systems of equations.
2.5 – Determinants and Multiplicative Inverses of Matrices.
Matrix Multiplication The Introduction. Look at the matrix sizes.
LEARNING OUTCOMES At the end of this topic, student should be able to :  D efination of matrix  Identify the different types of matrices such as rectangular,
Table of Contents Matrices - Definition and Notation A matrix is a rectangular array of numbers. Consider the following matrix: Matrix B has 3 rows and.
Notes Over 4.2 Finding the Product of Two Matrices Find the product. If it is not defined, state the reason. To multiply matrices, the number of columns.
Unit 3: Matrices. Matrix: A rectangular arrangement of data into rows and columns, identified by capital letters. Matrix Dimensions: Number of rows, m,
Do Now: Perform the indicated operation. 1.). Algebra II Elements 11.1: Matrix Operations HW: HW: p.590 (16-36 even, 37, 44, 46)
Chapter 5: Matrices and Determinants Section 5.5: Augmented Matrix Solutions.
4.1 An Introduction to Matrices Katie Montella Mod. 6 5/25/07.
A ij i = row j = column A [ A ] Definition of a Matrix.
Designed by Victor Help you improve MATRICES Let Maths take you Further… Know how to write a Matrix, Know what is Order of Matrices,
Matrices. Variety of engineering problems lead to the need to solve systems of linear equations matrixcolumn vectors.
D.N. Arabelos, M. Reguzzoni and C.C.Tscherning HPF Progress Meeting # 26, München, Feb , Global grids of gravity anomalies and vertical gravity.
LU Decomposition ● In Gauss elimination; Forward elimination Backward substitution Major computational effort Low computational effort can be used for.
Copyright © 2014, 2010, 2007 Pearson Education, Inc.
Lesson 43: Working with Matrices: Multiplication
12-1 Organizing Data Using Matrices
Finding the Inverse of a Matrix
Multiplication of Matrices
Matrix Operations SpringSemester 2017.
Warm Up Use scalar multiplication to evaluate the following:
7.3 Matrices.
RECORD. RECORD Gaussian Elimination: derived system back-substitution.
Using Matrices to Solve Systems of Equations
Warmup Solve each system of equations. 4x – 2y + 5z = 36 2x + 5y – z = –8 –3x + y + 6z = 13 A. (4, –5, 2) B. (3, –2, 4) C. (3, –1, 9) D. no solution.
Numerical Algorithms Quiz questions
Unit 3: Matrices
RECORD. RECORD COLLABORATE: Discuss: Is the statement below correct? Try a 2x2 example.
Multiplication of Matrices
Matrix Addition, C = A + B Add corresponding elements of each matrix to form elements of result matrix. Given elements of A as ai,j and elements of B as.
1.8 Matrices.
Matrix Operations SpringSemester 2017.
Matrices and Determinants
Section 8.1 – Systems of Linear Equations
Matrix A matrix is a rectangular arrangement of numbers in rows and columns Each number in a matrix is called an Element. The dimensions of a matrix are.
1.8 Matrices.
Matrix Multiplication Sec. 4.2
Presentation transcript:

C.C.Tscherning & M.Veicherts, University of Copenhagen, Jan Accelerating generalized Cholesky decomposition using multiple processors

C.C.Tscherning & M.Veicherts, University of Copenhagen, Jan Application in Least-Squares Collocation

C.C.Tscherning & M.Veicherts, University of Copenhagen, Jan Error-covariance estimation

C.C.Tscherning & M.Veicherts, University of Copenhagen, Jan Cholesky Factorization L: lower triangular matrix

C.C.Tscherning & M.Veicherts, University of Copenhagen, Jan Generalized Cholesky

C.C.Tscherning & M.Veicherts, University of Copenhagen, Jan More Generalized Cholesky

C.C.Tscherning & M.Veicherts, University of Copenhagen, Jan Parallization When diagonal element has been computed may each element in the row be reduced separately: Hence each processor may take care of one column.

C.C.Tscherning & M.Veicherts, University of Copenhagen, Jan Blockwise factorization Should one row be factorized at at time ? Or should we make the factorization of blocks of elements ? Out-of-core factorization needed for large matrices, so let the processors work on blocked matrices.

C.C.Tscherning & M.Veicherts, University of Copenhagen, Jan blocks ‘Column-wise’ 1-dim. of size 9 3 blocks rectangular 2-dim. of size 3*3 Block division Column-wise and rectangular Blocks 1 2 3Blocks 1 2 c 11 c 21 c 31 c 41 c 51 c 12 c 22 c 13 c 32 c 23 c 33 c 14 c 24 c 34 c 42 c 43 c 52 c 15 c 25 c 35 c 44 c 45 c 16 c 26 c 36 c 46 c 56 c 55 c 54 c 53 c 61 c 62 c 66 c 65 c 64 c 63 Block 3 c 11 c 21 c 31 c 41 c 51 c 12 c 22 c 13 c 32 c 23 c 33 c 14 c 24 c 34 c 42 c 43 c 52 c 15 c 25 c 35 c 44 c 45 c 16 c 26 c 36 c 46 c 56 c 55 c 54 c 53 c 61 c 62 c 66 c 65 c 64 c 63

C.C.Tscherning & M.Veicherts, University of Copenhagen, Jan Blocksize tests NEQ = 10000, Nproc = 4NEQ = 20000, Nproc = 2

C.C.Tscherning & M.Veicherts, University of Copenhagen, Jan Parallelization Flowchart over the Choleski factorisation with NES_MP and related subroutine(s)

C.C.Tscherning & M.Veicherts, University of Copenhagen, Jan Parallelization Results Results (Perf. test on two PCs, Compiler PGF90) GOCE (4x3GHz, 2GB)IKOS (4x2.66GHz, 4GB) PROCNEQ.NESNES_MPNESNES_MP

C.C.Tscherning & M.Veicherts, University of Copenhagen, Jan Integration in GEOCOL18 ServerNEQGeocol17aGeocol18zr Processors 124 GOCE IKOS Geocol integration tests: Timing (in s) for equation solving only.

C.C.Tscherning & M.Veicherts, University of Copenhagen, Jan Performance Increase

C.C.Tscherning & M.Veicherts, University of Copenhagen, Jan Conclusion Generalized Cholesky-factorization enables the use of parallelization for solution and error-covariance computation. Time gain using parallelization depends on number of processors, block-size and how busy the computer is doing other things.

C.C.Tscherning & M.Veicherts, University of Copenhagen, Jan Note: further use of multiprocessing Evaluation of spherical harmonic series (N.Pavlis et al.). Establishing the normal-equation matrix or computing a column of covariances Factorisation may start as soon as a row of blocks has been established. Gives realistic speeds of LSC applications (minutes instead of days).