Presentation is loading. Please wait.

Presentation is loading. Please wait.

Implementing Hypre- AMG in NIMROD via PETSc S. Vadlamani- Tech X S. Kruger- Tech X T. Manteuffel- CU APPM S. McCormick- CU APPM Funding: DE-FG02-07ER84730.

Similar presentations


Presentation on theme: "Implementing Hypre- AMG in NIMROD via PETSc S. Vadlamani- Tech X S. Kruger- Tech X T. Manteuffel- CU APPM S. McCormick- CU APPM Funding: DE-FG02-07ER84730."— Presentation transcript:

1 Implementing Hypre- AMG in NIMROD via PETSc S. Vadlamani- Tech X S. Kruger- Tech X T. Manteuffel- CU APPM S. McCormick- CU APPM Funding: DE-FG02-07ER84730

2 Goals SBIR funding for –“improving existing multigrid linear solver libraries applied to the extended MHD system to work efficiently on petascale computers”. HYPRE chosen because –Multigrid shown to “scale” – CU development callable from PETSc interface –development of library (ie. AMG method) will benefit all CEMM efforts Phase I: –explore HYPRE’s solvers applied to the positive definite matricies in NIMROD –start a validation process for petascale scalings Phase II: –push development for non-symmetric operators of the extended MHD system on high-order FE grids

3 Equations of interest

4 Major Difficulties for MHD system MHD equations yield difficult to invert matrices for three reasons: – Velocity advance has 3D matrix which stabilizes the MHD waves to high accuracy, –Magnetic field advance has a 3D matrix due to the temperature-dependent resistivity that varies by 5 orders of magnitude across the simulation domain, –Temperature advance has a three-dimensional, anisotropic operator with parallel diffusion coefficient that is 5 to 10 orders of magnitude larger than the perpendicular coefficent. All of the matrices discussed, while ill-conditioned, are Hermitian. Inclusion of extended MHD terms not only increases the condition number of matrices (by making the largest eigenvalue larger), but is fundamentally non-symmetric in nature. Very hard linear problems, must use solvers that scale

5 Reminder: Algebraic Multigrid The smoothing process (also known as relaxation) is an application of a linear solver (usually iterative) that results in a smooth error. The coarse-grid correction is made up of three subprocesses: –(1) restriction: a particular transfer of information to a coarse-grid, –(2) coarse-grid solve: solving the linear system on the chosen coarse- grid, –(3)prolongation: a transfer or interpolation of information back to the finer grid. The effectiveness of this algorithm relies on carefully choosing restriction and the coarse- grid solve, which are dependent on attributes of the system of equations being solved.

6 Typical AMG process

7 Using PETSc Level of PETSc Compliance –for developer support –PETSc programs usually initialize and kill their own MPI communicators..need to match patterns Calling from fortran (77 mentality): –#include ”include/finclude/includefile.h”, *.F for preproccesor –careful to only “include” once in each encapsulated subroutine –must access arrays via an integer index name internal to PETSc –zero indexing *IMPORTANT*

8 Sample code to set elements of an array #define xx_a(ib) xx_v(xx_i + (ib)) double precision xx_v(1) PetscOffset xx_i PetscErrorCode ierr integer i, n Vec x call VecGetArray(x,xx_v,xx_i,ierr) call VecGetLocalSize(x,n,ierr) do i=1,n xx_a(i) = 3*i + 1 enddo call VecRestoreArray(x,xx_v,xx_i,ierr)

9 Hypre calls in PETSc No change to matrix and vector creation Will set the preconditioner type to HYPRE types within a linear system solver –KSP package –needed for the smoothing process PCHYPRESetType () PetscOptionsSetValue()

10 Work Plan for Phase 1 SBIR Implement HYPRE Solvers In Nimrod via PETSc –understanding full NIMROD data structure –backward compatibility with current solvers –comparative simulations with benchmark cases Establish metrics for solvers’ efficiency Initial analysis of MG capability for extended MHD

11 This Week Revisit work done with SuperLU interface –implementation of distributed interface will give better insight to NIMROD data structure on communication patterns Obtain troublesome matrices in triplet format –Send to Sherry Li for further analysis and SuperLU development –possibilty of visualization (matplotlib, etc)

12 Summary Beginning implementing PETSc in NIMROD Will explore HYPRE solvers with derived metrics to establish effectiveness Explore mathematical properties of the extended MHD system to understand feasibility of AMG still scaling while solving these particular non-symmetric matrices [way in the future]: May need to use BoxMG (LANL) for anisotropic temperature advance


Download ppt "Implementing Hypre- AMG in NIMROD via PETSc S. Vadlamani- Tech X S. Kruger- Tech X T. Manteuffel- CU APPM S. McCormick- CU APPM Funding: DE-FG02-07ER84730."

Similar presentations


Ads by Google