Presentation is loading. Please wait.

Presentation is loading. Please wait.

An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.

Similar presentations


Presentation on theme: "An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear."— Presentation transcript:

1 An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear Flashes The Paramesh Grid Implementation in FLASH3 Klaus Weide May 2006

2 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Preliminaries Scope:  The PARAMESH3 (actually now: PARAMESH 4.0) implementation of the Grid unit in FLASH3  Ignoring particles (a related but separate topic)  Ignoring NO_PERMANENT_GUARDCELLS etc.  What users* of FLASH3 need to know about this Grid. *In particular, writers of non-infrastructure code that will work with FLASH3.

3 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Preliminaries A Warning that Shall Be Repeated:  This presentation includes discussion of internal structures of the Grid implementation, in order to explain the function of the Grid.  Please do not access these structures in code outside the Grid unit. You may get away with it for a while, but we don't want to support it!  If you break it, you get to keep both parts.  Please do use the official interfaces. For example, use Grid_getBlkIndexLimits instead of constants NXB, NYB, etc.  This will keep your code working if the Grid implementation changes!

4 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Basic Computational Unit : Block  The grid is composed of blocks  Blocks are composed of interior cells and surrounding guard cells  Paramesh: All blocks have the same size (number of cells), can have different resolution  PM in Flash3: Block size is determined at setup time.

5 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago First Look at Paramesh Grid  Purpose of the Grid: represent data  more on UNK variables etc. below  Each block resides on exactly one processor (at a given point in time)  Limitations imposed by Paramesh:  Same number of cells, guard cell layers  Resolution (“Delta”) of a block changes by multiples of 2  Resolution of neighbors differs at most by factor of 2

6 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago How Blocks are Identified  At a given time, a block is globally uniquely identified by a pair (PE, BlockID), where  0 < PE < numprocs  1 < BlockID < MAXBLOCKS  Locally, BlockID is sufficient to specify a block  User code can't directly access remote blocks anyway  Morton Numbers provide another way to identify blocks globally (more later)

7 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago How Blocks are Stored  Solution data,  per-block meta data,  tree information (for local blocks!) are stored in F90 arrays declared like this: real, dimension(,,,,MAXBLOCKS) :: UNK real, dimension(,MAXBLOCKS) :: bnd_box integer, dimension(,MAXBLOCKS) :: parent  MAXBLOCKS is a hardwired constant (from setup time)  “Inactive” (non-leaf) blocks also use storage  These structures are internal to the Grid unit and should not be accessed directly by other code.  Use the appropriate Grid_something subroutine calls instead!

8 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago How Blocks are Woven Together I  The Tree (1d: binary tree; 3d: “oct” tree)  parent/child relationships  Neighbors  geometric relationships  Morton Ordering  a way to serialize blocks  used only internally, implicitly  Determines storage of blocks

9 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Connecting Blocks: The Tree  The Tree (1d: binary tree; 3d: “oct” tree)  parent/child relationships  three types of blocks:  LEAF – physics acts only on data of these blocks  PARENT – have at least one LEAF child  ANCESTOR (inactive) – other, data may be invalid  Not all blocks have a parent – there is at least one root node, there may be many!  Each block has a refinement level, 1 ≤ lrefine(blockID) ≤ lrefine_max

10 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Connecting Blocks: Neighbors  Neighbor relationships  neigh array – neighbors across faces  surr_blks – surrounding blocks incl. diagonal  regenerated by PARAMESH3 from other information  “wrap around” at periodic boundaries  Combined Tree and Neighbor info  gr_gid array  combines parent/child/neigh info  This is how linkage info is stored in checkpoint files

11 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Connecting Blocks: Morton Order  Morton Ordering  a way to serialize blocks  Morton Function maps blocks to integers, ' morton order of blocks  determines  how blocks are ordered in memory in each processor AND  how they are distributed across processors  Morton Curve: an illustration of the Morton order  connect centers of leaf blocks in M. order

12 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago How Blocks are Woven Together II  Warnings  The structures linking blocks should not be accessed directly by user code  parent, neigh, etc. may point to remote blocks  The implementation of Morton function is internal magic of Paramesh. You don't need to understand it to use FLASH3/Paramesh. User code generally does not need to know where a block's neighbor is. It can rely on Paramesh to fill guard cells from neighbors correctly.

13 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Per-Block Meta-Information Paramesh keeps information associated with each local block:  parent, child, surr_blks, neigh – block linkage  lrefine, nodetype – tree-related information  coord, bnd_box, bsize – physical coordinates There are public accessor interfaces Grid_getThisOrThat to get the parts of this information that may be useful to user code. Use them!

14 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Block Data: Fluid Variables I FLASH3 provides the following kinds of variables:  UNK variables – stored in adjacent slots in UNK array for each cell  cell-centered (or 'volume-centered')  basic hydrodynamic and other physical continuous variables, aka. solution variables, aka. "unknowns".  data maintained by PARAMESH  UNK contains three kinds of variables, from different Config declarations:  General solution variables VARIABLE name #e.g., VARIABLE dens -> DENS_VAR  Species mass fractions SPECIES NAME #e.g., SPECIES NI56 For technical reasons there is always at least MFRAC_SPEC  "mass scalars" - additional passiveley advected variables MASS_SCALAR NAME #e.g., MASS_SCALAR YE

15 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Block Data: Fluid Variables II Additional kinds of variables:  GRIDVAR aka. SCRATCH variables GRIDVAR name #e.g., GRIDVAR otmp  cell-centered (or face-centered)  for temporary use, data is not meant to survive Grid changes  data maintained by FLASH3  Face variables FACEVAR name #e.g., FACEVAR Mag  face-centered  currently only used by MHD  data maintained by PARAMESH  work space – mentioned for completeness  maintained by PARAMESH, not currently used by FLASH3  Fluxes – mentioned for completeness FLUX name #e.g., FLUX rho  maintained by PARAMESH/FLASH3, used in Hydro

16 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Applying Boundary Conditions  Specify type of Boundaries in flash.par  Inner boundaries are now also possible  Initialization from checkpoint file if restart is TRUE  Non-default boundary condition handling:  physics units access and modify block data  Usually, provide custom Grid_applyBCEdge  provide custom Grid_applyBCEdgeAllUnkVars when needed  Not recommended: provide custom amr_1blk_bcset  NoDriver unit call Grid_updateRefinement after each time step  Hydrostatic boundary conditions:  work in progress  Implementation as in FLASH2 default is now available

17 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Life of The Mesh I What happens to the Grid, from start to finish of a calculation  Initialization  Initialization from scratch if restart is FALSE  Initialization from checkpoint file if restart is TRUE  Changes during Evolution  physics units access and modify block data  physics units call Grid_fillGuardCells when they need it  Driver unit call Grid_updateRefinement after each time step  Data and meta info are periodically stored in checkpoint files.

18 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Life of The Mesh – Initialization from scratch How is the Grid constructed initially?  One or more initial blocks are created, to become root block(s) of tree(s) – see runtime parameters nblockx, nblocky, nblockz  These are distributed by FLASH3 across processors from the very beginning (“parallel divide domain”), as evenly as possible  Some of the initial blocks can now be turned into internal boundary blocks (obstacles), see Simulation_defineDomain  PARAMESH magic machinery is let loose on the remaining initial block(s) repeatedly until desired lrefine_max can be reached. It takes care of  creating new (finer) blocks as children of coarser parents  redistributing blocks across processors (rebalancing) User can influence by  setting runtime parameters (lrefine_max, refine_var_n, etc.)  supplying custom Grid_markRefineDerefine implementation

19 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Life of The Mesh – Initialization from checkpoint How is the Grid reconstructed ?  Checkpoint contains Grid blocks in original (PE, blockID) order (Morton Order)  This logically one-dimensional sequence of blocks is chopped up and distributed evenly across available processors.  The same applies to meta info (e.g., gr_gid for tree info)  Each processor reads and stores only the chunk assigned to it. No processor stores all the global tree data.  PARAMESH3 is called to initialize additional data structures.

20 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Life of The Mesh – Evolution How does the Grid change during a simulation ?  Physics units access and modify block data  Grid_fillGuardCells is called when needed  PARAMESH takes care of communication  interpolation and restriction of data may occur  Driver unit calls Grid_updateRefinement periodically PARAMESH magic machinery is let loose on the blocks. It takes care of  creating new (finer) blocks as children of coarser parents  deleting leaf blocks where reolution is too high  redistributing blocks across processors (rebalancing) User can influence by  setting runtime parameters (lrefine_max, refine_var_n, etc.)  supplying custom Grid_markRefineDerefine implementation

21 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Life of The Mesh – The End How does the Grid save data ?  The state of the blocks and meta-info is saved to checkpoint files  Blocks end up stored sequentially in global (PE,blockID) order, i.e., Morton Order.  Some more on this in IO part of tutorial.


Download ppt "An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear."

Similar presentations


Ads by Google