Presentation is loading. Please wait.

Presentation is loading. Please wait.

Fusion-SDM (1) Problem description –Each run in future: ¼ Trillion particles, 10 variables, 8 bytes –Each time step, generated every 60 sec is (250x10^^9)x8x10.

Similar presentations


Presentation on theme: "Fusion-SDM (1) Problem description –Each run in future: ¼ Trillion particles, 10 variables, 8 bytes –Each time step, generated every 60 sec is (250x10^^9)x8x10."— Presentation transcript:

1 Fusion-SDM (1) Problem description –Each run in future: ¼ Trillion particles, 10 variables, 8 bytes –Each time step, generated every 60 sec is (250x10^^9)x8x10 bytes = 2x10^^13 (20 Terabyte) I/O rate 300 Gb/sec (not possible) –Put analysis into simulation –Need to embed analysis into computation Approach –Reduce data by summarizing into bins –55 GB per 100x10 bins - break into 64 files (probably) –2000 time steps x64 files x4 runs = 512,000 files –Every 60 sec need to Assume –you get 20 GB/s, Need to reduce data accordingly –Per run: 55GBX2000 = 110 TB Archival: move data to HPSS –110 TB/ 300MB/s = 4.2 days per simulation * 4 runs

2 Fusion-SDM (2) Tasks –Task 1: help checking that 20 GB/s can be sustained –Task 2: workflow integrated into the process for generating images etc. for monitoring the progress –Task 3: move data to HPSS with HSI – workflow task

3 Fusion-SDM (3) Analysis scenario –Goal: To find coherent structures –Need to generate coarser bins “reduced” data, or sample data –How to run parallel analysis on entire dataset? –Approach: incremental progress Task: Chandrika –Finding coherent structure –Analyzing particle data on 5D mesh. –Reduce the 55GB mesh down to 440MB mesh. (4x spatially, 2x in one velocity dimension). Do this during for the last 1K timesteps. = 430GB. –Pick a few time steps – toroidal / poloidal data –Then increase granularity –Then take more time steps –Then need to run the whole analysis in parallel, etc. –Then apply to XGC-1 data – in CPES, etc…

4 provenance Provenance –Currently: weak coupling (10 MB every few seconds) Static / dynamic libraries –Future First principle codes Models – fast, much longer time simulations: stronger coupling. –Need multiple codes to be couples strongly Need electronic notebook (find new name for that) Keeping track what is sent where: perhaps of profiles (1D volume averaged slices?) –Provenance Which machine, what parameters, etc. What code: got to save the code, what libraries Task: work with Seung-Hoe/ Julian Cummings on capturing metadata (XGC) –From makefiles/build environment.? –Get information from job submission. –Get information from PAPI. Task: changes in the dashboard –Generate movies on the fly (More real-time image generation) –More analysis (IDL scripts, VISIT, fast-bit, run this script from dashboard, parallel coordinates, correlation functions, …).


Download ppt "Fusion-SDM (1) Problem description –Each run in future: ¼ Trillion particles, 10 variables, 8 bytes –Each time step, generated every 60 sec is (250x10^^9)x8x10."

Similar presentations


Ads by Google