Presentation is loading. Please wait.

Presentation is loading. Please wait.

2004/12/22 1 Brief Outline of the Earth Simulator and Our Research Activities AND A Lesson Learnt for the Past Three Years Wataru Ohfuchi

Similar presentations


Presentation on theme: "2004/12/22 1 Brief Outline of the Earth Simulator and Our Research Activities AND A Lesson Learnt for the Past Three Years Wataru Ohfuchi"— Presentation transcript:

1

2 2004/12/22 1 Brief Outline of the Earth Simulator and Our Research Activities AND A Lesson Learnt for the Past Three Years Wataru Ohfuchi ohfuchi@jamstec.go.jp Earth Simulator Center Japan Agency for Marine-Earth Science and Technology and Atmospheric and Oceanic Simulation Group and AFES Working Team

3 2004/12/22 2

4 2004/12/22 3 Earth Simulator Building

5 2004/12/22 4 Inside the Earth Simulator Building PN cabinets(320) IN cabinets(65) 65m (71yd) Double Floor for Cables Power Supply System Cartridge Tape Library System Magnetic Disk System Air Conditioning System Seismic Isolation System 50m (55yd)

6 2004/12/22 5 Comparison of PN Size about 6m Peak Performance: 64Gflops Electric Power: about 8kVA Air Cooling Peak Performance: 64Gflops Electric Power: about 90kVA Air Cooling about 7m NEC SX-4 (1 node) Earth Simulator 70cm 100cm

7 2004/12/22 6 Configuration of the Earth Simulator Shared Memory 16GB Arithmetic Processor #0 Arithmetic Processor #1 Arithmetic Processor #7 Shared Memory 16GB Arithmetic Processor #0 Arithmetic Processor #1 Arithmetic Processor #7 Shared Memory 16GB Arithmetic Processor #0 Arithmetic Processor #1 Arithmetic Processor #7 Processor Node #0Processor Node #1Processor Node #639 Total peak performance:40Tflops Total main memory:10TB Peak performance/AP: 8Gflops Peak performance/PN:64Gflops Shared memory/PN:16GB Interconnection Network (full crossbar switch) Total number of APs:5120 Total number of PNs: 640

8 2004/12/22 7 Connection among Nodes XSW #0 XSW #1 XSW #2 XSW #3 XSW #4 XSW #5 XSW #6 XSW #7 XSW #126XSW #127 XCT #0 PN #2PN #3PN #4PN #5 PN #636PN #637PN #638PN #639 PN #0 PN #1 128 XSWs 64 Cabinets 640 PNs 320 Cabinets XCT #1 PN-IN Electric Cables : 640 x 130 = 83,200

9 2004/12/22 8 Electric Cables under the Floor

10 2004/12/22 9

11 2004/12/22 10

12 2004/12/22 11

13 2004/12/22 12 An Overview of AFES (AGCM for the Earth Simulator) –Primitive equation system (hydrostatic approximation) Valid (arguably) down to 10 km (T1279) –Spectral Eulerian –Physical processes Cumulus parameterizations (A-S, Kuo, MCA, Emanuel) Radiation (mstranX: Sekiguchi et al. 2004) Surface model: MATSIRO (Takata et al. 2004) Etc –Adopted from CCSR/NIES AGCM5.4.02 Center for Climate System Research, the Univ. Tokyo Japanese National Institute for Environmental Studies Rewritten totally from scratch with FORTRAN 90, MPI and microtasking

14 2004/12/22 13 T1279L96 AFESs Scalability

15 2004/12/22 14 AFES won the Gordon Bell Award for Peak Performance!!!

16 2004/12/22 15 Meso-scale Resolving T1279L96 Simulations Typhoons, wintertime cyclogenesis and Baiu-Meiyu front –Interactions between large-scale circulations and meso-scale phenomena –Self-organization of meso-scale circulations in larger circulation field Short-term (10 days to 2 weeks) –CPU power is NOT a problem; data size (~Tera bytes) is the problem

17 2004/12/22 16 10-km Mesh Global Simulations

18 2004/12/22 17 Typhoons over Western Pacific

19 2004/12/22 18 Winter Cyclogenesis over Japan

20 2004/12/22 19 Baiu-Meiyu Front over Japan

21 2004/12/22 20 But, So What?

22 2004/12/22 21 Our ES Project 2004 Mechanism and predictability of atmospheric and oceanic variations induced by interactions between large-scale field and meso-scale phenomena –Project leader: Wataru Ohfuchi –FES models+THORPEX AFES –Sub-project leader: Takeshi Enomoto (ESC) –AGCM OFES –Sub-project leaders: Hideharu Sasaki (ESC), Hirofumi Sakuma (FRCGC), Yukio Maumoto (FRCGC/U. Tokyo) –MOM3-based OGCM CFES –Sub-project leader: Nobumasa Komori (ESC) –Coupled model: AFES + OIFES (OFES + IARC sea ice model) THORPEX –Sub-project leader: Tadashi Tsuyuki (NPD, JMA) –High-resolution singular vector method and predictability

23 2004/12/22 22 Summary With the combination of the ES and models well optimized for its architecture, now it is possible to conduct meaningful (ultra-)high resolution global simulations first in the history of computational atmospheric and oceanic sciences, and geophysical fluid dynamics. Interaction between meso-scale phenomena and larger-scale circulation can be studied. Scientifically new knowledge and contribution to society are expected.

24 2004/12/22 23 A Possible Future Direction of High Performance Computing in Atmospheric and Oceanic Sciences: A Lesson Learnt for the Past Three Years with the Earth Simulator The Earth Simulator was not unfortunately perfect, of course. What I foresee as future modeling strategy. What I foresee as future HPC in AOS. It will be published in Advances in Science: Earth Science edited by Prof. Peter Sammonds, Royal Societys Philosophical Transactions (2005).

25 2004/12/22 24 How Many Points Are There in the 10-km Mesh AGCM? T1269L96 –1279 spherical harmonics with the so-called triangular trancation. –3840 (longitude) X 1920 (latitude) X 96 (layers) = … –~700 M points… Assume Double Precision (8B) and 100 Variables… –~560 GB –Actually, the T1279L96 AFES needs about 1.2 TB of memory.

26 2004/12/22 25 How Much Data Are We Producing with the 10-km Mesh AGCM? One 3-D Snapshot. –2.6GB. Oh, We Need 6-hourly Output!!! Ten 3-D Variables!!! For One Day… –2.6GB X 4/day (6-hourly) X 10 variables = … –104GB/day. Oh, We Want to Integrate for 10 Days… –~1TB. Oh, We Are Climatologists!!! We Want to Integrate for 10000 Days… –~1PB. Oh, We Need Ten Sensitivity Tests!!! –10PB.

27 2004/12/22 26 Future HPC in the World (in 2010…) VERY Unfortunately HPC Hardware Business Is Currently Dominated by the IMPERIAL JAPAN and the US of A!!! Suppose Emerging Super Computing Country, Taiwan, Republic of China, Takes Over Submerging J and USA within a Few Years. The Earth System Simulator, the National Taiwan Normal University. –1 PFlops machine (25 larger than the ES). –1 Exabyte of hardisk/PROJECT!!! –1 Zettabyte of long-term storage/PROJECT!!!

28 2004/12/22 27 So, What We Can Do with The Earth System Simulator at the National Taiwan Normal University 2010? When You Increase the Resolution by the Factor of Two… –2 (longitutde) X 2 (latitude) X 2 (vertical levels) X 2 (time) = 16 –~ 25. The Current Biggest Global Atmospheric Simulation Project on the ES is ~3-km Mesh (Nonhydorstatic) –So, ~1.5-km mesh simulation. –Sorry, its not cloud resolving, yet!!!

29 2004/12/22 28 We Need to Think Better Than That!!! Multi-scale Modeling. Stand-alone Global Hydrostatic Model. Stand-alone Regional Nonhydrostatic Model. –As super-parameterization. Stand-alone 3-D Turbulence Model. Super-prameterization-like link between these models. We may have to go down to explicit cloud physics. Of course, 3-D radiation!!!

30 2004/12/22 29 Conclusions 1 HPC is not only a number crunching capability. –Linpack has become totally obsolete. Data handling is much much much much much much….MORE important. We are already in the middle of the storm of data! –Hard disk. –Long-term data storage. –Software. But still we need to think much better. –Just increasing resolution does not seem to lead to a breakthrough.

31 2004/12/22 30 Conclusions 2 A HUGE HPC System should be used as a whole. –The ES consists of 640 nodes. –Sorry, those jobs that require less than ~320 nodes should go away. Expensive vs. Cheapo –Vector vs. Scalar? –We need to think about cost effectiveness. –It may depend on problems. GRID? –Probably very good for data sharing. –Simulations? We need to integrate science and engineering. –At least wee need to understand both and have strong opinions.


Download ppt "2004/12/22 1 Brief Outline of the Earth Simulator and Our Research Activities AND A Lesson Learnt for the Past Three Years Wataru Ohfuchi"

Similar presentations


Ads by Google