2004/12/22 1 Brief Outline of the Earth Simulator and Our Research Activities AND A Lesson Learnt for the Past Three Years Wataru Ohfuchi

Slides:



Advertisements
Similar presentations
TWO STEP EQUATIONS 1. SOLVE FOR X 2. DO THE ADDITION STEP FIRST
Advertisements

Symantec 2010 Windows 7 Migration Global Results.
You have been given a mission and a code. Use the code to complete the mission and you will save the world from obliteration…
Advanced Piloting Cruise Plot.
Pricing for Utility-driven Resource Management and Allocation in Clusters Chee Shin Yeo and Rajkumar Buyya Grid Computing and Distributed Systems (GRIDS)
Copyright © 2003 Pearson Education, Inc. Slide 1 Computer Systems Organization & Architecture Chapters 8-12 John D. Carpinelli.
Chapter 1 The Study of Body Function Image PowerPoint
UNITED NATIONS Shipment Details Report – January 2006.
Business Transaction Management Software for Application Coordination 1 Business Processes and Coordination.
1 RA I Sub-Regional Training Seminar on CLIMAT&CLIMAT TEMP Reporting Casablanca, Morocco, 20 – 22 December 2005 Status of observing programmes in RA I.
Governing Equations IV
Summary of Convergence Tests for Series and Solved Problems
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
My Alphabet Book abcdefghijklm nopqrstuvwxyz.
FACTORING ax2 + bx + c Think “unfoil” Work down, Show all steps.
Year 6 mental test 5 second questions
Year 6 mental test 10 second questions
Around the World AdditionSubtraction MultiplicationDivision AdditionSubtraction MultiplicationDivision.
Negative Numbers What do you understand by this?.
1 PV Generation in the Boundary Layer Robert Plant 18th February 2003 (With thanks to S. Belcher)
1 00/XXXX © Crown copyright Carol Roadnight, Peter Clark Met Office, JCMM Halliwell Representing convection in convective scale NWP models : An idealised.
Solve Multi-step Equations
Richmond House, Liverpool (1) 26 th January 2004.
ABC Technology Project
1 Undirected Breadth First Search F A BCG DE H 2 F A BCG DE H Queue: A get Undiscovered Fringe Finished Active 0 distance from A visit(A)
VOORBLAD.
15. Oktober Oktober Oktober 2012.
1 Breadth First Search s s Undiscovered Discovered Finished Queue: s Top of queue 2 1 Shortest path from s.
1 RA III - Regional Training Seminar on CLIMAT&CLIMAT TEMP Reporting Buenos Aires, Argentina, 25 – 27 October 2006 Status of observing programmes in RA.
Factor P 16 8(8-5ab) 4(d² + 4) 3rs(2r – s) 15cd(1 + 2cd) 8(4a² + 3b²)
1..
© 2012 National Heart Foundation of Australia. Slide 2.
Adding Up In Chunks.
Lets play bingo!!. Calculate: MEAN Calculate: MEDIAN
1 © 2004, Cisco Systems, Inc. All rights reserved. CCNA 1 v3.1 Module 7 Ethernet Technologies.
Understanding Generalist Practice, 5e, Kirst-Ashman/Hull
Benjamin Banneker Charter Academy of Technology Making AYP Benjamin Banneker Charter Academy of Technology Making AYP.
25 seconds left…...
1 Using one or more of your senses to gather information.
Subtraction: Adding UP
Equal or Not. Equal or Not
Slippery Slope
Januar MDMDFSSMDMDFSSS
Week 1.
Analyzing Genes and Genomes
We will resume in: 25 Minutes.
©Brooks/Cole, 2001 Chapter 12 Derived Types-- Enumerated, Structure and Union.
Essential Cell Biology
Intracellular Compartments and Transport
A SMALL TRUTH TO MAKE LIFE 100%
PSSA Preparation.
Immunobiology: The Immune System in Health & Disease Sixth Edition
Essential Cell Biology
Immunobiology: The Immune System in Health & Disease Sixth Edition
How Cells Obtain Energy from Food
Immunobiology: The Immune System in Health & Disease Sixth Edition
Energy Generation in Mitochondria and Chlorplasts
CpSc 3220 Designing a Database
Willem A. Landman Asmerom Beraki Francois Engelbrecht Stephanie Landman Supercomputing for weather and climate modelling: convenience or necessity.
Attempts to improve distribution of boundary layer clouds in AFES Akira Kuwano-Yoshida, Takeshi Enomoto and Wataru Ohfuchi Earth Simulator Center, JAMSTEC,
1 THE EARTH SIMULATOR SYSTEM By: Shinichi HABATA, Mitsuo YOKOKAWA, Shigemune KITAWAKI Presented by: Anisha Thonour.
Possible foreseeable measures for tera-scale data handling Kazutoshi Horiuchi *1 Keiko Takahashi *1 Hirofumi Sakuma *1 Shigemune Kitawaki *2 *1 Frontier.
Overview of Earth Simulator.
The C&C Center Three Major Missions: In This Presentation:
Presentation transcript:

2004/12/22 1 Brief Outline of the Earth Simulator and Our Research Activities AND A Lesson Learnt for the Past Three Years Wataru Ohfuchi Earth Simulator Center Japan Agency for Marine-Earth Science and Technology and Atmospheric and Oceanic Simulation Group and AFES Working Team

2004/12/22 2

2004/12/22 3 Earth Simulator Building

2004/12/22 4 Inside the Earth Simulator Building PN cabinets(320) IN cabinets(65) 65m (71yd) Double Floor for Cables Power Supply System Cartridge Tape Library System Magnetic Disk System Air Conditioning System Seismic Isolation System 50m (55yd)

2004/12/22 5 Comparison of PN Size about 6m Peak Performance: 64Gflops Electric Power: about 8kVA Air Cooling Peak Performance: 64Gflops Electric Power: about 90kVA Air Cooling about 7m NEC SX-4 (1 node) Earth Simulator 70cm 100cm

2004/12/22 6 Configuration of the Earth Simulator Shared Memory 16GB Arithmetic Processor #0 Arithmetic Processor #1 Arithmetic Processor #7 Shared Memory 16GB Arithmetic Processor #0 Arithmetic Processor #1 Arithmetic Processor #7 Shared Memory 16GB Arithmetic Processor #0 Arithmetic Processor #1 Arithmetic Processor #7 Processor Node #0Processor Node #1Processor Node #639 Total peak performance:40Tflops Total main memory:10TB Peak performance/AP: 8Gflops Peak performance/PN:64Gflops Shared memory/PN:16GB Interconnection Network (full crossbar switch) Total number of APs:5120 Total number of PNs: 640

2004/12/22 7 Connection among Nodes XSW #0 XSW #1 XSW #2 XSW #3 XSW #4 XSW #5 XSW #6 XSW #7 XSW #126XSW #127 XCT #0 PN #2PN #3PN #4PN #5 PN #636PN #637PN #638PN #639 PN #0 PN #1 128 XSWs 64 Cabinets 640 PNs 320 Cabinets XCT #1 PN-IN Electric Cables : 640 x 130 = 83,200

2004/12/22 8 Electric Cables under the Floor

2004/12/22 9

2004/12/22 10

2004/12/22 11

2004/12/22 12 An Overview of AFES (AGCM for the Earth Simulator) –Primitive equation system (hydrostatic approximation) Valid (arguably) down to 10 km (T1279) –Spectral Eulerian –Physical processes Cumulus parameterizations (A-S, Kuo, MCA, Emanuel) Radiation (mstranX: Sekiguchi et al. 2004) Surface model: MATSIRO (Takata et al. 2004) Etc –Adopted from CCSR/NIES AGCM Center for Climate System Research, the Univ. Tokyo Japanese National Institute for Environmental Studies Rewritten totally from scratch with FORTRAN 90, MPI and microtasking

2004/12/22 13 T1279L96 AFESs Scalability

2004/12/22 14 AFES won the Gordon Bell Award for Peak Performance!!!

2004/12/22 15 Meso-scale Resolving T1279L96 Simulations Typhoons, wintertime cyclogenesis and Baiu-Meiyu front –Interactions between large-scale circulations and meso-scale phenomena –Self-organization of meso-scale circulations in larger circulation field Short-term (10 days to 2 weeks) –CPU power is NOT a problem; data size (~Tera bytes) is the problem

2004/12/ km Mesh Global Simulations

2004/12/22 17 Typhoons over Western Pacific

2004/12/22 18 Winter Cyclogenesis over Japan

2004/12/22 19 Baiu-Meiyu Front over Japan

2004/12/22 20 But, So What?

2004/12/22 21 Our ES Project 2004 Mechanism and predictability of atmospheric and oceanic variations induced by interactions between large-scale field and meso-scale phenomena –Project leader: Wataru Ohfuchi –FES models+THORPEX AFES –Sub-project leader: Takeshi Enomoto (ESC) –AGCM OFES –Sub-project leaders: Hideharu Sasaki (ESC), Hirofumi Sakuma (FRCGC), Yukio Maumoto (FRCGC/U. Tokyo) –MOM3-based OGCM CFES –Sub-project leader: Nobumasa Komori (ESC) –Coupled model: AFES + OIFES (OFES + IARC sea ice model) THORPEX –Sub-project leader: Tadashi Tsuyuki (NPD, JMA) –High-resolution singular vector method and predictability

2004/12/22 22 Summary With the combination of the ES and models well optimized for its architecture, now it is possible to conduct meaningful (ultra-)high resolution global simulations first in the history of computational atmospheric and oceanic sciences, and geophysical fluid dynamics. Interaction between meso-scale phenomena and larger-scale circulation can be studied. Scientifically new knowledge and contribution to society are expected.

2004/12/22 23 A Possible Future Direction of High Performance Computing in Atmospheric and Oceanic Sciences: A Lesson Learnt for the Past Three Years with the Earth Simulator The Earth Simulator was not unfortunately perfect, of course. What I foresee as future modeling strategy. What I foresee as future HPC in AOS. It will be published in Advances in Science: Earth Science edited by Prof. Peter Sammonds, Royal Societys Philosophical Transactions (2005).

2004/12/22 24 How Many Points Are There in the 10-km Mesh AGCM? T1269L96 –1279 spherical harmonics with the so-called triangular trancation. –3840 (longitude) X 1920 (latitude) X 96 (layers) = … –~700 M points… Assume Double Precision (8B) and 100 Variables… –~560 GB –Actually, the T1279L96 AFES needs about 1.2 TB of memory.

2004/12/22 25 How Much Data Are We Producing with the 10-km Mesh AGCM? One 3-D Snapshot. –2.6GB. Oh, We Need 6-hourly Output!!! Ten 3-D Variables!!! For One Day… –2.6GB X 4/day (6-hourly) X 10 variables = … –104GB/day. Oh, We Want to Integrate for 10 Days… –~1TB. Oh, We Are Climatologists!!! We Want to Integrate for Days… –~1PB. Oh, We Need Ten Sensitivity Tests!!! –10PB.

2004/12/22 26 Future HPC in the World (in 2010…) VERY Unfortunately HPC Hardware Business Is Currently Dominated by the IMPERIAL JAPAN and the US of A!!! Suppose Emerging Super Computing Country, Taiwan, Republic of China, Takes Over Submerging J and USA within a Few Years. The Earth System Simulator, the National Taiwan Normal University. –1 PFlops machine (25 larger than the ES). –1 Exabyte of hardisk/PROJECT!!! –1 Zettabyte of long-term storage/PROJECT!!!

2004/12/22 27 So, What We Can Do with The Earth System Simulator at the National Taiwan Normal University 2010? When You Increase the Resolution by the Factor of Two… –2 (longitutde) X 2 (latitude) X 2 (vertical levels) X 2 (time) = 16 –~ 25. The Current Biggest Global Atmospheric Simulation Project on the ES is ~3-km Mesh (Nonhydorstatic) –So, ~1.5-km mesh simulation. –Sorry, its not cloud resolving, yet!!!

2004/12/22 28 We Need to Think Better Than That!!! Multi-scale Modeling. Stand-alone Global Hydrostatic Model. Stand-alone Regional Nonhydrostatic Model. –As super-parameterization. Stand-alone 3-D Turbulence Model. Super-prameterization-like link between these models. We may have to go down to explicit cloud physics. Of course, 3-D radiation!!!

2004/12/22 29 Conclusions 1 HPC is not only a number crunching capability. –Linpack has become totally obsolete. Data handling is much much much much much much….MORE important. We are already in the middle of the storm of data! –Hard disk. –Long-term data storage. –Software. But still we need to think much better. –Just increasing resolution does not seem to lead to a breakthrough.

2004/12/22 30 Conclusions 2 A HUGE HPC System should be used as a whole. –The ES consists of 640 nodes. –Sorry, those jobs that require less than ~320 nodes should go away. Expensive vs. Cheapo –Vector vs. Scalar? –We need to think about cost effectiveness. –It may depend on problems. GRID? –Probably very good for data sharing. –Simulations? We need to integrate science and engineering. –At least wee need to understand both and have strong opinions.