Presentation is loading. Please wait.

Presentation is loading. Please wait.

Severs AIST Cluster (50 CPU) Titech Cluster (200 CPU) KISTI Cluster (25 CPU) Climate Simulation on ApGrid/TeraGrid at SC2003 Client (AIST) Ninf-G Severs.

Similar presentations


Presentation on theme: "Severs AIST Cluster (50 CPU) Titech Cluster (200 CPU) KISTI Cluster (25 CPU) Climate Simulation on ApGrid/TeraGrid at SC2003 Client (AIST) Ninf-G Severs."— Presentation transcript:

1 Severs AIST Cluster (50 CPU) Titech Cluster (200 CPU) KISTI Cluster (25 CPU) Climate Simulation on ApGrid/TeraGrid at SC2003 Client (AIST) Ninf-G Severs NCSA Cluster (225 CPU)

2 National Institute of Advanced Industrial Science and Technology Example - Hybrid QM/MD Simulation -

3 QM/MD simulation over the Pacific at SC2004 QM Server MD Client TCS (512 CPU) @ PSC Total number of CPUs: 1792 Ninf-G Close-up view corrosion of Sillicon under stress P32 (512 CPU) F32 (256 CPU)

4 1 2 3 4 5 6 7 8 9 10 Total number of CPUs: 1793 Total Simulation Time: 10 hour 20 min # steps: 10 (= 7fs) Average time / step: 1 hour Size of generated files / step: 4.5GB

5 (some of) Lessons Learned Practically impossible to occupy a large-scale single system for few weeks. How can we long-run the simulation? Faults (e.g. HDD crush, network down) cannot be avoided. We don t prefer manual restart. The simulation should be capable of automatic recovery from faults. How can the simulation recover from faults?

6 Objectives Develop flexible, robust, and efficient Grid- enabled simulation. Flexible -- allow dynamic resource allocation/migration, robust -- detect errors and recover from faults automatically for long runs, and efficient -- manage thousands of CPUs. Verify our strategy through large-scale experiments. Implemented Grid-enabled SIMOX (Separation by Implanted Oxygen) simulation Run the simulation on Japan-US Grid testbed for few weeks.

7 Hybrid QM/CL Simulation (1) Enabling large scale simulation with quantum accuracy Combining classical MD Simulation with quantum simulation CL simulation Simulating the behavior of atoms in the entire region Based on the classical MD using an empirical inter-atomic potential QM simulation Modifying energy calculated by MD simulation only in the interesting regions Based on the density functional theory (DFT) MD Simulation QM simulation based on DFT

8 simulation algorithm Each QM computation is independent with each other compute intensive usually implemented as a MPI program Hybrid QM/CL Simulation (2) MD partQM part initial set-up Calculate MD forces of QM+MD regions Update atomic positions and velocities Calculate QM force of the QM region Data of QM atoms QM forces Calculate QM force of the QM region Calculate QM force of the QM region Calculate MD forces of QM region

9 National Institute of Advanced Industrial Science and Technology Implementation of Grid-enabled Simulation - multi-scale QM/MD simulation using GridRPC and MPI -

10 Approach to gridify applications Grid RPC enhances the flexibility and robustness by; dynamic allocation of server programs, and detection of network/cluster trouble. MPI enhances the efficiency by; highly parallel computing on a cluster for both client and server programs. The new programming approach, combining GridRPC with MPI, takes advantages of both programming models complementarily to run large-scale applications on the Grid for a long time. Client Server GridRPC MPI MD MPI QM MPI QM


Download ppt "Severs AIST Cluster (50 CPU) Titech Cluster (200 CPU) KISTI Cluster (25 CPU) Climate Simulation on ApGrid/TeraGrid at SC2003 Client (AIST) Ninf-G Severs."

Similar presentations


Ads by Google