Presentation is loading. Please wait.

Presentation is loading. Please wait.

Clusters, Grids and their applications in Physics David Barnes (Astro) Lyle Winton (EPP)

Similar presentations


Presentation on theme: "Clusters, Grids and their applications in Physics David Barnes (Astro) Lyle Winton (EPP)"— Presentation transcript:

1 Clusters, Grids and their applications in Physics David Barnes (Astro) Lyle Winton (EPP)

2 Todays typical workstation can: Compute 1024-point fast Fourier transforms at a rate of per second. Compute and apply gravitational force for particles at a rate of ~one time step per sec. Render ~7M elements of a data volume per sec. Stream data to and from disk at around 30 MByte per sec. Communicate with other machines at up to 10 Mbyte per sec, with a latency of a few tens of milliseconds. … … what if this is not enough? … …

3 High Performance Computers ~ 20 years ago –1x10 6 Floating Point Ops/sec (Mflop/s) [scalar processors] ~ 10 years ago –1x10 9 Floating Point Ops/sec (Gflop/s) [vector processors] ~ Today: superscalar-based CLUSTERS –1x10 12 Floating Point Ops/sec (Tflop/s) Highly parallel, distributed, networked superscalar processors ~ less than 10 years away –1x10 15 Floating Point Ops/sec (Pflop/s) Multi-level clusters and GRIDS a+b=c A+B=C a+b=c d+e=f

4 High-Performance Computing Directions: Beowulf-class PC Clusters Common off-the-shelf PC Nodes –Pentium, Alpha, PowerPC, SMP COTS LAN/SAN Interconnect –Ethernet, Myrinet, Giganet, ATM Open Source Unix –Linux, BSD Message Passing Computing –MPI, PVM –HPF Best price-performance Low entry-level cost Just-in-place configuration Vendor invulnerable Scalable Rapid technology tracking Definition:Advantages: Enabled by PC hardware, networks and operating system achieving capabilities of scientific workstations at a fraction of the cost and availability of industry standard message passing libraries. Slide from Jack Dongarra … lets see some clusters …

5 ~70 Gflop/s Mojo: School of Physics cluster 24 nodes, 24 CPUs, 2 & 2.4 GHz Pentium 4 ~70 Gflop/s

6 ~500 Gflop/s Swinburne Centre for Astrophysics and Supercomputing: 90 nodes, 180 CPUs, 2.0, 2.2 & 2.4 GHz Pentium 4 16 nodes, 32 CPUs, 933 MHz Pentium III ~500 Gflop/s Accomplish > 8 million 1024-pt FFTs per second. Render > 10 9 volume elements per second. Calculate at one time step per second for ~ particles using brute-force approach.

7 ~200 Tflop/s Next-generation blade cluster IBM Blue Gene ~200 Tflop/s

8 Uses thousands of Internet-connected PCs to help in the search for extraterrestrial intelligence. Uses data collected with the Arecibo Radio Telescope, in Puerto Rico When your computer is idle the software downloads a 300 kilobyte chunk of data for analysis. The results of this analysis are sent back to the SETI team and combined with results from thousands of other participants. Largest distributed computation project in existence –~ 400,000 machines –Averaging 26 Tflop/s Slide from Jack Dongarra This is more than a cluster … perhaps it is the first genuine Grid …

9 over to Lyle Winton…


Download ppt "Clusters, Grids and their applications in Physics David Barnes (Astro) Lyle Winton (EPP)"

Similar presentations


Ads by Google