Using MPI on Dept. Clusters Min LI Sep 9 2010. Outline Run MPI programs on single machine Run mpi programs on multiple machines Assignment 1.

Slides:



Advertisements
Similar presentations
MPI Message Passing Interface
Advertisements

Barry Wilkinson and Mark Holliday, 2004A1.1 Assignments Preliminaries Several computers are configured at WCU for the assignments. Here, terra.cs.wcu.edu.
MPI Basics Introduction to Parallel Programming and Cluster Computing University of Washington/Idaho State University MPI Basics Charlie Peck Earlham College.
Computer Organization Tools Computer Organization 1 © WD McQuain Programming Tools Most of the programming assignments will require using.
Tutorial on MPI Experimental Environment for ECE5610/CSC
MPI Program Structure Self Test with solution. Self Test 1.How would you modify "Hello World" so that only even-numbered processors print the greeting.
COSC 120 Computer Programming
Point-to-Point Communication Self Test with solution.
Distributed Memory Programming with MPI. What is MPI? Message Passing Interface (MPI) is an industry standard message passing system designed to be both.
Technical Instructions ENGL 3100 Dr. Shelley Thomas.
CS211 Data Structures Sami Rollins Fall 2004.
EECC756 - Shaaban #1 lec # 7 Spring Message Passing Interface (MPI) MPI, the Message Passing Interface, is a library, and a software standard.
Lecture 8 Objectives Material from Chapter 9 More complete introduction of MPI functions Show how to implement manager-worker programs Parallel Algorithms.
CS 101 Problem Solving and Structured Programming in C Sami Rollins Spring 2003.
1 Tuesday, October 10, 2006 To err is human, and to blame it on a computer is even more so. -Robert Orben.
Today Objectives Chapter 6 of Quinn Creating 2-D arrays Thinking about “grain size” Introducing point-to-point communications Reading and printing 2-D.
SSH. Review 1-minute exercise: Find the open ports on you own VM [Good] nmap [Better] netstat -lpunt.
Intro to Linux/Unix (user commands) Box. What is Linux? Open Source Operating system Developed by Linus Trovaldsa the U. of Helsinki in Finland since.
Director of Contra Costa College High Performance Computing Center
IPC144 Introduction to Programming Using C Week 1 – Lesson 2
2.1 Message-Passing Computing ITCS 4/5145 Parallel Computing, UNC-Charlotte, B. Wilkinson, Jan 17, 2012.
Introduction to Parallel Programming with C and MPI at MCSR Part 2 Broadcast/Reduce.
2.1 Message-Passing Computing ITCS 4/5145 Parallel Computing, UNC-Charlotte, B. Wilkinson, Jan 14, 2013.
An Introduction to Parallel Programming and MPICH Nikolaos Hatzopoulos.
CS 240A Models of parallel programming: Distributed memory and MPI.
Parallel Programming with MPI Prof. Sivarama Dandamudi School of Computer Science Carleton University.
Message Passing Programming with MPI Introduction to MPI Basic MPI functions Most of the MPI materials are obtained from William Gropp and Rusty Lusk’s.
1 Lab 2 “Hello world” in Unix/Linux #include "std_lib_facilities_4.h" int main(){ cout
CS 838: Pervasive Parallelism Introduction to MPI Copyright 2005 Mark D. Hill University of Wisconsin-Madison Slides are derived from an online tutorial.
Message Passing Programming Model AMANO, Hideharu Textbook pp. 140-147.
Summary of MPI commands Luis Basurto. Large scale systems Shared Memory systems – Memory is shared among processors Distributed memory systems – Each.
MPI Introduction to MPI Commands. Basics – Send and Receive MPI is a message passing environment. The processors’ method of sharing information is NOT.
An Introduction to Parallel Programming with MPI March 22, 24, 29, David Adams
Parallel Programming with MPI By, Santosh K Jena..
Oct. 23, 2002Parallel Processing1 Parallel Processing (CS 730) Lecture 6: Message Passing using MPI * Jeremy R. Johnson *Parts of this lecture was derived.
CS4230 CS4230 Parallel Programming Lecture 13: Introduction to Message Passing Mary Hall October 23, /23/2012.
Running on GCB part1 By: Camilo Silva. Simple steps to run MPI 1.Use putty or the terminal 2.SSH to gcb.fiu.edu 3.Loggin by providing your username and.
Project18’s Communication Drawing Design By: Camilo A. Silva BIOinformatics Summer 2008.
Programming distributed memory systems: Message Passing Interface (MPI) Distributed memory systems: multiple processing units working on one task (e.g.
An Introduction to MPI (message passing interface)
1 HPCI Presentation Kulathep Charoenpornwattana. March 12, Outline Parallel programming with MPI Running MPI applications on Azul & Itanium Running.
1 Using PMPI routines l PMPI allows selective replacement of MPI routines at link time (no need to recompile) l Some libraries already make use of PMPI.
1 Running MPI on “Gridfarm” Bryan Carpenter February, 2005.
Project18 Communication Design + Parallelization Camilo A Silva BIOinformatics Summer 2008.
Computer Programming A simple example /* HelloWorld: A simple C program */ #include int main (void) { printf (“Hello world!\n”); return.
Parallel Algorithms & Implementations: Data-Parallelism, Asynchronous Communication and Master/Worker Paradigm FDI 2007 Track Q Day 2 – Morning Session.
Message Passing Programming Based on MPI Collective Communication I Bora AKAYDIN
Message Passing Interface Using resources from
COMP7330/7336 Advanced Parallel and Distributed Computing MPI Programming: 1. Collective Operations 2. Overlapping Communication with Computation Dr. Xiao.
Parallel Programming C. Ferner & B. Wilkinson, 2014 Introduction to Message Passing Interface (MPI) Introduction 9/4/
University of Kansas Department of Electrical Engineering and Computer Science Dr. Susan Gauch April 21, 2005 I T T C Introduction to Web Technologies.
Assignprelim.1 Assignment Preliminaries © 2012 B. Wilkinson/Clayton Ferner. Modification date: Jan 16a, 2014.
ENEE150 Discussion 01 Section 0101 Adam Wang.
Verification of Data-Dependent Properties of MPI-Based Parallel Scientific Software Anastasia Mironova.
Auburn University
First Day in Lab Making a C++ program
Programming Tools Most of the programming assignments will require using the C language. We will use a current version of the GCC C compiler. What’s GCC?
CS4402 – Parallel Computing
MPI Message Passing Interface
Assignment Preliminaries
Parallel Programming with MPI and OpenMP
CS4961 Parallel Programming Lecture 16: Introduction to Message Passing Mary Hall November 3, /03/2011 CS4961.
Introduction to Message Passing Interface (MPI)
May 19 Lecture Outline Introduce MPI functionality
CSCE569 Parallel Computing
Assignment Preliminaries
Hardware Environment VIA cluster - 8 nodes Blade Server – 5 nodes
Programming Tools Most of the programming assignments will require using the C language. We will use a current version of the GCC C compiler. What’s GCC?
Working in The IITJ HPC System
Reading STAAR Benchmark 8:00 AM- 11:15 AM
Presentation transcript:

Using MPI on Dept. Clusters Min LI Sep

Outline Run MPI programs on single machine Run mpi programs on multiple machines Assignment 1

Login and info Login, putty – rlogin.cs.vt.edu a CS department account – Info – People.cs.vt.edu/~limin/cs4234

Exercise 1 "Hello World" (30’) Task 1: Executing a simple Hello World program. – Mpicc –o hello hello.c – Mpiexec –n 4./hello message from process= 0: hello, world message from process= 1: hello, world message from process= 2: hello, world message from process= 3: hello, world Task 2 Modification so that master prints out all messages Master: Hello slaves give me your messages Message received from process 1 : Hello back Message received from process 2 : Hello back Message received from process 3 : Hello back

Ex1 continue Task 3: Different messages from each process Master: Hello slaves give me your messages Message received from process 1 : Hello, I am John Message received from process 2 : Hello, I am Mary Message received from process 3 : Hello, I am Susan Task 4: Experiments with tags – Modify the program from task 3 so that the master process sends a messages to each slave, but with the tag of 100 and the slave waits for message with a tag of 100. Confirm program works. – Repeat but make the slaves wait for tag 101, and check program hangs. Why?

Exercise 2 running on multiple machines(30’) Task 1: Setting up the list of machines to use mpiexec -hostfile machines -n 4 a.out – Creating a hostfile. gen_hostfile “filename”

SSh without passwords  run on any machines without typing passwords. ssh-keygen -t rsa -N "" The result : Generating public/private dsa key pair. Enter file in which to save the key (/home/ugrads/NAME/.ssh/id_dsa): Your identification has been saved in /home/ugrads/NAME/.ssh/id_dsa. Your public key has been saved in /home/ugrads/NAME/.ssh/id_dsa.pub. The key fingerprint is: 89:ff:00:5f:06:fd:d0:a2:9e:51:b1:00:cd:0a:76:6f cd.ssh cat id_dsa.pub >> authorized_keys

Ex2 continue Task2. running a simple job – mpirun -machinefile machines -n 5 hello Task3. identifying machine names message from locke.rlogin process= 1: hello, world, message from cyan.rlogin process= 2: hello, world, message from terra.rlogin process= 0: hello, world, message from shadow.rlogin process= 3: hello, world, Task4 Measuring time of execution – MPI_Wtime() – Experiment with running the program on one computer and on multiple computers. Does it go faster on multiple computers?

Exercise 3 Using Broadcast operation(15’) The master broadcasts a "Hello World" message to the slaves and slaves print out the messages. Both the master and slaves execute the same broadcast routine. The master is identified as the root (rank 0). The 4th argument/parameter of the broadcast routine is zero in all cases. – int MPI_Bcast( void *buffer, int count, MPI_Datatype datatype, int root, MPI_Comm comm ); Does the broadcast operation reduce the execution time? Discuss.

Assignment Submission Produce a document that show that you successfully followed the instructions and performs all tasks by taking screen shots and include these screen shots in the document. – Give at least one screen shot for each numbered task in each exercise. Due date sep 17 – doc and the source codes

Ref Course website: – – MPI References – Dr Ribben’s – Dr. Sandu’s –