Presentation is loading. Please wait.

Presentation is loading. Please wait.

Parallel Programming in Parallel Virtual Machine

Similar presentations


Presentation on theme: "Parallel Programming in Parallel Virtual Machine"— Presentation transcript:

1 Parallel Programming in Parallel Virtual Machine
Fayon Atkinson

2 History Oak Ridge National Laboratory in 1989
University of Tennessee in 1991 Said to have also been revised at Emory University later.

3 PVM Explained Applications on a set of heterogeneous computers connected by a network. Appears logically as single parallel computer to its users. Distributed computing architecture is created from the software’s from a parallel connected system. Makes parallel programming straight forward and efficient Accurate computation Use to teach science medicine & engineering

4 Environment Two component: a library of PVM routines, and a daemon that should reside on all the hosts in the virtual machine. Hosts: computer nodes which could be uniprocessors, multiprocessors or clusters running the software.

5 Structure Star graph structure Tree structure
Supervisor –workers model Tree structure Hierarchy model

6 Supervisor–Workers Model
Supervisor is the middle node in the star The supervisor is the initiating task One supervisor interact with the user, activating the workers, assigning work to the workers, and collecting results from the workers. Many workers Calculations independent or dependent dependent workers communicate with each other before sending supervisors the computations

7 Hierarchy Structure Root of the tree is the top supervisor
Workers can create a set of new workers on another level Parent is the task that creates another task

8 Task Creation task can be started manually or can be spawned from another task pvm_spawn() All tasks are given a unique identifier when created (TID) To create a child: Child machine is started path to the executable file on the specified machine. number of copies of the child to be created. array of arguments to the child task(s).

9 Task Creation Functions:
mytid = pvm_mytid(); - running task can retrieve its own TID pvm_spawn(…, …, …, …, &tid) - array containing the TIDs of the children created by this call will be returned my_parent_tid = pvm_parent(); - returns the parent. Will return PvmNoParent for initial tasks daemon_tid = pvm_tidtohost(id); - determining on which host a given task is running num = pvm_spawn(Child,Arguments,Flag,Where,HowMany,&Tids).

10 Task Groups Grouping tasks is common
Useful for when a collective operation is performed on only a subset of the tasks, such as sending a message to only members of a group. Functions: i = pvm_joingroup(group_name) – joining a group called group_name info = pvm_lvgroup(group_name) – leaving a group called group_name pvm_gsize() – retrieve size of group pvm_gettid() – retrieve TID if a task given instance number and group name, pvm_getinst() – retrieve instance number of a given task and group name.

11 Task Communication Sending Steps: Receiving Steps:
A send buffer must be initialized. The message is packed into this buffer. The completed message is sent to its destination(s). Receiving Steps: The message is received. The received items are unpacked from the receive buffer.

12 Thank You!


Download ppt "Parallel Programming in Parallel Virtual Machine"

Similar presentations


Ads by Google