Distributed Programming Concepts and Notations. Inter-process Communication Synchronous Messages Asynchronous Messages Select statement Remote procedure.

Slides:



Advertisements
Similar presentations
Automated Theorem Proving Lecture 1. Program verification is undecidable! Given program P and specification S, does P satisfy S?
Advertisements

COMMUNICATING SEQUENTIAL PROCESSES C. A. R. Hoare The Queen’s University Belfast, North Ireland.
ICE1341 Programming Languages Spring 2005 Lecture #13 Lecture #13 In-Young Ko iko.AT. icu.ac.kr iko.AT. icu.ac.kr Information and Communications University.
Formal Semantics of Programming Languages 虞慧群 Topic 6: Advanced Issues.
Concurrency Important and difficult (Ada slides copied from Ed Schonberg)
Ch. 7 Process Synchronization (1/2) I Background F Producer - Consumer process :  Compiler, Assembler, Loader, · · · · · · F Bounded buffer.
Silberschatz, Galvin and Gagne ©2013 Operating System Concepts – 9 th Edition Chapter 5: Process Synchronization.
Remote Procedure Call (RPC)
1 Concurrency: Mutual Exclusion and Synchronization Chapter 5.
1 Concurrency Specification. 2 Outline 4 Issues in concurrent systems 4 Programming language support for concurrency 4 Concurrency analysis - A specification.
Tam Vu Remote Procedure Call CISC 879 – Spring 03 Tam Vu March 06, 03.
Remote Procedure CallCS-4513, D-Term Remote Procedure Call CS-4513 Distributed Computing Systems (Slides include materials from Operating System.
1 Chapter 8 Channels. 2 Concurrent Programming Constructs So far we have seen contructs based on shared memory concept (shared directly – buffer - or.
Informationsteknologi Wednesday, September 26, 2007 Computer Systems/Operating Systems - Class 91 Today’s class Mutual exclusion and synchronization 
Chapter 5 Concurrency: Mutual Exclusion and Synchronization Operating Systems: Internals and Design Principles, 6/E William Stallings Patricia Roy Manatee.
1 Concurrency: Mutual Exclusion and Synchronization Chapter 5.
Introduction in algorithms and applications Introduction in algorithms and applications Parallel machines and architectures Parallel machines and architectures.
Remote Procedure Call in SR Programming Language By Tze-Kin Tsang 3/20/2000.
Concurrency in Ada Programming Languages 1 Robert Dewar.
Erlang concurrency. Where were we? Finished talking about sequential Erlang Left with two questions  retry – not an issue; I mis-read the statement in.
Concurrency in Ada What concurrency is all about Relation to operating systems Language facilities vs library packages POSIX threads Ada concurrency Real.
1 Semaphores Special variable called a semaphore is used for signaling If a process is waiting for a signal, it is suspended until that signal is sent.
Concurrency CS 510: Programming Languages David Walker.
02/02/2004CSCI 315 Operating Systems Design1 Interprocesses Communication Notice: The slides for this lecture have been largely based on those accompanying.
Asynchronous Message Passing EE 524/CS 561 Wanliang Ma 03/08/2000.
Concurrency - 1 Tasking Concurrent Programming Declaration, creation, activation, termination Synchronization and communication Time and delays conditional.
CS603 Communication Mechanisms 14 January Types of Communication Shared Memory Message Passing Stream-oriented Communications Remote Procedure Call.
Chapter 4.1 Interprocess Communication And Coordination By Shruti Poundarik.
1 Chapter 9 Spaces with LINDA. 2 Linda Linda is an experimental programming concept unlike ADA or Occam which are fully developed production-quality languages.
Chapter 9 Message Passing Copyright © Operating Systems, by Dhananjay Dhamdhere Copyright © Operating Systems, by Dhananjay Dhamdhere2 Introduction.
Chapter 5 Concurrency: Mutual Exclusion and Synchronization Operating Systems: Internals and Design Principles, 6/E William Stallings 1.
CS5204 – Operating Systems 1 Communicating Sequential Processes (CSP)
1 Chapter 2. Communication. STEM-PNU 2 Layered Protocol TCP/IP : de facto standard Our Major Concern Not always 7-layered Protocol But some other protocols.
12/1/98 COP 4020 Programming Languages Parallel Programming in Ada and Java Gregory A. Riccardi Department of Computer Science Florida State University.
Concurrency: Mutual Exclusion and Synchronization Chapter 5.
Problems with Send and Receive Low level –programmer is engaged in I/O –server often not modular –takes 2 calls to get what you want (send, followed by.
1 Programming Languages and the Software Production Process Informal Cardelli’s metrics of programming languages fitness to real-time applications: Economy.
1 Concurrency Architecture Types Tasks Synchronization –Semaphores –Monitors –Message Passing Concurrency in Ada Java Threads.
1 Lecture 5 (part2) : “Interprocess communication” n reasons for process cooperation n types of message passing n direct and indirect message passing n.
Review Questions on Chapter IV—IPC COSC 4330/6310 Summer 2013.
Internet Security CSCE 813 Communicating Sequential Processes.
Concurrency: Mutual Exclusion and Synchronization Chapter 5.
1 Concurrency: Mutual Exclusion and Synchronization Chapter 5.
1 Concurrency: Mutual Exclusion and Synchronization Chapter 5.
Using a simple Rendez-Vous mechanism in Java
ICS 313: Programming Language Theory Chapter 13: Concurrency.
Model 3: Message-Passing Text We saw –Diagrams –Shared-variables text Now –Message-passing text (like CSP)
Synchronization Methods in Message Passing Model.
Semantics of Send and Receive Can be blocking or nonblocking –called “synchronous” and “asynchronous” –remember: procedure call is synchronous thread fork.
Perl Tutorial. Why PERL ??? Practical extraction and report language Similar to shell script but lot easier and more powerful Easy availablity All details.
Remote Procedure Call Andy Wang Operating Systems COP 4610 / CGS 5765.
1 Conventional Procedure Call read(fd,buf,nbytes) a)Parameter passing in a local procedure call: the stack before the call to read b)The stack while the.
Mark Stanovich Operating Systems COP Primitives to Build Distributed Applications send and receive Used to synchronize cooperating processes running.
Remote Procedure Call and Serialization BY: AARON MCKAY.
Computer Science Lecture 3, page 1 CS677: Distributed OS Last Class: Communication in Distributed Systems Structured or unstructured? Addressing? Blocking/non-blocking?
1 Chapter 2. Communication. STEMPusan National University STEM-PNU 2 Layered Protocol TCP/IP : de facto standard Our Major Concern Not always 7-layered.
Distributed Mutual Exclusion Synchronization in Distributed Systems Synchronization in distributed systems are often more difficult compared to synchronization.
Channels. Models for Communications Synchronous communications – E.g. Telephone call Asynchronous communications – E.g. .
Gokul Kishan CS8 1 Inter-Process Communication (IPC)
“Language Mechanism for Synchronization”
Distributed OS.
Concurrency: Mutual Exclusion and Synchronization
Chapter 4: Processes Process Concept Process Scheduling
Transactional Memory Semaphores, monitors, and conditional critical regions all suffer from limitations based on lock semantics Naïve synchronization may.
OPERATING SYSTEMS PROCESSES
Channels.
Subject : T0152 – Programming Language Concept
Channels.
Channels.
Last Class: Communication in Distributed Systems
Presentation transcript:

Distributed Programming Concepts and Notations

Inter-process Communication Synchronous Messages Asynchronous Messages Select statement Remote procedure calls Atomic transactions

Message based programming languages No shared variables Each object has a caretaker Can operate on an object only by sending a message to its caretaker

Basic Constructs send expression-list to destination receive variable-list from source can be used for communication synchronization

Basic Issues How to specify designators ? How is communication synchronized ?

Designator Specification Direct Naming P1: P2: var a,x:integer; var b,y:integer; x:=2*a; receive b from P1; send x to P2 y:=y+b; Advantages simple efficient implementation Disadvantages must know at compile time receive from anyone

Mailboxes process simpleNS; nsmb: mailbox; request:message_type; begin create(nsmb); repeat forever receive request from nsmb; case request.mtype of. end; client: send myreq to nsmb; Advantages: generality Disadvantages: inefficient to implement

Buffering Synchronous vs Asynchronous communication Blocking send vs Non-blocking send Unbuffered send vs Unbounded buffering Advantages of synchronous communication easier implementation easy proof rules easy to synchronize fault-tolerance easier Disadvantages exploitation of concurrency difficult

Avoiding blocking on receive peek() selective receive receive var from designator about key

Selective Communication Guarded command: e.g. extension: guard = boolean expression and optionally a message passing statement. A guard may Succeed other side is also ready to communicate Fail other side has already terminated or May neither succeed nor fail.

CSP: Synchronous messages process buffer; var slots:array [0..N-1] of integer; head, tail:0..N-1; size:0..N; begin head, tail, size := 0,0,0; *[ size < N; receive slots[tail] from producer -> size := size+1; tail:=tail + 1 mod N; [] size > 0; send slots[head] to consumer -> size := size-1; head:=head + 1 mod N; ] end

Remote Procedure Calls Key idea: try to preserve the semantics of local procedure calls P1: call process.pname(parameters) P2: remote procedure pname(..) end;

Kinds of RPC Simple declaration Accept statements

Ada - rendezvous P1: call P2:accept Synchronization parameters transmitted execute body parameters transmitted back

Comparison Advantages of accept statement server has flexibility of when to serve can provide different services by different bodies can combine with guarded commands to provide non-determinism Disadvantages of accept statement difficult to understand or prove correctness

Ada example task buffer; entry deposit(in value:T); entry fetch(out value:T); var slots:array [0..N-1] of T; head, tail:0..N-1; size:0..N; begin head, tail, size := 0,0,0; loop select when size < N => accept deposit(in value:T); slots[tail] := value; size := size+1; tail:=tail + 1 mod N; or when size > 0 => accept fetch(out value:T); value := slots[head]; size := size-1; head:=head + 1 mod N; end select; end loop;

Ada example contd. task produce repeat item := produce(); call buffer.deposit(item); forever; end; task consumer; repeat call buffer.fetch(item); consume(item); forever; end

Care must be taken regarding global variables pointers as parameters representation of parameters at least once/ at most once semantics