Lecture8 – Ch 5 Higher-level concurrency patterns Synchronized collections Concurrent collections Producer-consumer pattern  Serial thread confinement.

Slides:



Advertisements
Similar presentations
Operating Systems: Monitors 1 Monitors (C.A.R. Hoare) higher level construct than semaphores a package of grouped procedures, variables and data i.e. object.
Advertisements

– R 7 :: 1 – 0024 Spring 2010 Parallel Programming 0024 Recitation Week 7 Spring Semester 2010.
Concurrency Important and difficult (Ada slides copied from Ed Schonberg)
Multithreaded Programs in Java. Tasks and Threads A task is an abstraction of a series of steps – Might be done in a separate thread – Java libraries.
Concurrency 101 Shared state. Part 1: General Concepts 2.
Exceptions and Exception Handling Carl Alphonce CSE116 March 9, 2007.
Scalable Synchronous Queues By William N. Scherer III, Doug Lea, and Michael L. Scott Presented by Ran Isenberg.
CSC Multiprocessor Programming, Spring, 2011 Outline for Chapter 5 – Building Blocks – Library Classes, Dr. Dale E. Parson, week 5.
Java How to Program, 9/e CET 3640 Professor: Dr. José M. Reyes Álamo © Copyright by Pearson Education, Inc. All Rights Reserved.
CS 11 java track: lecture 7 This week: Web tutorial:
Designing a thread-safe class  Store all states in public static fields  Verifying thread safety is hard  Modifications to the program hard  Design.
Threading Part 4 CS221 – 4/27/09. The Final Date: 5/7 Time: 6pm Duration: 1hr 50mins Location: EPS 103 Bring: 1 sheet of paper, filled both sides with.
© Andy Wellings, 2004 Roadmap  Introduction  Concurrent Programming  Communication and Synchronization  Completing the Java Model  Overview of the.
1 Sharing Objects – Ch. 3 Visibility What is the source of the issue? Volatile Dekker’s algorithm Publication and Escape Thread Confinement Immutability.
Java 5 Threading CSE301 University of Sunderland Harry Erwin, PhD.
A. Frank - P. Weisberg Operating Systems Introduction to Cooperating Processes.
Concurrency - 1 Tasking Concurrent Programming Declaration, creation, activation, termination Synchronization and communication Time and delays conditional.
1 Thread Pools. 2 What’s A Thread Pool? A programming technique which we will use. A collection of threads that are created once (e.g. when server starts).
U NIVERSITY OF M ASSACHUSETTS, A MHERST Department of Computer Science Emery Berger University of Massachusetts, Amherst Operating Systems CMPSCI 377 Lecture.
Proxy Design Pattern Source: Design Patterns – Elements of Reusable Object- Oriented Software; Gamma, et. al.
Java How to Program, 9/e CET 3640 Professor: Dr. Reyes Álamo © Copyright by Pearson Education, Inc. All Rights Reserved.
Week 9 Building blocks.
Monitor  Giving credit where it is due:  The lecture notes are borrowed from Dr. I-Ling Yen at University of Texas at Dallas  I have modified them and.
1 Thread II Slides courtesy of Dr. Nilanjan Banerjee.
Threads some important concepts Simon Lynch
50.003: Elements of Software Construction Week 8 Composing Thread-safe Objects.
1 Concurrent Languages – Part 1 COMP 640 Programming Languages.
Java Threads. What is a Thread? A thread can be loosely defined as a separate stream of execution that takes place simultaneously with and independently.
Semaphores, Locks and Monitors By Samah Ibrahim And Dena Missak.
1 (Worker Queues) cs What is a Thread Pool? A collection of threads that are created once (e.g. when a server starts) That is, no need to create.
Java Threads 1 1 Threading and Concurrent Programming in Java Queues D.W. Denbo.
Internet Software Development Controlling Threads Paul J Krause.
Practice Session 8 Blocking queues Producers-Consumers pattern Semaphore Futures and Callables Advanced Thread Synchronization Methods CountDownLatch Thread.
Synchronized and Monitors. synchronized is a Java keyword to denote a block of code which must be executed atomically (uninterrupted). It can be applied.
CSC321 Concurrent Programming: §5 Monitors 1 Section 5 Monitors.
ICS 313: Programming Language Theory Chapter 13: Concurrency.
Advanced Concurrency Topics Nelson Padua-Perez Bill Pugh Department of Computer Science University of Maryland, College Park.
Threads Doing Several Things at Once. Threads n What are Threads? n Two Ways to Obtain a New Thread n The Lifecycle of a Thread n Four Kinds of Thread.
SPL/2010 Synchronization 1. SPL/2010 Overview ● synchronization mechanisms in modern RTEs ● concurrency issues ● places where synchronization is needed.
COMPSCI 230 S2C 2015 Software Design and Construction Synchronization (cont.) Lecture 4 of Theme C.
CS399 New Beginnings Jonathan Walpole. 2 Concurrent Programming & Synchronization Primitives.
Monitors and Blocking Synchronization Dalia Cohn Alperovich Based on “The Art of Multiprocessor Programming” by Herlihy & Shavit, chapter 8.
1 Condition Variables CS 241 Prof. Brighten Godfrey March 16, 2012 University of Illinois.
U NIVERSITY OF M ASSACHUSETTS A MHERST Department of Computer Science Computer Systems Principles Synchronization Emery Berger and Mark Corner University.
Comunication&Synchronization threads 1 Programación Concurrente Benemérita Universidad Autónoma de Puebla Facultad de Ciencias de la Computación Comunicación.
CPS110: Thread cooperation Landon Cox. Constraining concurrency  Synchronization  Controlling thread interleavings  Some events are independent  No.
1 Previous Lecture Overview  semaphores provide the first high-level synchronization abstraction that is possible to implement efficiently in OS. This.
Threads b A thread is a flow of control in a program. b The Java Virtual Machine allows an application to have multiple threads of execution running concurrently.
1 G53SRP: Java Concurrency Control (2) – wait/notify Chris Greenhalgh School of Computer Science.
Concurrency (Threads) Threads allow you to do tasks in parallel. In an unthreaded program, you code is executed procedurally from start to finish. In a.
Mutual Exclusion -- Addendum. Mutual Exclusion in Critical Sections.
Lecture 5 Page 1 CS 111 Summer 2013 Bounded Buffers A higher level abstraction than shared domains or simple messages But not quite as high level as RPC.
Monitor Pattern Read only Collection views Synchronized Collections views Concurrent HashMap The Vehicle Tracker Example.
Multithreading / Concurrency
Thread Pools (Worker Queues) cs
Thread Pools (Worker Queues) cs
Background on the need for Synchronization
Lecture 24 Concurrency 2 (D&D 23) Date.
Synchronization Lecture 23 – Fall 2017.
Condition Variables and Producer/Consumer
Condition Variables and Producer/Consumer
Multithreading.
Monitor Giving credit where it is due:
NETWORK PROGRAMMING CNET 441
CSE 153 Design of Operating Systems Winter 2019
CSCI1600: Embedded and Real Time Software
Software Engineering and Architecture
some important concepts
More concurrency issues
Synchronization and liveness
Presentation transcript:

Lecture8 – Ch 5 Higher-level concurrency patterns Synchronized collections Concurrent collections Producer-consumer pattern  Serial thread confinement Interruptable waiting Synchronizers Concurrent result cache

Synchronized Collections (5.1) Collections are thread-safe for their own invariants Client invariants are nevertheless at risk Requirement for client-side locking  Compound actions: find, put-if-absent, iterate  Locking policy: intrinsic lock on the collection  Locks held too long limit performance ConcurrentModificationException  Fail-fast  Client finds out something went wrong, has to deal with it

Concurrent Collections (5.2) Evolution of libraries based on experience Finer-grained locking (not on the whole collection – lock striping) Concurrent reads; partially concurrent writes Weakly-consistent iterators  Return all items in the collection at the time iteration started and maybe some that were later added  What did I miss? Some useful compound actions provided (client does not have to build them) Commonly used collection is the ConcurrentHashMap  Put-if-absent; remove-if-equal; replace-if-equal

Another concurrent collection CopyOnWriteArray{List,Set}  Copying cost – but good when collection is seldom updated  “Purely functional data structures” by Chris Okasaki

Lock Striping (11.4.3) Have a lock for independent subsets of a collection (e.g. HashMap): N hash buckets N/16 locks. Issue: how do you protect the whole collection? Acquire ALL of the locks (recursively!)

Producer-consumer pattern Originally a program structuring convenience; now one way to achieve parallelism Similar to shell pipes  But data objects are typed and structured Visitor pattern  For each item in a collection perform an action  Ex: Treewalk – the tree walker has state for keeping track of where it is  What if the action involves keeping complicated state? - compression Producer-consumer pattern lets both the producer and consumer keep state from one item to the next

Producer-Consumer Boundary: Blocking Queue Finite capacity  Java has unbounded queues – use is not recommended put() waits if full get() waits if empty

class SimpleBlockingQueue { private T[] rep; private int head, tail, size; public BQ(int size) { rep = new T[size]; head = 0; tail = 0; this.size=size; } public synchronized T take() { T result; while (head==tail) wait(); result = rep[head]; head += 1; // mod size notify(); } public synchronized put(T elem) { while (tail+1==head) wait(); // mod size rep[tail] = elem; tail += 1; // mod size notify(); } Are the empty and full conditions correct? Does it work if size==1? (Queue with size==1 is often called a *mailbox* How could you allow concurrent putting and taking? What would be the problem cases?

Extended blocking queue interface offer and poll methods  Don’t wait – always return immediately  Allow clients to decide what to do if blocking would occur  Note: have to be careful using these in check-act situations. Why?

Serial thread confinement Recall thread confinement – object is only accessible from a single thread An object only accessed from a single thread does not require synchronization Serial thread confinement – after passing an object into a blocking queue the producer never touches it again. The synchronization of the blocking queue suffices to safely publish the object to the consumer Other mechanisms for safe publishing can also be used in the serial confinement pattern

Deques Double-ended queues Pronounced “deck” Put and take at either end

Cooperative Interruption Interruptions occur only when a thread is blocked  Interruption is delivered as a InterruptedException This is a nice model Thread is free to swallow an interrupted exception (not good practice) Contrast with C signals and signal handlers  Signal handler can start executing any time the signal is not blocked  This is not a nice model

Don’t swallow InterruptedException (Fig. 5.10) Methods that call things that can raise InterruptedException must either  Be declared as throw’ing InterruptedException, or  Catch InterruptedException Clean up local state Thread.currentThread.interrupt() // restore interrupt status if InterruptedException cannot be re-raised  Or both Re-raising InterruptedException after cleaning up local state

Other fun synchronizers - Latches  CountDownLatch – initial non-zero value  Blocks threads calling latch.await() until latch value is 0; once 0 value never changes – “sticky” “monotonic property”  latch.countDown() reduces the latch value  Co-ordinated starting and ending points Create N tasks that wait on a startingLatch (value 1) startingLatch.countDown() endingLatch.await() // endingLatch initial value N Each of N threads calls endingLatch.countDown()

FutureTask Think of it as a thread body that computes a value th = new Thread(future); // different constructor than Thread(runnable) future.get() returns the result of the execution – complicated by how to handle exceptions. Not very elegant to my mind

Semaphores Like latches, an integer value acquire(): if value is 0 wait, otherwise reduce the value by 1 release(): increase the value by 1

Barriers CyclicBarrier – allow N threads to repeatedly all gather at the barrier Think about rewriting the example used in Hwk1 to use latches and/or barriers in place of the homegrown barriers that I used.

Results cache (5.6) Very interesting example Sometimes called a “memo” cache Illustrates implementation of caching – a common technique to increase performance  Idea: if it takes a long time to get an answer, save it for awhile in case you need it again Memory caches, disk caches, results caches

Stages in Cache Design  Cache for concurrent use is tricky Obvious, use intrinsic lock around everything. Problem: lock held during long computation limits concurrency Use ConcurrentHashMap – better concurrency but unsynchronized check-then-act with long “act” potentially allows unneeded work Cache a FutureTask in the ConcurrentHashMap – still unsynchronized check-then-act but now the “act” is just inserting a FutureTask rather than the long expensive computation Use putIfAbsent to eliminate this final race