Presentation is loading. Please wait.

Presentation is loading. Please wait.

PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Similar presentations


Presentation on theme: "PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1."— Presentation transcript:

1 PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1

2 Tasks vs Threads Similar but not the same. 6/16/2010Parallel Programming Abstractions 2 h/w processors Operating System Threads Task Scheduler Tasks

3 Tasks vs Threads: Main Differences Tasks need not run concurrently The task scheduler can schedule them on the same thread 6/16/2010Parallel Programming Abstractions 3

4 Tasks vs Threads: Main Differences Tasks need not run concurrently The task scheduler can schedule them on the same thread Tasks do not have fairness guarantees 6/16/2010Parallel Programming Abstractions 4 while(t == 0); Write(“hello”); while(t == 0); Write(“hello”); t = 1;

5 Tasks vs Threads: Main Differences Tasks need not run concurrently The task scheduler can schedule them on the same thread Tasks do not have fairness guarantees Tasks are cheaper than threads Have smaller stacks Do not pre-allocate OS resources 6/16/2010Parallel Programming Abstractions 5

6 Generating Tasks from Programs You don’t want programmers to explicitly create tasks Like assembly level programming Instead: Design programming language constructs that capture programmers intent Compiler converts these constructs to tasks Languages with the following features are very convenient for this purpose Type inference Generics First order anonymous functions (lamdas/delegates) 6/16/2010Parallel Programming Abstractions 6

7 First-order functions: C# Lambda Expressions Syntax (input parameters) => expression Is similar to : &anon_foo // where foo is declared elsewhere anon_foo(input parameters){ return expression;} Examples: x => x (x,y) => x==y (int x, string s) => s.Length > x () => { return SomeMethod()+1; } Func myFunc = x => x == 5; bool result = myFunc(4); Parallel Programming Abstractions 7 6/16/2010

8 Sequential Merge Sort With Lambdas 6/16/2010Parallel Programming Abstractions 8 MergeSort(int[] a, low, hi){ if(base_case) … int mid = low + (hi-low)/2; var f = (l,h) => { MergeSort(a, l, h);} f(low, mid-1); f(mid, high); Merge(a, low, mid, hi); } MergeSort(int[] a, low, hi){ if(base_case) … int mid = low + (hi-low)/2; var f = (l,h) => { MergeSort(a, l, h);} f(low, mid-1); f(mid, high); Merge(a, low, mid, hi); }

9 Things to Know about C# Lambdas Lambda is an expression (with no type) Conversion to a delegate type Type inference for parameters Capture of free variables Locations referenced by free variables are converted to be on the heap (“boxed”) Parallel Programming Abstractions 9 6/16/2010

10 The Task Abstraction 6/16/2010Parallel Programming Abstractions 10 delegate void Action(); class Task { Task( Action a ); void Wait(); // called by the WSQ scheduler void Execute(); } delegate void Action(); class Task { Task( Action a ); void Wait(); // called by the WSQ scheduler void Execute(); }

11 Merge Sort With Tasks 6/16/2010Parallel Programming Abstractions 11 MergeSort(int[] a, low, hi){ if(base_case) … int mid = low + (hi-low)/2; Task left = new Task( delegate{ MergeSort(a, low, mid); } ); Task right = new Task( delegate{ MergeSort(a, mid, hi); } ); left.Wait(); right.Wait(); Merge(a, low, mid-1, hi); } MergeSort(int[] a, low, hi){ if(base_case) … int mid = low + (hi-low)/2; Task left = new Task( delegate{ MergeSort(a, low, mid); } ); Task right = new Task( delegate{ MergeSort(a, mid, hi); } ); left.Wait(); right.Wait(); Merge(a, low, mid-1, hi); }

12 Parallel.Invoke Invokes all input actions in parallel Waits for all of them to finish 6/16/2010Parallel Programming Abstractions 12 static void Invoke(params Action[] actions);

13 Merge Sort With Parallel.Invoke 6/16/2010Parallel Programming Abstractions 13 MergeSort(int[] a, low, hi){ if(base_case) … int mid = low + (hi-low)/2; Paralle.Invoke{ () => { MergeSort(a, low, mid-1); } () => { MergeSort(a, mid, hi); } } Merge(a, low, mid, hi); } MergeSort(int[] a, low, hi){ if(base_case) … int mid = low + (hi-low)/2; Paralle.Invoke{ () => { MergeSort(a, low, mid-1); } () => { MergeSort(a, mid, hi); } } Merge(a, low, mid, hi); }

14 Compare with Sequential Version 6/16/2010Parallel Programming Abstractions 14 MergeSort(int[] a, low, hi){ if(base_case) … int mid = low + (hi-low)/2; { MergeSort(a, low, mid-1); } { MergeSort(a, mid, hi); } Merge(a, low, mid, hi); } MergeSort(int[] a, low, hi){ if(base_case) … int mid = low + (hi-low)/2; { MergeSort(a, low, mid-1); } { MergeSort(a, mid, hi); } Merge(a, low, mid, hi); }

15 Data Parallelism Sometimes you want to perform the same computation on all elements of a collection For every string in an array, check if it contains “foo” 6/16/2010Parallel Programming Abstractions 15

16 Parallel.For Iterates a variable i from lower from to upper Calls the delegate with i as the parameter in parallel 6/16/2010Parallel Programming Abstractions 16 For(int lower, int upper, delegate int (int) body);

17 Parallel.For 6/16/2010Parallel Programming Abstractions 17 // sequential for for(int i=0; i<n; i++){ if(a[i].Contains(“foo”)){DoSomething(a[i]);} } //Parallel for Parallel.For(0, n, (i) => { if(a[i].Contains(“foo”)){DoSomething(a[i]);} }); // sequential for for(int i=0; i<n; i++){ if(a[i].Contains(“foo”)){DoSomething(a[i]);} } //Parallel for Parallel.For(0, n, (i) => { if(a[i].Contains(“foo”)){DoSomething(a[i]);} });

18 A; Parallel.For(0, N, m: i => { B; }); C; m(0)m(1) m(N-1) … The DAG created by Parallel.For A C Parallel Programming Abstractions 18 6/16/2010

19 Paralle.ForEach Same as Parallel.For, but iterates over elements of a collection in parallel 6/16/2010Parallel Programming Abstractions 19 ForEach (IEnumerable source, delegate void (T) body);

20 Advantage of High-Level Abstractions Makes it easy to add parallelism Explore and experiment with different parallelization strategies Language Compiler/Runtime can perform performance optimizations Efficiently create tasks Efficiently distribute tasks to available cores Tight integration with scheduler 6/16/2010Parallel Programming Abstractions 20

21 Example Partitioning on Two Cores Parallel Performance 21 6/16/2010

22 Partitioning on Four Cores Parallel Performance 22 6/16/2010

23 Advantage of High-Level Abstractions Makes it easy to add parallelism Explore and experiment with different parallelization strategies Language Compiler/Runtime can perform performance optimizations Efficiently create tasks Efficiently distribute tasks to available cores Tight integration with scheduler Provide programmatic features Exceptions Cancelling running tasks 6/16/2010Parallel Programming Abstractions 23

24 Semantics of Parallel Constructs What does this do: 6/16/2010Parallel Programming Abstractions 24 Paralle.Invoke{ () => { WriteLine(“Hello”); } () => { WriteLine(“World”); } } Paralle.Invoke{ () => { WriteLine(“Hello”); } () => { WriteLine(“World”); } }

25 Semantics of Parallel Constructs What does this do: Compare with 6/16/2010Parallel Programming Abstractions 25 Paralle.Invoke{ () => { WriteLine(“Hello”); } () => { WriteLine(“World”); } } Paralle.Invoke{ () => { WriteLine(“Hello”); } () => { WriteLine(“World”); } } { WriteLine(“Hello”); } { WriteLine(“World”); } { WriteLine(“Hello”); } { WriteLine(“World”); }

26 Semantics of Parallel Constructs By writing this program You are telling the compiler that both outcomes are acceptable 6/16/2010Parallel Programming Abstractions 26 Paralle.Invoke{ () => { WriteLine(“Hello”); } () => { WriteLine(“World”); } } Paralle.Invoke{ () => { WriteLine(“Hello”); } () => { WriteLine(“World”); } }

27 Correctness Criteria Given a DAG, any linearization of the DAG is an acceptable execution 6/16/2010Parallel Programming Abstractions 27

28 Correctness Criteria Simple criterion: Every task operates on separate data No dependencies other than the edges in the DAG We will look at more complex criteria later in the course. 6/16/2010Parallel Programming Abstractions 28


Download ppt "PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1."

Similar presentations


Ads by Google