UET Multiprocessor Scheduling Problems Nan Zang

Slides:



Advertisements
Similar presentations
On the Complexity of Scheduling
Advertisements

Covers, Dominations, Independent Sets and Matchings AmirHossein Bayegan Amirkabir University of Technology.
Minimum Clique Partition Problem with Constrained Weight for Interval Graphs Jianping Li Department of Mathematics Yunnan University Jointed by M.X. Chen.
ECE 667 Synthesis and Verification of Digital Circuits
Evaluating Heuristics for the Fixed-Predecessor Subproblem of Pm | prec, p j = 1 | C max.
Lecture 24 Coping with NPC and Unsolvable problems. When a problem is unsolvable, that's generally very bad news: it means there is no general algorithm.
Contents College 4 §4.1, §4.2, §4.4, §4.6 Extra literature on resource constrained project scheduling (will be handed out)
ISE480 Sequencing and Scheduling Izmir University of Economics ISE Fall Semestre.
1 NP-completeness Lecture 2: Jan P The class of problems that can be solved in polynomial time. e.g. gcd, shortest path, prime, etc. There are many.
1 NP-Complete Problems. 2 We discuss some hard problems:  how hard? (computational complexity)  what makes them hard?  any solutions? Definitions 
Online Scheduling with Known Arrival Times Nicholas G Hall (Ohio State University) Marc E Posner (Ohio State University) Chris N Potts (University of Southampton)
Approximation Algorithms Chapter 5: k-center. Overview n Main issue: Parametric pruning –Technique for approximation algorithms n 2-approx. algorithm.
Parallel Scheduling of Complex DAGs under Uncertainty Grzegorz Malewicz.
Lateness Models Contents
Precedence Constrained Scheduling Abhiram Ranade Dept. of CSE IIT Bombay.
Martha Garcia.  Goals of Static Process Scheduling  Types of Static Process Scheduling  Future Research  References.
Precedence Constrained Scheduling Abhiram Ranade Dept. of CSE IIT Bombay.
The number of edge-disjoint transitive triples in a tournament.
Computability and Complexity 23-1 Computability and Complexity Andrei Bulatov Search and Optimization.
Complexity 15-1 Complexity Andrei Bulatov Hierarchy Theorem.
Chapter 3: CPU Scheduling
1 IOE/MFG 543 Chapter 3: Single machine models (Sections 3.1 and 3.2)
Reduction Techniques Restriction Local Replacement Component Design Examples.
1 IOE/MFG 543* Chapter 1: Introduction *Based in part on material from Izak Duenyas, University of Michigan, Scott Grasman, University of Missouri, Rakesh.
Scheduling Using Timed Automata Borzoo Bonakdarpour Wednesday, April 13, 2005 Selected Topics in Algorithms and Complexity (CSE960)
1 Set # 4 Dr. LEE Heung Wing Joseph Phone: Office : HJ639.
Computability and Complexity 24-1 Computability and Complexity Andrei Bulatov Approximation.
Online Scheduling of Precedence Constrained Tasks Yumei Huo Department of Computer Science New Jersey Institute.
1 IOE/MFG 543 Chapter 7: Job shops Sections 7.1 and 7.2 (skip section 7.3)
Priority Models Sashka Davis University of California, San Diego June 1, 2003.
Scheduling Master - Slave Multiprocessor Systems Professor: Dr. G S Young Speaker:Darvesh Singh.
Minimizing Makespan and Preemption Costs on a System of Uniform Machines Hadas Shachnai Bell Labs and The Technion IIT Tami Tamir Univ. of Washington Gerhard.
INTRODUCTION TO SCHEDULING
Chapter 6: CPU Scheduling
EHSAN KHODDAM MOHAMMADI MILAD GANJALIZADEH BABAK YADEGARI First Steps to Study SCHEDULING بسم الله الرحمن الرحيم.
1 Scheduling CEG 4131 Computer Architecture III Miodrag Bolic Slides developed by Dr. Hesham El-Rewini Copyright Hesham El-Rewini.
Computer Science and Engineering Parallel and Distributed Processing CSE 8380 March 01, 2005 Session 14.
Edge-disjoint induced subgraphs with given minimum degree Raphael Yuster 2012.
Batch Scheduling of Conflicting Jobs Hadas Shachnai The Technion Based on joint papers with L. Epstein, M. M. Halldórsson and A. Levin.
Approximation Algorithms
Computer Science and Engineering Parallel and Distributed Processing CSE 8380 March 08, 2005 Session 16.
Week 10Complexity of Algorithms1 Hard Computational Problems Some computational problems are hard Despite a numerous attempts we do not know any efficient.
An Efficient Algorithm for Scheduling Instructions with Deadline Constraints on ILP Machines Wu Hui Joxan Jaffar School of Computing National University.
1 Short Term Scheduling. 2  Planning horizon is short  Multiple unique jobs (tasks) with varying processing times and due dates  Multiple unique jobs.
Approximation Schemes Open Shop Problem. O||C max and Om||C max {J 1,..., J n } is set of jobs. {M 1,..., M m } is set of machines. J i : {O i1,..., O.
Graph Colouring L09: Oct 10. This Lecture Graph coloring is another important problem in graph theory. It also has many applications, including the famous.
6. Application mapping 6.1 Problem definition
Outline Introduction Minimizing the makespan Minimizing total flowtime
Optimal Algorithms for Task Scheduling Implemented by Ala Al-Nawaiseh Khaled Mahmud.
NP-COMPLETE PROBLEMS. Admin  Two more assignments…  No office hours on tomorrow.
Computer Science and Engineering Parallel and Distributed Processing CSE 8380 March 03, 2005 Session 15.
Operational Research & ManagementOperations Scheduling Economic Lot Scheduling 1.Summary Machine Scheduling 2.ELSP (one item, multiple items) 3.Arbitrary.
NP-Completeness (Nondeterministic Polynomial Completeness) Sushanth Sivaram Vallath & Z. Joseph.
© The McGraw-Hill Companies, Inc., Chapter 12 On-Line Algorithms.
Introduction to Real-Time Systems
1 Online Scheduling With Precedence Constraints Yumei Huo Department of Computer Science College.
Chapter 11 Introduction to Computational Complexity Copyright © 2011 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1.
Problems in Combinatorial Optimization. Linear Programming.
11 -1 Chapter 12 On-Line Algorithms On-Line Algorithms On-line algorithms are used to solve on-line problems. The disk scheduling problem The requests.
Single Machine Scheduling Problem Lesson 5. Maximum Lateness and Related Criteria Problem 1|r j |L max is NP-hard.
Approximation Algorithms for Scheduling Lecture 11.
Classification of Scheduling Problems
Approximation Algorithms for Scheduling
Chapter 10 NP-Complete Problems.
Some Topics in OR.
Shop Scheduling Problem
Chapter 6: CPU Scheduling
ICS 353: Design and Analysis of Algorithms
György Dósa – M. Grazia Speranza – Zsolt Tuza:
Complexity Theory in Practice
Presentation transcript:

UET Multiprocessor Scheduling Problems Nan Zang

Nan Zang Overview of the paper  Introduction  Classification  Complexity results for Pm | prec, p j = 1 | C max  Algorithms for Pk | prec, p j = 1 | C max  Future research directions and conclusions

Nan Zang Scheduling  Scheduling concerns optimal allocation of scarce resources to activities.  For example Class scheduling problems  Courses  Classrooms  Teachers

Nan Zang Problem notation (1)  A set of n jobs J = { J 1, J 2, ……, J n } The execution time of the job J j is p(J j).  A set of processors P = {P 1, P 2, ……}  Schedule Specify which job should be executed by which processor at what time.  Objective Optimize one or more performance criteria

Nan Zang Problem Notation (2)  α describes the processor environment Number of processors, speed, …  β provides the job characteristics release time, precedence constraints, preemption,…  γ represents the objective function to be optimized The finishing time of the last job (makespan) The total waiting time of the jobs 3-field notation α|β|γ (Graham et al.)

Nan Zang Job Characteristics  Release time (r j ) - earliest time at which job J j can start processing. - available job  Preemption (prmp) - jobs can be interrupted during processing.  Precedence constraints (prec) - before certain jobs are allowed to start processing, one or more jobs first have to be completed. - a ready job

Nan Zang Precedence constraints (prec) Definition  Successor  Predecessor  Immediate successor  Immediate predecessor  Transitive Reduction Before certain jobs are allowed to start processing, one or more jobs first have to be completed.

Nan Zang Precedence constraints (prec) Definition  Successor  Predecessor  Immediate successor  Immediate predecessor  Transitive Reduction One or more job have to be completed before another job is allowed to start processing.

Nan Zang Precedence constraints (prec) Definition  Successor  Predecessor  Immediate successor  Immediate predecessor  Transitive Reduction One or more job have to be completed before another job is allowed to start processing.

Nan Zang Special precedence constraints (1)  In-tree (Out-tree)  In-forest (Out-forest)  Opposing forest  Interval orders  Quasi-interval orders  Over-interval orders  Series-parallel orders  Level orders

Nan Zang Special precedence constraints (2) In-forest Out-tree In-tree Out-forest Opposing forest

Nan Zang UET scheduling problem formal definition Pm| prec, p j = 1 | C max (m≥1)  Processor Environment  m identical processors are in the system.  Job characteristics  Precedence constraints are given by a precedence graph;  Preemption is not allowed;  The release time of all the jobs is 0.  Objective function  C max : the time the last job finishes execution. ( If c j denotes the finishing time of J j in a schedule S, )

Nan Zang Gantt Chart A Gantt chart indicates the time each job spends in execution, as well as the processor on which it executes Time axis P1P1 P2P2 P3P3 Slot 1Slot 2

Nan Zang Overview of the paper  Introduction  Classification  Complexity results for Pm | prec, p j = 1 | C max  Algorithms for Pk | prec, p j = 1 | C max  Future research directions and conclusions

Nan Zang Classification Due to the number of processors  Number of processors is a variable (m) Pm | prec, p j = 1 | C max  Number of processors is a constant (k) Pk | prec, p j = 1 | C max

Nan Zang Classification Due to the number of processors  Number of processors is a variable (m) Pm| prec, p j = 1 | C max  Number of processors is a constant (k) Pk| prec, p j = 1 | C max

Nan Zang Pm| prec, p j = 1 | C max (1) Theorem 1 Pm| prec, p j = 1 | C max is NP-complete.  Ullman (1976) 3SAT ≤ Pm | prec, p j = 1 | C max 2. Lenstra and Rinooy Kan (1978) k-clique ≤ Pm| prec, p j = 1 | C max Corollary 1.1 The problem of determining the existence of a schedule with C max ≤3 for the problem Pm| prec, pj = 1 | C max is NP-complete.

Nan Zang Pm| prec, p j = 1 | C max (2) Mayr (1985)  Theorem 2 Pm | p j =1, SP | C max is NP-complete. SP: Series - parallel  Theorem 3 Pm | p j =1, OF | C max is NP-complete. OF: Opposing - forest

Nan Zang SP and OF  Series-parallel orders Does NOT have a substructure isomorphic to Fig 1.  Opposing-forest orders Is a disjoint union of in-tree orders and out-tree orders. Fig 2: Opposing forest a c b d Fig 1

Nan Zang Conclusion on Pm| prec, p j = 1 | C max  3SAT is reducible to the corresponding scheduling problem.  m is a function of the number of clauses in the 3SAT problem.  Results and techniques do not hold for the case Pk | prec, p j = 1 | C max

Nan Zang Classification  Number of processors is a variable (m) Pm | prec, p j = 1 | C max  Number of processors is a constant (k) Pk | prec, p j = 1 | C max

Nan Zang Optimal Schedule for Pk | prec, p j = 1 | C max  The complexity of Pk | prec, p j = 1 | C max is open.  8 th problem in Garey and Johnson’s open problems list.(1979)  One of the three problems remaining unsolved in that list  If k = 2, P2 | prec, p j = 1 | C max is solvable in polynomial time.  Fujii, Kasami and Ninomiya (1969)  Coffman and Graham (1972)  For any fixed k, when the precedence graph is restricted to certain special forms, Pk | prec, p j = 1 | C max turns out to be solvable in polynomial time.  In-tree, Out-tree, Opposing-forest, Interval orders…

Nan Zang Special precedence constraints  In-tree (Out-tree)  In-forest (Out-forest)  Opposing forest  Interval orders  Quasi-interval orders  Over-interval orders  Level orders

Nan Zang Algorithms for Pk | prec, p j = 1 | C max  List scheduling policies  Graham’s list algorithm  HLF algorithm  MSF algorithm  CG algorithm  FKN algorithm (Matching algorithm)  Merge algorithm

Nan Zang Algorithms for Pk | prec, p j = 1 | C max  List scheduling policies  Graham’s list algorithm  HLF algorithm  MSF algorithm  CG algorithm  FKN algorithm (Matching algorithm)  Merge algorithm

Nan Zang List scheduling policies (1)  Set up a priority list L of jobs.  When a processor is idle, assign the first ready job to the processor and remove it from the list L.

Nan Zang Algorithms for Pk | prec, p j = 1 | C max  List scheduling policies  Graham’s list algorithm  HLF algorithm  MSF algorithm  CG algorithm  FKN algorithm (Matching algorithm)  Merge algorithm

Nan Zang Graham ’ s list algorithm  Graham first analyzed the performance of the simplest list scheduling algorithm.  List scheduling algorithm with an arbitrary job list is called Graham’s list algorithm.  Approximation ratio for Pk | prec, p j = 1 | C max δ = 2-1/k. (Tight!) Approximation ratio is δ if for each input instance, the makespan produced by the algorithm is at most δ times of the optimal makespan.

Nan Zang Algorithms for Pk| prec, p j = 1 | C max  List scheduling policies  Graham’s list algorithm  HLF algorithm  MSF algorithm  CG algorithm  FKN algorithm (Matching algorithm)  Merge algorithm

Nan Zang HLF algorithm (1)  T. C. Hu (1961)  Critical Path algorithm or Hu’s algorithm  Algorithm  Assign a level h to each job.  If job has no successors, h(j) equals 1.  Otherwise, h(j) equals one plus the maximum level of its immediate successors.  Set up a priority list L by nonincreasing order of the jobs’ levels.  Execute the list scheduling policy on this level based priority list L.

Nan Zang HLF algorithm (2) Level 3Level 2Level

Nan Zang HLF algorithm (3)  Time complexity O(|V|+|E|) (|V| is the number of jobs and |E| is the number of edges in the precedence graph)  Theorem 4 (Hu, 1961) The HLF algorithm is optimal for Pk | p j = 1, in-tree (out-tree) | C max.  Corollary 4.1 The HLF algorithm is optimal for Pk | p j = 1, in-forest (out-forest) | C max.

Nan Zang HLF algorithm (4)  N.F. Chen & C.L. Liu (1975) The approximation ratio of HLF algorithm for the problem with general precedence constraints: If k = 2, δ HLF ≤ 4/3. If k ≥ 3, δ HLF ≤ 2 – 1/(k-1). Tight!

Nan Zang Algorithms for Pk | prec, p j = 1 | C max  List scheduling policies  Graham’s list algorithm  HLF algorithm  MSF algorithm  CG algorithm  FKN algorithm (Matching algorithm)  Merge algorithm

Nan Zang MSF algorithm (1)  Algorithm:  Set up a priority list L by nonincreasing order of the jobs’ successors numbers. (i.e. the job having more successors should have a higher priority in L than the job having fewer successors)  Execute the list scheduling policy based on this priority list L.

Nan Zang MSF algorithm (2)

Nan Zang MSF algorithm (3)  Time complexity O(|V|+|E|)  Theorem 5 ( Papadimitriou and Yannakakis, 1979 ) The MSF algorithm is optimal for Pk | p j = 1, interval | C max.  Theorem 6 (Moukrim, 1999) The MSF algorithm is optimal for Pk | p j = 1, quasi-interval | C max.

Nan Zang Special precedence constraints  Interval orders Does NOT have a substructure isomorphic to Fig 1.  Quasi – interval orders Does NOT have a substructure isomorphic to TYPE I, II or III. Type I Type II Type III ed c ab d ba e c d b a e c a c b d Fig 1

Nan Zang MSF algorithm (4)  Ibarra & Kim (1976) The performance of MSF algorithm: If k = 2, δ MSF ≤ 4/3, and this bound is tight.  If k ≥ 3, no tight bound is known. δ MSF is at least 2-1/(k+1).

Nan Zang MSF algorithm (5)

Nan Zang Algorithms for Pk | prec, p j = 1 | C max  List scheduling policies  Graham’s list algorithm  HLF algorithm  MSF algorithm  CG algorithm  FKN algorithm (Matching algorithm)  Merge algorithm

Nan Zang CG algorithm (1)  Coffman and Graham (1972)  An optimal algorithm for P2 | prec, p j = 1 | C max.  Best approximation algorithm known for Pk | prec, p j = 1 | C max, where k ≥ 3.

Nan Zang  Definitions –Let IS(J j ) denote the immediate successors set of J j. –A job is ready to label, if all its immediate successors are labeled and it has not been labeled yet. –N(J j ) denotes the decreasing sequence of integers formed by ordering of the set { label (J i ) | J i  IS(J j ) }. –Let label(J j ) be an integer label assigned to J j. CG algorithm (2)

Nan Zang CG algorithm (3)  Assign a label to each job:  Choose an arbitrary task J k such that IS(J k ) = 0, and define label(J k ) to be 1  for i  2 to n do  R be the set of jobs that are ready to label.  Let J* be the task in R such that N(J*) is lexicographically smaller than N(J) for all J in R  Let label(J*)  i end for 2. Construct a list of jobs L = {J n, J n-1,…, J 2, J 1 } according to the decreasing order of the labels of the jobs. 3. Execute the list scheduling policy on this priority list L.

Nan Zang CG algorithm (4) N(J 9 )=(1) N(J 8 )=(7,6,5,4,3,2) N(J 10 )= N(J 11 )=N(J 12 )=(8)

Nan Zang CG algorithm (4)

Nan Zang CG algorithm (5)  Time complexity O(|V|+|E|)  Theorem 5 (Coffman and Graham, 1972) The CG algorithm is optimal for P2 | prec, p j = 1| C max.  Theorem 6 (Moukrim, 2005) The CG algorithm is optimal for Pk | p j = 1, over-interval | C max.

Nan Zang Special precedence constraints  Quasi – interval orders Does NOT have a substructure isomorphic to TYPE I, II or III.  Over – interval orders Does NOT have a substructure isomorphic to TYPE I or II. Type I Type II Type III ed c ab d ba e c d b a e c

Nan Zang CG algorithm (6) The performance of CG algorithm when k≥3  Lam and Sethi (1978) δ CG ≤ 2 – 2/k  Braschi and Trystram (1994) C max (S) ≤ (2 – 2/k) C max (S*) – (k – 2 – odd(k))/k (tight!) Note: S is a CG schedule. S* is an optimal schedule. If k is an odd, odd(k):=1; otherwise, odd(k):=0.

Nan Zang List Scheduling Policy Conclusions  List scheduling policies  Graham’s list algorithm  HLF algorithm  MSF algorithm  CG algorithm  Property  easy to implement  extended to the problem Pm | prec, pj = 1| C max  Research directions:  Allow priority lists to depend on the number k of processors.

Nan Zang Algorithms for Pk | prec, p j = 1 | C max  List scheduling policy algorithm  Graham’s list algorithm  HLF algorithm  MSF algorithm  CG algorithm  FKN algorithm (Matching algorithm)  Merge algorithm

Nan Zang FKN algorithm (1)  Fujii, Kasami and Ninomiya (1969)  First optimal algorithm for P2 | prec, p j = 1 | C max.  Basic Idea  Find a minimum partition of the jobs There are at most two jobs in each set. The pair of jobs in the same set can be executed together.  Make a valid schedule according to a particular order of the partition. (Some clever swap work needed!)  The length of the result schedule = value of the min partition  Can be solved by some maximum matching algorithm.

Nan Zang FKN algorithm (2)  Hard to extend!  FKN algorithm cannot be extended to k = 3 directly. Minimal partition is {J 1, J 5, J 6 } {J 4, J 2, J 3 } and |P|=2. However, The optimal C max corresponds partition {J 1, J 4 } {J 2,J 3,J 5 } {J 6 }

Nan Zang Algorithms for Pk | prec, p j = 1 | C max  List scheduling policies  Graham’s list algorithm  HLF algorithm  MSF algorithm  CG algorithm  FKN algorithm (Matching algorithm)  Merge algorithm

Nan Zang Merge Algorithm (1)  Dolev and Marmuth (1985)  Required input: An optimal schedule S for a high-graph H(G).  Merge algorithms show how to “ Merge” the known optimal schedule S with the remaining jobs.  Produce an optimal schedule for the whole job set G.

Nan Zang Merge Algorithm (2) Definitions height h(G) highest level of the vertices in G. median μ(G) height of kth highest component +1 high-graph H(G) a subgraph of G, made up of all the components which are strictly higher than the median. low-graph L(G) remaining subgraph of G, except for H(G)

Nan Zang Merge Algorithm (3) C1C1 C2C2 C3C3 C4C4 H(C 1 )=5H(C 2 )=3H(C 3 )=3H(C 4 )=2 K=3 3rd highest μ(G)=4 H(G)L(G) J3J3 J2J2 J 10 J4J4 J 5 J 6 J 7 J1J1 J9J9 J 12 J8J8 J 11 J 13 J 14 J 15 J 16

Nan Zang Merge Algorithm (4) Idea of the Merge Algorithm  If there is an idle period in S, then fill it with a highest initial vertices in L(G), and remove it from L(G)  Similar to HLF Algorithm

Nan Zang Merging Algorithm (5) J 10 J9J9 J8J8 J 11 J 12 J 13 J 14 J 15 J 16 S: Optimal schedule for H(G) L(G) S’: from merge algorithm

Nan Zang Merge Algorithm (6) Theorem 10 (Reduction theorem) Let G be a precedence graph and S be an optimal schedule for H(G). Then, the Merge algorithm finds an optimal schedule for the whole graph G in time and space O(|V|+|E|). Corollary 10 If H(G) is empty, then HLF is optimal for G.

Nan Zang Merge Algorithm (7) Why Merge algorithm is useful? 1. Find an optimal schedule for a subgraph H(G). 2. H(G) contains fewer than k- 1 components. Precedence Constraints Time complexity Level ordersO(n k-1 ) Opposing forestO(n 2k-2 logn) Bounded heightO(n h(k-1) ) Dolev and Marmuth How to use it? 1. If H(G) is easy to solve. 2. If every closed subgraph of G can be classified into polynomial number of classes.

Nan Zang Algorithms for Pk | prec, p j = 1 | C max  List scheduling policies  Graham’s list algorithm  HLF algorithm  MSF algorithm  CG algorithm  FKN algorithm (Matching algorithm)  Merge algorithm

Nan Zang Main results known LISTLIST Approxi mation ratio K=2 K≥3 IntreeOFInterval Quasi- interval Over- interval Arbitrary List3/22-1/k HLF4/3opt2-1/(k-1) MSF4/3opt ≥2-1/(k+1) CGopt ≈2-2/k FKNopt Mergeopt (Opt: We can get optimal solution in polynomial time. )

Nan Zang Overview of the paper  Introduction  Classification  Complexity result for Pm | prec, p j = 1 | C max  Algorithms for Pk | prec, p j = 1 | C max  Future research directions and conclusions

Nan Zang Future research directions (1)  Finding a new class of orders which can be solved by known algorithms or their generalizations.  over-interval (Moukrim, 2005)  Find an algorithm with a better approximation ratio for the UET scheduling problem.  CG algorithm (1972)  Ranade (2003) special precedence constraints (loosely connected task graphs ) δ = 1.875

Nan Zang Future research directions (2)  Use the UET multiprocessor scheduling algorithms to solve other related scheduling problems  CG algorithm is optimal for P2 | r j, p j = 1| C max.  HLF algorithm is optimal for P | r j, p j = 1, outtree| C max. (Huo and Leung, 2005)  MSF algorithm is optimal for P | p j = 1, cm=1, quasi-interval| C max. (Moukrim, 2003)  CG algorithm is optimal for P2| prmp, p j = 1| C max and ∑C j (Coffman, Sethuraman and Timkovsky, 2003)  The most challenging research task:  Solve the famous open problem Pk |p j = 1| C max (k≥3).

Thank You!