Presentation is loading. Please wait.

Presentation is loading. Please wait.

Single Machine Scheduling Problem Lesson 5. Maximum Lateness and Related Criteria Problem 1|r j |L max is NP-hard.

Similar presentations


Presentation on theme: "Single Machine Scheduling Problem Lesson 5. Maximum Lateness and Related Criteria Problem 1|r j |L max is NP-hard."— Presentation transcript:

1 Single Machine Scheduling Problem Lesson 5

2 Maximum Lateness and Related Criteria Problem 1|r j |L max is NP-hard.

3 1|r j |L max Polynomial solvable cases: r j = r for all j = 1,..., n. Jackson ’ s rule: Schedule jobs in order of nondecreasing due dates. d j = d for all j = 1,..., n. Schedule jobs in order of nondecreasing release dates. p j = 1 for all j = 1,..., n. Horn ’ s rule: At any time schedule an available job with the smallest due date. It is easy to prove the correctness of all these rules by using interchange arguments.

4 Precedence relations The previous results may be extended to the corresponding problems with precedence relations between jobs. In case d j = d we have to modify the release dates before applying the corresponding rule. Other cases require a similar modification to the due dates.

5 Modification of Release Dates i → j & r i + p i > r j i riri rjrj r i + p i t j r ’ j = r i + p i j j → k k rkrk r ’ k = r i + p i + p j k

6 Modification of Due Dates i → j & d j – p j < d i i didi d j – p j t d’i = dj – pjd’i = dj – pj j djdj i i didi t j djdj Li’Li’ LjLj L i ’ = L j

7 1| prec; p j = 1; r j |L max Again, we can prove by exchange arguments that after modifying release times and due dates the above scheduling rules provide optimal schedules. We will give a proof only for problem 1| prec; p j = 1; r j |L max. The other proofs are similar.

8 1| prec; p j = 1; r j |L max Theorem 4.3 The schedule constructed by the Horn’s rule with modified due dates is optimal. Horn ’ s rule: At any time schedule an available job with the smallest due date.

9 Active schedule A schedule is called active if it is not possible to schedule jobs early without violating the constraints. In such an active schedule, each job starts at a release time or at a finishing time of some job. The exists an optimal schedule which is active. Consider an optimal active schedule S* which coincides as long as possible with the schedule S constructed by Horn ’ s rule.

10 Proof of Theorem 4.3 j ● ● ● S*: i1i1 i2i2 i3i3 ilil i i S:S: t Let t be the first time at which a job i of S and a different job j of S* begin. Let i 1,..., i l be successors of job j. r i, r j  t Horn ’ s rule + d i  d j  d i ν i ● ● ● S’:S’: ji1i1 i2i2 i l–1 ilil d i  d j  d i ν S’ is optimal

11 1| prec; pmtn; r j |L max Earliest Due Date Rule (EDD-rule): Schedule the jobs starting at the smallest r j -value. At each decision point t given by a release time or a finishing time of some job, schedule a job j with the following properties: r j  t, all its predecessors are scheduled, and it has the smallest modified due date.. (4.7)

12 1| prec; pmtn; r j |L max Theorem 4.4 The schedule constructed by EDD rule is optimal for problem 1| prec; pmtn; r j |L max.

13 Proof of Theorem 4.4 j ● ● ● S*: i1i1 i2i2 ilil i i S:S: t Assume that both schedule coincide until time t. Let i 1,..., i l be successors of job j. r i, r j  t EDD rule+ d i  d j  d i ν i ● ● ● S’:S’: ji1i1 i2i2 i l–1 ilil d i  d j  d i ν S’ is optimal ii2i2

14 Head-Tail Problem

15 Largest Tail Rule Corollary 4.5 A preemptive schedule for the one machine head- tail problem with precedence constraints can be constructed in O(n 2 ) time using the following rule: At each time given by a head t or a finishing time t of some job, schedule a precedence feasible job j with r j  t which has a largest tail.

16 1| prec; p j = 1|L max The first step is to modify the due dates in such a way that they are compatible with the precedence relations. Additionally, we assume that all modified due dates are nonnegative. So we have L max  0. Using the modified due dates d j, an optimal schedule can be calculated in O(n) time.

17 Two ideas. The jobs are processed in [0,n]. This implies that no job j with d j  n is late, even if it is processed as the last job. Because L max  0, these jobs have no influence on the L max -value. To sort the jobs we may use a bucket sorting method i.e. we construct the sets

18 1|tree| Σw j C j We have to schedule jobs with arbitrary processing times on a single machine so that a weighted sum of completion times is minimized. The processing time are assumed to be positive. Precedence constraints are given by a tree. We first assume that the tree is an outtree (i.e. each node in the tree has at most one predecessor). Before presenting an algorithm for outtrees, we will prove some basic properties of optimal schedules.

19 Notation For each job i = 1,..., n, define q i = w i /p i and let S(i) be the set of (not necessarily immediate) successors of i including i. For a set of jobs I  {1,..., n} define Two subsets I, J  {1,..., n} are parallel (I ~ J) if, for all i  I, j  J, neither i is a successor of j nor vise versa. The parallel sets must be disjoint. In the case {i} ~ {j} we simply write i ~ j.

20 Property of Optimal Schedule (1) Lemma 4.6 Let π be an optimal sequence and let I, J represent two blocks (sets of jobs to be processed consequently) of π such that I is scheduled before J. Let π’ be the sequence we get from π by swapping I and J. Then a) I ~ J implies q(I)  q(J), b)if I ~ J and q(I) = q(J), then π’ is also optimal.

21 Proof of Lemma 4.6 (a) π π’π’ Block I Block J p(I)p(I) p(J)p(J) p(J)p(J) f := Σ w j C j π is optimal f(π)  f(π’) f(π’) 0  f(π’) – f(π) = w(I)p(J) – w(J)p(I) q(I) = w(I)/p(I)  w(J)/p(J) = q(J) I ~ J

22 Proof of Lemma 4.6 (b) π π’π’ Block I Block J p(I)p(I) p(J)p(J) p(J)p(J) f := Σ w j C j w(I)p(J) = w(J)p(I) q(I) = q(J) I ~ J f(π’) = f(π)

23 Property of Optimal Schedule (2) Theorem 4.7 Let i, j be jobs with i → j and q j = max{q k | k  S(i)}. Then there exists an optimal schedule in which i is processed immediately before j.

24 Proof of Theorem 4.7 Each schedule can be represented by a sequence. Let π be an optimal sequence with the property that the number l of jobs scheduled between i and j is minimal. Assume l > 0. Then we have the following situation. kj ● ● ● i i → j and q j = max{q k | k  S(i)}.

25 Case 1: k  S(i) kj ● ● ● i i → j Outtree k ~ j Lemma 4.6 Optimal schedule π : q(k)  q(j) q j = max{q k | k  S(i)}q(k)  q(j) q(k) = q(j) kj ● ● ● i Optimal schedule π’ :

26 Case 2: k  S(i) kj ● ● ● i Optimal schedule π : h ● ● ● Block K : r  K  r  S(i) h is the latest job between i and j | h  S(i) i → j Outtree  e: j  S(e)  i  S(e) r  K  j  S(r)  K ~ j q(K)  q(j)

27 Case 2: k  S(i) kj ● ● ● i Optimal schedule π : h ● ● ● Block K : r  K  r  S(i) h is the latest job between i and j | h  S(i) h  S(i) r  S(i) q(h)  q(K)  q(j)r  S(h)  K ~ j q j = max{q k |k  S(i)}q(h)  q(j) Optimal schedule π’ : kj ● ● ● ih q(K) = q(j) + Lemma 4.6

28 Idea of Algorithm The conditions of Theorem 4.7 are satisfied if we choose a job different from the root with maximal q j -value, along with its unique father i. Since the exist an optimal schedule in which i is processed immediately before j, we merge nodes i and j and make all sons of j additional sons of i. The new node i, which represents the subsequence π i : i, j, will have the label q(i):= q(J i ), with J i = {i, j}. Note that for a son of j, its new farther i (represented by J i ) can be identified by looking for the set J i which contains j.

29 Merging Procedure The merging process will be apply recursively. In the general step, each node i represents a set of jobs J i and corresponding sequence π i of the jobs J i, where i is the first job in this sequence. We select a vertex j different from the root with maximal q(j)- value. Let f be the unique father of j in the original outtree. Then we have to find a node i of the current tree with f  J i. We merge j and i, replacing J i and π i by J i ∪ J j and π i ○ π j, where π i ○ π j is the concatenation of the sequences π i and π j.

30 Optimality Theorem 4.8 The Merging Procedure can be implemented in polynomial time and calculate an optimal sequence of the 1|outtree| Σw j C j problem.

31 Proof of Theorem 4.8 We proof optimality by induction on the number of jobs. Clearly the procedure is correct if we have only one job. Let P be a problem with n jobs. Assume that i, j are the first jobs merged by the algorithm. Let P ’ be the resulting problem with n – 1 jobs, where i is replaced by I:={i, j} with w(I) = w(i) + w(j) and p(I) = p(i) + p(j).

32 Proof of Theorem 4.8 (2) Let R be the set of sequences of the form π: π(1),..., π(k), i, j, π(k + 3),..., π(n), and let R ’ be the set of sequences of the form π’: π(1),..., π(k), I, π(k + 3),..., π(n). Note that by Theorem 4.7 set R contains an optimal schedule. For the corresponding objective function values f n (π) and f n- 1 (π’) we have f n (π) – f n-1 (π’) = w(i)p(i) + w(j)(p(i) + p(j)) – – (w(i) + w(j))(p(i) + p(j)) = – w(i)p(j). We conclude that π is optimal  π’ is optimal.

33 1|intree| Σw j C j To solve a 1|intree| Σw j C j problem P, we reduce P to a 1|outtree| Σw j ’ C j problem P’ with i is a successor of j in P’  j is a successor of i in P w j ’ = – w j for j = 1,..., n. Then a sequence π: 1,..., n is feasible for P if and only if π’: n,..., 1 is feasible for P’.

34 1|intree| Σw j C j

35 1|| Σw j C j Smith ’ s ratio rule: Put the jobs in order of nondecreasing ratios p i /w i, which applies if w i > 0. Homework To prove that Smith’s ratio rule leads to optimal sequence by using interchange arguments.

36 1|| ΣC j Smith ’ s rule (SPT-rule): Put the jobs in order of nondecreasing processing times.

37 1| pmtn; r j | ΣC j Modified Smith ’ s rule: At each release time or finishing time of a job, schedule an unfinished job which is available and has the smallest remaining processing time.

38 Optimality Theorem 4.9 A schedule constructed by modified Smith’s rule is optimal for problem 1| pmtn; r j | ΣC j.

39 Proof of Theorem 4.9 j S*: i i S:S: t Assume that both schedule coincide until time t. S’:S’: t’t’ ijj jiijj Smith ’ s rule  the remaining processing time of j is not smaller than the remaining processing time of i  S ’ is optimal.

40 Exercise Partition Given n positive integer numbers s 1, s 2,…, s n, is there a subset J  I ={1,..., n} such that Partition problem is NP-hard. Show that the problem 1|r j |L max is NP-hard by reducing the partition problem to it. (Hint: Given an instance of the partition problem, construct an instance of jobs with release dates such that if there is a partition no job is late, if there is no partition, at least one job is late.)

41 Exercise 5.1 Find an optimal schedule for the following instance of 1| pmtn; r j | ΣC j problem. J1J1 J2J2 J3J3 J4J4 J5J5 J6J6 J7J7 rjrj 75100140 pjpj 2454138

42 Exercise 5.2 Find an optimal schedule for the following instance of 1| outtree| Σw j C j problem. J1J1 J2J2 J3J3 J4J4 J5J5 J6J6 J7J7 pjpj 15106144 wjwj 20254138 6 2 1 3 457

43 Running time The complexity of the algorithm is O(n 2 ). This can be seen as follows. If we exclude the recursive calls in Step 7, the number of steps for the Procedure Decompose is O(|S|). Thus, for the number f (n) of computational steps we have the recursion f (n) = cn + Σ f (n i ) where n i is the number of jobs in the i-th block and Σ n i  n.


Download ppt "Single Machine Scheduling Problem Lesson 5. Maximum Lateness and Related Criteria Problem 1|r j |L max is NP-hard."

Similar presentations


Ads by Google