Download presentation

Presentation is loading. Please wait.

Published byJoleen Melton Modified over 5 years ago

1
Spring, 2007 8-1 Scheduling Operations

2
Spring, 2007 8-2 Scheduling Problems in Operations Job Shop Scheduling. Personnel Scheduling Facilities Scheduling Vehicle Scheduling and Routing Project Management Dynamic versus Static Scheduling

3
Spring, 2007 8-3 The Hierarchy of Production Decisions The logical sequence of operations in factory planning corresponds to the following sequence All planning starts with the demand forecast. Demand forecasts are the basis for the top level (aggregate) planning. The Master Production Schedule (MPS) is the result of disaggregating aggregate plans down to the individual item level. Based on the MPS, MRP is used to determine the size and timing of component and subassembly production. Detailed shop floor schedules are required to meet production plans resulting from the MRP.

4
Spring, 2007 8-4 Hierarchy of Production Decisions

5
Spring, 2007 8-5

6
Spring, 2007 8-6

7
Spring, 2007 8-7

8
Spring, 2007 8-8

9
Spring, 2007 8-9

10
Spring, 2007 8-10

11
Spring, 2007 8-11

12
Spring, 2007 8-12

13
Spring, 2007 8-13 Characteristics of the Job Shop Scheduling Problem Job Arrival Pattern Number and Variety of Machines Number and skill level of workers Flow Patterns Evaluation of Alternative Rules

14
Spring, 2007 8-14 Objectives in Job Shop Scheduling Meet due dates Minimize work-in-process (WIP) inventory Minimize average flow time Maximize machine/worker utilization Reduce set-up times for changeovers Minimize direct production and labor costs (note: that these objectives can be conflicting)

15
Spring, 2007 8-15 Terminology Flow shop: n jobs processed through m machines in the same sequence Job shop: the sequencing of jobs through machines may be different, and there may be multiple operations on some machines. Parallel processing vs. sequential processing: parallel processing means that the machines are identical. Flow time of job i: Time elapsed from initiation of first job until completion of job i. Makespan: Flow time of the job completed last. Tardiness: The positive difference between the completion time and the due date. Lateness: Difference between completion time and due date (may be negative).

16
Spring, 2007 8-16 Common Sequencing Rules FCFS. First Come First Served. Jobs processed in the order they come to the shop. SPT. Shortest Processing Time. Jobs with the shortest processing time are scheduled first. EDD. Earliest Due Date. Jobs are sequenced according to their due dates. CR. Critical Ratio. Compute the ratio of processing time of the job and remaining time until the due date. Schedule the job with the largest CR value next.

17
Spring, 2007 8-17 Results for Single Machine Sequencing The rule that minimizes the mean flow time of all jobs is SPT. The following criteria are equivalent: Mean flow time Mean waiting time. Mean lateness Moore’s algorithm minimizes number of tardy jobs Lawler’s algorithm minimizes the maximum flow time subject to precedence constraints.

18
Spring, 2007 8-18 Results for Multiple Machines The optimal solution for scheduling n jobs on two machines is always a permutation schedule (that is, jobs are done in the same order on both machines). (This is the basis for Johnson’s algorithm.) For three machines, a permutation schedule is still optimal if we restrict attention to total flow time only. Under rare circumstances, the two machine algorithm can be used to solve the three machine case. When scheduling two jobs on m machines, the problem can be solved by graphical means.

19
Spring, 2007 8-19 Stochastic Scheduling: Static Case Single machine case. Suppose that processing times are random variables. If the objective is to minimize average weighted flow time, jobs are sequenced according to expected weighted SPT. That is, if job times are t 1, t 2,..., and the respective weights are u 1, u 2,... then job i precedes job i+1 if E(t i ) / u i < E(t i+1 ) / u i+1.

20
Spring, 2007 8-20 Stochastic Scheduling: Static Case (continued) Multiple Machines. Requires the assumption that the distribution of job times is exponential, (memoryless property). Assume parallel processing of n jobs on two machines. Then the optimal sequence is to to schedule the jobs according to LEPT (longest expected processing time first). Johnsons algorithm for scheduling n jobs on two machines in the deterministic case has a natural extension to the stochastic case as long as the job times are exponentially distributed.

21
Spring, 2007 8-21 Stochastic Scheduling: Dynamic Analysis When jobs arrive to the shop dynamically over time, queueing theory provides a means of analyzing the results. The standard M/M/1 queue applies to the case of purely random arrivals to a single machine with random processing times. If the selection discipline does not depend on the flow times, the mean flow times are the same, but the variance of the flow times will differ. If job times are realized when the job joins the queue rather than when the job enters service, SPT generally results in lowest expected flow time.

22
Spring, 2007 8-22 22 of 52 Single Machine Deterministic Models Jobs: J 1, J 2,..., J n Assumptions: The machine is always available throughout the scheduling period. The machine cannot process more than one job at a time. Each job must spend on the machine a prescribed length of time.

23
Spring, 2007 8-23 1 3 2 J1J1 J2J2 J3J3... S(t)S(t) t

24
Spring, 2007 8-24 Requirements that may restrict the feasibility of schedules: precedence constraints no preemptions release dates deadlines * Whether some feasible schedule exist? NP hard Objective function f is used to compare schedules. f(S) < f(S') whenever schedule S is considered to be better than S' problem of minimising f(S) over the set of feasible schedules.

25
Spring, 2007 8-25 1. Completion Time Models Due date related objectives: 2. Lateness Models 3. Tardiness Models 4. Sequence-Dependent Setup Problems

26
Spring, 2007 8-26 Completion Time Models Contents 1. An algorithm which gives an optimal schedule with the minimum total weighted completion time 1 || w j C j 2. An algorithm which gives an optimal schedule with the minimum total weighted completion time when the jobs are subject to precedence relationship that take the form of chains 1 | chain | w j C j

27
Spring, 2007 8-27 Literature: Scheduling, Theory, Algorithms, and Systems, Michael Pinedo, Prentice Hall, 1995, or new: Second Addition, 2002, Chapter 3.

28
Spring, 2007 8-28 1 || w j C j Theorem. The weighted shortest processing time first rule (WSPT) is optimal for 1 || w j C j WSPT: jobs are ordered in decreasing order of w j /p j The next follows trivially: The problem 1 || C j is solved by a sequence S with jobs arranged in nondecreasing order of processing times.

29
Spring, 2007 8-29 Proof. By contradiction. S is a schedule, not WSPT, that is optimal. j and k are two adjacent jobs such that which implies that w j p k < w k p j t t + p j + p k j k S:S: t kj S’S’... S: (t+p j ) w j + (t+p j +p k ) w k = t w j + p j w j + t w k + p j w k + p k w k S’: (t+p k ) w k + (t+p k +p j ) w j = t w k + p k w k + t w j + p k w j + p j w j the completion time for S’ < completion time for S contradiction!

30
Spring, 2007 8-30 1 | chain | w j C j chain 1: 1 2 ... k chain 2: k+1 k+2 ... n Lemma. If the chain of jobs 1,...,k precedes the chain of jobs k+1,...,n. Let l* satisfy factor of chain 1,...,k l* is the job that determines the factor of the chain

31
Spring, 2007 8-31 Lemma. If job l* determines (1,...,k), then there exists an optimal sequence that processes jobs 1,...,l* one after another without interruption by jobs from other chains. Algorithm Whenever the machine is free, select among the remaining chains the one with the highest factor. Process this chain up to and including the job l* that determines its factor.

32
Spring, 2007 8-32 Example chain 1: 1 2 3 4 chain 2:5 6 7 factor of chain 1 is determined by job 2: (6+18)/(3+6)=2.67 factor of chain 2 is determined by job 6: (8+17)/(4+8)=2.08 chain 1 is selected: jobs 1, 2 factor of the remaining part of chain 1 is determined by job 3: 12/6=2 factor of chain 2 is determined by job 6: 2.08 chain 2 is selected: jobs 5, 6

33
Spring, 2007 8-33 factor of the remaining part of chain 1 is determined by job 3: 2 factor of the remaining part of chain 2 is determined by job 7: 18/10=1.8 chain 1 is selected: job 3 factor of the remaining part of chain 1 is determined by job 4: 8/5=1.6 factor of the remaining part of chain 2 is determined by job 7: 1.8 chain 2 is selected: job 7 job 4 is scheduled last the final schedule: 1, 2, 5, 6, 3, 7, 4

34
Spring, 2007 8-34 1 | prec | w j C j * Polynomial time algorithms for the more complex precedence constraints than the simple chains are developed. * The problems with arbitrary precedence relation are NP hard. * 1 | r j, prmp | w j C j preemptive version of the WSPT rule does not always lead to an optimal solution, the problem is NP hard * 1 | r j, prmp | C j preemptive version of the SPT rule is optimal * 1 | r j | C j is NP hard

35
Spring, 2007 8-35 Summary * 1 || w j C j WSPT rule * 1 | chain | w j C j a polynomial time algorithm is given * 1 | prec | w j C j with arbitrary precedence relation is NP hard * 1 | r j, prmp | w j C j the problem is NP hard * 1 | r j, prmp | C j preemptive version of the SPT rule is optimal * 1 | r j | C j is NP hard

36
Spring, 2007 8-36 36 of 52 Lateness Models Contents 1. Lawler’s algorithm which gives an optimal schedule with the minimum cost h max when the jobs are subject to precedence relationship 1 | prec | h max 2. A branch-and-bound algorithm for the scheduling problems with the objective to minimise lateness 1 | r j | L max Literature: Scheduling, Theory, Algorithms, and Systems, Michael Pinedo, Prentice Hall, 1995, Chapter 3.2 or new: Second Addition, 2002, Chapter 3.

37
Spring, 2007 8-37 Lawler’s Algorithm Backward algorithm which gives an optimal schedule for 1 | prec | h max h max = max ( h 1 (C 1 ),...,h n (C n ) ) h j are nondecreasing cost functions Notation makespan C max = p j completion of the last job Jset of jobs already scheduled they have to be processed during the time interval J C complement of set J, set of jobs still to be scheduled J' J C set of jobs that can be scheduled immediately before set J (schedulable jobs)

38
Spring, 2007 8-38 Step 1. J = J C = {1,...,n} k = n Step 2. Let j* be such that Place j* in J in the k-th order position Delete j* from J C Step 3. If J C = then Stop else k = k - 1 go to Step 2 Lawler’s A lgorithm for 1 | | h max

39
Spring, 2007 8-39 Example (no precedence relationships between jobs) J = J C ={1, 2, 3} jobs still to be scheduled C max = 10 h 1 (10) = 11 h 2 (10) =12 h 3 (10) =10 Job 3 is scheduled last and has to be processed in [5, 10]. 105... 3

40
Spring, 2007 8-40 J = {3}J C ={1, 2} jobs still to be scheduled C max = 5 h 1 (5) = 6 h 2 (5) = 6 Either job 1 or job 2 may be processed before job 3. 105 321 5 312 or Two schedules are optimal: 1, 2, 3 and 2, 1, 3

41
Spring, 2007 8-41 Step 1. J = , J C = {1,...,n} J' the set of all jobs with no successors k = n Step 2. Let j* be such that Place j* in J in the k-th order position Delete j* from J C Modify J' to represent the set of jobs which can be scheduled immediately before set J. Step 3. If J C = then Stop else k = k - 1 go to Step 2 Lawler’s Algorithm for 1 | prec | h max

42
Spring, 2007 8-42 Example. What will happen in the previous example if the precedence 1 2 has to be taken into account? J = J C ={1, 2, 3} still to be scheduled J'={2, 3} have no successors C max = 10 h 2 (10) = 12 h 3 (10) = 10 J = {3} J C ={1, 2} still to be scheduled J'={2} can be scheduled immediately before J C max = 5 h 2 (5) = 6 J = {3, 2} J C ={1} J'={1} h 1 (2) = 3 Optimal schedule: 1, 2, 3, h max = 10 105... 3 105 321 5 32 2 2

43
Spring, 2007 8-43 1 || L max is the special case of the 1 | prec | h max where h j = C j - d j algorithm results in the schedule that orders jobs in increasing order of their due dates - earliest due date first rule (EDD) 1 | r j | L max is NP hard, branch-and-bound is used 1 | r j, prec | L max similar branch-and-bound

44
Spring, 2007 8-44 Branch-and-bound algorithm Search space can grow very large as the number of variables in the problem increases! Branch-and-bound is a heuristic that works on the idea of successive partitioning of the search space. S S1S1 S2S2 SnSn S 12 S 13... S = S 1 S 2 ... S n S 1 S 2 ... S n =

45
Spring, 2007 8-45 S S1S1 S2S2 SnSn S 12 S 13... We need some means for obtaining a lower bound on the cost for any particular solution (the task is to minimise the cost). f bound f(x), x S 1 f bound f(x), x S 2 there is no need to explore S 2

46
Spring, 2007 8-46 Branch-and-bound algorithm Step 1 Initialise P = S i (determine the partitions) Initialise f bound Step 2 Remove best partition S i from P Reduce or subdivide S i into S ij Update f bound P = P S ij For all S ij P do if lower bound of f(S ij ) > f bound then remove S ij from P Step 3 If not termination condition then go to Step 2

47
Spring, 2007 8-47 *,*,*,* 1,*,*,*2,*,*,* n,*,*,* 1,2,*,*1,3,*,*... Branch-and-bound algorithm for 1 | r j | L max * Solution space contains n! schedules (n is number of jobs). Total enumeration is not viable !

48
Spring, 2007 8-48 Branching rule: k-1 level, j 1,..., j k-1 are scheduled, j k need to be considered if no job still to be scheduled can not be processed before the release time of j k that is: J set of jobs not yet scheduled t is time when j k-1 is completed *,*,*,* 1,*,*,*2,*,*,* n,*,*,* 1,2,*,*1,3,*,*... 1st level 2nd level

49
Spring, 2007 8-49 Lower bound: Preemptive earliest due date (EDD) rule is optimal for 1 | r j prmp | L max A preemptive schedule will have a maximum lateness not greater than a non-preemtive schedule. If a preemptive EDD rule gives a nonpreemptive schedule then all nodes with a larger lower bound can be disregarded.

50
Spring, 2007 8-50 Example. Non-preemptive schedules 0 12 37 12 21 059 L 1 =3 L 2 =6 L max =6 L 1 =5 L 2 =-1 L max =5 Preemptive schedule obtained using EDD 212 0379 L 1 =3 L 2 =3 L max =3 * the lowest L max !

51
Spring, 2007 8-51 Example *,*,*,* 1,*,*,*2,*,*,* 4,*,*,* 1,2,*,*1,3,*,* 3,*,*,* job 2 could be processed before job 3 job 1 could be processed before job 4 1,3,4,2 L.B. = 5L.B. = 7 L.B. = 6 L.B. = 5

52
Spring, 2007 8-52 *, *, *, * 1,*,*,* 1 [0, 4]L 1 =-4 3 [4, 5] 4 [5, 10]L 4 =0 3 [10, 15]L 3 =4 2 [15, 17]L 2 =5 2,*,*,* 2 [1, 3]L 2 =-9 1 [3, 7]L 1 =-1 4 [7, 12]L 4 =2 3 [12, 18]L 3 =7 1,2,*,* 1 [0, 4]L 1 =-4 2 [4, 6]L 2 =-6 4 [6, 11]L 4 =1 3 [11, 17]L 3 =6 1,3,*,* 1 [0, 4] L 1 =-4 3 [4, 10]L 3 =-1 4 [10, 15]L 4 =5 2 [15, 17]L 3 =5 Schedule: 1, 3, 4, 2, 4,*,*,* either job 1 or 2 can be processed before 4 ! 3,*,*,* job 2 can be processed before 3 !

53
Spring, 2007 8-53 Summary * 1 | prec | h max, h max =max( h 1 (C 1 ),...,h n (C n ) ), Lawler’s algorithm * 1 || L max EDD rule * 1 | r j | L max is NP hard, branch-and-bound is used 1 | r j, prec | L max similar branch-and-bound * 1 | r j, prmp | L max preemptive EDD rule

54
Spring, 2007 8-54 54 of 52 Tardiness Models Contents 1. Moor’s algorithm which gives an optimal schedule with the minimum number of tardy jobs 1 || U j 2. An algorithm which gives an optimal schedule with the minimum total tardiness 1 || T j Literature: Scheduling, Theory, Algorithms, and Systems, Michael Pinedo, Prentice Hall, 1995, Chapters 3.3 and 3.4 or new: Second Addition, 2002, Chapter 3.

55
Spring, 2007 8-55 Optimal schedule has this form j d 1,...,j d k, j t 1,...,j t l Notation Jset of jobs already scheduled J C set of jobs still to be scheduled J d set of jobs already considered for scheduling, but which have been discarded because they will not meet their due date in the optimal schedule meet their due dates EDD rule do not meet their due dates Moor’s algorithm for 1 || U j

56
Spring, 2007 8-56 Step 1. J = J d = J C = {1,...,n} Step 2. Let j* be such that Add j* to J Delete j* from J C Step 3. If else let k* be such that Delete k* from J Add k* to J d Step 4. If J d = STOP else go to Step 2. then go to Step 4.

57
Spring, 2007 8-57 Example J = , J d = , J C = {1,...,5} j*=1J = {1}, J d = , J C = {2, 3, 4, 5}, t=7 < 9 = d 1 7 1 7 1 2 15 7 1 2 3 19 j*=2J = {1, 2}, J d = , J C = {3, 4, 5}, t=15 < 17 = d 2 j*=3J = {1, 2, 3}, J d = , J C = {4, 5}, t=19 > 18 = d 3 k*=2J = {1, 3}, J d = {2}, t=11

58
Spring, 2007 8-58 j*=4J = {1, 3, 4}, J d = {2}, J C = {5}, t=17 < 19 = d 4 j*=5J = {1, 3, 4, 5}, J d = {2}, J C = , t=23 > 21 = d 5 k*=1J = {3, 4, 5}, J d = {2, 1}, t=16 < 21 = d 5 optimal schedule 3, 4, 5, 1, 2 U j = 2 7 1 3 15 4 17 7 1 3 15 4 17 5 23

59
Spring, 2007 8-59 The Total Tardiness 1 || T j is NP hard Lemma. If p j < p k and d j < d k then there exists an optimal sequence in which job j is scheduled before job k. d 1 ... d n and p k = max(p 1,..., p n ) {1,...,k-1} any order k {k+1,...,n} Lemma.There exists an integer , 0 n-k such that there is an optimal schedule S in which job k is preceded by jobs j k + and followed by jobs j > k + . k {1,...,k-1, k+1,..., k+ } any order {k+ +1,..., n} any order completion time of job k

60
Spring, 2007 8-60 PRINCIPLE OF OPTIMALITY, Bellman 1956. An optimal policy has the property that whatever the initial state and the initial decision are, the remaining decisions must constitute an optimal policy with regard to the state resulting from the first decision. Algorithm Dynamic programming procedure: recursively the optimal solution for some job set J starting at time t is determined from the optimal solutions to subproblems defined by job subsets of S* S with start times t* t. J(j, l, k) contains all the jobs in a set {j, j+1,..., l} with processing time p k V( J(j, l, k), t)total tardiness of the subset under an optimal sequence if this subset starts at time t

61
Spring, 2007 8-61 Initial conditions: V( , t) = 0 V( { j }, t ) = max (0, t + p j - d j ) Recursive conditions: where k' is such that p k' = max ( p j' | j' J(j, l, k) ) Optimal value function is obtained for V( { 1,...,n }, 0 )

62
Spring, 2007 8-62 Example k'=3, 0 2 d k' = d 3 = 266 V( J(1, 3, 3), 0) = 01, 2 C 2 = 200 < 266 = d 2 T 1 + T 2 = 0 2, 1 C 1 = 200 < 260 = d 1 T 2 + T 1 = 0 C 3 (0) - d 3 = 121 + 79 + 147 - 266 = 347 - 266 = 81

63
Spring, 2007 8-63 V( J(4, 5, 3), 347) 4, 5 T 4 = 430 - 336 = 94 T 5 = 560 - 337 = 229 T 4 + T 4 = 317 5, 4 T 5 = 477 - 337 = 140 T 4 = 560 - 336 = 224 T 5 + T 4 = 364 C 3 (1) - d 3 = 347 + 83 - 266 = 430 - 266 = 164 C 3 (2) - d 3 = 430 + 130 - 266 = 560 - 266 = 294 V( J(1, 4, 3), 0)=0achieved with the sequence 1, 2, 4 and 2, 1, 4 V( J(5, 5, 3), 430)=223 V( J(1, 5, 3), 0)=76achieved with the sequence 1, 2, 4, 5 and 2, 1, 4, 5 V( , 560)=0 optimal sequences: 1, 2, 4, 5, 3 and 2, 1, 4, 5, 3

64
Spring, 2007 8-64 Summary * 1 || U j forward algorithm * 1 || w j U j is NP hard * 1 || T j is NP hard, pseudo-polynomial algorithm based on dynamic programming

65
Spring, 2007 8-65 Assembly Line Balancing Characteristics of the Assembly Line Balancing problem. A collection of n tasks must be completed on each item Tasks are assigned to stations. Tasks must be sequenced properly, and certain tasks may not be done at the same station. The objective is to assign tasks to stations to minimize the cycle time, C. The general problem is difficult to solve optimally, but effective heuristics are available. (the text discusses one known as the ranked positional weight technique.)

66
Spring, 2007 8-66 Schematic of a Typical Assembly Line

Similar presentations

© 2020 SlidePlayer.com Inc.

All rights reserved.

To make this website work, we log user data and share it with processors. To use this website, you must agree to our Privacy Policy, including cookie policy.

Ads by Google