Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Short Term Scheduling. 2  Planning horizon is short  Multiple unique jobs (tasks) with varying processing times and due dates  Multiple unique jobs.

Similar presentations


Presentation on theme: "1 Short Term Scheduling. 2  Planning horizon is short  Multiple unique jobs (tasks) with varying processing times and due dates  Multiple unique jobs."— Presentation transcript:

1 1 Short Term Scheduling

2 2  Planning horizon is short  Multiple unique jobs (tasks) with varying processing times and due dates  Multiple unique jobs sharing the same set of resources (machines)  Time is treated as continuous (not discretized into periods)  Varying objective functions Characteristics

3 3  Common in make-to-order environments with high product variety  Common as a support tool for MRP in generating detailed schedules once orders have been released Characteristics (Continued…)

4 4 Example Two jobs, A and B Two machines M1 and M2 Jobs are processed on M1 and then on M2 Job A: 9 minutes on M1 and 2 minutes on M2 Job B: 4 minutes on M1 and 9 minutes on M2

5 5 Example

6 6 Example (Continued…)

7 7 Challenge As the number of jobs increases, complete enumeration becomes difficult: 3! = 6, 4! = 24, 5! = 120, 6! = 720, … 10! =3,628,800, while 13! = 6,227,020,800 25!= 15,511,210,043,330,985,984,000,000

8 8 Classification of Scheduling Problems  Number of jobs  Number of machines  Type of production facility  Single machine  Flow shop  parallel machines  job shop  Job arrivals  Static  Dynamic  Performance measures

9 9 A Single Machine Example Jobs 123456 Processing time, p j 12 8 3 10 4 18 Release time, r j -20-15 12-10 3 2 Due date, d j 10 2 72 -8 8 60

10 10 The Single Machine Problem Single machine scheduling problems seek an optimal sequence (for a given criterion) in which to complete a given collection of jobs on a single machine that can accommodate only one job at a time.

11 11 Decision Variables  x j : time job j is started (relative to time 0 = now), x j  max(0, r j ) for all values of j.

12 12 Sequencing constraints - (start time of j )+ (processing time of j ) < start time of j ’ or - (start time of j’ )+ (processing time of j’ ) < start time of j

13 13 Sequencing constraints - (start time of j )+ (processing time of j )  start time of j ’ or - (start time of j’ )+ (processing time of j’ )  start time of j x j + p j  x j’ or x j’ + p j’  x j

14 14 Disjunctive variables - Introduce disjunctive variables y jj’, y jj’ = 1 if job j is scheduled before job j’ and y jj’ = 0 otherwise. x j + p j  x j’ + M (1 - y jj’ ), x j’ + p j’  x j + My jj’, for all pairs of j and j’ ( for every j and every j’ > j), M is a large positive constant

15 15 Due date constraints x j + p j  d j for all values of j

16 16 Examples of Performance measures

17 17 Example Jobs 1 2 3 Processing time15 6 9 Release time 510 0 Due date202536 Start time 924 0

18 18 Objective functions

19 19 Objective functions

20 20 Formulation: Minimizing Makespan (Maximum Completion Time)

21 21 A Formulation with a Linear Objective Function

22 22  Similar formulations can be constructed with other min- max objective functions, such as minimizing maximum lateness or maximum tardiness.  Other objective functions involving minimizing means (other than mean tardiness) are already linear.

23 23 The Job Shop Scheduling Problem  N jobs  M Machines  A job j visits in a specified sequence a subset of the machines

24 24 Notation  p jm : processing time of job j on machine m,  x jm : start time of job j on machine m,  y j,j’,m = 1 if job j is scheduled before job j’ on machine m,  M ( j ): The subset of the machines visited by job j,  SS ( m, j ): the set of machines that job j visits after visiting machine m

25 25 Formulation

26 26 Solution Methods  Small to medium problems can be solved exactly (to optimality) using techniques such as branch and bound and dynamic programming  Structural results and polynomial (fast) algorithms for certain special cases  Large problems in general may not solve within a reasonable amount of time (the problem belongs to a class of combinatorial optimization problems called NP-hard)  Large problems can be solved approximately using heuristic approaches

27 27 Single Machine Results Makespan  Not affected by sequence Average Flow Time  Minimized by performing jobs according to the “shortest processing time” (SPT) order Average Lateness  Minimized by performing in “shortest processing time” (SPT) order Maximum Lateness (or Tardiness)  Minimized by performing in “earliest due date” (EDD) order.  If there exists a sequence with no tardy jobs, EDD will do it

28 28 Single Machine Results (Continued…) Average Weighted Flow Time  Minimized by performing according to the “smallest processing time ratio” (processing time/weight) order Average Tardiness  No simple sequencing rule will work

29 29 Two Machine Results Given a set of jobs that must go through a sequence of two machines, what sequence will yield the minimum makespan?

30 30 Johnson’s Algorithm A Simple algorithm (Johnson 1954): 1. Sort the processing times of the jobs on the two machines in two lists. 2. Find the shortest processing time in either list and remove the corresponding job from both lists. –If the job came from the first list, place it in the first available position in the sequence. –If the job came from the second list, place it in the last available position in sequence. 3. Repeat until are all jobs have been sequenced. The resulting sequence minimizes makespan.

31 31 Data:

32 32 Johnson’s Algorithm Example Data: Iteration 1: min time is 4 (job 1 on M1); place this job first and remove from both lists:

33 33 Data:

34 34 Johnson’s Algorithm Example (Continued…) Iteration 2: min time is 5 (job 3 on M2); place this job last and remove from lists: Iteration 3: only job left is job 2; place in remaining position (middle). Final Sequence: 1-2-3 Makespan: 28

35 35 Gantt Chart for Johnson’s Algorithm Example Short task on M1 to “load up” quickly. Short task on M2 to “clear out” quickly.

36 36 Three Machine Results  Johnson’s algorithm can be extended to three machines by creating two composite machines (M1* = M1 + M2) and (M2* = M2 + M3) and then applying Johnson’s algorithm to these two machines  Optimality is guaranteed only when certain conditions are met  smallest processing time on M1 is greater or equal than largest processing on machine 2, or  smallest processing time on M3 is greater or equal than largest processing on machine 2

37 37 Multi-Machine Results  Generate M-1 pairs of dummy machines  Example: with 4 machines, we have the following three pairs (M1, M4), (M1+M2, M3+M4), (M1+M2+M3, M2+M3+M4)  Apply Johnson’s algorithm to each pair and select the best resulting schedule out of the M-1 schedules generated  Optimality is not guaranteed.

38 38 Dispatching Rules In general, simple sequencing rules (dispatching rules) do not lead to optimal schedules. However, they are often used to solve approximately (heuristically) complex scheduling problems. Basic Approach  Decompose a multi-machine problem (e.g., a job shop scheduling problem) into sub-problems each involving a single machine.  Use a simple dispatching rule to sequence jobs on each of these machines.

39 39 Example Dispatching Rules  FIFO – simplest, seems “fair”.  SPT – Actually works quite well with tight due dates.  EDD – Works well when jobs are mostly the same size.  Critical ratio (time until due date/work remaining) - Works well for tardiness measures  Many (100’s) others.

40 40 Heuristics Algorithms  Construction heuristics  Use a procedure (a set of rules) to construct from scratch a good (but not necessarily optimal) schedule  Improvement heuristics  Starting from a feasible schedule (possibly obtained using a construction heuristic), use a procedure to further improve the schedule

41 41 Example: A Single Machine with Setups  N jobs to be scheduled on a single machine with a sequence dependent setup preceding the processing of each job.  The objective is to identify a sequence that minimizes makespan.  The problem is an instance of the Traveling Salesman Problem (TSP).  The problem is NP-hard (the number of computational steps required to solve the problem grow exponentially with the number of jobs).

42 42 A Heuristic Algorithm

43 43 A Heuristic Algorithm  Greedy heuristic : Start with an arbitrary job from the set of N jobs. Schedule jobs subsequently based on “next shortest setup time.”

44 44 A Heuristic Algorithm  Greedy heuristic : Start with an arbitrary job from the set of N jobs. Schedule jobs subsequently based on “next shortest setup time.”  Improved greedy heuristic : Evaluate sequences with all possible starting jobs ( N different schedules). Choose schedule with the shortest makespan.

45 45 A Heuristic Algorithm  Greedy heuristic : Start with an arbitrary job from the set of N jobs. Schedule jobs subsequently based on “next shortest setup time.”  Improved greedy heuristic : Evaluate sequences with all possible starting jobs ( N different schedules). Choose schedule with the shortest makespan.  Improved heuristic : Starting from the improved greedy heuristic solution carry out a series of pair-wise interchanges in the job sequence. Stop when solution stops improving.

46 46 A Problem Instance  16 jobs  Each job takes 1 hour on single machine (the bottleneck resource)  4 hours of setup to change over from one job family to another  Fixed due dates  Find a solution that minimizes tardiness

47 47 EDD Sequence Average Tardiness: 10.375

48 48 A Greedy Search  Consider all pair-wise interchanges  Choose one the one that reduces average tardiness the most  Continue until no further improvement is possible

49 49 First Interchange: Exchange Jobs 4 and 5. Average Tardiness: 5.0 (reduction of 5.375!)

50 50 Greedy Search Final Sequence Average Tardiness: 0.5 (9.875 lower than EDD)

51 51 A Better (Due-Date Feasible) Sequence Average Tardiness: 0

52 52 Revisiting Computational Times Current situation: computers can examine 1,000,000 sequences per second and we wish to build a scheduling system that has a response time of no longer than one minute. How many jobs can we sequence optimally (using a brute force approach)?

53 53 Effect of Faster Computers Future Situation: New computers will be 1,000 times faster, i.e. it can do 1 billion comparisons per second). How many jobs can we sequence optimally now?

54 54 Implications for Real Problems Computation: NP (non-polynomial) algorithms are slow to use No Technology Fix: Faster computers do not help on NP algorithm. Exact Algorithms: Need for specialized algorithms that take advantage of the structure of the problem to reduce the search space. Heuristics: Likely to continue to be the dominant approach to solving large problems in practice (e.g., multi-step exchange algorithms, Genetic Algorithms, Simulated Annealing, Tabu Search, among others)

55 55 Implications for Real Problems (Continued…) Robustness: NP hard problems have many solutions, and presumably many “good” ones.  Example : 25 job sequence problem. Suppose that only one in a trillion of the possible solutions is “good”. This still leaves 15 trillion “good” solutions. Our task is to find one of these. Focus on Bottleneck: We can often concentrate on scheduling the bottleneck process, which simplifies the problem closer to a single machine case.


Download ppt "1 Short Term Scheduling. 2  Planning horizon is short  Multiple unique jobs (tasks) with varying processing times and due dates  Multiple unique jobs."

Similar presentations


Ads by Google