Presentation is loading. Please wait.

Presentation is loading. Please wait.

9/10/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Adam Smith Algorithm Design and Analysis L ECTURE 13 Dynamic.

Similar presentations


Presentation on theme: "9/10/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Adam Smith Algorithm Design and Analysis L ECTURE 13 Dynamic."— Presentation transcript:

1 9/10/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Adam Smith Algorithm Design and Analysis L ECTURE 13 Dynamic Programming Fibonacci Weighted Interval Scheduling

2 Design Techniques So Far Greedy. Build up a solution incrementally, myopically optimizing some local criterion. Recursion / divide &conquer. Break up a problem into subproblems, solve subproblems, and combine solutions. Dynamic programming. Break problem into overlapping subproblems, and build up solutions to larger and larger sub-problems. 9/10/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne

3 Dynamic Programming History Bellman. [1950s] Pioneered the systematic study of dynamic programming. Etymology. n Dynamic programming = planning over time. n Secretary of Defense was hostile to mathematical research. n Bellman sought an impressive name to avoid confrontation. Reference: Bellman, R. E. Eye of the Hurricane, An Autobiography. "it's impossible to use dynamic in a pejorative sense" "something not even a Congressman could object to"

4 Dynamic Programming Applications Areas. n Bioinformatics. n Control theory. n Information theory. n Operations research. n Computer science: theory, graphics, AI, compilers, systems, …. Some famous dynamic programming algorithms. n Unix diff for comparing two files. n Viterbi for hidden Markov models / decoding convolutional codes n Smith-Waterman for genetic sequence alignment. n Bellman-Ford for shortest path routing in networks. n Cocke-Kasami-Younger for parsing context free grammars.

5 Fibonacci Sequence Sequence defined by a 1 = 1 a 2 = 1 a n = a n-1 + a n-2 How should you compute the Fibonacci sequence? Recursive algorithm: Running Time? 1, 1, 3, 5, 8, 13, 21, 34, … Fib(n) 1. If n =1 or n=2, then 2. return 1 3. Else 4. a = Fib(n-1) 5. b = Fib(n-2) 6. return a+b

6 Computing Fibonacci Sequence Faster Observation: Lots of redundancy! The recursive algorithm only solves n- 1 different sub-problems “Memoization”: Store the values returned by recursive calls in a sub- table Resulting Algorithm: Linear time, if integer operations take constant time Fib(n) 1. If n =1 or n=2, then 2. return 1 3. Else 4. f[1]=1; f[2]=1 5. For i=3 to n 6. f[i]  f[i-1]+f[i-2] 7. return f[n]

7 Computing Fibonacci Sequence Faster Observation: Fibonacci recurrence is linear Can compute A n using only O(log n) matrix multiplications; each one takes O(1) integer multiplications and additions. Total running time? O(log n) integer operations. Exponential improvement! Exercise: how big an improvement if we count bit operations? Multiplying k-bit numbers takes O(k log k) time. How many bits needed to write down a n ?

8 Weighted Interval Scheduling Weighted interval scheduling problem. n Job j starts at s j, finishes at f j, and has weight or value v j. n Two jobs compatible if they don't overlap. n Goal: find maximum weight subset of mutually compatible jobs. Time f g h e a b c d 012345678910

9 Unweighted Interval Scheduling Review Recall. Greedy algorithm works if all weights are 1. n Consider jobs in ascending order of finish time. n Add job to subset if it is compatible with previously chosen jobs. Observation. Greedy algorithm can fail spectacularly with weights Time 01234567891011 b a weight = 999 weight = 1

10 Weighted Interval Scheduling Notation. Label jobs by finishing time: f 1  f 2 ...  f n. Def. p(j) = largest index i < j such that job i is compatible with j. = largest i such that f i  s j Ex: p(8) = 5, p(7) = 3, p(2) = 0. Time 01234567891011 6 7 8 4 3 1 2 5

11 Dynamic Programming: Binary Choice Notation. OPT(j) = value of optimal solution to the problem consisting of job requests 1, 2,..., j. n Case 1: OPT selects job j. – collect profit v j – can't use incompatible jobs { p(j) + 1, p(j) + 2,..., j - 1 } – must include optimal solution to problem consisting of remaining compatible jobs 1, 2,..., p(j) n Case 2: OPT does not select job j. – must include optimal solution to problem consisting of remaining compatible jobs 1, 2,..., j-1 optimal substructure

12 Input: n, s 1,…,s n, f 1,…,f n, v 1,…,v n Sort jobs by finish times so that f 1  f 2 ...  f n. Compute p(1), p(2), …, p(n) Compute-Opt(j) { if (j = 0) return 0 else return max(v j + Compute-Opt(p(j)), Compute-Opt(j-1)) } Weighted Interval Scheduling: Brute Force Brute force algorithm. Worst case: T(n) = T(n-1) + T(n-2) + (…)

13 Weighted Interval Scheduling: Brute Force Observation. Recursive algorithm fails spectacularly because of redundant sub-problems  exponential algorithms. Ex. Number of recursive calls for family of "layered" instances grows like Fibonacci sequence. 3 4 5 1 2 p(1) = 0, p(j) = j-2 5 43 3221 21 10 1010

14 Memoization. Store results of each sub-problem in a table; lookup as needed. Input: n, s 1,…,s n, f 1,…,f n, v 1,…,v n Sort jobs by finish times so that f 1  f 2 ...  f n. Compute p(1), p(2), …, p(n) for j = 1 to n M[j] = empty M[0] = 0 M-Compute-Opt(j) { if (M[j] is empty) M[j] = max(w j + M-Compute-Opt(p(j)), M-Compute-Opt(j-1)) return M[j] } global array Weighted Interval Scheduling: Memoization

15 Weighted Interval Scheduling: Running Time Claim. Memoized version of algorithm takes O(n log n) time. n Sort by finish time: O(n log n). n Computing p(  ) : O(n log n) via repeated binary search. n M-Compute-Opt(j) : each invocation takes O(1) time and either – (i) returns an existing value M[j] – (ii) fills in one new entry M[j] and makes two recursive calls n Case (ii) occurs at most n times  at most 2n recursive calls overall n Overall running time of M-Compute-Opt(n) is O(n). ▪ Remark. O(n) if jobs are pre-sorted by start and finish times.

16 Equivalent algorithm: Bottom-Up Bottom-up dynamic programming. Unwind recursion. Input: n, s 1,…,s n, f 1,…,f n, v 1,…,v n Sort jobs by finish times so that f 1  f 2 ...  f n. Compute p(1), p(2), …, p(n) Iterative-Compute-Opt { M[0] = 0 for j = 1 to n M[j] = max(v j + M[p(j)], M[j-1]) } how fast? Total time = (sorting) + O(n) = O(n log(n))

17 Weighted Interval Scheduling: Finding a Solution Q. Dynamic programming algorithms computes optimal value. What if we want the solution itself? A. Do some post-processing. n # of recursive calls  n  O(n). Run M-Compute-Opt(n) Run Find-Solution(n) Find-Solution(j) { if (j = 0) output nothing else if (v j + M[p(j)] > M[j-1]) print j Find-Solution(p(j)) else Find-Solution(j-1) }

18 10/8/2008 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Knapsack Problem Given n objects and a "knapsack." - Item i weighs w i > 0 kilograms and has value v i > 0. - Knapsack has capacity of W kilograms. - Goal: fill knapsack so as to maximize total value. Ex: { 3, 4 } has value 40. Many “packing” problems fit this model –Assigning production jobs to factories –Deciding which jobs to do on a single processor with bounded time –Deciding which problems to do on an exam 1 value 18 22 28 1 weight 5 6 62 7 # 1 3 4 5 2 W = 11

19 10/8/2008 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Knapsack Problem Given n objects and a "knapsack." - Item i weighs w i > 0 kilograms and has value v i > 0. - Knapsack has capacity of W kilograms. - Goal: fill knapsack so as to maximize total value. Ex: { 3, 4 } has value 40. 1 value 18 22 28 1 weight 5 6 62 7 # 1 3 4 5 2 W = 11 Greedy: repeatedly add item with maximum ratio v i / w i Example: { 5, 2, 1 } achieves only value = 35  greedy not optimal.

20 10/8/2008 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Knapsack Problem Given n objects and a "knapsack." - Item i weighs w i > 0 kilograms and has value v i > 0. - Knapsack has capacity of W kilograms. - Goal: fill knapsack so as to maximize total value. Ex: { 3, 4 } has value 40. 1 value 18 22 28 1 weight 5 6 62 7 # 1 3 4 5 2 W = 11 Greedy algorithm?

21 10/8/2008 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Dynamic programming: attempt 1 Definition: OPT(i) = maximum profit subset of items 1, …, i. –Case 1: OPT does not select item i. OPT selects best of { 1, 2, …, i-1 } –Case 2: OPT selects item i. without knowing what other items were selected before i, we don't even know if we have enough room for i Conclusion. Need more sub-problems!

22 10/8/2008 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Adding a new variable Definition: OPT(i, w) = max profit subset of items 1, …, i with weight limit w. –Case 1: OPT does not select item i. OPT selects best of { 1, 2, …, i-1 } with weight limit w –Case 2: OPT selects item i. new weight limit = w – w i OPT selects best of { 1, 2, …, i–1 } with new weight limit

23 10/8/2008 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Bottom-up algorithm Fill up an n-by-W array. Input: n, W, w 1,…,w N, v 1,…,v N for w = 0 to W M[0, w] = 0 for i = 1 to n for w = 1 to W if (w i > w) M[i, w] = M[i-1, w] else M[i, w] = max {M[i-1, w], v i + M[i-1, w-w i ]} return M[n, W]

24 10/8/2008 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Knapsack table n 1 Value 18 22 28 1 Weight 5 6 62 7 Item 1 3 4 5 2  { 1, 2 } { 1, 2, 3 } { 1, 2, 3, 4 } { 1 } { 1, 2, 3, 4, 5 } 0 0 0 0 0 0 0 1 0 1 1 1 1 1 2 0 6 6 6 1 6 3 0 7 7 7 1 7 4 0 7 7 7 1 7 5 0 7 18 1 6 0 7 19 22 1 7 0 7 24 1 28 8 0 7 25 28 1 29 9 0 7 25 29 1 34 10 0 7 25 29 1 34 11 0 7 25 40 1 W W = 11 OPT: { 4, 3 } value = 22 + 18 = 40

25 How do we make turn this into a proof of correctness? Dynamic programming (and divide and conquer) lends itself directly to induction. –Base cases: OPT(i,0)=0, OPT(0,w)=0 (no items!). –Inductive step: explaining the correctness of recursive formula If the following values are correctly computed: –OPT(0,w-1),OPT(1,w-1),…,OPT(n,w-1) –OPT(0,w),…,OPT(i-1,w) Then the recursive formula computes OPT(i,w) correctly –Case 1: …, Case 2: … (from previous slide). 10/8/2008 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne

26 About proofs Proof is a rigorous argument about a mathematical statement’s truth –Should convince you –Should not feel like a shot in the dark –What makes a proof “good enough” is social –(Truly rigorous, machine-readable proofs exist but are too painful for human use). In real life, you have to check your own work –Ideal: all problems should have grades 100% or 20% 10/8/2008 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne

27 10/8/2008 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Time and Space Complexity Given n objects and a "knapsack." - Item i weighs w i > 0 kilograms and has value v i > 0. - Knapsack has capacity of W kilograms. - Goal: fill knapsack so as to maximize total value. What is the input size? In words? In bits? 1 value 18 22 28 1 weight 5 6 62 7 # 1 3 4 5 2 W = 11

28 10/8/2008 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Time and space complexity Time and space:  (n W). –Not polynomial in input size! –Space can be made O(W) (exercise!) –"Pseudo-polynomial." –Decision version of Knapsack is NP-complete. [KT, chapter 8] Knapsack approximation algorithm. There is a poly-time algorithm that produces a solution with value within 0.01% of optimum. [KT, section 11.8]


Download ppt "9/10/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Adam Smith Algorithm Design and Analysis L ECTURE 13 Dynamic."

Similar presentations


Ads by Google