Presentation is loading. Please wait.

Presentation is loading. Please wait.

UMass Lowell Computer Science 91.404 Analysis of Algorithms Prof. Karen Daniels Spring, 2008 Design Patterns for Optimization Problems Dynamic Programming.

Similar presentations


Presentation on theme: "UMass Lowell Computer Science 91.404 Analysis of Algorithms Prof. Karen Daniels Spring, 2008 Design Patterns for Optimization Problems Dynamic Programming."— Presentation transcript:

1 UMass Lowell Computer Science 91.404 Analysis of Algorithms Prof. Karen Daniels Spring, 2008 Design Patterns for Optimization Problems Dynamic Programming & Greedy Algorithms

2 Algorithmic Paradigm Context Subproblem solution order Make choice, then solve subproblem(s) Solve subproblem(s), then make choice

3 Dynamic Programming Approach to Optimization Problems 1. Characterize structure of an optimal solution. 2. Recursively define value of an optimal solution. 3. Compute value of an optimal solution in bottom-up fashion. 4. Construct an optimal solution from computed information. source: 91.503 textbook Cormen, et al.

4 Dynamic Programming Longest Common Subsequence

5 Example: Longest Common Subsequence (LCS): Motivation ä Strand of DNA: string over finite set {A,C,G,T} ä each element of set is a base: adenine, guanine, cytosine or thymine ä Compare DNA similarities ä S 1 = ACCGGTCGAGTGCGCGGAAGCCGGCCGAA ä S 2 = GTCGTTCGGAATGCCGTTGCTCTGTAAA ä One measure of similarity: ä find the longest string S 3 containing bases that also appear (not necessarily consecutively) in S 1 and S 2 ä S 3 = GTCGTCGGAAGCCGGCCGAA source: 91.503 textbook Cormen, et al.

6 Example: LCS Definitions ä Sequence is a subsequence of if (strictly increasing indices of X) such that ä example: is subsequence of with index sequence ä Z is common subsequence of X and Y if Z is subsequence of both X and Y ä example: ä common subsequence but not longest ä common subsequence. Longest? Longest Common Subsequence Problem: Given 2 sequences X, Y, find maximum-length common subsequence Z. source: 91.503 textbook Cormen, et al.

7 Example: LCS Step 1: Characterize an LCS THM 15.1: Optimal LCS Substructure Given sequences: For any LCSof X and Y: 1 if thenand Z k-1 is an LCS of X m-1 and Y n-1 2 if thenZ is an LCS of X m-1 and Y 3 if thenZ is an LCS of X and Y n-1 PROOF: based on producing contradictions 1 a) Suppose. Appending to Z contradicts longest nature of Z. b) To establish longest nature of Z k-1, suppose common subsequence W of X m-1 and Y n-1 has length > k-1. Appending to W yields common subsequence of length > k = contradiction. b) To establish longest nature of Z k-1, suppose common subsequence W of X m-1 and Y n-1 has length > k-1. Appending to W yields common subsequence of length > k = contradiction. 2 Common subsequence W of X m-1 and Y of length > k would also be common subsequence of X m, Y, contradicting longest nature of Z. 3 Similar to proof of (2) source: 91.503 textbook Cormen, et al.

8 Example: LCS Step 2: A Recursive Solution ä Implications of Theorem 15.1: ? yes no Find LCS(X m-1, Y n-1 ) Find LCS(X m-1, Y) Find LCS(X, Y n-1 ) LCS 1 (X, Y) = LCS(X m-1, Y n-1 ) + x m LCS 2 (X, Y) = max(LCS(X m-1, Y), LCS(X, Y n-1 ))

9 Example: LCS Step 2: A Recursive Solution (continued) ä Overlapping subproblem structure: ä Recurrence for length of optimal solution: Conditions of problem can exclude some subproblems! c[i,j]= c[i-1,j-1]+1 if i,j > 0 and x i =y j max(c[i,j-1], c[i-1,j])if i,j > 0 and x i =y j 0 if i=0 or j=0  (mn) distinct subproblems source: 91.503 textbook Cormen, et al.

10 Example: LCS Step 3: Compute Length of an LCS source: 91.503 textbook Cormen, et al. c table c table (represent b table) (represent b table) B CB A B C B A 0 1 2 3 4 What is the asymptotic worst- case time complexity?

11 Example: LCS Step 4: Construct an LCS source: 91.503 textbook Cormen, et al.

12 Dynamic Programming & Greedy Algorithm Activity Selection

13 Activity Selection Optimization Problem ä Problem Instance: ä Set S = {1,2,...,n} of n activities ä Each activity i has: ä start time: s i ä finish time: f i ä Activities i, j are compatible iff non-overlapping: ä Objective: ä select a maximum-sized set of mutually compatible activities source: 91.404 textbook Cormen, et al.

14 Activity Selection 12 3 4 5 6 7 8 9 10 12 11 13 1415 16 1 2 3 4 8 7 6 5 Activity Time Duration Activity Number

15 Activity Selection Algorithmic Progression ä “Brute-Force” ä Dynamic Programming #1 ä Exponential number of subproblems ä Dynamic Programming #2 ä Quadratic number of subproblems ä Greedy Algorithm

16 Activity Selection Solution to S ij including a k produces 2 subproblems: 1) S ik (start after a i finishes; finish before a k starts) 2) S kj (start after a k finishes; finish before a j starts) source: 91.404 textbook Cormen, et al. c[i,j]=size of maximum-size subset of mutually compatible activities in S ij.

17 Greedy Algorithm

18 What is a Greedy Algorithm? ä Solves an optimization problem ä Optimal Substructure: ä optimal solution contains in it optimal solutions to subproblems ä Greedy Strategy: ä At each decision point, do what looks best “locally” ä Choice does not depend on evaluating potential future choices or presolving overlapping subproblems ä Top-down algorithmic structure ä With each step, reduce problem to a smaller problem ä Greedy Choice Property: ä “locally best” = globally best

19 Greedy Strategy Approach 1. Determine the optimal substructure of the problem. 2. Develop a recursive solution. 3. Prove that, at any stage of the recursion, one of the optimal choices is the greedy choice. 4. Show that all but one of the subproblems caused by making the greedy choice are empty. 5. Develop a recursive greedy algorithm. 6. Convert it to an iterative algorithm. source: 91.404 textbook Cormen, et al.

20 source: web site accompanying 91.404 textbook Cormen, et al. Errors from earlier printing are corrected in red. High-level call: RECURSIVE-ACTIVITY-SELECTOR(s,f,0,n) Returns an optimal solution for Recursive Greedy Activity Selection

21 source: web site accompanying 91.404 textbook Cormen, et al.

22 Running time? Iterative Greedy Activity Selection ä Iterative Greedy Algorithm: ä S’ = presort activities in S by nondecreasing finish time ä and renumber ä GREEDY-ACTIVITY-SELECTOR(S’) ä n length[S’] ä A {1} ä j1 ä for i 2 to n ä do if ä then ä j i ä return A source: 91.503 textbook Cormen, et al.

23 Streamlined Greedy Strategy Approach 1. View optimization problem as one in which making choice leaves one subproblem to solve. 2. Prove there always exists an optimal solution that makes the greedy choice. 3. Show that greedy choice + optimal solution to subproblem optimal solution to problem. source: 91.404 textbook Cormen, et al. Greedy Choice Property: “locally best” = globally best


Download ppt "UMass Lowell Computer Science 91.404 Analysis of Algorithms Prof. Karen Daniels Spring, 2008 Design Patterns for Optimization Problems Dynamic Programming."

Similar presentations


Ads by Google