CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming - 1 Dynamic.

Slides:



Advertisements
Similar presentations
Dynamic Programming.
Advertisements

Lecture 8: Dynamic Programming Shang-Hua Teng. Longest Common Subsequence Biologists need to measure how similar strands of DNA are to determine how closely.
15.Dynamic Programming Hsu, Lih-Hsing. Computer Theory Lab. Chapter 15P.2 Dynamic programming Dynamic programming is typically applied to optimization.
Unit 4: Dynamic Programming
CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.
Introduction to Algorithms
Comp 122, Fall 2004 Dynamic Programming. dynprog - 2 Lin / Devi Comp 122, Spring 2004 Longest Common Subsequence  Problem: Given 2 sequences, X =  x.
Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from.
1 Dynamic Programming (DP) Like divide-and-conquer, solve problem by combining the solutions to sub-problems. Differences between divide-and-conquer and.
Dynamic Programming (pro-gram)
Dynamic Programming Part 1: intro and the assembly-line scheduling problem.
Dynamic Programming Lets begin by looking at the Fibonacci sequence.
Dynamic Programming Lecture 9 Asst. Prof. Dr. İlker Kocabaş 1.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 11.
Dynamic Programming (II)
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2006 Lecture 1 (Part 3) Design Patterns for Optimization Problems.
CPSC 311, Fall 2009: Dynamic Programming 1 CPSC 311 Analysis of Algorithms Dynamic Programming Prof. Jennifer Welch Fall 2009.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2009 Design Patterns for Optimization Problems Dynamic Programming.
Dynamic Programming1. 2 Outline and Reading Matrix Chain-Product (§5.3.1) The General Technique (§5.3.2) 0-1 Knapsack Problem (§5.3.3)
Dynamic Programming Code
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2002 Lecture 1 (Part 3) Tuesday, 9/3/02 Design Patterns for Optimization.
Dynamic Programming Optimization Problems Dynamic Programming Paradigm
CPSC 411 Design and Analysis of Algorithms Set 5: Dynamic Programming Prof. Jennifer Welch Spring 2011 CPSC 411, Spring 2011: Set 5 1.
Dynamic Programming1. 2 Outline and Reading Matrix Chain-Product (§5.3.1) The General Technique (§5.3.2) 0-1 Knapsack Problem (§5.3.3)
1 Dynamic Programming Andreas Klappenecker [based on slides by Prof. Welch]
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 5. Greedy Algorithms - 1 Greedy.
UNC Chapel Hill Lin/Manocha/Foskey Optimization Problems In which a set of choices must be made in order to arrive at an optimal (min/max) solution, subject.
Analysis of Algorithms CS 477/677
Dynamic Programming Reading Material: Chapter 7 Sections and 6.
© 2004 Goodrich, Tamassia Dynamic Programming1. © 2004 Goodrich, Tamassia Dynamic Programming2 Matrix Chain-Products (not in book) Dynamic Programming.
Lecture 8: Dynamic Programming Shang-Hua Teng. First Example: n choose k Many combinatorial problems require the calculation of the binomial coefficient.
Optimal binary search trees
Analysis of Algorithms
11-1 Matrix-chain Multiplication Suppose we have a sequence or chain A 1, A 2, …, A n of n matrices to be multiplied –That is, we want to compute the product.
Dynamic Programming Introduction to Algorithms Dynamic Programming CSE 680 Prof. Roger Crawfis.
Algorithms and Data Structures Lecture X
Dynamic Programming UNC Chapel Hill Z. Guo.
October 21, Algorithms and Data Structures Lecture X Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Dynamic Programming. Well known algorithm design techniques:. –Divide-and-conquer algorithms Another strategy for designing algorithms is dynamic programming.
CS 5243: Algorithms Dynamic Programming Dynamic Programming is applicable when sub-problems are dependent! In the case of Divide and Conquer they are.
COSC 3101A - Design and Analysis of Algorithms 7 Dynamic Programming Assembly-Line Scheduling Matrix-Chain Multiplication Elements of DP Many of these.
CSC401: Analysis of Algorithms CSC401 – Analysis of Algorithms Chapter Dynamic Programming Objectives: Present the Dynamic Programming paradigm.
CS 8833 Algorithms Algorithms Dynamic Programming.
Dynamic Programming Greed is not always good.. Jaruloj Chongstitvatana Design and Analysis of Algorithm2 Outline Elements of dynamic programming.
Algorithmics - Lecture 121 LECTURE 11: Dynamic programming - II -
Optimization Problems In which a set of choices must be made in order to arrive at an optimal (min/max) solution, subject to some constraints. (There may.
Computer Sciences Department1.  Property 1: each node can have up to two successor nodes (children)  The predecessor node of a node is called its.
15.Dynamic Programming. Computer Theory Lab. Chapter 15P.2 Dynamic programming Dynamic programming is typically applied to optimization problems. In such.
Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from.
TU/e Algorithms (2IL15) – Lecture 3 1 DYNAMIC PROGRAMMING
TU/e Algorithms (2IL15) – Lecture 4 1 DYNAMIC PROGRAMMING II
Advanced Algorithms Analysis and Design
1 Chapter 15-2: Dynamic Programming II. 2 Matrix Multiplication Let A be a matrix of dimension p x q and B be a matrix of dimension q x r Then, if we.
Dynamic Programming Typically applied to optimization problems
Advanced Design and Analysis Techniques
Dynamic Programming Several problems Principle of dynamic programming
Dynamic Programming Comp 122, Fall 2004.
CSCE 411 Design and Analysis of Algorithms
Data Structure and Algorithms
Chapter 15: Dynamic Programming II
Dynamic Programming Comp 122, Fall 2004.
Ch. 15: Dynamic Programming Ming-Te Chi
DYNAMIC PROGRAMMING.
COMP108 Algorithmic Foundations Dynamic Programming
Algorithms and Data Structures Lecture X
Analysis of Algorithms CS 477/677
Asst. Prof. Dr. İlker Kocabaş
Analysis of Algorithms CS 477/677
Data Structures and Algorithms Dynamic Programming
Presentation transcript:

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Dynamic Programming

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Dynamic Programming Coming up Assembly-line scheduling Matrix-chain multiplication Elements of dynamic programming Longest common subsequence Optimal binary search trees (Chap 15)

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming First Dynamic Programming Example: Assembly-line Scheduling

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Assembly line 1 Assembly line 2 Assembly-line Scheduling a 1,1 a 1,2 a 1,n-1 a 1,n t 1,1 t 2,1 t 1,2 t 2,2 a 1,4 a 1,3 t 1,3 t 2,3 t 1,n-1 t 2,n-1 … x1x1 Completed auto exits a 2,1 a 2,2 a 2,3 a 2,n-1 a 2,n a 2,4 x2x2 e1e1 e2e2 chassis enters Station S 2,1 Station S 2,2 Station S 2,3 Station S 2,4 …Station S 2,n-1 Station S 2,n Station S 1,1 Station S 1,2 Station S 1,3 Station S 1,4 …Station S 1,n-1 Station S 1,n

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Assembly-line Scheduling Problem: To minimize the total processing time for one auto. a 1,1 a 1,2 a 1,n-1 a 1,n t 1,1 a 1,4 a 1,3 t 2,3 t 2,n-1 … x1x1 Completed auto exits a 2,1 a 2,2 a 2,3 a 2,n- 1 a 2,n a 2,4 x1x1 chassis enters Station S 2,1 Station S 2,2 Station S 2,3 Station S 2,4 … Station S 2,n-1 Station S 2,n Station S 1,1 Station S 1,2 Station S 1,3 Station S 1,4 … Station S 1,n-1 Station S 1,n …

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Assembly-line Scheduling To find the fastest ways, we may inspect the speeds of all possible ways. Solve it by dynamic-programming. But … there are 2 n possible ways !!! It takes  (2 n ) time.

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Assembly-line Scheduling (Step 1) Step 1. Characterize the structure of an optimal solution (the fastest way through the factory) or The fastest way through the factory = The faster one of either a 2,j-1 The fastest way to reach S 2,j-1 x2x2 a 1,n x1x1 The fastest way to reach S 1,j-1

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Assembly-line Scheduling (Step 1) Step 1. Characterize the structure of an optimal solution (the fastest way through the factory) or a 2,j-1 a 1,j The fastest way to reach S 2,j-1 t 2,n-1 The fastest way to reach station S 1,j = The faster one of either a 1,j-1 a 1,j The fastest way to reach S 1,j-1

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Assembly-line Scheduling (Step 2) Step 2. Recursively define an optimal solution (fastest way through the factory), in terms of the optimal solutions to subproblems Let f * denote the fastest time through the factory. Let f i [j] denote the fastest time to reach and get through S i,j. Then, f * = min (f 1 [n] + x 1, f 2 [n] + x 2 ) e 1 +a 1,1 if j=1 min (f 1 [j-1] + a 1,j, f 2 [j-1] + t 2,j-1 + a 1,j ) if j>1 f 1 (j) = e 2 +a 2,1 if j=1 min (f 2 [j-1] + a 2,j, f 1 [j-1] + t 1,j-1 + a 2,j ) if j>1 f 2 (j) = Most of the sub- problems are visited more than once. Any clever method to handle them?

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Assembly-line Scheduling (Step 3) Step 3. Compute the value of an optimal solution in a bottom-up fashion FASTEST-WAY (a,t,e,x,n) 1f 1 [1]  e 1 + a 1,1 2f 2 [1]  e 2 + a 2,1 3for j  2 to n 4do if f 1 [j-1] + a 1,j  f 2 [j-1] + t 2,j-1 + a 1,j 5thenf 1 [j]  f 1 [j-1] + a 1,j 6l 1 [j]  1 7elsef 1 [j]  f 2 [j-1] + t 2,j-1 + a 1,j 8l 1 [j]  2 9do if f 2 [j-1] + a 2,j  f 1 [j-1] + t 1,j-1 + a 2,j 10then f 2 [j]  f 2 [j-1] + a 2,j 11l 2 [j]  2 12elsef 2 [j]  f 1 [j-1] + t 1,j-1 + a 2,j 13l 2 [j]  1 14if f 1 [n] + x 1  f 2 [n] + x 2 15then f *  f 1 [n]+x 1 16l*  1 17elsef *  f 2 [n]+x 2 18l*  2 Let l i [j]be the line number, 1 or 2, whose station j-1 is used in the solution. eg. l 1 [5] = 2 means that the fastest way to reach station 5 of line 1 should pass through station 4 of line 2. Let l * be the line whose station n is used in the solution. eg. l * =2 means that the fastest way involves station n of line 2.

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Assembly-line Scheduling (Step 3) What we are doing, is: Starting at the bottom level, continuously filling in the tables (f 1 [], f 2 [] and l 1 [], l 2 []), and Looking up the tables to compute new results for next higher level Exercise: what is the complexity of this algorithm?

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Assembly-line Scheduling (Step 4) Step 4. Construct an optimal solution from computed information. PRINT-STATIONS() 1i  l * 2print “line ” i “, station ” n 3for j  n downto 2 4do i  l i [j] 5print “line ” i “, station ” j-1 Sample line 1, station 6 line 2, station 5 line 2, station 4 line 1, station 3 line 2, station 2 line 1, station 1

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Assembly-line Scheduling Dynamic Programming Step ? Characterize the structure of an optimal solution Eg. Study the structure of the fastest way through the factory. Step ? Recursively define the value of an optimal solution Step ? Compute the value of an optimal solution in a bottom-up fashion Step ? Construct an optimal solution from computed information. What we have done is indeed Dynamic Programming Now it is time to test your memory:

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Second Dynamic Programming Example: Matrix-Chain Multiplication

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Matrix-Chain Multiplication

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Matrix-Chain Multiplication No. They must be compatible: To multiply A and B, number of columns of A must be equal to number of rows of B. Can we freely multiply matrices of any different shapes?

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Matrix-Chain Multiplication We can also multiply them in different ways: However, we must consider the different efficiencies of the different ways. ( parenthesizations ).

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Matrix-Chain Multiplication 4x3x2= multiplications Total = multiplications 5x3x2= multiplications 4x5x2=40 multiplications 5 Total = multiplications x3x4=60 multiplications

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Matrix-Chain Multiplication Number of scalar multiplications =No. of rows of A * No. of columns of A * No. of columns of B or no. of rows of B Another Example: A … … … … B … … … … 5 C … … … … 5 50 (AB):No. of multiplication = 10 * 100 * 5 = 5000 (BC):No. of multiplication = 100 * 5 * 50 = ((AB)C): No. of multiplication = * 5 * 50 = (A(BC)):No. of multiplication = * 100 * 50 = Size of resultant matrix: No. of rows = No. of rows of A No. of columns = No. of columns of B A B

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Matrix-Chain Multiplication So before multiplying the 3 matrices, we better decide the optimal parenthesization. But how to decide? Shall we check all the parenthesizations and find out the optimal one? If we are multiplying 100 matrices, it seems there are a lot of candidates. Mmm, Let’s see how many parenthesizations we’d need to check …

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Matrix-Chain Multiplication Suppose we need to compute A 1 A 2 A 3 …A n Our question is “How many parenthesizations we need to check”. We may compute in many ways: 1 st way: A 1 (A 2 …A n ) but, there are different methods to compute for (A 2 …A n ). 2 nd way: (A 1 A 2 )(A 3 …A n ) but, there are different methods to compute for (A 3 …A n )... n-1 th way: (A 1 …A n-1 )A n but, there are different methods to compute for (A 1 …A n-1 ). Let’s handle in k th way, then no. of methods for (A 1..A n ) =no. of methods for (A 1..A k )(A k+1 A n ) =(no. of methods for k matrices) x (no. of methods for n-k matrices) Suppose we need to compute A 1 A 2 A 3 …A n Our question is “How many parenthesizations we need to check”. We may compute in many ways: 1 st way: A 1 (A 2 …A n ) 2 nd way: (A 1 A 2 )(A 3 …A n ).. n-1 th way: (A 1 …A n-1 )An Let P(n) = no. of alternative parenthesizations of n matrices, then 1if n=1  k=1 to n-1 P(k) P(n-k) if n>1 P(n) = Total no. of methods = sum of the methods for all n-1 ways. n-1 ways

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Matrix-Chain Multiplication 1if n=1  k=1 to n-1 P(k) P(n-k) if n>1 P(n) = Hence, P(1) = 1 P(2) = P(1)*P(1) = 1 P(3) = P(1)*P(2)+P(2)*P(1) = 2 P(4) = P(1)*P(3)+P(2)*P(2)+P(3)*P(1) = 5 P(5) = P(1)*P(4)+P(2)*P(3)+P(3)*P(2)+P(4)*P(1) = 14 P(6) = P(1)*P(5)+P(2)*P(4)+P(3)*P(3)+P(4)*P(2)+P(5)*P(1) = … This function increases very fast [  (2 n )]. So it is not feasible to check all possible parenthesizations. We’ll solve it using dynamic programming. Hence, P(1) = 1 P(2) = P(1)*P(1) = 1 P(3) = P(1)*P(2)+P(2)*P(1) = 2 P(4) = P(1)*P(3)+P(2)*P(2)+P(3)*P(1) = 5 P(5) = P(1)*P(4)+P(2)*P(3)+P(3)*P(2)+P(4)*P(1) = 14 P(6) = P(1)*P(5)+P(2)*P(4)+P(3)*P(3)+P(4)*P(2)+P(5)*P(1) = …

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Matrix-Chain Multiplication (Step 1) Step 1. The structure of an optimal parenthesization For..A i A 2 A 3 …A j.. Let ( A i …A k )( A k+1 …A j ) be the optimal parenthesization. Then A i …A k must be the optimal parenthesization for A i..A k. And A k+1 …A j must be the optimal parenthesization for A k+1..A j.

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Matrix-Chain Multiplication (Step 2) Step 2. A recursive solution Define m[i,j] as minimum number of scalar multiplications for A i …A j. And let p s-1 be the number of columns of A s-1 (= number of rows of A s ). Then, 0if i=j min i  k < j {m[i,k]+m[k+1,j]+p i-1 p k p j } if i<j m[i,j] = Then, each m[i,j] problem can be solved by making use of the solutions of smaller problems. Indeed there are not many problems: one problem for each choice of i and j satisfying 1  i  j  n. But their solutions are to be visited many times. => Any clever method?

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Matrix-Chain Multiplication (Step 3) Step 3. Compute an optimal solution from bottom-up 0if i=j min i  k < j {m[i,k]+m[k+1,j]+p i-1 p k p j } if i<j m[i,j] = Example: col x row A135x30 A215x35 A35x15 A410x5 A520x10 A625x20 i m j m[1,2] = m[1,1]+m[2,2]+30x35x15 = m[2,3] = m[2,2]+m[3,3]+35x15x5 = For chain length = 1: For chain length = 2: m[1,2] + m[3,3] + 30x15x5 = m[1,1] + m[2,3] + 30x35x5 = m[1,3] = min m [2,3] + m[4,4] + 35x5x10 = m[2,2] + m[3,4] + 35x15x10 = m[1,4] = min

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming MATRIX-CHAIN-ORDER 1for i  1 to n 2m[i,i]  0 3for len  2 to n //len is chain length 4for i  1 to n - len + 1 5j  i + len - 1 6m[i,j]   7for k  i to j-1 8q  m[i,k]+m[k+1,j]+p i-1 p k p j 9if q < m[i,j] 10thenm[i,j]  q 11s[i,j]  k Matrix-Chain Multiplication (Step 3) Step 3. Compute an optimal solution from bottom-up Let s[i,j]be the value of k where optimal splitting occurs. col x row A135x30 A215x35 A35x15 A410x5 A520x10 A625x j s i O()  ()

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Matrix-Chain Multiplication (Step 4) Step 4. Constructing an optimal solution A recursive printing procedure: Recall that s[i,j] = the value of k where optimal splitting occurs. PRINT-OPTIMAL-PARENS(i,j) 1if i=j then print “A”I 2elseprint “(“ 3PRINT-OPTIMAL-PARENS(i,s[i,j]) 4PRINT-OPTIMAL-PARENS(s[i,j]+1,j) 5print “)“ Sample ((A 1 (A 2 A 3 ))((A 4 A 5 )A 6 )) j s i

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Elements of Dynamic Programming: 2 Key Ingredients & Memoization

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Elements of Dynamic Programming Comparison: Divide & Conquer: Break up into smaller problems Dynamic Programming: Solve all smaller problems but only reuse optimal subproblem solutions But when should we apply dynamic programming? 2 key ingredients: Optimal Substructure Overlapping Subproblems Optimizatio n Problems Table-based Solution Bottom-up approach

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Elements of Dynamic Programming 2 key ingredients of Dynamic Programming: Optimal Substructure For any problem, if an optimal solution consists of optimal solutions to its subproblems, then, it exhibits optimal substructure. Overlapping Subproblems If a recursive algorithm revisits the same subproblem over and over again, we say that the subproblems are overlapping. If both ingredients happen, it is a good clue that dynamic programming might apply.

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Elements of Dynamic Programming Memoization - a variation of dynamic programming. Uses a top-down (recursive) strategy: After computing solutions to subproblems, store in the table. Subsequent calls do table lookup. Memorization? Memoization? MEMOIZED-MATRIX-CHAIN() 1for i  1 to n 2for j  i to n 3m[i,j]   4return LOOKUP-CHAIN(i,j) LOOKUP-CHAIN(i,j) 1if m[i,j] <  2return m[i,j] 3if i=j 4then m[i,j]  0 5else for k  i to j-1 6q  LOOKUP-CHAIN(i,k) + LOOKUP-CHAIN(k+1,j) + p i-1 p k p j 7if q < m[i,j] 8then m[i,j]  q 9return m[i,j]

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Elements of Dynamic Programming When Memoization outperforms bottom-up dynamic programming? For the some cases when some subproblems need not be solved at all. Eg. In the assembly line, some stations’ delivery paths are blocked. Then Memoization is faster. Otherwise, a bottom-up dynamic programming is usually more efficient: No overhead for recursion Less overhead for maintaining the table.

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Third Dynamic Programming Example: Longest Common Subsequence

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Longest Common Subsequence Suppose we have 2 strings: S 1 =ACCGGTCGAGTGCGCGGAAGCCGGCCGAA S 2 =GTCGTTCGGAATGCCGTTGCTCTGTAAT The Longest Common Subsequence is GTCGTCGGAAGCCGGCCGAA The Longest Common Subsequence problem : Given 2 sequences X= and Y=, find a maximum-length common subsequence (LCS) of X and Y. Suppose we have 2 strings: S 1 =ACCGGTCGAGTGCGCGGAAGCCGGCCGAA S 2 =GTCGTTCGGAATGCCGTTGCTCTGTAAT The Longest Common Subsequence is GTCGTCGGAAGCCGGCCGAA

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Longest Common Subsequence Optimal substructure of LCS: Let Z= be any LCS of X= and Y=. Let’s define the notations: X r =, Y r =, Z r =. Then, if x m =y n, then Z k-1 is an LCS of the pair X m-1 and Y n-1. if x m  y n, then Z is an LCS of the pair X m-1 and Y n, or Z is an LCS of the pair X m and Y n-1, orZ is an LCS of both pairs. We can work out a recursive formula for the problem: let c[i,j] = length of an LCS of X i and Y j, then C[i,j] = 0if i=0 or j=0 c[i-1,j-1]+1if i,j>0 and x i =y j Max(c[i,j-1],c[i-1,j])if i,j>0 and x i  y j Note that in this case we skip the computation of the subproblems c[i,j-1] and c[i- 1,j]

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Longest Common Subsequence To finish the solution by dynamic programming? You may like to try out yourselves, Or, read Chap 15.4.

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Forth Dynamic Programming Example: Optimal binary search trees

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Optimal binary search trees k2k2 k1k1 k4k4 k3k3 k5k5 d0d0 d1d1 d2d2 d3d3 d4d4 d5d5 A binary search tree for dictionary lookup: Let K= be n words (distinct keys) in sorted order Let d 0,d 1,..d n be n “dummy keys” representing values not in K Let p i = probability of searching k 1 q i = probability representing d i i p i q i Define the cost of a search as the number of nodes examined in a search. (depth of the key + 1) Expected cost of an arbitrary search = 2.8 k2k2 k1k1 k5k5 k4k4 k3k3 d0d0 d1d1 d2d2 d3d3 d4d4 d5d5 Expected cost of an arbitrary search = 2.75 Optimal binary search tree

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Optimal binary search trees The Optimal Binary Search Tree problem : For a given set of probabilities of the keys and dummy nodes, construct an optimal binary search tree. Optimal substructure: For an optimal binary search tree with root k r as the root, the left sub-tree and the right sub-tree are also optimal binary search trees. krkr

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Optimal binary search trees Let e[i,j] = expected cost of searching an optimal binary search tree containing k i,..,k j. When j = i-1, it corresponds to searching d i-1, then e[i,i-1]=q i-1. e[i,j] = q i-1 if j=i-1 Min i  r  j { e[i,r-1] + e[r+1,j] +  s=i to j p s +  s=i-1 to j q s } if i  j k2k2 k1k1 k5k5 k4k4 k3k3 d0d0 d1d1 d2d2 d3d3 d4d4 d5d5 k2k2 k1k1 k5k5 k4k4 k3k3 d0d0 d1d1 d2d2 d3d3 d4d4 d5d5 We can work out a recursive formula for the problem:

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Optimal binary search trees To finish the solution by dynamic programming? You may like to try out yourselves, Or, read Chap 15.5.

CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Dynamic Programming Dynamic Programming Summary Assembly-line scheduling Matrix-chain multiplication Elements of dynamic programming Optimal Substructure Overlapping Subproblems Memoization Longest common subsequence Optimal binary search trees