Download presentation

Presentation is loading. Please wait.

Published byCharles Stelling Modified over 2 years ago

1
Dynamic Programming Nithya Tarek

2
Dynamic Programming Dynamic programming solves problems by combining the solutions to sub problems. Paradigms: Divide and conquer Greedy Algorithm Dynamic programming

3
String Matching Longest common subsequence Knuth-Morris-Pratt pattern matching

4
Example – Fibonacci Numbers using recursion Function f(n) if n = 0 output 0 else if n = 1 output 1 else output f(n-1) + f(n-2)

5
Example – Fibonacci Numbers using recursion Run time: T(n) = T(n-1) + T(n-2) This function grows as n grows The run time doubles as n grows and is order O(2 n ). This is a bad algorithm as there are numerous repetitive calculations.

6
Example – Fibonacci Numbers using Dynamic programming f(5) f(4)f(3) f(3)f(2)f(2)f(1) f(2)f(1)

7
Example – Fibonacci Numbers using Dynamic programming Dynamic programming calculates from bottom to top. Values are stored for later use. This reduces repetitive calculation.

8
Longest Common Subsequence The input to this problem is two sequences S1=abcdace and S2=badcabe. The problem is to find the longest sequence that is a subsequence of both S1 and S2 where S1 ≠ S2. Distance between S1 and S2 is defined as the number of characters we have to remove from one string and add to that string to make S1 and S2 equal.

9
Longest Common Subsequence S1: a b c d a c e S2: badcabe Length LCSS = 4 Edit Distance = 3(remove) +3(add) = 6

10
Longest Common Subsequence Theorem: |S1| = m, |S2|=n LCSS = L Edit distance = m + n – 2L

11
Longest Common Subsequence S1 S2 i j S1i S2j

12
Longest Common Subsequence To find LCSS(S1 i, S2 j ) If S1[i] = S2[j] Then return LCSS[S1 i-1, S2 j-1 ] + 1 Else return max{LCSS[S1 i-1, S2 j ], LCSS[S1 i, S2 j-1 ] } This algorithm is very slow S1 S2 i j

13
Solving LCSS using Dynamic programming LCSS Matrix: The last entry in the matrix shows the LCSS abcdace b 0111111 a 1111222 d 0112222 c 1122233 a 1122333 b 0222333 e 0222334

14
Solving LCSS using Dynamic programming Runtime for this matrix using Dynamic programming is order of O(m,n) O(1) times to fill up each entry There are mn entries. Space required: O(mn)

15
Advantages of Dynamic programming A recursive procedure does not have memory Dynamic programming stores previous values to avoid multiple calculations

16
Space and Time The space can be reduced to order of O(min{m,n}) It is enough to keep only two rows j and j-1 After calculating the entries for row j move that row up to row j-1 and delete row j-1 and get the new entries for row j. The time cannot be reduced

17
Space reduction To compare for a smaller area w, where w is the window size, the matrix size will reduce. S1: a b c d a c e S2: badcabe 33 w w

18
Space reduction Specifying the window size reduces the number of calculation The runtime of this algorithm is: O(2w * min{m,n}) It is a linear time algorithm and the space required: O(w)

Similar presentations

Presentation is loading. Please wait....

OK

Introduction to Algorithms 6.046J/18.401J/SMA5503

Introduction to Algorithms 6.046J/18.401J/SMA5503

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Economics ppt on demand and supply Ppt on statistics and probability letters Ppt on 60 years of indian parliamentary Plain backgrounds for ppt on social media Ppt on self awareness books Ppt on sources of energy for class 10th Free download ppt on nitrogen cycle Convert doc file to ppt online templates Ppt on zener diode characteristics Ppt on national parks of india