CPSC 335 Dynamic Programming Dr. Marina Gavrilova Computer Science University of Calgary Canada.

Slides:



Advertisements
Similar presentations
Lecture 8: Dynamic Programming Shang-Hua Teng. Longest Common Subsequence Biologists need to measure how similar strands of DNA are to determine how closely.
Advertisements

Overview What is Dynamic Programming? A Sequence of 4 Steps
Algorithms Dynamic programming Longest Common Subsequence.
COMP8620 Lecture 8 Dynamic Programming.
Review: Dynamic Programming
1 Longest Common Subsequence (LCS) Problem: Given sequences x[1..m] and y[1..n], find a longest common subsequence of both. Example: x=ABCBDAB and y=BDCABA,
Dynamic Programming Reading Material: Chapter 7..
Dynamic Programming1. 2 Outline and Reading Matrix Chain-Product (§5.3.1) The General Technique (§5.3.2) 0-1 Knapsack Problem (§5.3.3)
Dynamic Programming Dynamic Programming algorithms address problems whose solution is recursive in nature, but has the following property: The direct implementation.
CSC401 – Analysis of Algorithms Lecture Notes 12 Dynamic Programming
Dynamic Programming1. 2 Outline and Reading Matrix Chain-Product (§5.3.1) The General Technique (§5.3.2) 0-1 Knapsack Problem (§5.3.3)
Dynamic Programming1 Modified by: Daniel Gomez-Prado, University of Massachusetts Amherst.
Dynamic Programming Reading Material: Chapter 7 Sections and 6.
© 2004 Goodrich, Tamassia Dynamic Programming1. © 2004 Goodrich, Tamassia Dynamic Programming2 Matrix Chain-Products (not in book) Dynamic Programming.
Dynamic Programming 0-1 Knapsack These notes are taken from the notes by Dr. Steve Goddard at
1 Dynamic Programming Jose Rolim University of Geneva.
Lecture 7 Topics Dynamic Programming
Longest Common Subsequence
CSC 172 DATA STRUCTURES. DYNAMIC PROGRAMMING TABULATION MEMMOIZATION.
Dynamic Programming – Part 2 Introduction to Algorithms Dynamic Programming – Part 2 CSE 680 Prof. Roger Crawfis.
Dynamic Programming. Well known algorithm design techniques:. –Divide-and-conquer algorithms Another strategy for designing algorithms is dynamic programming.
ADA: 7. Dynamic Prog.1 Objective o introduce DP, its two hallmarks, and two major programming techniques o look at two examples: the fibonacci.
Recursion and Dynamic Programming. Recursive thinking… Recursion is a method where the solution to a problem depends on solutions to smaller instances.
Dynamic Programming.
CPSC 335 Randomized Algorithms Dr. Marina Gavrilova Computer Science University of Calgary Canada.
1 0-1 Knapsack problem Dr. Ying Lu RAIK 283 Data Structures & Algorithms.
David Luebke 1 10/24/2015 CS 332: Algorithms Greedy Algorithms Continued.
CSC401: Analysis of Algorithms CSC401 – Analysis of Algorithms Chapter Dynamic Programming Objectives: Present the Dynamic Programming paradigm.
Greedy Methods and Backtracking Dr. Marina Gavrilova Computer Science University of Calgary Canada.
6/4/ ITCS 6114 Dynamic programming Longest Common Subsequence.
Dynamic Programming continued David Kauchak cs302 Spring 2012.
1 Chapter 6 Dynamic Programming. 2 Algorithmic Paradigms Greedy. Build up a solution incrementally, optimizing some local criterion. Divide-and-conquer.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 14.
Introduction to Algorithms Jiafen Liu Sept
1 Dynamic Programming Topic 07 Asst. Prof. Dr. Bunyarit Uyyanonvara IT Program, Image and Vision Computing Lab. School of Information and Computer Technology.
Dynamic Programming1. 2 Outline and Reading Matrix Chain-Product (§5.3.1) The General Technique (§5.3.2) 0-1 Knapsack Problem (§5.3.3)
Chapter 7 Dynamic Programming 7.1 Introduction 7.2 The Longest Common Subsequence Problem 7.3 Matrix Chain Multiplication 7.4 The dynamic Programming Paradigm.
Dynamic Programming … Continued 0-1 Knapsack Problem.
2/19/ ITCS 6114 Dynamic programming 0-1 Knapsack problem.
CSC 213 Lecture 19: Dynamic Programming and LCS. Subsequences (§ ) A subsequence of a string x 0 x 1 x 2 …x n-1 is a string of the form x i 1 x.
Dynamic Programming … Continued
Merge Sort 5/28/2018 9:55 AM Dynamic Programming Dynamic Programming.
Algorithmics - Lecture 11
JinJu Lee & Beatrice Seifert CSE 5311 Fall 2005 Week 10 (Nov 1 & 3)
Chapter 8 Dynamic Programming.
Dynamic Programming.
CS Algorithms Dynamic programming 0-1 Knapsack problem 12/5/2018.
Dynamic Programming Dr. Yingwu Zhu Chapter 15.
ICS 353: Design and Analysis of Algorithms
ICS 353: Design and Analysis of Algorithms
Dynamic Programming 1/15/2019 8:22 PM Dynamic Programming.
Advanced Algorithms Analysis and Design
Dynamic Programming Dynamic Programming 1/15/ :41 PM
Dynamic Programming.
CS6045: Advanced Algorithms
Dynamic Programming Dynamic Programming 1/18/ :45 AM
Merge Sort 1/18/ :45 AM Dynamic Programming Dynamic Programming.
Data Structures and Algorithms (AT70. 02) Comp. Sc. and Inf. Mgmt
Longest Common Subsequence
Merge Sort 2/22/ :33 AM Dynamic Programming Dynamic Programming.
Lecture 8. Paradigm #6 Dynamic Programming
CSC 413/513- Intro to Algorithms
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Longest Common Subsequence
Lecture 5 Dynamic Programming
Analysis of Algorithms CS 477/677
0-1 Knapsack problem.
Longest Common Subsequence
Algorithm Course Dr. Aref Rashad
Presentation transcript:

CPSC 335 Dynamic Programming Dr. Marina Gavrilova Computer Science University of Calgary Canada

Optimal structure: optimal solution to problem consists of optimal solutions to subproblems Overlapping subproblems: few subproblems in total, many recurring instances of each Solve in “near linear” time through recursion, using a table to show state of problem at any given moment of time Reduces computation time from exponential to linear in some cases Dynamic programming

Longest Common Subsequence Problem: Given 2 sequences, A = 〈 a1,...,an 〉 and B = 〈 b1,...,bm 〉, find a common subsequence of characters not necessarily adjacent whose length is maximum. Subsequence must be in order. 3 Dynamic programming

Straight-forward solution finds all permutations of substrings in string A in exponential time 2^n, then checks for every beginning position in string B – another m times, O(m2^n) 4 Dynamic programming

Application: comparison of two DNA strings Ex: X= {A B C B D A B }, Y= {B D C A B A} Longest Common Subsequence: X = A B C BD A B Y = B D C A B A 5 Dynamic programming

6 Straight-forward solution finds all permutations of substrings in string A in exponential time 2^n, then checks for every beginning position in string B – another m times, O(m2^n) Dynamic programming

Similarly to Linear programming, Dynamic programming defines an algorithmic technique to solve the class of problems through recursion on small subproblems and noticing patterns. 7 Dynamic programming

We determine the length of Longest Common Substring and at the same time record the substring as well. Define Ai, Bj to be the prefixes of A and B of length i and j respectively Define L[i,j] to be the length of LCS of Ai and Aj Then the length of LCS of A and B will be L[n,m] We start with i = j = 0 (empty substrings of A and B) LCS of empty string and any other string is empty, so for every i and j: c[0, j] = L[i,0] = 0 First case: A[i]=B[j]: one more symbol in strings A and B matches, so the length of LCS Ai and Aj equals to the length of LCS L [AXi-1, Bi-1]+1 Second case: As symbols don’t match, solution is not improved, and the length of L(Ai, Bj) is the same as before (i.e. maximum of L(Ai, Bj-1) and L(Ai-1,Bj)) 8 Dynamic programming

LCS algorithm calculates the values of each entry of the array ARRAY[n,m] in O(m*n) since each c[i,j] is calculated in constant time, and there are m*n elements in the array 9

10 Knapsack problem

Given some items, pack the knapsack to get the maximum total value. Each item has some size and some value. Total size of the knapsack is is no more than some constant C. We must consider sizes of items as well as their value. Item# Size Value Knapsack problem

12 Knapsack problem Given a knapsack with maximum capacity C, and a set U consisting of n items {U1,U2,Un} Each item j has some size sj and value vj Problem: How to pack the knapsack to achieve maximum total value of packed items?

13 Knapsack problem There are two versions of the problem: (1) “0-1 knapsack problem” and (2) “Fractional knapsack problem” (1) Items are single; you either take an item or not. Solved with dynamic programming (2) Items can be taken a number of times. Solved with a greedy algorithm

Let’s first solve this problem with a straightforward algorithm Since there are n items, there are 2n possible combinations of items. We go through all combinations and find the one with the most total value and with total size less or equal to C Running time will be O(2n) 14 Knapsack problem

If items are labeled 1..n, then a subproblem would be to find an optimal solution for Ui = {items labeled 1, 2,.. i} However, the solution for U4 might not be part of the solution for U5, so this definition of a subproblem is flawed. Adding another parameter: j, which represents the exact size for each subset of items U, is the solution, woth the subproblem to be computed as V[i,j] 15 Knapsack problem

Now, the best subset of Uj that has total size J is one of the two: 1) the best subset of Ui-1 that has total size s, or 2) the best subset of Ui-1 that has total size (C-si) plus the item i 16 Knapsack problem

Intuitively, the best subset of Ui that has the total size s either contains item i or not: First case: si>C. Item i can’t be part of the solution, since if it was, the total size would be more than C, which is unacceptable Second case: si <= C. Then the item i can be in the solution, and we choose the case with greater value benefit (whether it includes item i or not) 17 Knapsack problem

Dynamic programming is a useful technique of solving certain kinds of optimization problems When the solution can be recursively described in terms of partial solutions, we can store these partial solutions and re-use them to save time Running time (Dynamic Programming) is much better than straight-forward approach 18 Conclusions

Thank You