Introduction to Algorithms 6.046J/18.401J/SMA5503

Slides:



Advertisements
Similar presentations
1 Introduction to Algorithms 6.046J/18.401J/SMA5503 Lecture 19 Prof. Erik Demaine.
Advertisements

Introduction to Algorithms 6.046J/18.401J/SMA5503
© 2001 by Charles E. Leiserson Introduction to AlgorithmsDay 18 L10.1 Introduction to Algorithms 6.046J/18.401J/SMA5503 Lecture 10 Prof. Erik Demaine.
©2001 by Charles E. Leiserson Introduction to AlgorithmsDay 9 L6.1 Introduction to Algorithms 6.046J/18.401J/SMA5503 Lecture 6 Prof. Erik Demaine.
Dynamic Programming.
Dynamic Programming Nithya Tarek. Dynamic Programming Dynamic programming solves problems by combining the solutions to sub problems. Paradigms: Divide.
CPSC 335 Dynamic Programming Dr. Marina Gavrilova Computer Science University of Calgary Canada.
Overview What is Dynamic Programming? A Sequence of 4 Steps
Algorithms Dynamic programming Longest Common Subsequence.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Dynamic Programming.
David Luebke 1 5/4/2015 CS 332: Algorithms Dynamic Programming Greedy Algorithms.
Greedy Algorithms Basic idea Connection to dynamic programming Proof Techniques.
Comp 122, Fall 2004 Dynamic Programming. dynprog - 2 Lin / Devi Comp 122, Spring 2004 Longest Common Subsequence  Problem: Given 2 sequences, X =  x.
1 Dynamic Programming (DP) Like divide-and-conquer, solve problem by combining the solutions to sub-problems. Differences between divide-and-conquer and.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 11.
1 Longest Common Subsequence (LCS) Problem: Given sequences x[1..m] and y[1..n], find a longest common subsequence of both. Example: x=ABCBDAB and y=BDCABA,
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2006 Lecture 1 (Part 3) Design Patterns for Optimization Problems.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2002 Lecture 2 Tuesday, 2/5/02 Dynamic Programming.
Dynamic Programming CIS 606 Spring 2010.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2009 Design Patterns for Optimization Problems Dynamic Programming.
Dynamic Programming Code
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2002 Lecture 1 (Part 3) Tuesday, 9/3/02 Design Patterns for Optimization.
Analysis of Algorithms CS 477/677
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2008 Design Patterns for Optimization Problems Dynamic Programming.
November 7, 2005Copyright © by Erik D. Demaine and Charles E. Leiserson Dynamic programming Design technique, like divide-and-conquer. Example:
1 A Linear Space Algorithm for Computing Maximal Common Subsequences Author: D.S. Hirschberg Publisher: Communications of the ACM 1975 Presenter: Han-Chen.
Analysis of Algorithms
Lecture 7 Topics Dynamic Programming
Longest Common Subsequence
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2005 Design Patterns for Optimization Problems Dynamic Programming.
First Ingredient of Dynamic Programming
Dynamic Programming – Part 2 Introduction to Algorithms Dynamic Programming – Part 2 CSE 680 Prof. Roger Crawfis.
Lecture 5 Dynamic Programming. Dynamic Programming Self-reducibility.
Dynamic Programming UNC Chapel Hill Z. Guo.
Dynamic Programming. Well known algorithm design techniques:. –Divide-and-conquer algorithms Another strategy for designing algorithms is dynamic programming.
ADA: 7. Dynamic Prog.1 Objective o introduce DP, its two hallmarks, and two major programming techniques o look at two examples: the fibonacci.
Dynamic programming 叶德仕
David Luebke 1 10/24/2015 CS 332: Algorithms Greedy Algorithms Continued.
COSC 3101A - Design and Analysis of Algorithms 7 Dynamic Programming Assembly-Line Scheduling Matrix-Chain Multiplication Elements of DP Many of these.
CS 8833 Algorithms Algorithms Dynamic Programming.
6/4/ ITCS 6114 Dynamic programming Longest Common Subsequence.
Dynamic Programming continued David Kauchak cs302 Spring 2012.
Introduction to Algorithms Jiafen Liu Sept
CS38 Introduction to Algorithms Lecture 10 May 1, 2014.
David Luebke 1 2/26/2016 CS 332: Algorithms Dynamic Programming.
9/27/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Adam Smith Algorithm Design and Analysis L ECTURE 16 Dynamic.
Dr Nazir A. Zafar Advanced Algorithms Analysis and Design Advanced Algorithms Analysis and Design By Dr. Nazir Ahmad Zafar.
Dynamic Programming (DP)
Design & Analysis of Algorithm Dynamic Programming
David Meredith Dynamic programming David Meredith
Advanced Algorithms Analysis and Design
Dynamic programming 叶德仕
Lecture 5 Dynamic Programming
CS200: Algorithm Analysis
Dynamic Programming Several problems Principle of dynamic programming
Dynamic Programming Comp 122, Fall 2004.
CSCE 411 Design and Analysis of Algorithms
CS 3343: Analysis of Algorithms
Longest Common Subsequence
Dynamic Programming Comp 122, Fall 2004.
Ch. 15: Dynamic Programming Ming-Te Chi
Algorithms CSCI 235, Spring 2019 Lecture 28 Dynamic Programming III
Trevor Brown DC 2338, Office hour M3-4pm
Introduction to Algorithms: Dynamic Programming
Longest Common Subsequence
Longest Common Subsequence
Dynamic Programming.
Presentation transcript:

Introduction to Algorithms 6.046J/18.401J/SMA5503 Lecture 15 Prof. Charles E. Leiserson

Introduction to Algorithms Day24- L15 Dynamic programming Design technique, like divide-and-conquer. Example: Longest Common Subsequence (LCS) • Given two sequences x[1 . . m] and y[1 . . n], find a longest subsequence common to them both. 2001 by Charles E. Leiserson Introduction to Algorithms Day24- L15

Brute-force LCS algorithm Check every subsequence of x[1 . . m] to see if it is also a subsequence of y[1 . . n]. Analysis • Checking = O(n) time per subsequence. • 2m subsequences of x (each bit-vector of length m determines a distinct subsequence of x). Worst-case running time = O(n2m) = exponential time. 2001 by Charles E. Leiserson Introduction to Algorithms Day24- L15

Towards a better algorithm Simplification: 1. Look at the length of a longest-common subsequence. 2. Extend the algorithm to find the LCS itself. Notation: Denote the length of a sequence s by | s |. Strategy: Consider prefixes of x and y. • Define c[i, j] = | LCS(x[1 . . i], y[1 . . j]) |. • Then, c[m, n] = | LCS(x, y) |. 2001 by Charles E. Leiserson Introduction to Algorithms Day24- L15

Recursive formulation Theorem c[i, j ]= Let z[1 . . k] = LCS(x[1 . . i], y[1 . . j]), where c[i, j] = k. Then, z[k] = x[i], or else z could be extended. Thus, z[1 . . k–1] is CS of x[1 . . i–1] and y[1.. j–1]. 2001 by Charles E. Leiserson Introduction to Algorithms Day24- L15

Introduction to Algorithms Day24- L15 Proof (continued) Claim: z[1 . . k–1] = LCS(x[1 . . i–1], y[1 . . j–1]). Suppose w is a longer CS of x[1 . . i–1] and y [1 . . j–1], that is, |w| > k–1. Then, cut and paste: w || z[k] (w concatenated with z[k]) is a common subsequence of x[1 . . i] and y[1 . . j] with |w || z[k]| > k. Contradiction, proving the claim. Thus, c[i–1, j–1] = k–1, which implies that c[i, j] = c[i–1, j–1] + 1. Other cases are similar. 2001 by Charles E. Leiserson Introduction to Algorithms Day24- L15

Dynamic-programming hallmark #1 If z = LCS(x, y), then any prefix of z is an LCS of a prefix of x and a prefix of y. 2001 by Charles E. Leiserson Introduction to Algorithms Day24- L15

Recursive algorithm for LCS LCS(x, y, i, j) if x[i] = y[ j] then c[i, j] ← LCS(x, y, i–1, j–1) + 1 else c[i, j] ← max{LCS(x, y, i–1, j), LCS(x, y, i, j–1)} Worst-case: x[i] ≠ y[ j], in which case the algorithm evaluates two subproblems, each with only one parameter decremented. 2001 by Charles E. Leiserson Introduction to Algorithms Day24- L15

Introduction to Algorithms Day24- L15 Recursion tree but we’re solving subproblems already solved! Height = m + n  work potentially exponential. 2001 by Charles E. Leiserson Introduction to Algorithms Day24- L15

Dynamic-programming hallmark #2 The number of distinct LCS subproblems for two strings of lengths m and n is only mn. 2001 by Charles E. Leiserson Introduction to Algorithms Day24- L15

Memoization algorithm Memoization: After computing a solution to a subproblem, store it in a table. Subsequent calls check the table to avoid redoing work. Time = Θ(mn) = constant work per table entry. Space = Θ(mn). 2001 by Charles E. Leiserson Introduction to Algorithms Day24- L15

Dynamic-programming algorithm IDEA: Compute the table bottom-up. Time = Θ(mn). 2001 by Charles E. Leiserson Introduction to Algorithms Day24- L15

Dynamic-programming algorithm IDEA: Compute table bottom-up. Time = Θ(mn). Reconstruct LCS by tracing backwards. Space = Θ(mn). Exercise: O(min{m, n}). 2001 by Charles E. Leiserson Introduction to Algorithms Day24- L15