Richard Anderson Lecture 20 LCS / Shortest Paths

Slides:



Advertisements
Similar presentations
1 Chapter 6 Dynamic Programming Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
Advertisements

CSEP 521 Applied Algorithms Richard Anderson Lecture 7 Dynamic Programming.
CSE 421 Algorithms Richard Anderson Lecture 19 Longest Common Subsequence.
Lecture 39 CSE 331 Dec 6, On Friday, Dec 10 hours-a-thon Atri: 2:00-3:30 (Bell 123) Jeff: 4:00-5:00 (Bell 224) Alex: 5:00-6:30 (Bell 242)
CSE 421 Algorithms Richard Anderson Lecture 19 Longest Common Subsequence.
CSE 421 Algorithms Richard Anderson Lecture 21 Shortest Paths.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 22.
10/4/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Adam Smith Algorithm Design and Analysis L ECTURE 17 Dynamic.
Prof. Swarat Chaudhuri COMP 482: Design and Analysis of Algorithms Spring 2012 Lecture 17.
Computer Sciences Department1.  Property 1: each node can have up to two successor nodes (children)  The predecessor node of a node is called its.
CS38 Introduction to Algorithms Lecture 10 May 1, 2014.
Lecture 38 CSE 331 Dec 5, OHs today (only online 9:30-11)
CSEP 521 Applied Algorithms Richard Anderson Lecture 8 Network Flow.
CSE 421 Algorithms Richard Anderson Lecture 8 Optimal Caching Dijkstra’s algorithm.
CSE 421 Algorithms Richard Anderson Lecture 21 Shortest Paths and Network Flow.
Dynamic Programming Typically applied to optimization problems
All-pairs Shortest paths Transitive Closure
Lecture 32 CSE 331 Nov 16, 2016.
Richard Anderson Lecture 1
Algorithmics - Lecture 11
Richard Anderson Lecture 29 NP-Completeness and course wrap-up
JinJu Lee & Beatrice Seifert CSE 5311 Fall 2005 Week 10 (Nov 1 & 3)
Data Structures and Algorithms
Prepared by Chen & Po-Chuan 2016/03/29
Shortest Paths with Dynamic Programming
Lecture 36 CSE 331 Nov 29, 2017.
Chapter 6 Dynamic Programming
Lecture 36 CSE 331 Nov 30, 2016.
Lecture 35 CSE 331 Nov 27, 2017.
Autumn 2016 Lecture 9 Dijkstra’s algorithm
Lecture 35 CSE 331 Nov 28, 2016.
Autumn 2015 Lecture 9 Dijkstra’s algorithm
Lecture 37 CSE 331 Dec 1, 2017.
Memory Efficient Longest Common Subsequence
Richard Anderson Lecture 11 Minimum Spanning Trees
Richard Anderson Lecture 9 Dijkstra’s algorithm
Richard Anderson Lecture 9 Dijkstra’s algorithm
Richard Anderson Lecture 19 Longest Common Subsequence
Richard Anderson Lecture 21 Shortest Path Network Flow Introduction
Chapter 6 Dynamic Programming
Richard Anderson Lecture 21 Network Flow
Richard Anderson Lecture 21 Shortest Paths
Dynamic Programming.
Dynamic Programming Longest Path in a DAG, Yin Tat Lee
Lecture 32 CSE 331 Nov 15, 2017.
Lecture 33 CSE 331 Nov 14, 2014.
Fundamental Data Structures and Algorithms
Dynamic Programming.
Algorithms (2IL15) – Lecture 7
Lecture 14 Shortest Path (cont’d) Minimum Spanning Tree
Evaluation of the Course (Modified)
Dynamic Programming.
Lecture 36 CSE 331 Nov 28, 2011.
Lecture 36 CSE 331 Nov 30, 2012.
Richard Anderson Lecture 19 Memory Efficient Dynamic Programming
Lecture 37 CSE 331 Dec 2, 2016.
Sorting and Divide-and-Conquer
Winter 2019 Lecture 9 Dijkstra’s algorithm
Richard Anderson Lecture 19 Longest Common Subsequence
Richard Anderson Lecture 20 Shortest Paths and Network Flow
Richard Anderson Lecture 19 Memory Efficient Dynamic Programming
Richard Anderson Lecture 18 Dynamic Programming
Lecture 13 Shortest Path (cont’d) Minimum Spanning Tree
Dynamic Programming.
Richard Anderson Lecture 20 Space Efficient LCS
Memory Efficient Dynamic Programming / Shortest Paths
Lecture 36 CSE 331 Nov 22, 2013.
Lecture 37 CSE 331 Dec 3, 2018.
Autumn 2019 Lecture 9 Dijkstra’s algorithm
Presentation transcript:

Richard Anderson Lecture 20 LCS / Shortest Paths CSE 421 Algorithms Richard Anderson Lecture 20 LCS / Shortest Paths

Longest Common Subsequence C=c1…cg is a subsequence of A=a1…am if C can be obtained by removing elements from A (but retaining order) LCS(A, B): A maximum length sequence that is a subsequence of both A and B ocurranec occurrence attacggct tacgacca

Optimization recurrence If aj = bk, Opt[ j,k ] = 1 + Opt[ j-1, k-1 ] If aj != bk, Opt[ j,k] = max(Opt[ j-1,k], Opt[ j,k-1])

Dynamic Programming Computation

Storing the path information A[1..m], B[1..n] for i := 1 to m Opt[i, 0] := 0; for j := 1 to n Opt[0,j] := 0; Opt[0,0] := 0; for i := 1 to m for j := 1 to n if A[i] = B[j] { Opt[i,j] := 1 + Opt[i-1,j-1]; Best[i,j] := Diag; } else if Opt[i-1, j] >= Opt[i, j-1] { Opt[i, j] := Opt[i-1, j], Best[i,j] := Left; } else { Opt[i, j] := Opt[i, j-1], Best[i,j] := Down; } b1…bn a1…am

How good is this algorithm? Is it feasible to compute the LCS of two strings of length 100,000 on a standard desktop PC? Why or why not.

Observations about the Algorithm The computation can be done in O(m+n) space if we only need one column of the Opt values or Best Values The algorithm can be run from either end of the strings

Computing LCS in O(nm) time and O(n+m) space Divide and conquer algorithm Recomputing values used to save space

Divide and Conquer Algorithm Where does the best path cross the middle column? For a fixed i, and for each j, compute the LCS that has ai matched with bj

Divide and Conquer A = a1,…,am B = b1,…,bn Find j such that Recurse LCS(a1…am/2, b1…bj) and LCS(am/2+1…am,bj+1…bn) yield optimal solution Recurse

Algorithm Analysis T(m,n) = T(m/2, j) + T(m/2, n-j) + cnm

Memory Efficient LCS Summary We can afford O(nm) time, but we can’t afford O(nm) space If we only want to compute the length of the LCS, we can easily reduce space to O(n+m) Avoid storing the value by recomputing values Divide and conquer used to reduce problem sizes

Shortest Paths with Dynamic Programming

Shortest Path Problem Dijkstra’s Single Source Shortest Paths Algorithm O(mlog n) time, positive cost edges General case – handling negative edges If there exists a negative cost cycle, the shortest path is not defined Bellman-Ford Algorithm O(mn) time for graphs with negative cost edges

Lemma If a graph has no negative cost cycles, then the shortest paths are simple paths Shortest paths have at most n-1 edges

Shortest paths with a fixed number of edges Find the shortest path from v to w with exactly k edges

Express as a recurrence Optk(w) = minx [Optk-1(x) + cxw] Opt0(w) = 0 if v=w and infinity otherwise

Algorithm, Version 1 foreach w M[0, w] = infinity; M[0, v] = 0; for i = 1 to n-1 M[i, w] = minx(M[i-1,x] + cost[x,w]);

Algorithm, Version 2 foreach w M[0, w] = infinity; M[0, v] = 0; for i = 1 to n-1 M[i, w] = min(M[i-1, w], minx(M[i-1,x] + cost[x,w]))

Algorithm, Version 3 foreach w M[w] = infinity; M[v] = 0; for i = 1 to n-1 M[w] = min(M[w], minx(M[x] + cost[x,w]))

Correctness Proof for Algorithm 3 Key lemma – at the end of iteration i, for all w, M[w] <= M[i, w]; Reconstructing the path: Set P[w] = x, whenever M[w] is updated from vertex x

If the pointer graph has a cycle, then the graph has a negative cost cycle If P[w] = x then M[w] >= M[x] + cost(x,w) Equal when w is updated M[x] could be reduced after update Let v1, v2,…vk be a cycle in the pointer graph with (vk,v1) the last edge added Just before the update M[vj] >= M[vj+1] + cost(vj+1, vj) for j < k M[vk] > M[v1] + cost(v1, vk) Adding everything up 0 > cost(v1,v2) + cost(v2,v3) + … + cost(vk, v1) v1 v4 v2 v3

Negative Cycles If the pointer graph has a cycle, then the graph has a negative cycle Therefore: if the graph has no negative cycles, then the pointer graph has no negative cycles

Finding negative cost cycles What if you want to find negative cost cycles?

Foreign Exchange Arbitrage USD 1.2 1.2 USD EUR CAD ------ 0.8 1.2 1.6 0.6 ----- EUR CAD 0.6 USD 0.8 0.8 EUR CAD 1.6