Dynamic Programming Longest Path in a DAG, Yin Tat Lee

Slides:



Advertisements
Similar presentations
Introduction to Algorithms 6.046J/18.401J/SMA5503
Advertisements

Fibonacci Numbers F n = F n-1 + F n-2 F 0 =0, F 1 =1 – 0, 1, 1, 2, 3, 5, 8, 13, 21, 34 … Straightforward recursive procedure is slow! Why? How slow? Lets.
Single Source Shortest Paths
* Bellman-Ford: single-source shortest distance * O(VE) for graphs with negative edges * Detects negative weight cycles * Floyd-Warshall: All pairs shortest.
1 Appendix B: Solving TSP by Dynamic Programming Course: Algorithm Design and Analysis.
 2004 SDU Lecture11- All-pairs shortest paths. Dynamic programming Comparing to divide-and-conquer 1.Both partition the problem into sub-problems 2.Divide-and-conquer.
Approaches to Problem Solving greedy algorithms dynamic programming backtracking divide-and-conquer.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 20.
Computability and Complexity 23-1 Computability and Complexity Andrei Bulatov Search and Optimization.
Shortest Paths Definitions Single Source Algorithms –Bellman Ford –DAG shortest path algorithm –Dijkstra All Pairs Algorithms –Using Single Source Algorithms.
Lecture 38 CSE 331 Dec 3, A new grading proposal Towards your final score in the course MAX ( mid-term as 25%+ finals as 40%, finals as 65%) .
Lecture 39 CSE 331 Dec 6, On Friday, Dec 10 hours-a-thon Atri: 2:00-3:30 (Bell 123) Jeff: 4:00-5:00 (Bell 224) Alex: 5:00-6:30 (Bell 242)
Shortest Paths Definitions Single Source Algorithms
Lecture 37 CSE 331 Dec 1, A new grading proposal Towards your final score in the course MAX ( mid-term as 25%+ finals as 40%, finals as 65%) .
1 Dynamic Programming Jose Rolim University of Geneva.
Dynamic Programming – Part 2 Introduction to Algorithms Dynamic Programming – Part 2 CSE 680 Prof. Roger Crawfis.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 17.
Prof. Swarat Chaudhuri COMP 482: Design and Analysis of Algorithms Spring 2012 Lecture 17.
Dynamic Programming: Manhattan Tourist Problem Lecture 17.
Lecture 38 CSE 331 Dec 2, Review Sessions etc. Atri: (at least ½ of) class next Friday Jiun-Jie: Extra office hour next Friday Jesse: Review Session.
CSE 421 Algorithms Richard Anderson Lecture 27 NP-Completeness and course wrap up.
Approaches to Problem Solving greedy algorithms dynamic programming backtracking divide-and-conquer.
Introduction to Algorithms All-Pairs Shortest Paths My T. UF.
Lecture 38 CSE 331 Dec 5, OHs today (only online 9:30-11)
CSCI 256 Data Structures and Algorithm Analysis Lecture 16 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
Introduction to NP Instructor: Neelima Gupta 1.
1 Chapter 15-2: Dynamic Programming II. 2 Matrix Multiplication Let A be a matrix of dimension p x q and B be a matrix of dimension q x r Then, if we.
Hamiltonian Graphs Graphs Hubert Chan (Chapter 9.5)
Proof technique (pigeonhole principle)
Lecture 13 Shortest Path.
Lecture 5 Dynamic Programming
Richard Anderson Lecture 29 NP-Completeness and course wrap-up
Hamiltonian Graphs Graphs Hubert Chan (Chapter 9.5)
Lecture 22 Complexity and Reductions
Greedy Algorithms / Minimum Spanning Tree Yin Tat Lee
Greedy Algorithms / Interval Scheduling Yin Tat Lee
Lecture 5 Dynamic Programming
CS330 Discussion 6.
Discussion section #2 HW1 questions?
OVERVIEW 1-st Midterm: 3 problems 2-nd Midterm 3 problems
Shortest Paths with Dynamic Programming
ICS 353: Design and Analysis of Algorithms
Lecture 33 CSE 331 Nov 18, 2016.
Analysis and design of algorithm
Lecture 37 CSE 331 Dec 1, 2017.
Dynamic Programming Dr. Yingwu Zhu Chapter 15.
Richard Anderson Lecture 20 LCS / Shortest Paths
Algorithms and Data Structures Lecture XIII
Lecture 13 Algorithm Analysis
Lecture 13 Algorithm Analysis
Richard Anderson Lecture 30 NP-Completeness
Dynamic Programming.
Dynamic Programming Shortest Paths with Negative Weights Yin Tat Lee
Lecture 13 Algorithm Analysis
All pairs shortest path problem
Algorithms (2IL15) – Lecture 7
Lecture 14 Shortest Path (cont’d) Minimum Spanning Tree
Evaluation of the Course (Modified)
Dynamic Programming.
Chapter 24: Single-Source Shortest Paths
Lecture 36 CSE 331 Nov 28, 2011.
Lecture 36 CSE 331 Nov 30, 2012.
Lecture 37 CSE 331 Dec 2, 2016.
Dynamic Programming II DP over Intervals
Chapter 24: Single-Source Shortest Paths
CSE 417: Algorithms and Computational Complexity
Lecture 13 Shortest Path (cont’d) Minimum Spanning Tree
Richard Anderson Lecture 20 Space Efficient LCS
Memory Efficient Dynamic Programming / Shortest Paths
Lecture 36 CSE 331 Nov 22, 2013.
Presentation transcript:

Dynamic Programming Longest Path in a DAG, Yin Tat Lee CSE 421 Dynamic Programming Longest Path in a DAG, Yin Tat Lee

Longest Path in a DAG

Longest Path in a DAG Goal: Given a DAG G, find the longest path. Recall: A directed graph G is a DAG if it has no cycle. This problem is NP-hard for general directed graphs: It has the Hamiltonian Path as a special case 2 3 6 5 4 7 1

DP for Longest Path in a DAG Q: What is the right ordering? Remember, we have to use that G is a DAG, ideally in defining the ordering We saw that every DAG has a topological sorting So, let’s use that as an ordering. 2 3 6 5 4 7 1

DP for Longest Path in a DAG Suppose we have labelled the vertices such that (𝑖,𝑗) is a directed edge only if 𝑖<𝑗. Let 𝑂𝑃𝑇(𝑗) = length of the longest path ending at 𝑗 Suppose OPT(j) is 𝑖 1 , 𝑖 2 , 𝑖 2 , 𝑖 3 ,…, 𝑖 𝑘−1 , 𝑖 𝑘 , 𝑖 𝑘 ,𝑗 , then Obs 1: 𝑖 1 ≤ 𝑖 2 ≤…≤ 𝑖 𝑘 ≤𝑗. Obs 2: 𝑖 1 , 𝑖 2 , 𝑖 2 , 𝑖 3 ,…, 𝑖 𝑘−1 , 𝑖 𝑘 is the longest path ending at 𝑖 𝑘 . 𝑂𝑃𝑇 𝑗 =1+𝑂𝑃𝑇 𝑖 𝑘 . 1 2 3 4 5 6 7

DP for Longest Path in a DAG Suppose we have labelled the vertices such that (𝑖,𝑗) is a directed edge only if 𝑖<𝑗. Let 𝑂𝑃𝑇(𝑗) = length of the longest path ending at 𝑗 𝑂𝑃𝑇 𝑗 = 0 1+ max 𝑖: 𝑖,𝑗 an edge 𝑂𝑃𝑇(𝑖) If 𝑗 is a source o.w.

Outputting the Longest Path Let G be a DAG given with a topological sorting: For all edges (𝒊,𝒋) we have i<j. Initialize Parent[j]=-1 for all j. Compute-OPT(j){ if (in-degree(j)==0) return 0 if (M[j]==empty) M[j]=0; for all edges (i,j) if (M[j] < 1+Compute-OPT(i)) M[j]=1+Compute-OPT(i) Parent[j]=i return M[j] } Let M[k] be the maximum of M[1],…,M[n] While (Parent[k]!=-1) Print k k=Parent[k] Record the entry that we used to compute OPT(j)

Exercise: Longest Increasing Subsequence

Longest Increasing Subsequence Given a sequence of numbers Find the longest increasing subsequence in 𝑂( 𝑛 2 ) time 41, 22, 9, 15, 23, 39, 21, 56, 24, 34, 59, 23, 60, 39, 87, 23, 90 41, 22, 9, 15, 23, 39, 21, 56, 24, 34, 59, 23, 60, 39, 87, 23, 90

DP for LIS Let OPT(j) be the longest increasing subsequence ending at j. Observation: Suppose the OPT(j) is the sequence 𝑥 𝑖 1 , 𝑥 𝑖 2 ,…, 𝑥 𝑖 𝑘 , 𝑥 𝑗 Then, 𝑥 𝑖 1 , 𝑥 𝑖 2 ,…, 𝑥 𝑖 𝑘 is the longest increasing subsequence ending at 𝑥 𝑖 𝑘 , i.e., 𝑂𝑃𝑇 𝑗 =1+𝑂𝑃𝑇( 𝑖 𝑘 ) 𝑂𝑃𝑇 𝑗 = 1 1+ max 𝑖: 𝑥 𝑖 < 𝑥 𝑗 𝑂𝑃𝑇(𝑖) Alternative Soln: This is a special case of Longest path in a DAG: Construct a graph 1,…n where (𝑖,𝑗) is an edge if 𝑖<𝑗 and 𝑥 𝑖 < 𝑥 𝑗 . If 𝑥 𝑗 > 𝑥 𝑖 for all 𝑖<𝑗 o.w.

Shortest Paths with Negative Edge Weights

Shortest Paths with Neg Edge Weights Given a weighted directed graph 𝐺= 𝑉,𝐸 and a source vertex 𝑠, where the weight of edge (u,v) is 𝑐 𝑢,𝑣 (that can be negative) Goal: Find the shortest path from s to all vertices of G. Recall that Dikjstra’s Algorithm fails when weights are negative 1 1 -2 3 3 -2 source s 2 3 2 3 s 2 -1 -1 2 4 4

Impossibility on Graphs with Neg Cycles Observation: No solution exists if G has a negative cycle. This is because we can minimize the length by going over the cycle again and again. So, suppose G does not have a negative cycle. 1 3 -2 2 3 s 2 -1 4

DP for Shortest Path (First Attempt) Def: Let 𝑂𝑃𝑇(𝑣) be the length of the shortest 𝑠 - 𝑣 path 𝑂𝑃𝑇 𝑣 = 0 if 𝑣=𝑠 min 𝑢: 𝑢,𝑣 an edge 𝑂𝑃𝑇 𝑢 + 𝑐 𝑢,𝑣 The formula is correct. But it is not clear how to compute it.

DP for Shortest Path Def: Let 𝑂𝑃𝑇(𝑣,𝑖) be the length of the shortest 𝑠 - 𝑣 path with at most 𝑖 edges. Let us characterize 𝑂𝑃𝑇(𝑣,𝑖). Case 1: 𝑂𝑃𝑇(𝑣,𝑖) path has less than 𝑖 edges. Then, 𝑂𝑃𝑇 𝑣,𝑖 =𝑂𝑃𝑇 𝑣,𝑖−1 . Case 2: 𝑂𝑃𝑇(𝑣,𝑖) path has exactly 𝑖 edges. Let 𝑠, 𝑣 1 , 𝑣 2 ,…, 𝑣 𝑖−1 ,𝑣 be the 𝑂𝑃𝑇(𝑣,𝑖) path with 𝑖 edges. Then, 𝑠, 𝑣 1 ,…, 𝑣 𝑖−1 must be the shortest 𝑠 - 𝑣 𝑖−1 path with at most 𝑖−1 edges. So, 𝑂𝑃𝑇 𝑣,𝑖 =𝑂𝑃𝑇 𝑣 𝑖−1 ,𝑖−1 + 𝑐 𝑣 𝑖−1 ,𝑣

DP for Shortest Path Def: Let 𝑂𝑃𝑇(𝑣,𝑖) be the length of the shortest 𝑠 - 𝑣 path with at most 𝑖 edges. 𝑂𝑃𝑇 𝑣,𝑖 = 0 if 𝑣=𝑠 ∞ if 𝑣≠𝑠,𝑖=0 min (𝑂𝑃𝑇 𝑣,𝑖−1 , min 𝑢: 𝑢,𝑣 an edge 𝑂𝑃𝑇 𝑢,𝑖−1 + 𝑐 𝑢,𝑣 ) So, for every v, 𝑂𝑃𝑇 𝑣,? is the shortest path from s to v. But how long do we have to run? Since G has no negative cycle, it has at most 𝑛−1 edges. So, 𝑂𝑃𝑇(𝑣,𝑛−1) is the answer.

Bellman Ford Algorithm for v=1 to n if 𝒗≠𝒔 then M[v,0]=∞ M[s,0]=0. for i=1 to n-1 M[v,i]=M[v,i-1] for every edge (u,v) M[v,i]=min(M[v,i], M[u,i-1]+cu,v) Running Time: 𝑂 𝑛𝑚 Can we test if G has negative cycles? Yes, run for i=1…3n and see if the M[v,n-1] is different from M[v,3n]

DP Techniques Summary Recipe: Dynamic programming techniques. Follow the natural induction proof. Find out additional assumptions/variables/subproblems that you need to do the induction Strengthen the hypothesis and define w.r.t. new subproblems Dynamic programming techniques. Whenever a problem is a special case of an NP-hard problem an ordering is important: Adding a new variable: knapsack. Dynamic programming over intervals: RNA secondary structure. Top-down vs. bottom-up: Different people have different intuitions Bottom-up is useful to optimize the memory