7 -1 Chapter 7 Dynamic Programming. 7 -2 Fibonacci sequence (1) 0,1,1,2,3,5,8,13,21,34,... Leonardo Fibonacci (1170 -1250) 用來計算兔子的數量 每對每個月可以生產一對 兔子出生後,

Slides:



Advertisements
Similar presentations
S. J. Shyu Chap. 1 Introduction 1 The Design and Analysis of Algorithms Chapter 1 Introduction S. J. Shyu.
Advertisements

Chapter 7 Dynamic Programming.
Chapter 4 The Greedy Method.
Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by recurrences with overlapping subproblems.
1 Section 9.1 Introduction to Trees. 2 Tree terminology Tree: a connected, undirected graph that contains no simple circuits –must be a simple graph:
Dynamic Programming (II)
Chapter 3 The Greedy Method 3.
Chapter 7 Dynamic Programming 7.
§ 8 Dynamic Programming Fibonacci sequence
Dynamic Programming Reading Material: Chapter 7..
3 -1 Chapter 3 The Greedy Method 3 -2 The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each.
Greedy Algorithms Reading Material: Chapter 8 (Except Section 8.5)
Chapter 8 Dynamic Programming Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Dynamic Programming1. 2 Outline and Reading Matrix Chain-Product (§5.3.1) The General Technique (§5.3.2) 0-1 Knapsack Problem (§5.3.3)
7 -1 Chapter 7 Dynamic Programming Fibonacci Sequence Fibonacci sequence: 0, 1, 1, 2, 3, 5, 8, 13, 21, … F i = i if i  1 F i = F i-1 + F i-2 if.
Dynamic Programming.
Optimal binary search trees
1 Dynamic Programming Jose Rolim University of Geneva.
Dynamic Programming Introduction to Algorithms Dynamic Programming CSE 680 Prof. Roger Crawfis.
1 Dynamic Programming 2012/11/20. P.2 Dynamic Programming (DP) Dynamic programming Dynamic programming is typically applied to optimization problems.
May 5, 2015Applied Discrete Mathematics Week 13: Boolean Algebra 1 Dijkstra’s Algorithm procedure Dijkstra(G: weighted connected simple graph with vertices.
Lecture 5 Dynamic Programming. Dynamic Programming Self-reducibility.
The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each decision is locally optimal. These.
© The McGraw-Hill Companies, Inc., Chapter 3 The Greedy Method.
Dynamic Programming Dynamic programming is a technique for solving problems with a recursive structure with the following characteristics: 1.optimal substructure.
Data Structures Chapter 1: Basic Concepts 1-1. bit: basic unit of information. data type: interpretation of a bit pattern. e.g integer 65 ASCII.
Chapter 5 Dynamic Programming 2001 년 5 월 24 일 충북대학교 알고리즘연구실.
CSCE350 Algorithms and Data Structure Lecture 17 Jianjun Hu Department of Computer Science and Engineering University of South Carolina
7 -1 Chapter 7 Dynamic Programming Fibonacci sequence Fibonacci sequence: 0, 1, 1, 2, 3, 5, 8, 13, 21, … F i = i if i  1 F i = F i-1 + F i-2 if.
Dynamic Programming Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by or formulated.
May 1, 2002Applied Discrete Mathematics Week 13: Graphs and Trees 1News CSEMS Scholarships for CS and Math students (US citizens only) $3,125 per year.
7 -1 Chapter 7 Dynamic Programming Fibonacci sequence Fibonacci sequence: 0, 1, 1, 2, 3, 5, 8, 13, 21, … F i = i if i  1 F i = F i-1 + F i-2 if.
8 -1 Dynamic Programming Fibonacci sequence Fibonacci sequence: 0, 1, 1, 2, 3, 5, 8, 13, 21, … F i = i if i  1 F i = F i-1 + F i-2 if i  2 Solved.
Contents of Chapter 5 Chapter 5 Dynamic Programming
Tree A connected graph that contains no simple circuits is called a tree. Because a tree cannot have a simple circuit, a tree cannot contain multiple.
CSC401: Analysis of Algorithms CSC401 – Analysis of Algorithms Chapter Dynamic Programming Objectives: Present the Dynamic Programming paradigm.
3 -1 Chapter 3 The Greedy Method 3 -2 A simple example Problem: Pick k numbers out of n numbers such that the sum of these k numbers is the largest.
MA/CSSE 473 Day 28 Dynamic Programming Binomial Coefficients Warshall's algorithm Student questions?
Chapter 3 Trees and Forests 大葉大學 資訊工程系 黃鈴玲
Dynamic Programming Greed is not always good.. Jaruloj Chongstitvatana Design and Analysis of Algorithm2 Outline Elements of dynamic programming.
8 -1 Chapter 8 Dynamic Programming Fibonacci sequence Fibonacci sequence: 0, 1, 1, 2, 3, 5, 8, 13, 21, … F i = i if i  1 F i = F i-1 + F i-2 if.
Agenda Review: –Planar Graphs Lecture Content:  Concepts of Trees  Spanning Trees  Binary Trees Exercise.
5.5.2 M inimum spanning trees  Definition 24: A minimum spanning tree in a connected weighted graph is a spanning tree that has the smallest possible.
CS 3343: Analysis of Algorithms Lecture 18: More Examples on Dynamic Programming.
Foundation of Computing Systems
1 Ch20. Dynamic Programming. 2 BIRD’S-EYE VIEW Dynamic programming The most difficult one of the five design methods Has its foundation in the principle.
Computer Sciences Department1.  Property 1: each node can have up to two successor nodes (children)  The predecessor node of a node is called its.
Chapter 10: Trees A tree is a connected simple undirected graph with no simple circuits. Properties: There is a unique simple path between any 2 of its.
15.Dynamic Programming. Computer Theory Lab. Chapter 15P.2 Dynamic programming Dynamic programming is typically applied to optimization problems. In such.
Chapter 7 Dynamic Programming 7.1 Introduction 7.2 The Longest Common Subsequence Problem 7.3 Matrix Chain Multiplication 7.4 The dynamic Programming Paradigm.
8 -1 Dynamic Programming Fibonacci sequence Fibonacci sequence: 0, 1, 1, 2, 3, 5, 8, 13, 21, … F i = i if i  1 F i = F i-1 + F i-2 if i  2 Solved.
Branch and Bound Searching Strategies
Tree - in “math speak” An ________ graph is a set of vertices/nodes and a set of edges, each edge connects two vertices. Any undirected graph in which.
Chapter 6 – Trees. Notice that in a tree, there is exactly one path from the root to each node.
Chapter 11. Chapter Summary  Introduction to trees (11.1)  Application of trees (11.2)  Tree traversal (11.3)  Spanning trees (11.4)
TU/e Algorithms (2IL15) – Lecture 4 1 DYNAMIC PROGRAMMING II
Applied Discrete Mathematics Week 15: Trees
Source Code for Data Structures and Algorithm Analysis in C (Second Edition) – by Weiss
DYNAMIC PROGRAMMING.
Dynamic Programming General Idea
Chapter 8 Dynamic Programming
Course Contents: T1 Greedy Algorithm Divide & Conquer
Unit-4: Dynamic Programming
All pairs shortest path problem
And the Final Subject is…
Dynamic Programming General Idea
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Dynamic Programming II DP over Intervals
Advanced Analysis of Algorithms
Presentation transcript:

7 -1 Chapter 7 Dynamic Programming

7 -2 Fibonacci sequence (1) 0,1,1,2,3,5,8,13,21,34,... Leonardo Fibonacci ( ) 用來計算兔子的數量 每對每個月可以生產一對 兔子出生後, 隔一個月才會生產, 且永不死亡 生產 總數

7 -3 Fibonacci sequence (2) 0,1,1,2,3,5,8,13,21,34,...

7 -4 Fibonacci sequence and golden number 0,1,1,2,3,5,8,13,21,34,... f n = 0 if n = 0 f n = 1 if n = 1 f n = f n-1 + f n-2 if n  2 1 x-1 x

7 -5 Computation of Fibonacci sequence Solved by a recursive program: Much replicated computation is done. It should be solved by a simple loop. f n = 0 if n = 0 f n = 1 if n = 1 f n = f n-1 + f n-2 if n  2

7 -6 Dynamic Programming Dynamic Programming is an algorithm design method that can be used when the solution to a problem may be viewed as the result of a sequence of decisions

7 -7 The shortest path To find a shortest path in a multi-stage graph Apply the greedy method : the shortest path from S to T : = 8

7 -8 The shortest path in multistage graphs e.g. The greedy method can not be applied to this case: (S, A, D, T) = 23. The real shortest path is: (S, C, F, T) = 9.

7 -9 Dynamic programming approach Dynamic programming approach (forward approach): d(S, T) = min{1+d(A, T), 2+d(B, T), 5+d(C, T)} d(A,T) = min{4+d(D,T), 11+d(E,T)} = min{4+18, 11+13} = 22.

7 -10 d(B, T) = min{9+d(D, T), 5+d(E, T), 16+d(F, T)} = min{9+18, 5+13, 16+2} = 18. d(C, T) = min{ 2+d(F, T) } = 2+2 = 4 d(S, T) = min{1+d(A, T), 2+d(B, T), 5+d(C, T)} = min{1+22, 2+18, 5+4} = 9. The above way of reasoning is called backward reasoning.

7 -11 Backward approach (forward reasoning) d(S, A) = 1 d(S, B) = 2 d(S, C) = 5 d(S,D)=min{d(S,A)+d(A,D), d(S,B)+d(B,D)} = min{ 1+4, 2+9 } = 5 d(S,E)=min{d(S,A)+d(A,E), d(S,B)+d(B,E)} = min{ 1+11, 2+5 } = 7 d(S,F)=min{d(S,B)+d(B,F), d(S,C)+d(C,F)} = min{ 2+16, 5+2 } = 7

7 -12 d(S,T) = min{d(S, D)+d(D, T), d(S,E)+ d(E,T), d(S, F)+d(F, T)} = min{ 5+18, 7+13, 7+2 } = 9

7 -13 Principle of optimality Principle of optimality: Suppose that in solving a problem, we have to make a sequence of decisions D 1, D 2, …, D n. If this sequence is optimal, then the last k decisions, 1  k  n must be optimal. e.g. the shortest path problem If i, i 1, i 2, …, j is a shortest path from i to j, then i 1, i 2, …, j must be a shortest path from i 1 to j In summary, if a problem can be described by a multistage graph, then it can be solved by dynamic programming.

7 -14 Forward approach and backward approach: Note that if the recurrence relations are formulated using the forward approach then the relations are solved backwards. i.e., beginning with the last decision On the other hand if the relations are formulated using the backward approach, they are solved forwards. To solve a problem by using dynamic programming: Find out the recurrence relations. Represent the problem by a multistage graph. Dynamic programming

7 -15 The resource allocation problem m resources, n projects profit P i, j : j resources are allocated to project i. maximize the total profit.

7 -16 The multistage graph solution The resource allocation problem can be described as a multistage graph. (i, j) : i resources allocated to projects 1, 2, …, j e.g. node H=(3, 2) : 3 resources allocated to projects 1, 2.

7 -17 Find the longest path from S to T : (S, C, H, L, T), =13 2 resources allocated to project 1. 1 resource allocated to project 2. 0 resource allocated to projects 3, 4.

7 -18 The longest common subsequence (LCS) problem A string : A = b a c a d A subsequence of A: deleting 0 or more symbols from A (not necessarily consecutive). e.g. ad, ac, bac, acad, bacad, bcd. Common subsequences of A = b a c a d and B = a c c b a d c b : ad, ac, bac, acad. The longest common subsequence (LCS) of A and B: a c a d.

7 -19 The LCS algorithm Let A = a 1 a 2  a m and B = b 1 b 2  b n Let L i,j denote the length of the longest common subsequence of a 1 a 2  a i and b 1 b 2  b j. L i,j = L i-1,j if a i =b j max{ L i-1,j, L i,j-1 } if a i  b j L 0,0 = L 0,j = L i,0 = 0 for 1  i  m, 1  j  n.

7 -20 The dynamic programming approach for solving the LCS problem: Time complexity: O(mn)

7 -21 Tracing back in the LCS algorithm e.g. A = b a c a d, B = a c c b a d c b After all L i,j ’ s have been found, we can trace back to find the longest common subsequence of A and B.

/1 knapsack problem n objects, weight W 1, W 2, ,W n profit P 1, P 2, ,P n capacity M maximize subject to  M x i = 0 or 1, 1  i  n e. g.

7 -23 The multistage graph solution The 0/1 knapsack problem can be described by a multistage graph.

7 -24 The dynamic programming approach The longest path represents the optimal solution: x 1 =0, x 2 =1, x 3 =1 = = 50 Let f i (Q) be the value of an optimal solution to objects 1,2,3, …,i with capacity Q. f i (Q) = max{ f i-1 (Q), f i-1 (Q-W i )+P i } The optimal solution is f n (M).

7 -25 Optimal binary search trees e.g. binary search trees for 3, 7, 9, 12;

7 -26 Optimal binary search trees n identifiers : a 1 <a 2 <a 3 < … < a n P i, 1  i  n : the probability that a i is searched. Q i, 0  i  n : the probability that x is searched where a i < x < a i+1 (a 0 =- , a n+1 =  ).

7 -27 Identifiers : 4, 5, 8, 10, 11, 12, 14 Internal node : successful search, P i External node : unsuccessful search, Q i The expected cost of a binary tree: The level of the root : 1

7 -28 The dynamic programming approach Let C(i, j) denote the cost of an optimal binary search tree containing a i, …,a j. The cost of the optimal binary search tree with a k as its root :

7 -29 General formula

7 -30 Computation relationships of subtrees e.g. n=4 Time complexity : O(n 3 ) (n-m) C(i, j) ’ s are computed when j-i=m. Each C(i, j) with j-i=m can be computed in O(m) time.

7 -31 Matrix-chain multiplication n matrices A 1, A 2, …, A n with size p 0  p 1, p 1  p 2, p 2  p 3, …, p n-1  p n To determine the multiplication order such that # of scalar multiplications is minimized. To compute A i  A i+1, we need p i-1 p i p i+1 scalar multiplications. e.g. n=4, A 1 : 3  5, A 2 : 5  4, A 3 : 4  2, A 4 : 2  5 ((A 1  A 2 )  A 3 )  A 4, # of scalar multiplications: 3 * 5 * * 4 * * 2 * 5 = 114 (A 1  (A 2  A 3 ))  A 4, # of scalar multiplications: 3 * 5 * * 4 * * 2 * 5 = 100 (A 1  A 2 )  (A 3  A 4 ), # of scalar multiplications: 3 * 5 * * 4 * * 2 * 5 = 160

7 -32 Let m(i, j) denote the minimum cost for computing A i  A i+1  …  A j Computation sequence : Time complexity : O(n 3 )

7 -33 Single step graph edge searching fugitive: can move in any speed and is hidden in some edge of an undirected graph G=(V,E) edge searcher(guard): search an edge (u, v) from u to v, or stay at vertex u to prevent the fugitive passing through u Goal: to capture the fugitive in one step. no extra guards needed extra guards needed

7 -34 cost of a searcher from u to v: wt(u) a guard staying at u: wt(u) Cost of the following: 2wt(a)+wt(b)+wt(b) (one extra guard stays at b) Problem definition: To arrange the searchers with the minimal cost to capture the fugitive in one step. NP-hard for general graphs; P for trees.

7 -35 The weighted single step graph edge searching problem on trees T(v i ): the tree includes v i, v j (parent of v i ) and all descendant nodes of v i. C(T(v i ), v i, v j ): cost of an optimal searching plan with searching from v i to v j. C(T(v 4 ), v 4, v 2 )=5 C(T(v 4 ), v 2, v 4 )=2 C(T(v 2 ), v 2, v 1 )=6 C(T(v 2 ), v 1, v 2 )=9

7 -36 The dynamic programming approach Rule 1: optimal total cost Rule 2.1 : no extra guard at root r: All children must have the same searching direction.

7 -37 Rule 2.2: one extra guard at root r: Each child can choose its best direction independently.

7 -38 Rule 3.1 : Searching to an internal node u from its parent node w

7 -39 Rule 3.2 : Searching from an internal node u to its parent node w

7 -40 Rule 4: A leaf node u and its parent node w. the dynamic programming approach: working from the leaf nodes and gradually towards the root Time complexity : O(n) computing minimal cost of each sub-tree and determining searching directions for each edge