Dynamic Programming.

Slides:



Advertisements
Similar presentations
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Advertisements

Dynamic Programming.
Greedy Algorithms.
Types of Algorithms.
Analysis of Algorithms
Algorithms + L. Grewe.
UMass Lowell Computer Science Analysis of Algorithms Prof. Giampiero Pecelli Fall, 2010 Paradigms for Optimization Problems Dynamic Programming.
Overview What is Dynamic Programming? A Sequence of 4 Steps
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Analysis of Algorithms Dynamic Programming. A dynamic programming algorithm solves every sub problem just once and then Saves its answer in a table (array),
Introduction to Algorithms
Algorithm Strategies Nelson Padua-Perez Chau-Wen Tseng Department of Computer Science University of Maryland, College Park.
Dynamic Programming CIS 606 Spring 2010.
Greedy Algorithms CIS 606 Spring Greedy Algorithms Similar to dynamic programming. Used for optimization problems. Idea – When we have a choice.
UNC Chapel Hill Lin/Manocha/Foskey Optimization Problems In which a set of choices must be made in order to arrive at an optimal (min/max) solution, subject.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2006 Design Patterns for Optimization Problems Dynamic Programming.
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 6 Instructor: Paul Beame TA: Gidon Shavit.
Dynamic Programming 0-1 Knapsack These notes are taken from the notes by Dr. Steve Goddard at
Analysis of Algorithms
Dynamic Programming Introduction to Algorithms Dynamic Programming CSE 680 Prof. Roger Crawfis.
First Ingredient of Dynamic Programming
David Luebke 1 8/23/2015 CS 332: Algorithms Greedy Algorithms.
Dynamic Programming. Well known algorithm design techniques:. –Divide-and-conquer algorithms Another strategy for designing algorithms is dynamic programming.
ADA: 7. Dynamic Prog.1 Objective o introduce DP, its two hallmarks, and two major programming techniques o look at two examples: the fibonacci.
Nattee Niparnan. Recall  Complexity Analysis  Comparison of Two Algos  Big O  Simplification  From source code  Recursive.
Recursion and Dynamic Programming. Recursive thinking… Recursion is a method where the solution to a problem depends on solutions to smaller instances.
Fundamentals of Algorithms MCS - 2 Lecture # 7
Dynamic Programming Chapter 15 Highlights Charles Tappert Seidenberg School of CSIS, Pace University.
1 Summary: Design Methods for Algorithms Andreas Klappenecker.
David Luebke 1 10/24/2015 CS 332: Algorithms Greedy Algorithms Continued.
CSC 413/513: Intro to Algorithms Greedy Algorithms.
1 CPSC 320: Intermediate Algorithm Design and Analysis July 28, 2014.
CS 8833 Algorithms Algorithms Dynamic Programming.
DP (not Daniel Park's dance party). Dynamic programming Can speed up many problems. Basically, it's like magic. :D Overlapping subproblems o Number of.
1 Programming for Engineers in Python Autumn Lecture 12: Dynamic Programming.
Dynamic Programming. Many problem can be solved by D&C – (in fact, D&C is a very powerful approach if you generalize it since MOST problems can be solved.
Types of Algorithms. 2 Algorithm classification Algorithms that use a similar problem-solving approach can be grouped together We’ll talk about a classification.
1 Dynamic Programming Topic 07 Asst. Prof. Dr. Bunyarit Uyyanonvara IT Program, Image and Vision Computing Lab. School of Information and Computer Technology.
Optimization Problems In which a set of choices must be made in order to arrive at an optimal (min/max) solution, subject to some constraints. (There may.
Computer Sciences Department1.  Property 1: each node can have up to two successor nodes (children)  The predecessor node of a node is called its.
CSC5101 Advanced Algorithms Analysis
Greedy Algorithms BIL741: Advanced Analysis of Algorithms I (İleri Algoritma Çözümleme I)1.
1 Today’s Material Dynamic Programming – Chapter 15 –Introduction to Dynamic Programming –0-1 Knapsack Problem –Longest Common Subsequence –Chain Matrix.
Fundamental Data Structures and Algorithms Ananda Guna March 18, 2003 Dynamic Programming Part 1.
1 Algorithms CSCI 235, Fall 2015 Lecture 29 Greedy Algorithms.
Dynamic Programming. What is Dynamic Programming  A method for solving complex problems by breaking them down into simpler sub problems. It is applicable.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
CSCI 58000, Algorithm Design, Analysis & Implementation Lecture 12 Greedy Algorithms (Chapter 16)
Greedy algorithms: CSC317
Lecture 12.
Dynamic Programming Sequence of decisions. Problem state.
Design & Analysis of Algorithm Dynamic Programming
David Meredith Dynamic programming David Meredith
Algorithmics - Lecture 11
Advanced Design and Analysis Techniques
Types of Algorithms.
Lecture 5 Dynamic Programming
Algorithm Design Methods
Dynamic Programming.
Dynamic Programming.
Types of Algorithms.
Ch. 15: Dynamic Programming Ming-Te Chi
Algorithms CSCI 235, Spring 2019 Lecture 29 Greedy Algorithms
Introduction to Algorithms: Dynamic Programming
Types of Algorithms.
Longest Common Subsequence
Longest Common Subsequence
Data Structures and Algorithms Dynamic Programming
Presentation transcript:

Dynamic Programming

Algorithmic Strategies Divide and Conquer Greedy Approach Dynamic Programming

Algorithm Strategies: Key Differences Divide and Conquer Divide a problem into “disjoint” sub-problems Solve the sub-problems first (bottom-up solution) Combine the results somehow Greedy Approach Apply a greedy choice first The choice will create sub-problems to solve (top-down solution) Hopefully, local optimal will lead to global optimal

Algorithm Strategies: Key Differences (Cont’d) Dynamic Programming Divide a problem into sub-problems Solve the sub-problems first, and then combine them to solve larger problem (bottom-up solution) The sub-problems are overlapping Divide and conquer will solve the same sub-problem again ad again Dynamic programming will solve each sub-problem once, and remembers the answer Similar to Greedy approach, it solves optimization problems Uses more space to save time

Dynamic Programming Trades space (to save the sub-problems solutions) to save time Can reduce exponential-time complexity to polynomial because of this extra storage Tries to search all possible combinations. At each step it will select the best combination for each sub-problem

Rod Cutting

Rod Cutting: Problem Definition Given a rod of length n with n-1 cutting points Also given a revenue for each length Find the best cutting for the rod that maximize the revenue

Example: Rod of size 4 How many possible cuttings? Exponential  2(n-1) = 23 = 8 Each cutting point is a random variable (0 or 1) We have n-1 cutting points The total possibilities is 2(n-1) Naïve Approach (exponential) Try all possibilities and select the maximum Best choice: Two pieces each of size 2  revenue = 5 + 5 = 10

Rod Cutting: Optimization Problem Rod cutting is an optimization problem Has the optimal sub-structure property You must has the optimal cut for each sub-problem to get the global optimal Has recursive exponential solution Has polynomial dynamic programming solution

Recursive Top-Down Solution Where to have the first cut? (Position i above) This will create a sub-problem that will be solved recursively

Recursive Top-Down Solution Recursion Tree (for n = 4) i=1 i=2 i=4 i=3 Condition for ending the recursion If the first cut is at position i Solve the rest recursively Where to have the first cut? (Position i above) This will create a sub-problem that will be solved recursively

Notice: Sub-Problems Repeat Recursion Tree (for n = 4) i=1 i=2 i=4 i=3 >> Sub-problem of size 2 is repeated twice >> Sub-problem of size 1 is repeated 4 times

Dynamic Programming Solution Store sub-problem results, don’t re-compute Time vs. Memory trade-off Turn a exponential-time solution into a polynomial-time solution

Version 1: Top-down with Memoization r[x] is the best revenue for size x If this size is solved before, just return its optimal solution Same as before (i is the first-cut position, the rest is solved recursively)

Version 2: Bottom-Up Without Recursion Solve the sub-problems in order (smaller to larger) Size 0 has revenue 0 Increment the problem size by 1 each time This loop gets the maximum among all possible cuts All smaller sub-problems are already computed Time Complexity  O( n2) Space Complexity  O(n)

Reconstructing The Optimal Solution So far, the algorithm tells you the optimal revenue But, does not tell how to get it Tells you for size 4, the optimal revenue is 10 !! We need extra space  Only store the 1st-cut position for each size (not all positions) S[4] = 2 // meaning the 1st-cut position for size 4 is 2 Now we have pieces, each of size 2 To know their cut, go to S[2]

Code Extension Problem size Max revenue 1st cut position