1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 6 Instructor: Paul Beame TA: Gidon Shavit.

Slides:



Advertisements
Similar presentations
Dynamic Programming Introduction Prof. Muhammad Saeed.
Advertisements

Introduction to Algorithms 6.046J/18.401J/SMA5503
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Dynamic Programming.
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 4 Instructor: Paul Beame TA: Gidon Shavit.
Divide and Conquer. Recall Complexity Analysis – Comparison of algorithm – Big O Simplification From source code – Recursive.
Greedy Algorithms Basic idea Connection to dynamic programming Proof Techniques.
1 Dynamic Programming Jose Rolim University of Geneva.
Nattee Niparnan. Recall  Complexity Analysis  Comparison of Two Algos  Big O  Simplification  From source code  Recursive.
Algorithm Strategies Nelson Padua-Perez Chau-Wen Tseng Department of Computer Science University of Maryland, College Park.
Fall 2008Programming Development Techniques 1 Topic 3 Linear Recursion and Iteration September 2008.
Dynamic Programming Carrie Williams. What is Dynamic Programming? Method of breaking the problem into smaller, simpler sub-problems Method of breaking.
CPSC 311, Fall 2009: Dynamic Programming 1 CPSC 311 Analysis of Algorithms Dynamic Programming Prof. Jennifer Welch Fall 2009.
Dynamic Programming CIS 606 Spring 2010.
CS 206 Introduction to Computer Science II 10 / 14 / 2009 Instructor: Michael Eckmann.
This material in not in your text (except as exercises) Sequence Comparisons –Problems in molecular biology involve finding the minimum number of edit.
Dynamic Programming Optimization Problems Dynamic Programming Paradigm
CPSC 411 Design and Analysis of Algorithms Set 5: Dynamic Programming Prof. Jennifer Welch Spring 2011 CPSC 411, Spring 2011: Set 5 1.
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Instructor: Paul Beame TA: Gidon Shavit.
1 Dynamic Programming Andreas Klappenecker [based on slides by Prof. Welch]
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 10 Instructor: Paul Beame.
CS 206 Introduction to Computer Science II 10 / 16 / 2009 Instructor: Michael Eckmann.
Dynamic Programming 0-1 Knapsack These notes are taken from the notes by Dr. Steve Goddard at
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Instructor: Paul Beame TA: Gidon Shavit.
CS 206 Introduction to Computer Science II 10 / 08 / 2008 Instructor: Michael Eckmann.
1 Dynamic Programming Jose Rolim University of Geneva.
10/31/02CSE Greedy Algorithms CSE Algorithms Greedy Algorithms.
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 5 Instructor: Paul Beame TA: Gidon Shavit.
CS 206 Introduction to Computer Science II 02 / 25 / 2009 Instructor: Michael Eckmann.
Dynamic Programming Introduction to Algorithms Dynamic Programming CSE 680 Prof. Roger Crawfis.
10/31/02CSE Greedy Algorithms CSE Algorithms Greedy Algorithms.
Lecture 5 Dynamic Programming. Dynamic Programming Self-reducibility.
October 21, Algorithms and Data Structures Lecture X Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Nattee Niparnan. Recall  Complexity Analysis  Comparison of Two Algos  Big O  Simplification  From source code  Recursive.
Recursion and Dynamic Programming. Recursive thinking… Recursion is a method where the solution to a problem depends on solutions to smaller instances.
CSCI 256 Data Structures and Algorithm Analysis Lecture 14 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
Dynamic Programming Nattee Niparnan. Dynamic Programming  Many problem can be solved by D&C (in fact, D&C is a very powerful approach if you generalized.
LECTURE 13. Course: “Design of Systems: Structural Approach” Dept. “Communication Networks &Systems”, Faculty of Radioengineering & Cybernetics Moscow.
CS 8833 Algorithms Algorithms Dynamic Programming.
CS 206 Introduction to Computer Science II 02 / 23 / 2009 Instructor: Michael Eckmann.
1 Dynamic Programming Andreas Klappenecker [partially based on slides by Prof. Welch]
Dynamic Programming. Many problem can be solved by D&C – (in fact, D&C is a very powerful approach if you generalize it since MOST problems can be solved.
1 Dynamic Programming Topic 07 Asst. Prof. Dr. Bunyarit Uyyanonvara IT Program, Image and Vision Computing Lab. School of Information and Computer Technology.
Dynamic Programming Min Edit Distance Longest Increasing Subsequence Climbing Stairs Minimum Path Sum.
Dynamic Programming.  Decomposes a problem into a series of sub- problems  Builds up correct solutions to larger and larger sub- problems  Examples.
CSC5101 Advanced Algorithms Analysis
Lecture 151 Programming & Data Structures Dynamic Programming GRIFFITH COLLEGE DUBLIN.
2/19/ ITCS 6114 Dynamic programming 0-1 Knapsack problem.
TU/e Algorithms (2IL15) – Lecture 3 1 DYNAMIC PROGRAMMING
1 CS1120: Recursion. 2 What is Recursion? A method is said to be recursive if the method definition contains a call to itself A recursive method is a.
Dynamic Programming. What is Dynamic Programming  A method for solving complex problems by breaking them down into simpler sub problems. It is applicable.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 18.
Lecture 5 Dynamic Programming
Dynamic Programming Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
JinJu Lee & Beatrice Seifert CSE 5311 Fall 2005 Week 10 (Nov 1 & 3)
Lecture 5 Dynamic Programming
Applied Algorithms (Lecture 17) Recursion Fall-23
Bubble Sort Bubble sort is one way to sort an array of numbers. Adjacent values are swapped until the array is completely sorted. This algorithm gets its.
CSCE 411 Design and Analysis of Algorithms
Dynamic Programming.
Dynamic Programming.
Prepared by Chen & Po-Chuan 2016/03/29
CS1120: Recursion.
CS Algorithms Dynamic programming 0-1 Knapsack problem 12/5/2018.
Data Structure and Algorithms
Lecture 8. Paradigm #6 Dynamic Programming
Lecture 4 Dynamic Programming
This is not an advertisement for the profession
Algorithm Course Algorithms Lecture 3 Sorting Algorithm-1
Presentation transcript:

1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 6 Instructor: Paul Beame TA: Gidon Shavit

2 Algorithm Design Techniques  Dynamic Programming  Given a solution of a problem using smaller sub-problems, e.g. a recursive solution  Useful when the same sub-problems show up again and again in the solution

3 A simple case: Computing Fibonacci Numbers  Recall F n =F n-1 +F n-2 and F 0 =0, F 1 =1  Recursive algorithm:  Fibo(n) if n=0 then return(0) else if n=1 then return(1) else return(Fibo(n-1)+Fibo(n-2))

4 Call tree - start F (6) F (5)F (4) F (3) F (4) F (2) F (3) F (1) F (0) 10 F (1)

5 Full call tree F (6) F (2) F (5)F (4) F (3) F (4) F (2) F (3) F (1) F (0) F (1) F (0) 10 F (1) F (2)F (1) 1 F (0) 10 F (2)F (1) 1 F (0) 1 0 F (1) 1

6 Memo-ization  Remember all values from previous recursive calls  Before recursive call, test to see if value has already been computed  Dynamic Programming  Convert memo-ized algorithm from a recursive one to an iterative one

7 Fibonacci - Dynamic Programming Version  FiboDP(n): F[0]  0 F[1]  1 for i=2 to n do F[i]=F[i-1]+F[i-1] endfor return(F[n])

8 Dynamic Programming  Useful when  same recursive sub-problems occur repeatedly  Can anticipate the parameters of these recursive calls  The solution to whole problem can be figured out with knowing the internal details of how the sub-problems are solved  principle of optimality

9 List partition problem  Given: a sequence of n positive integers s 1,...,s n and a positive integer k  Find: a partition of the list into up to k blocks: s 1,...,s i 1 |s i s i 2 |s i s i k-1 |s i k s n so that the sum of the numbers in the largest block is as small as possible. i.e. find spots for up to k-1 dividers

10 Greedy approach  Ideal size would be P=  Greedy: walk along until what you have so far adds up to P then insert a divider  Problem: it may not exact (or correct)  | |  sum is 4800 so size must be at least 1600.

11 Recursive solution  Try all possible values for the position of the last divider  For each position of this last divider  there are k-2 other dividers that must divide the list of numbers prior to the last divider as evenly as possible  s 1,...,s i 1 |s i s i 2 |s i s i k-1 |s i k s n  recursive sub-problem of the same type

12 Recursive idea  Let M[n,k] the smallest cost (size of largest block) of any partition of the n into k pieces.  If between the i th and i+1 st is the best position for the last divider then M[n,k]= max ( M[i,k-1], )  In general  M[n,k]= min i<n max ( M[i,k-1], ) cost of last block max cost of 1st k-1 blocks

13 Time-saving - prefix sums  Computing the costs of the blocks may be expensive and involved repeated work  Idea: Pre-compute prefix sums  p[1]=s 1 p[2]=s 1 +s 2 p[3]=s 1 +s 2 +s 3... p[n]=s 1 +s s n  cost: n additions, space n  Length of block s i s j is just p[j]-p[i]

14 Linear Partition Algorithm  Partition(S,k): p[0]  0; for i=1 to n do p[i]  p[i-1]+s i for i=1 to n do M[i,1]  p[i] for j=1 to k do M[1,j]  s 1 for i=2 to n do for j=2 to k do M[i,j]  min pos<i {max(M[pos,j-1], p[i]-p[pos])} D[i,j]  value of pos where min is achieved

15 Linear Partition Algorithm  Partition(S,k): p[0]  0; for i=1 to n do p[i]  p[i-1]+s i for i=1 to n do M[i,1]  p[i] for j=1 to k do M[1,j]  s 1 for i=2 to n do for j=2 to k do M[i,j]  for pos=1 to i-1 do s  max(M[pos,j-1], p[i]-p[pos]) if M[i,j]>s then M[i,j]  s ; D[i,j]  pos