Recurrences (in color) It continues…. Recurrences When an algorithm calls itself recursively, its running time is described by a recurrence. When an algorithm.

Slides:



Advertisements
Similar presentations
Divide and Conquer (Merge Sort)
Advertisements

A simple example finding the maximum of a set S of n numbers.
5/5/20151 Analysis of Algorithms Lecture 6&7: Master theorem and substitution method.
Comp 122, Spring 2004 Divide and Conquer (Merge Sort)
September 12, Algorithms and Data Structures Lecture III Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Divide-and-Conquer Recursive in structure –Divide the problem into several smaller sub-problems that are similar to the original but smaller in size –Conquer.
11 Computer Algorithms Lecture 6 Recurrence Ch. 4 (till Master Theorem) Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
ADA: 4. Divide/Conquer1 Objective o look at several divide and conquer examples (merge sort, binary search), and 3 approaches for calculating their.
Algorithms Recurrences. Definition – a recurrence is an equation or inequality that describes a function in terms of its value on smaller inputs Example.
1 Issues with Matrix and Vector Issues with Matrix and Vector Quicksort Quicksort Determining Algorithm Efficiency Determining Algorithm Efficiency Substitution.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Recurrences - 1 Recurrences.
CS 253: Algorithms Chapter 4 Divide-and-Conquer Recurrences Master Theorem Credit: Dr. George Bebis.
Updates HW#1 has been delayed until next MONDAY. There were two errors in the assignment Merge sort runs in Θ(n log n). Insertion sort runs in Θ(n2).
Recurrences Part 3. Recursive Algorithms Recurrences are useful for analyzing recursive algorithms Recurrence – an equation or inequality that describes.
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions Analysis of.
Chapter 4: Solution of recurrence relationships
Recurrence Relations Connection to recursive algorithms Techniques for solving them.
Analysis of Algorithms CS 477/677 Recurrences Instructor: George Bebis (Appendix A, Chapter 4)
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions BIL741: Advanced.
October 1, Algorithms and Data Structures Lecture III Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Divide-and-Conquer 7 2  9 4   2   4   7
Analysis of Algorithms
Analysis of Algorithms CS 477/677
Analyzing Recursive Algorithms A recursive algorithm can often be described by a recurrence equation that describes the overall runtime on a problem of.
MCA 202: Discrete Mathematics Instructor Neelima Gupta
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
1 Computer Algorithms Lecture 7 Master Theorem Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
Project 2 due … Project 2 due … Project 2 Project 2.
DR. NAVEED AHMAD DEPARTMENT OF COMPUTER SCIENCE UNIVERSITY OF PESHAWAR LECTURE-5 Advance Algorithm Analysis.
Algorithms Merge Sort Solving Recurrences The Master Theorem.
Getting Started Introduction to Algorithms Jeff Chastine.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 3.
Introduction to Algorithms Chapter 4: Recurrences.
Chapter 4. Recurrences. Outline Offers three methods for solving recurrences, that is for obtaining asymptotic bounds on the solution In the substitution.
Recurrences David Kauchak cs161 Summer Administrative Algorithms graded on efficiency! Be specific about the run times (e.g. log bases) Reminder:
Divide-and-Conquer UNC Chapel HillZ. Guo. Divide-and-Conquer It’s a technique instead of an algorithm Recursive in structure – Divide the problem into.
Midterm Review 1. Midterm Exam Thursday, October 15 in classroom 75 minutes Exam structure: –TRUE/FALSE questions –short questions on the topics discussed.
1 Algorithms CSCI 235, Fall 2015 Lecture 6 Recurrences.
Recurrences – II. Comp 122, Spring 2004.
Foundations II: Data Structures and Algorithms
Solving Recurrences with the Substitution Method.
Design & Analysis of Algorithms COMP 482 / ELEC 420 John Greiner
Lecture 5 Today, how to solve recurrences We learned “guess and proved by induction” We also learned “substitution” method Today, we learn the “master.
Divide and Conquer. Recall Divide the problem into a number of sub-problems that are smaller instances of the same problem. Conquer the sub-problems by.
Introduction to Algorithms (2 nd edition) by Cormen, Leiserson, Rivest & Stein Chapter 4: Recurrences.
Master Method Some of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Divide and Conquer Faculty Name: Ruhi Fatima Topics Covered Divide and Conquer Matrix multiplication Recurrence.
BY Lecturer: Aisha Dawood. A recurrence is a function is defined in terms of:  one or more base cases, and  itself, with smaller arguments. 2.
Chapter 4: Solution of recurrence relationships Techniques: Substitution: proof by induction Tree analysis: graphical representation Master theorem: Recipe.
Recurrences It continues… Jeff Chastine. Recurrences When an algorithm calls itself recursively, its running time is described by a recurrence. A recurrence.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 4.
Equal costs at all levels
Recursion Ali.
Mathematical Foundations (Solving Recurrence)
Analysis of Algorithms CS 477/677
CS 3343: Analysis of Algorithms
Advance Analysis of Lecture No 9 Institute of Southern Punjab Multan
Algorithms and Data Structures Lecture III
Ch 4: Recurrences Ming-Te Chi
Divide and Conquer (Merge Sort)
Using The Master Method Case 1
Analysis of Algorithms
Algorithms Recurrences.
Algorithms CSCI 235, Spring 2019 Lecture 6 Recurrences
Presentation transcript:

Recurrences (in color) It continues…

Recurrences When an algorithm calls itself recursively, its running time is described by a recurrence. When an algorithm calls itself recursively, its running time is described by a recurrence. A recurrence describes itself in terms of its value on smaller inputs A recurrence describes itself in terms of its value on smaller inputs There are three methods of solving these: substitution, recursion tree, or master method There are three methods of solving these: substitution, recursion tree, or master method

What it looks like This is the recurrence of M ERGE -S ORT This is the recurrence of M ERGE -S ORT What this says is that the time involved is 1 if n = 1, What this says is that the time involved is 1 if n = 1, Else, the time involved is 2 times half the size of the array, plus n time to merge sorted sub-arrays Else, the time involved is 2 times half the size of the array, plus n time to merge sorted sub-arrays T(n) = { Θ (1) 2T(n/2) + Θ (n) if n = 1 if n > 1

Substitution Similar to induction Similar to induction Guess solution, and prove it holds true for next call Guess solution, and prove it holds true for next call Powerful method, but can sometimes be difficult to guess the solution! Powerful method, but can sometimes be difficult to guess the solution!

Substitution Example: Example: T (n) = 2T (n/2) + n T (n) = 2T (n/2) + n Guess that T (n) = O(n lg n) Guess that T (n) = O(n lg n) We must prove that T (n)  cn lg n, for an appropriate constant c > 0 We must prove that T (n)  cn lg n, for an appropriate constant c > 0 Assume it holds for n/2 as well Assume it holds for n/2 as well T (n/2) = c(n/2) lg (n/2) T (n)  2 (c(n/2) lg (n/2)) + n = cn lg (n/2)) + n = cn lg (n/2)) + n = cn (lg n – lg 2) + n = cn (lg n – lg 2) + n = cn lg n – cn + n = cn lg n – cn + n  cn lg n, for  c  1  cn lg n, for  c  1 Note:  means ‘for all’

Subtleties Let T (n) = T (n/2) + T(n/2) + 1 Let T (n) = T (n/2) + T(n/2) + 1 Assume that T (n) = O(n) Assume that T (n) = O(n) Then T (n/2) = c(n/2) Then T (n/2) = c(n/2) T (n)  c(n/2) + c(n/2) + 1 = cn + 1 (note there is an extra “1”!) = cn + 1 (note there is an extra “1”!) Which does not imply T (n)  cn Here, we’re correct, but off by a constant! Here, we’re correct, but off by a constant!

Subtleties We strengthen our guess: T (n)  cn – b We strengthen our guess: T (n)  cn – b T (n)  (c(n/2) – b) + (c(n/2) – b) + 1 = cn – 2b + 1 = cn – 2b + 1  cn – b, for  b > 1  cn – b, for  b > 1

One Last Example Original Equation: T (n) = 2T (  n) + lg n Let m = lg n, then T (2 m ) = 2T (2 m/2 ) + m Let S (m) = T (2 m ) S (m) = 2 S (m/2) + m We know S (m) = (m lg m), so T (n) = T (2 m ) = S (m) = O(m lg m) = O(lg n lg lg n)

Recursion Tree Method A recursion tree is built A recursion tree is built We sum up each level We sum up each level Total cost = number of levels * cost at each level Total cost = number of levels * cost at each level Usually used to generate a good guess for the substitution method Usually used to generate a good guess for the substitution method Could still be used as direct proof Could still be used as direct proof Example: T (n) = 3T (n/4) +  (n 2 ) Example: T (n) = 3T (n/4) +  (n 2 )

T(n)T(n)

cn 2 T(n/4)

cn 2 c(n/4) 2 T(n/16)

cn 2 c(n/4) 2 c(n/16) 2 T(1)

cn 2 c(n/4) 2 c(n/16) 2 T(1) cn 2 3/16 cn 2 (3/16) 2 cn 2  (n log 4 3 )

Questions How many levels does this tree have? How many levels does this tree have? The subproblem at depth i is n/4 i The subproblem at depth i is n/4 i When does the subproblem hit size 1 ? When does the subproblem hit size 1 ? n/4 i = 1 n/4 i = 1 n = 4 i n = 4 i lg 4 n = i lg 4 n = i Therefore, the tree has lg 4 n + 1 levels (0, 1, 2,… lg 4 n) Therefore, the tree has lg 4 n + 1 levels (0, 1, 2,… lg 4 n) There are 3 i nodes at each level There are 3 i nodes at each level The cost at each level is 3 i c(n/4 i ) 2 The cost at each level is 3 i c(n/4 i ) 2 The last level has 3 log 4 n nodes = n log 4 3 The last level has 3 log 4 n nodes = n log 4 3

The Master's Method When it has this form: T(n) = aT(n/b) + f(n) If f (n) = Ο(n log b a-ε ) for some constant ε>0, then T (n) = Θ (n log b a ) If f (n) = Ο(n log b a-ε ) for some constant ε>0, then T (n) = Θ (n log b a ) If f (n) = Θ(n log b a-ε ) for some constant ε>0, then T (n) = Θ (n log b a lgn) If f (n) = Θ(n log b a-ε ) for some constant ε>0, then T (n) = Θ (n log b a lgn) If f (n) = Ω (n log b a+ε ) for some constant ε>0, and if af(n/b) ≤ cf (n) for c 0, and if af(n/b) ≤ cf (n) for c < 1 and large n T (n) = Θ (f (n)) T (n) = Θ (f (n))

Example T(n) = 9T(n/3) + n T(n) = 9T(n/3) + n a = 9, b = 3, f(n)=n, thus n log b a = n log 3 9 = n 2 a = 9, b = 3, f(n)=n, thus n log b a = n log 3 9 = n 2 f(n)=n=O( n log 3 9- ε), where ε=1 f(n)=n=O( n log 3 9- ε), where ε=1 So we can apply case 1, thus T(n) = Θ ( n 2 ) So we can apply case 1, thus T(n) = Θ ( n 2 ) T(n) = T(2n/3) + 1 T(n) = T(2n/3) + 1 a = 1, b = 3/2, f(n)=1, thus n log b a = n log 3/2 1 = n 0 = 1 a = 1, b = 3/2, f(n)=1, thus n log b a = n log 3/2 1 = n 0 = 1 Case 2 applies, thus T(n) = Θ (lgn) Case 2 applies, thus T(n) = Θ (lgn)

Example … T(n) = 3 T(n/4) + nlgn T(n) = 3 T(n/4) + nlgn a = 3, b = 4, f (n) = nlgn a = 3, b = 4, f (n) = nlgn n log b a = n log 4 3 = O( n ) n log b a = n log 4 3 = O( n ) f (n) = Ω (n log 4 3+ε ) where ε ≈ 0.2 (solve for it) f (n) = Ω (n log 4 3+ε ) where ε ≈ 0.2 (solve for it) For large n, a f (n/b) = 3(n/4)lg(n/4) ≤ (3/4)nlgn = c f (n) for c = 3/4 For large n, a f (n/b) = 3(n/4)lg(n/4) ≤ (3/4)nlgn = c f (n) for c = 3/4 Case 3 applied Case 3 applied Then T (n) = Θ (nlgn)

When it doesn’t work... T(n) = 2T(n/2) + n lg n T(n) = 2T(n/2) + n lg n a = 2, b = 2, f(n) = n lg n a = 2, b = 2, f(n) = n lg n You would think that rule 3 should apply You would think that rule 3 should apply f(n) > n log b a f(n) > n log b a n lg n > n n lg n > n But f(n) is not polynomially larger! But f(n) is not polynomially larger! Because (n lg n)/n = lg n, which is asymptotically less than n . Because (n lg n)/n = lg n, which is asymptotically less than n .