Analyzing Recursive Algorithms A recursive algorithm can often be described by a recurrence equation that describes the overall runtime on a problem of.

Slides:



Advertisements
Similar presentations
5/5/20151 Analysis of Algorithms Lecture 6&7: Master theorem and substitution method.
Advertisements

Comp 122, Spring 2004 Divide and Conquer (Merge Sort)
September 12, Algorithms and Data Structures Lecture III Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Divide-and-Conquer Recursive in structure –Divide the problem into several smaller sub-problems that are similar to the original but smaller in size –Conquer.
11 Computer Algorithms Lecture 6 Recurrence Ch. 4 (till Master Theorem) Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
ADA: 4. Divide/Conquer1 Objective o look at several divide and conquer examples (merge sort, binary search), and 3 approaches for calculating their.
Algorithms Recurrences. Definition – a recurrence is an equation or inequality that describes a function in terms of its value on smaller inputs Example.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
Analysis of Algorithms CS 477/677 Midterm Exam Review Instructor: George Bebis.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Recurrences - 1 Recurrences.
CS 253: Algorithms Chapter 4 Divide-and-Conquer Recurrences Master Theorem Credit: Dr. George Bebis.
Updates HW#1 has been delayed until next MONDAY. There were two errors in the assignment Merge sort runs in Θ(n log n). Insertion sort runs in Θ(n2).
Recurrences Part 3. Recursive Algorithms Recurrences are useful for analyzing recursive algorithms Recurrence – an equation or inequality that describes.
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions Analysis of.
Chapter 4: Solution of recurrence relationships
Analysis of Algorithms CS 477/677 Recurrences Instructor: George Bebis (Appendix A, Chapter 4)
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions BIL741: Advanced.
Analysis of Recursive Algorithms October 29, 2014
Divide-and-Conquer 7 2  9 4   2   4   7
Analysis of Algorithms
Analysis of Algorithms CS 477/677
Algorithm Correctness A correct algorithm is one in which every valid input instance produces the correct output. The correctness must be proved mathematically.
MCA 202: Discrete Mathematics Instructor Neelima Gupta
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
1 Recurrences Algorithms Jay Urbain, PhD Credits: Discrete Mathematics and Its Applications, by Kenneth Rosen The Design and Analysis of.
1 Computer Algorithms Lecture 7 Master Theorem Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
2IL50 Data Structures Fall 2015 Lecture 2: Analysis of Algorithms.
Project 2 due … Project 2 due … Project 2 Project 2.
Divide and Conquer Andreas Klappenecker [based on slides by Prof. Welch]
CSC 413/513: Intro to Algorithms Merge Sort Solving Recurrences.
DR. NAVEED AHMAD DEPARTMENT OF COMPUTER SCIENCE UNIVERSITY OF PESHAWAR LECTURE-5 Advance Algorithm Analysis.
10/25/20151 CS 3343: Analysis of Algorithms Lecture 6&7: Master theorem and substitution method.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 3.
Recurrences David Kauchak cs161 Summer Administrative Algorithms graded on efficiency! Be specific about the run times (e.g. log bases) Reminder:
Divide and Conquer Andreas Klappenecker [based on slides by Prof. Welch]
Divide-and-Conquer UNC Chapel HillZ. Guo. Divide-and-Conquer It’s a technique instead of an algorithm Recursive in structure – Divide the problem into.
Midterm Review 1. Midterm Exam Thursday, October 15 in classroom 75 minutes Exam structure: –TRUE/FALSE questions –short questions on the topics discussed.
1 Algorithms CSCI 235, Fall 2015 Lecture 6 Recurrences.
Foundations II: Data Structures and Algorithms
ADVANCED ALGORITHMS REVIEW OF ANALYSIS TECHNIQUES (UNIT-1)
Divide and Conquer. Recall Divide the problem into a number of sub-problems that are smaller instances of the same problem. Conquer the sub-problems by.
Spring 2015 Lecture 2: Analysis of Algorithms
Introduction to Algorithms (2 nd edition) by Cormen, Leiserson, Rivest & Stein Chapter 4: Recurrences.
Master Method Some of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Recurrences (in color) It continues…. Recurrences When an algorithm calls itself recursively, its running time is described by a recurrence. When an algorithm.
Divide and Conquer Faculty Name: Ruhi Fatima Topics Covered Divide and Conquer Matrix multiplication Recurrence.
2IL50 Data Structures Spring 2016 Lecture 2: Analysis of Algorithms.
BY Lecturer: Aisha Dawood. A recurrence is a function is defined in terms of:  one or more base cases, and  itself, with smaller arguments. 2.
Chapter 4: Solution of recurrence relationships Techniques: Substitution: proof by induction Tree analysis: graphical representation Master theorem: Recipe.
Advanced Algorithms Analysis and Design By Dr. Nazir Ahmad Zafar Dr Nazir A. Zafar Advanced Algorithms Analysis and Design.
Recurrences It continues… Jeff Chastine. Recurrences When an algorithm calls itself recursively, its running time is described by a recurrence. A recurrence.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 4.
Recursion Ali.
Mathematical Foundations (Solving Recurrence)
Analysis of Algorithms CS 477/677
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
T(n) = aT(n/b) + cn = a(aT(n/b/b) + cn/b) + cn 2
Advance Analysis of Lecture No 9 Institute of Southern Punjab Multan
Algorithms and Data Structures Lecture III
Ch 4: Recurrences Ming-Te Chi
Divide and Conquer (Merge Sort)
Analysis of Algorithms
Algorithms Recurrences.
Algorithms CSCI 235, Spring 2019 Lecture 6 Recurrences
Presentation transcript:

Analyzing Recursive Algorithms A recursive algorithm can often be described by a recurrence equation that describes the overall runtime on a problem of size n in terms of the runtime on smaller inputs. For divide-and-conquer algorithms, we get recurrences like: T(n) =  (1)if n  c aT(n/b) + D(n) + C(n)otherwise a = number of sub-problems n/b = size of the sub-problems D(n) = time to divide the size n problem into sub-problems C(n) = time to combine the sub-problem solutions where

Solving Recurrences (Ch. 4) We will use the following 3 methods to solve recurrences 1.Backward Substitution: find a pattern and convert it to a summation. 2.Make a guess and prove it is correct using induction. 3.Apply the "Master Theorem": If the recurrence has the form T(n) = aT(n/b) + f(n) then there is a formula that can (often) be applied, given in Section 4-3. To make the solutions simpler, we will Assume T(n) =  (1) for n small enough.

The Factorial Problem Algorithm F(n) Input: a positive integer n Output: n! 1. if n=0 2. return 1 3. else 4. return F(n-1) * n T(n) = T(n-1) + 1 T(0) = 0

Analyzing Recursive Algorithms For recursive algorithms such as computing the factorial of n, we get an expression like the following: T(n) = 1 if n = 0 T(n-1) + D(n) + C(n)otherwise n-1 = size of the sub-problem D(n) = time to divide the size n problem into sub-problems C(n) = time to combine the sub-problem solutions where

Solving recurrence for n! Algorithm F(n) Input: a positive integer n Output: n! 1. if n=0 2. return 1 3. else 4. return F(n-1) * n T(n) = T(n-1) + 1 T(0) = 0 We can solve this recurrence (ie, find an expression of the running time that is not given in terms of itself) using backward substitution. T(n) = T(n-1) + 1 subst T(n-1) = T(n-2) + 1 = [T(n-2) + 1] + 1 = T(n-2) + 2 subst T(n-2) = T(n-3) + 1 =[T(n-3) + 1] + 2 = T(n-3) + 3… … =T(n-i)+i = … = T(n-n)+n = 0 + n Therefore, this algorithm has linear running time.

Solving Recurrences: Backward Substitution Example: T(n) = 2T(n/2) + n (look familiar?) T(n)=2T(n/2) + n =2[2T(n/4) + n/2] + nexpand T(n/2) =4T(n/4) + n + nsimplify = 4[2T(n/8) + n/4] + n + nexpand T(n/4) =8T(n/8) + n + n + nsimplify…notice a pattern? = = 2 lgn T(n/2 lgn ) n + n + n after lgn iterations =c2 lgn + nlgn = cn + nlgn 2 lgn = n lg2 = n =  (nlgn)

Solving Recurrences: Backward Substitution Example: T(n) = 4T(n/2) + n T(n)=4T(n/2) + n =4[4T(n/4) + n/2] + n expand T(n/2) =16T(n/4) + 2n + n simplify 16 = 4 2 = 16[4T(n/8) + n/4] + 2n + n expand T(n/4) =64T(n/8) + 4n + 2n + n simplify 64 = 4 3 = = 4 lgn T(n/2 lgn ) n + 2n + n after lgn iterations =c4 lgn + n convert to summation = cn lg4 + n (2 lgn - 1) 4 lgn = n lg4 = n 2 = cn 2 + n(n - 1) 2 lgn = n lg2 = n =  (n 2 )

Solving Recurrences: Backward Substitution Example: T(n) = T(n/2) + 1 T(n)=T(n/2) + 1 =[T(n/4) + 1] + 1 expand T(n/2) =T(n/4) + 2 simplify = [T(n/8) + 1] + 2 expand T(n/4) =T(n/8) + 3 simplify = = T(n/2 lgn ) + lgn 2 lgn = n lg2 = n =c + lgn =  (lgn)

Binary Search (non-recursive) Algorithm BinarySearch(A[1…n], k) Input: a sorted array A of n comparable items and search key k Output: Index of array’s element that is equal to k or -1 if k not found 1. l = 1; r =n 2. while l ≤ r 3. m =  (l + r)/2  4. if k = A[m] return m 5. else if k < A[m] r = m – 1 6. else l = m return -1 What is the running time of this algorithm for an input of size n? Are there best- and worst-case input instances?

Binary Search (recursive) Algorithm BinarySearchRec(A[1…n], k, l, r) Input: a sorted array A of n comparable items, search key k, leftmost and rightmost index positions in A Output: Index of array’s element that is equal to k or -1 if k not found 1. if (l > r) return else 3. m =  (l + r)/2  4. if k = A[m] return m 5. else if k < A[m] return BinarySearchRec(A, k, l, m-1) 6. else return BinarySearchRec(A, k, m+1, r)

Substitution Method: 1.Guess the solution. 2.Use induction to find the constants and show that the solution works.

Substitution Method: If the recurrence has an exact function (with no asymptotic terms), the solution must be exact and a full inductive proof must be done (with basis, IHOP, and inductive step). If the recurrence has an asymptotic term, we express the solution by asymptotic notation and we don’t worry about base cases in the substitution proof.

Substitution Method: Example: T(n) = 1 if n=1 or 2T(n/2) + n if n>1. 1.Guess: T(n) = nlgn + n. 2.Induction: Basis: n = 1 => nlgn + n = = 1 = T(n) IHOP: T(k) = klgk + k for all k < n Now use this IHOP for T(n/2): = nlgn + n

Substitution Method: Guessing an Upper Bound Give an upper bound on the recurrence: T(n) = 2T(n/2) + θ n. Guess T(n) = O(nlgn) by showing that T(n) ≤ dnlgn for some d > 0. For easier analysis, assume that n = 2 k. IHOP: Assume T(n/2) ≤ dn/2 lg(n/2) T(n) ≤ 2(dn/2lg(n/2)) + cn using the IHOP = dnlg(n/2) + cn =dnlgn - dnlg2 + cn = dnlgn - dn + cn ≤ dnlgn if –dn + cn ≤ 0, d ≥ c

Substitution Method: Guessing a Lower Bound Give a lower bound on the recurrence: T(n) = 2T(n/2) + θn. Guess T(n) = Ω(nlgn) by showing that T(n) ≥ dnlgn for some d > 0. For easier analysis, assume that n = 2 k IHOP: Assume T(n/2) ≥ d(n/2)lg(n/2). T(n) ≥ 2(d(n/2)lg(n/2)) + cn =dnlgn – dnlg2 + cn = dnlgn - dn + cn ≥ dnlgn if –dn + cn ≥ 0, d ≤ c

Substitution Method: Guessing an Upper Bound Give an upper bound on the recurrence: T(n) = T(n/2) + 1. Guess T(n) = O(lgn) by showing that T(n) ≤ dlgn for some d > 0. For easier analysis, assume that n = 2 k IHOP: Assume T(n/2) ≤ dlg(n/2). T(n) ≤ (dlg(n/2)) + c = dlg(n/2) + c =dlgn – dlg2 + c = dlgn - d + c ≤ dlgn if –d + c ≤ 0, d ≥ c.

Substitution Method: Guessing a Lower Bound Give a lower bound on the recurrence: T(n) = T(n/2) + 1. Guess T(n) = Ω(lgn) by showing that T(n) ≥ dlgn for some d > 0. For easier analysis, assume that n = 2 k IHOP: Assume T(n/2) ≥ dlg(n/2). T(n) ≥dlg(n/2) + c =dlgn – dlg2 + c = dlgn - d + c ≥ dlgn if –d + c ≥ 0, d ≤ c.

Using induction to prove a tight bound Suppose T(n) = 1 if n = 2, and T(n) = T(n/2) + 1 if n = 2 k, for k > 1. Show T(n) = lgn by induction on the exponent k. Basis: When k = 1, n = 2. T(2) = lg2 = 1 = T(n). IHOP: Assume T(2 k ) = lg2 k = k for some constant k > 1. Inductive step: Show T(2 k+1 ) = lg(2 k+1 ) = k+1. T(2 k+1 ) = T(2 k ) + 1 /* given */ =(lg2 k ) + 1 /* by IHOP*/ =k + 1

Suppose T(n) = 1 if n = 1, and T(n) = T(n-1) +  (n) for n > 1. Show T(n) = O(n 2 ) by induction. IHOP: Assume T(k) = k 2 for all n < k. Inductive step: Show T(k) = k 2. T(k) = T(k-1) + k /* given */ =(k-1) 2 + k /* by IHOP */ =k 2 - k + 1 ≤ k 2 for k > 1 Using induction to prove an upper bound

How do you come up with a good guess? Example: T(n) = T(n/3) + T(2n/3) + θ(n) The recursion tree shows us the cost at each level of recursion. leftmost branch peters out after log 3 n levels rightmost branch peters out after log 3/2 n levels lgn levels, with cost n at each level = O(nlgn)

Solving Recurrences: Master Method (§4.3) The master method provides a 'cookbook' method for solving recurrences of a certain form. Master Theorem: Let a ≥ 1 and b > 1 be constants, let f(n) be a function, and let T(n) be defined on nonnegative integers as: T(n) = aT(n/b) + f(n) Recall, a is the number of sub-problems and n/b is the size of each sub-problem. f(n) is the cost to divide and recombine for a solution. Then, T(n) can be bounded asymptotically as follows: 1.T(n) =  (n log b a )if f(n) = O(n log b a-  ) for some constant  > 0 2.T(n) =  (n log b a lgn) if f(n) =  (n log b a ) 3.T(n) =  f(n)if f(n) =  (n log b a+  ) for some constant  > 0

Solving Recurrences: Master Method Intuition: Compare f(n) with  (n log b a ). case 1: f(n) is "polynomially smaller than"  (n log b a ) case 2: f(n) is "asymptotically equal to"  (n log b a ) case 3: f(n) is "polynomially larger than"  (n log b a ) What is log b a? Recall that T(n) = aT(n/b) + f(n) and that the recursion ends when we get to, and To get a tight bound, we need to check which is polynomially larger, f(n) or n log b a

Master Method Restated Master Theorem: If T(n) = aT(n/b) + O(n d ) for some constants a ≥ 1, b > 1, d ≥ 0, then  (n log b a )if d b d ) T(n) =  (n d lgn) if d = log b a (a = b d )  (n d ) if d > log b a (a < b d ) Why? The proof uses a recursion tree argument (§4.6)

Regularity Condition Master Theorem: Let a ≥ 1 and b > 1 be constants, let f(n) be a function, and let T(n) be defined on nonnegative integers as: T(n) = aT(n/b) + f(n) Case 3 requires us to also show a(f(n/b))≤ c(f(n)) for some non- negative constant c < 1 and all sufficiently large n, the “regularity condition”. The regularity condition always holds when f(n) = n k and for constant ε > 0.

Solving Recurrences: Master Method (§4.3) These 3 cases do not cover all the possibilities for f(n). There is a gap between cases 1 and 2 when f(n) is smaller than n log b a, but not polynomially smaller. There is a gap between cases 2 and 3 when f(n) is larger than n log b a, but not polynomially larger. If the function falls into one of these 2 gaps, or if the regularity condition can’t be shown to hold, then the master method can’t be used to solve the recurrence.

Solving Recurrences: Master Method Example: T(n) = 9T(n/3) + n a = 9, b = 3, f(n) = n, n log b a = n log 3 9 = n 2 compare f(n) = n with n 2 n = O(n 2 -  ) (so f(n) is polynomially smaller than n log b a ) case 1 applies: T(n) =  (n 2 ) a = 1, b = 2, f(n) = 1, n log b a = n log 2 1 = n 0 = 1 compare f(n) = 1 with 1 1 =  (n 0 ) (so f(n) is polynomially equal to n log b a ) case 2 applies: T(n) =  (n 0 lgn) =  (lgn) Example: T(n) = T(n/2) + 1

Solving Recurrences: Master Method Example: T(n) = T(n/2) + n 2 a = 1, b = 2, f(n) = n 2, n log b a = n log 2 1 = n 0 = 1 compare f(n) = n 2 with 1 n 2 =  (n 0+  ) (so f(n) is polynomially larger) Show regularity: af(n/b) ≤ cf(n) 1(n/2) 2 ≤ cn 2 for c = ¼, so case 3 holds and T(n) =  (n 2 ) Example: T(n) = 4T(n/2) + n 2 a = 4, b = 2, f(n) = n 2, n log b a = n log 2 4 = n 2 compare f(n) = n 2 with n 2 n 2 =  (n 2 ) (so f(n) is polynomially equal) Case 2 holds and T(n) =  (n 2 lgn)

Solving Recurrences: Master Method Example: T(n) = 7T(n/2) + n 2 Example: T(n) = 7T(n/3) + n 2 a = 7, b = 3, f(n) = n 2, n log b a = n log 3 7 = n 1+  compare f(n) = n 2 with n 1+  n 2 =  (n 1+  ) (so f(n) is polynomially larger) Show regularity: af(n/b) ≤ cf(n) 7(n/3) 2 ≤ cn 2 for c = 7/9, so case 3 holds and T(n) =  (n 2 ) a = 7, b = 2, f(n) = n 2, n log b a = n log 2 7 = n 2+  compare f(n) = n 2 with n 2+  n 2 = O (n 2+  ) (so f(n) is polynomially smaller) Case 1 holds and T(n) =  (n log 2 7 )

End of lecture 4

Review of Logarithms (cf. lect 2) A logarithm is an inverse exponential function. Saying b x = y is equivalent to saying log b y = x. properties of logarithms: log b (xy) = log b x + log b y log b (x/y) = log b x - log b y log b x a = alog b x log b a= log x a/log x b a = b log b a (e.g., n = 2 lgn = n lg2 ) lg k n = (lgn) k lglg(n) = lg(lgn)

Recursion Tree for Merge-Sort c if n = 1 2T(n/2) + cn otherwise T(n) = Recurrence for worst-case running time for Merge-Sort cn cn/2cn/2cn cn/4cncn/4 c ccccccccn cnlgn + cn lgn + 1 levels (h = lgn)

Solving Recurrences: Master Method (§4.3) A more general version of Case 2 follows: T(n) =  (n log b a lg k+1 n) if f(n) =  (n log b a lg k n) for k ≥ 0 This case covers the gap between cases 2 and 3 in which f(n) is larger than n log b a by only a polylog factor. We’ll see an example of this type of recurrence in class.

Practice Use the master method to give tight bounds for the running time of T(n) in each of the following recurrences. Specify which case of the master theorem holds for each problem. a)T(n) = 2T(n/2) + n 3 b)T(n) = T(2n/3) + 1 c)T(n) = 16T(n/4) + n 2 d)T(n) = 5T(n/2) + n 2 e)T(n) = 5T(n/2) + n 3