10/25/20151 CS 3343: Analysis of Algorithms Lecture 6&7: Master theorem and substitution method.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms 6.046J/18.401J/SMA5503 Lecture 2 Prof. Erik Demaine.
Advertisements

Introduction to Algorithms 6.046J
5/5/20151 Analysis of Algorithms Lecture 6&7: Master theorem and substitution method.
Nattee Niparnan. Recall What is the measurement of algorithm? How to compare two algorithms? Definition of Asymptotic Notation.
Comp 122, Spring 2004 Divide and Conquer (Merge Sort)
September 12, Algorithms and Data Structures Lecture III Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Divide-and-Conquer Recursive in structure –Divide the problem into several smaller sub-problems that are similar to the original but smaller in size –Conquer.
11 Computer Algorithms Lecture 6 Recurrence Ch. 4 (till Master Theorem) Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
Algorithms Recurrences. Definition – a recurrence is an equation or inequality that describes a function in terms of its value on smaller inputs Example.
The master method The master method applies to recurrences of the form T(n) = a T(n/b) + f (n), where a  1, b > 1, and f is asymptotically positive.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
Updates HW#1 has been delayed until next MONDAY. There were two errors in the assignment Merge sort runs in Θ(n log n). Insertion sort runs in Θ(n2).
4.Recurrences Hsu, Lih-Hsing. Computer Theory Lab. Chapter 4P.2 Recurrences -- Substitution method Recursion-tree method Master method.
Recurrences Part 3. Recursive Algorithms Recurrences are useful for analyzing recursive algorithms Recurrence – an equation or inequality that describes.
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions Analysis of.
Recurrence Relations Connection to recursive algorithms Techniques for solving them.
Analysis of Algorithms CS 477/677 Recurrences Instructor: George Bebis (Appendix A, Chapter 4)
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions BIL741: Advanced.
October 1, Algorithms and Data Structures Lecture III Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Analysis of Algorithms
Analysis of Algorithms CS 477/677
Analyzing Recursive Algorithms A recursive algorithm can often be described by a recurrence equation that describes the overall runtime on a problem of.
MCA 202: Discrete Mathematics Instructor Neelima Gupta
10/13/20151 CS 3343: Analysis of Algorithms Lecture 9: Review for midterm 1 Analysis of quick sort.
1 Computer Algorithms Lecture 7 Master Theorem Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
2IL50 Data Structures Fall 2015 Lecture 2: Analysis of Algorithms.
Project 2 due … Project 2 due … Project 2 Project 2.
DR. NAVEED AHMAD DEPARTMENT OF COMPUTER SCIENCE UNIVERSITY OF PESHAWAR LECTURE-5 Advance Algorithm Analysis.
Chapter 4. Recurrences. Outline Offers three methods for solving recurrences, that is for obtaining asymptotic bounds on the solution In the substitution.
Recurrences David Kauchak cs161 Summer Administrative Algorithms graded on efficiency! Be specific about the run times (e.g. log bases) Reminder:
Divide-and-Conquer UNC Chapel HillZ. Guo. Divide-and-Conquer It’s a technique instead of an algorithm Recursive in structure – Divide the problem into.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
1 Algorithms CSCI 235, Fall 2015 Lecture 6 Recurrences.
Solving Recurrences with the Substitution Method.
Design & Analysis of Algorithms COMP 482 / ELEC 420 John Greiner
Lecture 5 Today, how to solve recurrences We learned “guess and proved by induction” We also learned “substitution” method Today, we learn the “master.
Divide and Conquer. Recall Divide the problem into a number of sub-problems that are smaller instances of the same problem. Conquer the sub-problems by.
Spring 2015 Lecture 2: Analysis of Algorithms
1Computer Sciences. 2 GROWTH OF FUNCTIONS 3.2 STANDARD NOTATIONS AND COMMON FUNCTIONS.
Recurrences (in color) It continues…. Recurrences When an algorithm calls itself recursively, its running time is described by a recurrence. When an algorithm.
BY Lecturer: Aisha Dawood. A recurrence is a function is defined in terms of:  one or more base cases, and  itself, with smaller arguments. 2.
Advanced Algorithms Analysis and Design By Dr. Nazir Ahmad Zafar Dr Nazir A. Zafar Advanced Algorithms Analysis and Design.
Equal costs at all levels
Mathematical Foundations (Solving Recurrence)
Introduction to Algorithms: Recurrences
Lecture 11. Master Theorem for analyzing Recursive relations
CS 3343: Analysis of Algorithms
Chapter 4: Divide and Conquer
Growth of functions CSC317.
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
Advance Analysis of Lecture No 9 Institute of Southern Punjab Multan
Introduction to Algorithms 6.046J
Algorithms and Data Structures Lecture III
Divide-and-Conquer 7 2  9 4   2   4   7
Ch 4: Recurrences Ming-Te Chi
CS 3343: Analysis of Algorithms
Introduction to Algorithms
CS 3343: Analysis of Algorithms
Trevor Brown CS 341: Algorithms Trevor Brown
Analysis of Algorithms
Algorithms CSCI 235, Spring 2019 Lecture 6 Recurrences
Algorithms and Data Structures Lecture III
Presentation transcript:

10/25/20151 CS 3343: Analysis of Algorithms Lecture 6&7: Master theorem and substitution method

10/25/20152 Analyzing recursive algorithms 1.Defining recurrence 2.Solving recurrence

10/25/20153 Solving recurrence 1.Recursion tree / iteration method - Good for guessing an answer 2.Substitution method - Generic method, rigid, but may be hard 3.Master method - Easy to learn, useful in limited cases only - Some tricks may help in other cases

10/25/20154 The master method The master method applies to recurrences of the form T(n) = a T(n/b) + f (n), where a  1, b > 1, and f is asymptotically positive. 1.Divide the problem into a subproblems, each of size n/b 2.Conquer the subproblems by solving them recursively. 3.Combine subproblem solutions Divide + combine takes f(n) time.

10/25/20155 Master theorem T(n) = a T(n/b) + f (n) C ASE 1: f (n) = O(n log b a –  )  T(n) =  (n log b a ). C ASE 2: f (n) =  (n log b a )  T(n) =  (n log b a log n). C ASE 3: f (n) =  (n log b a +  ) and a f (n/b)  c f (n)  T(n) =  ( f (n)). Key: compare f(n) with n log b a Regularity Condition

10/25/20156 Case 1 f (n) = O(n log b a –  ) for some constant  > 0. Alternatively: n log b a / f(n) = Ω(n   Intuition: f (n) grows polynomially slower than n log b a Or: n log b a dominates f(n) by an n  factor for some  > 0 Solution: T(n) =  (n log b a ) T(n) = 4T(n/2) + n b = 2, a = 4, f(n) = n log 2 4 = 2 f(n) = n = O(n 2-  ), or n 2 / n = n 1 = Ω(n  ), for   T(n) = Θ(n 2 ) T(n) = 2T(n/2) + n/logn b = 2, a = 2, f(n) = n / log n log 2 2 = 1 f(n) = n/logn  O(n 1-  , or n 1 / f(n) = log n  Ω(n  ), for any   CASE 1 does not apply

10/25/20157 Case 2 f (n) =  (n log b a ). Intuition: f (n) and n log b a have the same asymptotic order. Solution: T(n) =  (n log b a log n) e.g. T(n) = T(n/2) + 1log b a = 0 T(n) = 2 T(n/2) + n log b a = 1 T(n) = 4T(n/2) + n 2 log b a = 2 T(n) = 8T(n/2) + n 3 log b a = 3

10/25/20158 Case 3 f (n) = Ω(n log b a +  ) for some constant  > 0. Alternatively: f(n) / n log b a = Ω(n  ) Intuition: f (n) grows polynomially faster than n log b a Or: f(n) dominates n log b a by an n  factor for some  > 0 Solution: T(n) = Θ(f(n)) T(n) = T(n/2) + n b = 2, a = 1, f(n) = n n log 2 1 = n 0 = 1 f(n) = n = Ω(n 0+  ), or n / 1= n = Ω(n  )  T(n) = Θ(n) T(n) = T(n/2) + log n b = 2, a = 1, f(n) = log n n log 2 1 = n 0 = 1 f(n) = log n  Ω(n 0+  ), or f(n) / n log 2 1 / = log n  Ω(n  )  CASE 3 does not apply

10/25/20159 Regularity condition a f (n/b)  c f (n) for some c < 1 and all sufficiently large n This is needed for the master method to be mathematically correct. –to deal with some non-converging functions such as sine or cosine functions For most f(n) you’ll see (e.g., polynomial, logarithm, exponential), you can safely ignore this condition, because it is implied by the first condition f (n) = Ω(n log b a +  )

10/25/ Examples T(n) = 4T(n/2) + n a = 4, b = 2  n log b a = n 2 ; f (n) = n. C ASE 1: f (n) = O(n 2 –  ) for  = 1.  T(n) =  (n 2 ). T(n) = 4T(n/2) + n 2 a = 4, b = 2  n log b a = n 2 ; f (n) = n 2. C ASE 2: f (n) =  (n 2 ).  T(n) =  (n 2 log n).

10/25/ Examples T(n) = 4T(n/2) + n 3 a = 4, b = 2  n log b a = n 2 ; f (n) = n 3. C ASE 3: f (n) =  (n 2 +  ) for  = 1 and 4(n/2) 3  cn 3 (reg. cond.) for c = 1/2.  T(n) =  (n 3 ). T(n) = 4T(n/2) + n 2 /log n a = 4, b = 2  n log b a = n 2 ; f (n) = n 2 /log n. Master method does not apply. In particular, for every constant  > 0, we have n   (log n).

10/25/ Examples T(n) = 4T(n/2) + n 2.5 a = 4, b = 2  n log b a = n 2 ; f (n) = n 2.5. C ASE 3: f (n) =  (n 2 +  ) for  = 0.5 and 4(n/2) 2.5  cn 2.5 (reg. cond.) for c =  T(n) =  (n 2.5 ). T(n) = 4T(n/2) + n 2 log n a = 4, b = 2  n log b a = n 2 ; f (n) = n 2 log n. Master method does not apply. In particular, for every constant  > 0, we have n   (log n).

10/25/ How do I know which case to use? Do I need to try all three cases one by one?

10/25/ Compare f(n) with n log b a o (n log b a ) Possible CASE 1 f(n)  Θ( n log b a ) CASE 2 ω(n log b a ) Possible CASE 3 check if n log b a / f(n)  Ω(n  ) check if f(n) / n log b a  Ω(n  )

10/25/ Examples log b a = 2. n  o(n 2 ) => Check case 1 n 2 /n = n  Ω(n  ), so T(n)  Θ (n 2 ) log b a = 2. n 2  o(n 2 ) => case 2 T(n)  Θ (n 2 logn) log b a = 1.xxx. n  o(n 1.xxx ) => Check case 1 n 1.xxx /n = n 0.xxx  Ω(n  ), so T(n)  Θ (n log 4 6 ) log b a = 0.5. n  ω (n 0.5 ) => Check case 3 n / n 0.5  Ω(n  ) (check reg cond) T(n)  Θ (n) log b a = 0. nlogn = ω (n 0 ) => Check case 3 nlogn/1  Ω(n  ) (check reg cond) T(n)  Θ (nlogn) log b a = 1. nlogn = ω (n) => Check case 3 nlogn / n = logn  Ω(n  ) for any . MS n/a.

10/25/ More examples

10/25/ Some tricks Changing variables Obtaining upper and lower bounds –Make a guess based on the bounds –Prove using the substitution method

10/25/ Changing variables Let n = log m, i.e., m = 2 n => T(log m) = 2 T(log (m/2)) + 1 Let S(m) = T(log m) = T(n) => S(m) = 2S(m/2) + 1 => S(m) = Θ(m) => T(n) = S(m) = Θ(m) = Θ(2 n ) T(n) = 2T(n-1) + 1

10/25/ Changing variables Let n =2 m => sqrt(n) = 2 m/2 We then have T(2 m ) = T(2 m/2 ) + 1 Let T(n) = T(2 m ) = S(m) => S(m) = S(m/2) + 1  S(m) = Θ (log m) = Θ (log log n)  T(n) = Θ (log log n)

10/25/ Changing variables T(n) = 2T(n-2) + n Let n = log m, i.e., m = 2 n => T(log m) = 2 T(log m/4) + log m Let S(m) = T(log m) = T(n) => S(m) = 2S(m/4) + log m => S(m) = m 1/2 => T(n) = S(m) = (2 n ) 1/2 = (sqrt(2)) n  1.4 n

10/25/ Obtaining bounds Solve the Fibonacci sequence: T(n) = T(n-1) + T(n-2) + 1 T(n) >= 2T(n-2) + 1 [1] T(n) <= 2T(n-1) + 1 [2] Solving [1], we obtain T(n) >= 1.4 n Solving [2], we obtain T(n) <= 2 n Actually, T(n)  1.62 n

10/25/ Obtaining bounds T(n) = T(n/2) + log n T(n)  Ω(log n) T(n)  O(T(n/2) + n  ) Solving T(n) = T(n/2) + n , we obtain T(n) = O(n  ), for any  > 0 So: T(n)  O(n  ) for any  > 0 –T(n) is unlikely polynomial –Actually, T(n) = Θ(log 2 n) by extended case 2

10/25/ Extended Case 2 C ASE 2: f (n) =  (n log b a )  T(n) =  (n log b a log n). Extended C ASE 2: (k >= 0) f (n) =  (n log b a log k n)  T(n) =  (n log b a log k+1 n).

10/25/ Solving recurrence 1.Recursion tree / iteration method - Good for guessing an answer - Need to verify 2.Substitution method - Generic method, rigid, but may be hard 3.Master method - Easy to learn, useful in limited cases only - Some tricks may help in other cases

10/25/ Substitution method 1.Guess the form of the solution (e.g. by recursion tree / iteration method) 2.Verify by induction (inductive step). 3.Solve for O-constants n 0 and c (base case of induction) The most general method to solve a recurrence (prove O and  separately):

10/25/ Recurrence: T(n) = 2T(n/2) + n. Guess: T(n) = O(n log n). (eg. by recursion tree method) To prove, have to show T(n) ≤ c n log n for some c > 0 and for all n > n 0 Proof by induction: assume it is true for T(n/2), prove that it is also true for T(n). This means: Given: T(n) = 2T(n/2) + n Need to Prove: T(n)≤ c n log (n) Assume: T(n/2)≤ cn/2 log (n/2) Substitution method

10/25/ Proof Given: T(n) = 2T(n/2) + n Need to Prove: T(n)≤ c n log (n) Assume: T(n/2)≤ cn/2 log (n/2) Proof: Substituting T(n/2) ≤ cn/2 log (n/2) into the recurrence, we get T(n) = 2 T(n/2) + n ≤ cn log (n/2) + n ≤ c n log n - c n + n ≤ c n log n - (c - 1) n ≤ c n log n for all n > 0 (if c ≥ 1). Therefore, by definition, T(n) = O(n log n).

10/25/ Recurrence: T(n) = 2T(n/2) + n. Guess: T(n) = Ω(n log n). To prove, have to show T(n) ≥ c n log n for some c > 0 and for all n > n 0 Proof by induction: assume it is true for T(n/2), prove that it is also true for T(n). This means: Given: Need to Prove: T(n) ≥ c n log (n) Assume: Substitution method – example 2 T(n) = 2T(n/2) + n T(n/2) ≥ cn/2 log (n/2)

10/25/ Proof Given: T(n) = 2T(n/2) + n Need to Prove: T(n) ≥ c n log (n) Assume: T(n/2) ≥ cn/2 log (n/2) Proof: Substituting T(n/2) ≥ cn/2 log (n/2) into the recurrence, we get T(n) = 2 T(n/2) + n ≥ cn log (n/2) + n ≥ c n log n - c n + n ≥ c n log n + (1 – c) n ≥ c n log n for all n > 0 (if c ≤ 1). Therefore, by definition, T(n) = Ω(n log n).

10/25/ More substitution method examples (1) Prove that T(n) = 3T(n/3) + n = O(nlogn) Need to show that T(n)  c n log n for some c, and sufficiently large n Assume above is true for T(n/3), i.e. T(n/3)  cn/3 log (n/3)

10/25/ T(n) = 3 T(n/3) + n  3 cn/3 log (n/3) + n  cn log n – cn log3 + n  cn log n – (cn log3 – n)  cn log n (if cn log3 – n ≥ 0) cn log3 – n ≥ 0 => c log 3 – 1 ≥ 0 (for n > 0) => c ≥ 1/log3 => c ≥ log 3 2 Therefore, T(n) = 3 T(n/3) + n  cn log n for c = log 3 2 and n > 0. By definition, T(n) = O(n log n).

10/25/ More substitution method examples (2) Prove that T(n) = T(n/3) + T(2n/3) + n = O(nlogn) Need to show that T(n)  c n log n for some c, and sufficiently large n Assume above is true for T(n/3) and T(2n/3), i.e. T(n/3)  cn/3 log (n/3) T(2n/3)  2cn/3 log (2n/3)

10/25/ T(n) = T(n/3) + T(2n/3) + n  cn/3 log(n/3) + 2cn/3 log(2n/3) + n  cn log n + n – cn (log 3 – 2/3)  cn log n + n(1 – clog3 + 2c/3)  cn log n, for all n > 0 (if 1– c log3 + 2c/3  0) c log3 – 2c/3 ≥ 1  c ≥ 1 / (log3-2/3) > 0 Therefore, T(n) = T(n/3) + T(2n/3) + n  cn log n for c = 1 / (log3-2/3) and n > 0. By definition, T(n) = O(n log n).

10/25/ More substitution method examples (3) Prove that T(n) = 3T(n/4) + n 2 = O(n 2 ) Need to show that T(n)  c n 2 for some c, and sufficiently large n Assume above is true for T(n/4), i.e. T(n/4)  c(n/4) 2 = cn 2 /16

10/25/ T(n) = 3T(n/4) + n 2  3 c n 2 / 16 + n 2  (3c/16 + 1) n 2  cn 2 3c/  c implies that c ≥ 16/13 Therefore, T(n) = 3(n/4) + n 2  cn 2 for c = 16/13 and all n. By definition, T(n) = O(n 2 ). ?

10/25/ Avoiding pitfalls Guess T(n) = 2T(n/2) + n = O(n) Need to prove that T(n)  c n Assume T(n/2)  cn/2 T(n)  2 * cn/2 + n = cn + n = O(n) What’s wrong? Need to prove T(n)  cn, not T(n)  cn + n

10/25/ Subtleties Prove that T(n) = T(  n/2  ) + T(  n/2  ) + 1 = O(n) Need to prove that T(n)  cn Assume above is true for T(  n/2  ) & T(  n/2  ) T(n) <= c  n/2  + c  n/2  + 1  cn + 1 Is it a correct proof? No! has to prove T(n) <= cn However we can prove T(n) = O (n – 1) Details skipped.

10/25/ Making good guess T(n) = 2T(n/2 + 17) + n When n approaches infinity, n/ are not too different from n/2 Therefore can guess T(n) =  (n log n) Prove  : Assume T(n/2 + 17) ≥ c (n/2+17) log (n/2 + 17) Then we have T(n) = n + 2T(n/2+17) ≥ n + 2c (n/2+17) log (n/2 + 17) ≥ n + c n log (n/2 + 17) + 34 c log (n/2+17) ≥ c n log (n/2 + 17) + 34 c log (n/2+17) …. Maybe can guess T(n) =  ((n-17) log (n-17)) (trying to get rid of the +17). Details skipped.