MCA 202: Discrete Mathematics Instructor Neelima Gupta

Slides:



Advertisements
Similar presentations
한양대학교 정보보호 및 알고리즘 연구실 이재준 담당교수님 : 박희진 교수님
Advertisements

Recurrences : 1 Chapter 3. Growth of function Chapter 4. Recurrences.
5/5/20151 Analysis of Algorithms Lecture 6&7: Master theorem and substitution method.
Comp 122, Spring 2004 Divide and Conquer (Merge Sort)
September 12, Algorithms and Data Structures Lecture III Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Divide-and-Conquer Recursive in structure –Divide the problem into several smaller sub-problems that are similar to the original but smaller in size –Conquer.
11 Computer Algorithms Lecture 6 Recurrence Ch. 4 (till Master Theorem) Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
ADA: 4. Divide/Conquer1 Objective o look at several divide and conquer examples (merge sort, binary search), and 3 approaches for calculating their.
Algorithms Recurrences. Definition – a recurrence is an equation or inequality that describes a function in terms of its value on smaller inputs Example.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Recurrences - 1 Recurrences.
Analysis of Recursive Algorithms
CS 253: Algorithms Chapter 4 Divide-and-Conquer Recurrences Master Theorem Credit: Dr. George Bebis.
Updates HW#1 has been delayed until next MONDAY. There were two errors in the assignment Merge sort runs in Θ(n log n). Insertion sort runs in Θ(n2).
Recurrences Part 3. Recursive Algorithms Recurrences are useful for analyzing recursive algorithms Recurrence – an equation or inequality that describes.
Recurrence Examples.
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions Analysis of.
Chapter 4: Solution of recurrence relationships
Analysis of Algorithms CS 477/677 Recurrences Instructor: George Bebis (Appendix A, Chapter 4)
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions BIL741: Advanced.
October 1, Algorithms and Data Structures Lecture III Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Analysis of Algorithms
Analysis of Algorithms CS 477/677
Analyzing Recursive Algorithms A recursive algorithm can often be described by a recurrence equation that describes the overall runtime on a problem of.
David Luebke 1 10/3/2015 CS 332: Algorithms Solving Recurrences Continued The Master Theorem Introduction to heapsort.
1 Computer Algorithms Lecture 7 Master Theorem Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
Project 2 due … Project 2 due … Project 2 Project 2.
CSC 413/513: Intro to Algorithms Merge Sort Solving Recurrences.
DR. NAVEED AHMAD DEPARTMENT OF COMPUTER SCIENCE UNIVERSITY OF PESHAWAR LECTURE-5 Advance Algorithm Analysis.
10/25/20151 CS 3343: Analysis of Algorithms Lecture 6&7: Master theorem and substitution method.
Tonga Institute of Higher Education Design and Analysis of Algorithms IT 254 Lecture 2: Mathematical Foundations.
Introduction to Algorithms Chapter 4: Recurrences.
Chapter 4. Recurrences. Outline Offers three methods for solving recurrences, that is for obtaining asymptotic bounds on the solution In the substitution.
Recurrences David Kauchak cs161 Summer Administrative Algorithms graded on efficiency! Be specific about the run times (e.g. log bases) Reminder:
Divide-and-Conquer UNC Chapel HillZ. Guo. Divide-and-Conquer It’s a technique instead of an algorithm Recursive in structure – Divide the problem into.
CSC 413/513: Intro to Algorithms Solving Recurrences Continued The Master Theorem Introduction to heapsort.
Recurrences – II. Comp 122, Spring 2004.
Solving Recurrences with the Substitution Method.
Design & Analysis of Algorithms COMP 482 / ELEC 420 John Greiner
Lecture 5 Today, how to solve recurrences We learned “guess and proved by induction” We also learned “substitution” method Today, we learn the “master.
Divide and Conquer. Recall Divide the problem into a number of sub-problems that are smaller instances of the same problem. Conquer the sub-problems by.
Introduction to Algorithms (2 nd edition) by Cormen, Leiserson, Rivest & Stein Chapter 4: Recurrences.
Master Method Some of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Recurrences (in color) It continues…. Recurrences When an algorithm calls itself recursively, its running time is described by a recurrence. When an algorithm.
Divide and Conquer Faculty Name: Ruhi Fatima Topics Covered Divide and Conquer Matrix multiplication Recurrence.
BY Lecturer: Aisha Dawood. A recurrence is a function is defined in terms of:  one or more base cases, and  itself, with smaller arguments. 2.
Chapter 4: Solution of recurrence relationships Techniques: Substitution: proof by induction Tree analysis: graphical representation Master theorem: Recipe.
Advanced Algorithms Analysis and Design By Dr. Nazir Ahmad Zafar Dr Nazir A. Zafar Advanced Algorithms Analysis and Design.
Recurrences It continues… Jeff Chastine. Recurrences When an algorithm calls itself recursively, its running time is described by a recurrence. A recurrence.
Recursion Ali.
Mathematical Foundations (Solving Recurrence)
Lecture 11. Master Theorem for analyzing Recursive relations
Algorithm : Design & Analysis [4]
T(n) = aT(n/b) + cn = a(aT(n/b/b) + cn/b) + cn 2
Advance Analysis of Lecture No 9 Institute of Southern Punjab Multan
Algorithms and Data Structures Lecture III
Ch 4: Recurrences Ming-Te Chi
CS 3343: Analysis of Algorithms
Recurrences (Method 4) Alexandra Stefan.
Divide and Conquer (Merge Sort)
Solving Recurrences Continued The Master Theorem
Analysis of Algorithms
Introduction To Algorithms
Algorithms Recurrences.
Quicksort Quick sort Correctness of partition - loop invariant
Algorithms and Data Structures Lecture III
Presentation transcript:

MCA 202: Discrete Mathematics Instructor Neelima Gupta

Table Of Contents Solving Recurrences The Master Theorem

Recurrence Relations

Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions

Recurrence Examples

Solving Recurrences Substitution method Iteration method Master method

Solving Recurrences The substitution method (CLR 4.1) –“Making a good guess” method –Guess the form of the answer, then use induction to find the constants and show that solution works –Examples: T(n) = 2T(n/2) +  (n)  T(n) =  (n lg n) T(n) = 2T(  n/2  ) + n  ???

Solving Recurrences The substitution method (CLR 4.1) –“Making a good guess” method –Guess the form of the answer, then use induction to find the constants and show that solution works –Examples: T(n) = 2T(n/2) +  (n)  T(n) =  (n lg n) T(n) = 2T(  n/2  ) + n  T(n) =  (n lg n) T(n) = 2T(  n/2  )+ 17) + n  ???

Substitution method Guess the form of the solution. Use mathematical induction to find the constants and show that the solution works. The substitution method can be used to establish either upper or lower bounds on a recurrence.

An example (Substitution method ) T(n) = 2T(floor( n/2) ) +n We guess that the solution is T(n)=0(n lg n). i.e. to show that T(n) ≤ cn lg n, for some constant c> 0 and n ≥ m. Assume that this bound holds for [n/2]. So, we get T(n) ≤ 2(c floor (n/2) lg(floor(n/2))) + n ≤ cn lg(n/2) + n = cn lg n – cn lg 2 + n = cn lg n – cn + n ≤ cn lg n where, the last step holds as long as c≥ 1.

Boundary conditions : Suppose, T(1)=1 is the sole boundary condition of the recurrence. then, for n=1, the bound T(n)≤ c n lg n yields T(1)≤ c lg1=0, which is at odds with T(1)=1. Thus,the base case of our inductive proof fails to hold. To overcome this difficulty, we can take advantage of the asymptotic notation which only requires us to prove T(n)≤c n lg n for n≥ m. The idea is to remove the difficult boundary condition T(1)= 1 from consideration. Thus, we can replace T(1) by T(2) as the base cases in the inductive proof, letting m=2.

Contd.... From the recurrence, with T(1) = 1, we get T(2)=4 We require T(2)≤ c 2 lg 2 It is clear that, any choice of c≥2 suffices for the base cases

Are we done? Have we proved the case for n =3. Have we proved that T(3) ≤ c 3 lg 3. No. Since floor(3/2) = 1 and for n =1, it does not hold. So induction does not apply on n= 3. From the recurrence, with T(1) = 1, we get T(3) = 5. The inductive proof that T(n) ≤ c n lg n for some constant c≥1 can now be completed by choosing c large enough that T(3)≤c 3 lg 3 also holds. It is clear that, any choice of c ≥ 2 is sufficient for this to hold. Thus we can conclude that T(n) ≤ c n lg n for any c ≥ 2 and n ≥ 2.

Wrong Application of induction Given recurrence: T(n) = 2T(n/2) + 1 Guess: T(n) = O(n) Claim : Ǝ some constant c and n 0, such that T(n) = n 0. Proof: Suppose the claim is true for all values <= n/2 then T(n)=2T(n/2)+1 (given) <= 2.c.(n/2) +1 (by induction hypothesis) = cn+1 =1

Note that T(n/2) T(n)<=(c+1)n (this statement is true but does not help us in establishing a solution to the problem) T(1)<=c (when n=1) T(2)<=(c+1).2 T(2^2)<=(c+2).2^2.. T(2^i)<=(c+i).2^I So, T(n)= T(2^logn)= (c+ log n)n = Ɵ (n log n) Why is it Wrong?

What if we have extra lower order terms? So, does that mean that the claim we initially made that T(n)=O(n) was wrong ? No. Recall: T(n)=2T(n/2)+1 (given) <= 2.c.(n/2) +1 (by induction hypothesis) = cn+1 Note that in the proof we have an extra lower order term in our inductive proof.

Given Recurrence: T(n) = 2T(n/2) + 1 Guess: T(n) = O(n) Claim : Ǝ some constant c and n 0, such that T(n) = n 0 Proof: Suppose the claim is true for all values <= n/2 then T(n)=2T(n/2)+1 <=2[c(n/2)-b]+1 <= cn-2b+1 <= cn-b+(1-b) =1 Thus, T(n/2) <= cn/2 – b => T(n) =1 Hence, by induction T(n) = O(n), i.e. Our claim was true. Hence proved.

Another Example (Substitution method ) T(n) = 2T(  n/  ) + n, T(34) < c.34.lg 34 for an appropriate constant c. Since it looks similar to T(n) = T(n/2) + n, We guess that the solution is T(n) = O(n lg n). Thanks to Manoj Patial, Roll No (MCA 2012)

Let us assume that it holds ∀ m ≤  n/  > T(m) ≤ cm lg m ∀ m ≤  n/  Hence we can write T(n) = 2T(  n/  ) + n ≤ 2c (  n/2+17  )lg (  n/2 +17  ) + n ≤ 2c (n/2+17)lg(n/2 +17) + n ≤ 2c(n/2+17) lg(n/2+n/4) + n ∀ n/4 ≥ 17 = (cn + 34c) lg(3n/4) + n = cn lgn – cn lg 4/3 + 34c lg n – 34c lg 4/3 + n ≤ cn lgn – cn lg 4/3 + 34c lg n + n As lg n = 0(n) therefore 34 c lgn ≤ kn ∀ n ≥ n o hence T(n)≤ cn lg n + n(1- c lg 4/3) + kn ∀ n ≥ max {n o, 68} = cn lg n + n(k+1-c lg 4/3) …………..(1) now we can always choose c such that k+1 – c lg 4/3 ≤ 0 Thanks to Manoj Patial, Roll No (MCA 2012)

> k+1 ≤ c lg 4/3 > c ≥ (k+1)/ lg(4/3) = k T(n) ≤ cn lg n for some constant c ≥ k and n ≥ max {n o, 68} …………….(Hence proved) Boundary Conditions: n = 35, it holds since it holds for  35/  =34, therefore it holds for 35 by induction. n = 36, it holds since it holds for  36/  =35, therefore it holds for 36 by induction.. And so on. Thanks to Manoj Patial, Roll No (MCA 2012)

Solving Recurrences using Recursion Tree Method Here while solving recurrences, we divide the problem into subproblems of equal size. For e.g., T(n) = a T(n/b) + f(n) where a > 1,b > 1 and f(n) is a given function. F(n) is the cost of splitting or combining the sub problems. n T n/b Thanks : MCA 2012 Anurag Aggarwal and Aniruddh Jarial

n T n/2 n/2 2 1) T(n) = 2T(n/2) + n The recursion tree for this recurrence is : n n/2 n/2 n log 2 n : Thanks : MCA 2012 Anurag Aggarwal and Aniruddh Jarial

When we add the values across the levels of the recursion tree, we get a value of n for every level. We have n + n + n + …… log n times = n ( …… log n times) = n (log 2 n) = Ɵ (n log n) T(n) = Ɵ (n log n) Thanks : MCA 2012 Anurag Aggarwal and Aniruddh Jarial

II. Given : T(n) = 2T(n/2) + 1 Solution : The recursion tree for the above recurrence is log n 4 1 :::::: Thanks : MCA 2012 Anurag Aggarwal and Aniruddh Jarial

Now we add up the costs over all levels of the recursion tree, to determine the cost for the entire tree : We get series like …… log n times which is a G.P. [ So, using the formula for sum of terms in a G.P. : a + ar + ar 2 + ar 3 + …… + ar n – 1 = a( r n – 1 ) r – 1 ] = 1 (2 log n – 1) 2 – 1 = n – 1 = Ɵ (n – 1) (neglecting the lower order terms) = Ɵ (n) Thanks : MCA 2012 Anurag Aggarwal and Aniruddh Jarial

III. Given : T(n) = T(n/3) + T(2n/3) + n Solution : The recursion tree for the above recurrence is n n/3 2n/3 n log 3 n log 3/2 n n 1 n/3 i : n n/3 2n/3 n n n 3 3 n. (3/2) 2 Thanks : MCA 2012 Anurag Aggarwal and Aniruddh Jarial

When we add the values across the levels of the recursion tree, we get a value of n for every level. Since the shortest path from the root to the leaf is n → n → n → n → … we have 1 when n = 1 3 i => n = 3 i Taking log ₃ on both the sides => log ₃ n = i Thus the height of the shorter tree is log ₃ n T(n) > n log ₃ n … A Thanks : MCA 2012 Anurag Aggarwal and Aniruddh Jarial

Similarly, the longest path from root to the leaf is n → 2 n → 2 2 n → … So rightmost will be the longest when 2 k n = 1 3 or n = 1 (3/2) k => k = log 3/2 n T(n) < n log 3/2 n … B Since base does not matter in asymptotic notation, we guess from A and B T(n) = Ɵ (n log 2 n) Thanks : MCA 2012 Anurag Aggarwal and Aniruddh Jarial

Solving Recurrences Another option is “iteration method” –Expand the recurrence –Work some algebra to express as a summation –Evaluate the summation We will show some examples

Solve the following recurrence: T(n) = T(αn) + T(βn) + n, where 0 < α ≤ β < 1 Assume suitable initial conditions. Assignment 4

The Master Theorem Given: a divide and conquer algorithm –An algorithm that divides the problem of size n into a subproblems, each of size n/b –Let the cost of each stage (i.e., the work to divide the problem + combine solved subproblems) be described by the function f(n) Then, the Master Theorem gives us a cookbook for the algorithm’s running time:

The Master Theorem if T(n) = aT(n/b) + f(n) then

Using The Master Method T(n) = 9T(n/3) + n –a=9, b=3, f(n) = n –n log b a = n log 3 9 =  (n 2 ) –Since f(n) = O(n log  ), where  =1, case 1 applies: –Thus the solution is T(n) =  (n 2 )

More Examples of Master’s Theorem T(n) = 3T(n/5) + n T(n) = 2T(n/2) + n T(n) = 2T(n/2) + 1 T(n) = T(n/2) + n T(n) = T(n/2) + 1

When Master’s Theorem cannot be applied T(n) = 2T(n/2) + n logn T(n) = 2T(n/2) + n/ logn

Up Next Proving the correctness of Algorithms