Presentation is loading. Please wait.

Presentation is loading. Please wait.

September 12, 2005 Copyright©2001-5 Erik D. Demaine and Charles E. Leiserson L2.1 Introduction to Algorithms 6.046J/18.401J LECTURE2 Asymptotic Notation.

Similar presentations


Presentation on theme: "September 12, 2005 Copyright©2001-5 Erik D. Demaine and Charles E. Leiserson L2.1 Introduction to Algorithms 6.046J/18.401J LECTURE2 Asymptotic Notation."— Presentation transcript:

1 September 12, 2005 Copyright© Erik D. Demaine and Charles E. Leiserson L2.1 Introduction to Algorithms 6.046J/18.401J LECTURE2 Asymptotic Notation O-, Ω-, and Θ-notation Recurrences Substitution method Iterating the recurrence Recursion tree Master method Prof. Erik Demaine

2 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.2 Asymptotic notation O-notation (upper bounds): We write f(n) = O(g(n)) if there Exist constants c>0, n 0 >0 such That 0 f(n) cg(n) for all nn 0. We write f(n) = O(g(n)) if there Exist constants c>0, n 0 >0 such That 0 f(n) cg(n) for all nn 0.

3 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.3 Asymptotic notation O-notation (upper bounds): We write f(n) = O(g(n)) if there Exist constants c>0, n 0 >0 such That 0 f(n) cg(n) for all nn 0. We write f(n) = O(g(n)) if there Exist constants c>0, n 0 >0 such That 0 f(n) cg(n) for all nn 0. E XAMPLE: 2n 2 = O(n 3 ) (c= 1, n 0 = 2)

4 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.4 Asymptotic notation O-notation (upper bounds): We write f(n) = O(g(n)) if there Exist constants c>0, n 0 >0 such That 0 f(n) cg(n) for all nn 0. We write f(n) = O(g(n)) if there Exist constants c>0, n 0 >0 such That 0 f(n) cg(n) for all nn 0. E XAMPLE: 2n 2 = O(n 3 ) (c= 1, n 0 = 2) functions, Not values

5 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.5 Asymptotic notation O-notation (upper bounds): We write f(n) = O(g(n)) if there Exist constants c>0, n 0 >0 such That 0 f(n) cg(n) for all nn 0. We write f(n) = O(g(n)) if there Exist constants c>0, n 0 >0 such That 0 f(n) cg(n) for all nn 0. E XAMPLE: 2n 2 = O(n 3 ) (c= 1, n 0 = 2) functions, Not values Funny, one-way equality

6 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.6 Set definition of O-notation O(g(n))= { f(n) :there exist constants c> 0, n 0 > 0such that 0 f(n) cg(n) for all nn 0 } O(g(n))= { f(n) :there exist constants c> 0, n 0 > 0such that 0 f(n) cg(n) for all nn 0 }

7 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.7 Set definition of O-notation O(g(n))= { f(n) :there exist constants c> 0, n 0 > 0such that 0 f(n) cg(n) for all nn 0 } O(g(n))= { f(n) :there exist constants c> 0, n 0 > 0such that 0 f(n) cg(n) for all nn 0 } E XAMPLE : 2n 2 O(n 3 )

8 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.8 Set definition of O-notation O(g(n))= { f(n) :there exist constants c> 0, n 0 > 0such that 0 f(n) cg(n) for all nn 0 } O(g(n))= { f(n) :there exist constants c> 0, n 0 > 0such that 0 f(n) cg(n) for all nn 0 } E XAMPLE : 2n 2 O(n 3 ) (Logicians:λn.2n 2 O(λn.n 3 ), but its convenient to be sloppy, as long as we understand whats reallygoing on.)

9 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.9 Macro substitution Convention: A set in a formula represents An anonymous function in the set.

10 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.10 Macro substitution Convention: A set in a formula represents An anonymous function in the set. f(n) = n 3 + O(n 2 ) means f(n) = n 3 + h(n) for some h(n) O(n 2 ). E XAMPLE:

11 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.11 Macro substitution Convention: A set in a formula represents An anonymous function in the set. E XAMPLE: n 2 = O(n) + O(n 2 ) means for any f(n) O(n) : n 2 + f(n) = h(n) for some h(n) O(n 2 ).

12 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.12 -notation (lower bounds) O-notation is an upper-bound notation. It makes no sense to say f(n) is at least O(n2).

13 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.13 -notation (lower bounds) O-notation is an upper-bound notation. It makes no sense to say f(n) is at least O(n2). (g(n))= { f(n) :there exist constants c> 0, n 0 > 0such that 0 cg(n) f(n) for all nn 0 } (g(n))= { f(n) :there exist constants c> 0, n 0 > 0such that 0 cg(n) f(n) for all nn 0 }

14 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.14 -notation (lower bounds) O-notation is an upper-bound notation. It makes no sense to say f(n) is at least O(n2). (g(n))= { f(n) :there exist constants c> 0, n 0 > 0such that 0 cg(n) f(n) for all nn 0 } (g(n))= { f(n) :there exist constants c> 0, n 0 > 0such that 0 cg(n) f(n) for all nn 0 } E XAMPLE:

15 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.15 -notation (tight bounds)

16 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.16 -notation (tight bounds) E XAMPLE:

17 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.17 -notation and ω-notation O-notation and Ω-notation are like and. o-notation and ω-notation are like. (g(n))= { f(n) : for any constant c> 0 there is a constant n 0 > 0 such that 0 f(n) < cg(n) for all n n 0 } (g(n))= { f(n) : for any constant c> 0 there is a constant n 0 > 0 such that 0 f(n) < cg(n) for all n n 0 } 2n 2 = o(n 3 )(n 0 = 2/c) E XAMPLE:

18 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.18 -notation and ω-notation O-notation and Ω-notation are like and. o-notation and ω-notation are like. (g(n))= { f(n) : for any constant c> 0 there is a constant n 0 > 0 such that 0 cg(n) < f(n) for all n n 0 } (g(n))= { f(n) : for any constant c> 0 there is a constant n 0 > 0 such that 0 cg(n) < f(n) for all n n 0 } E XAMPLE:

19 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.19 Solving recurrences The analysis of merge sort from Lecture 1 required us to solve a recurrence. Recurrences are like solving integrals, differential equations, etc. Learn a few tricks. Lecture 3: Applications of recurrences to divide-and-conquer algorithms.

20 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.20 Substitution method The most general method: 1.Guess the form of the solution. 2.Verify by induction. 3.Solve for constants.

21 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.21 Substitution method The most general method: 1.Guess the form of the solution. 2.Verify by induction. 3.Solve for constants. E XAMPLE: [Assume that T(1) = Θ(1).] Guess O(n 3 ). (Prove O and Ωseparately.) Assume that T(k) ck 3 for k< n. Prove T(n) cn 3 by induction. T(n) = 4T(n/2) + n

22 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.22 Example of substitution T(n) = 4T(n/2) + n 4c(n/2) 3 + n = (c/2)n 3 + n = cn 3 –((c/2)n 3 – n) desired – residual cn 3 desired Whenever (c/2)n 3 – n 0, for example, if c 2 and n 1. residual

23 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.23 Example (continued) We must also handle the initial conditions, that is, ground the induction with base cases. Base: T(n) = Θ(1) for all n< n 0, where n 0 is a suitable constant. For 1 n < n 0, we have Θ(1) cn 3, if we pick cbig enough.

24 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.24 Example (continued) We must also handle the initial conditions, that is, ground the induction with base cases. Base: T(n) = Θ(1) for all n< n 0, where n 0 is a suitable constant. For 1 n < n 0, we have Θ(1) cn 3, if we pick cbig enough. This bound is not tight!

25 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.25 A tighter upper bound? We shall prove that T(n) = O(n 2 ).

26 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.26 A tighter upper bound? We shall prove that T(n) = O(n 2 ). Assume that T(k) ck 2 for k < n: T(n)= 4T(n/2) + n 4c(n/2) 2 + n = cn 2 + n = O(n 2 )

27 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.27 A tighter upper bound? We shall prove that T(n) = O(n 2 ). Assume that T(k) ck 2 for k < n: T(n)= 4T(n/2) + n 4c(n/2) 2 + n = cn 2 + n = O (n 2 ) Wrong! We must prove the I.H.

28 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.28 A tighter upper bound? We shall prove that T(n) = O(n 2 ). Assume that T(k) ck 2 for k < n: T(n)= 4T(n/2) + n 4c(n/2) 2 + n = cn 2 + n = O (n 2 ) = cn2 – (- n) [ desired - residual] cn2 for no choice of c > 0. Lose! Wrong! We must prove the I.H.

29 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.29 A tighter upper bound! I DEA: Strengthen the inductive hypothesis. Subtract a low-order term. Inductive hypothesis: T(k) c1k2 – c2k for k < n.

30 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.30 A tighter upper bound! I DEA: Strengthen the inductive hypothesis. Subtract a low-order term. Inductive hypothesis: T(k) c 1 k 2 – c 2 k for k < n. T(n)= 4T(n/2) + n = 4(c 1 (n/2) 2 –c2(n/2))+ n = c 1 n 2 –2c 2 n+ n = c 1 n 2 –c 2 n–(c 2 n–n) c 1 n 2 –c 2 n if c 2 1.

31 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.31 A tighter upper bound! I DEA: Strengthen the inductive hypothesis. Subtract a low-order term. Inductive hypothesis: T(k) c 1 k 2 – c 2 k for k < n. T(n)= 4T(n/2) + n = 4(c 1 (n/2) 2 –c2(n/2))+ n = c 1 n 2 –2c 2 n+ n = c 1 n 2 –c 2 n–(c 2 n–n) c 1 n 2 –c 2 n if c 2 1. Pick c 1 big enough to handle the initial conditions.

32 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.32 Recursion-tree method A recursion tree models the costs (time) of a recursive execution of an algorithm. The recursion-tree method can be unreliable, just like any method that uses ellipses (…). The recursion-tree method promotes intuition, however. The recursion tree method is good for generating guesses for the substitution method.

33 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.33 Example of recursion tree Solve T(n) = T(n/4) + T(n/2)+ n 2 :

34 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.34 Example of recursion tree Solve T(n) = T(n/4) + T(n/2)+ n 2 : T(n)

35 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.35 Example of recursion tree Solve T(n) = T(n/4) + T(n/2)+ n 2 : n2n2 T(n/4)T(n/2)

36 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.36 Example of recursion tree

37 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.37 Example of recursion tree

38 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.38 Example of recursion tree

39 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.39 Example of recursion tree

40 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.40 Example of recursion tree

41 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.41 Example of recursion tree geometric series

42 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.42 The master method The master method applies to recurrences of the form T(n) = aT(n/b) + f(n), where a1, b> 1, and f is asymptotically positive.

43 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.43 Three common cases Compare f (n)with n logba : 1. f (n) = O (n logba–ε )for some constant ε> 0. f (n) grows polynomially slower than n logba (by an n ε factor). Solution: T(n) = (n logba )

44 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.44 Three common cases Compare f (n)with n logba : 1. f (n) = O (n logba–ε )for some constant ε> 0. f (n) grows polynomially slower than n logba (by an n ε factor). Solution: T(n) = (n logba ) 2. f (n) = (n logba lg k n)for some constant k0. f (n) and n l ogba grow at similar rates. Solution: T(n) = (n logba lg k+1 n).

45 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.45 Three common cases (cont.) Compare f (n)with n logba : 3. f (n) = Ω(n logba+ ε )for some constant ε> 0. f (n) grows polynomially faster than n logba (by an n ε factor), and f (n) satisfies the regularity condition that af(n/b) cf(n) for some constant c< 1. Solution: T(n) = (f(n)).

46 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.46 Examples EX. T(n) = 4T(n/2) + n a =4, b= 2 n logba =n 2 ; f(n) = n. CASE1: f(n) = O(n 2–ε )for ε= 1. T(n) = (n 2 ).

47 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.47 Examples EX. T(n) = 4T(n/2) + n a =4, b= 2 n logba =n 2 ; f(n) = n. CASE1: f(n) = O(n 2–ε )for ε= 1. T(n) = (n 2 ). EX.T(n) = 4T(n/2) + n 2 a =4, b= 2 n logba =n 2 ; f(n) = n 2. CASE2: f(n) = (n 2 lg 0 n), that is, k = 0. T(n) = (n 2 lgn).

48 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.48 Examples EX.T(n) = 4T(n/2) + n 3 a =4, b= 2 n logba =n 2 ; f(n) = n 3. CASE3: f(n) = Ω(n 2+ ε )for ε= 1 and 4(n/2) 3cn 3 (reg. cond.) for c= 1/2. T(n) = (n 3 ).

49 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.49 Examples EX.T(n) = 4T(n/2) + n 3 a =4, b= 2 n logba =n 2 ; f(n) = n 3. CASE3: f(n) = Ω(n 2+ ε )for ε= 1 and 4(n/2) 3cn 3 (reg. cond.) for c= 1/2. T(n) = (n 3 ). EX.T(n) = 4T(n/2) + n 2 /lgn a =4, b= 2 n logba =n 2 ; f(n) = n 2 /lgn. Master method does not apply. In particular, for every constant ε> 0, we have n ε = ω(lgn).

50 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.50 Idea of master theorem Recursion tree:

51 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.51 Idea of master theorem Recursion tree:

52 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.52 Idea of master theorem Recursion tree:

53 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.53 Idea of master theorem Recursion tree:

54 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.54 CASE1: The weight increases geometrically from the root to the leaves. The leaves hold a constant fraction of the total weight. Idea of master theorem Recursion tree:

55 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.55 CASE2: (k= 0) The weight is approximately the same on each of the log b n levels. Idea of master theorem Recursion tree:

56 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.56 Idea of master theorem CASE3: The weight decreases geometrically from the root to the leaves. The root holds a constant fraction of the total weight. Recursion tree:

57 September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.57 Appendix: geometric series Return to last slide viewed.


Download ppt "September 12, 2005 Copyright©2001-5 Erik D. Demaine and Charles E. Leiserson L2.1 Introduction to Algorithms 6.046J/18.401J LECTURE2 Asymptotic Notation."

Similar presentations


Ads by Google