Download presentation

Presentation is loading. Please wait.

Published byMorgan Young Modified over 2 years ago

1
September 12, 2005 Copyright© Erik D. Demaine and Charles E. Leiserson L2.1 Introduction to Algorithms LECTURE2 Asymptotic Notation O-, Ω-, and Θ-notation Recurrences Substitution method Iterating the recurrence Recursion tree Master method

2
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.2 Asymptotic notation O-notation (upper bounds): We write f(n) = O(g(n)) if there Exist constants c>0, n 0 >0 such That 0 f(n) cg(n) for all nn 0. We write f(n) = O(g(n)) if there Exist constants c>0, n 0 >0 such That 0 f(n) cg(n) for all nn 0.

3
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.3 Asymptotic notation O-notation (upper bounds): We write f(n) = O(g(n)) if there Exist constants c>0, n 0 >0 such That 0 f(n) cg(n) for all nn 0. We write f(n) = O(g(n)) if there Exist constants c>0, n 0 >0 such That 0 f(n) cg(n) for all nn 0. E XAMPLE: 2n 2 = O(n 3 ) (c= 1, n 0 = 2)

4
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.4 Asymptotic notation O-notation (upper bounds): We write f(n) = O(g(n)) if there Exist constants c>0, n 0 >0 such That 0 f(n) cg(n) for all nn 0. We write f(n) = O(g(n)) if there Exist constants c>0, n 0 >0 such That 0 f(n) cg(n) for all nn 0. E XAMPLE: 2n 2 = O(n 3 ) (c= 1, n 0 = 2) functions, Not values

5
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.5 Asymptotic notation O-notation (upper bounds): We write f(n) = O(g(n)) if there Exist constants c>0, n 0 >0 such That 0 f(n) cg(n) for all nn 0. We write f(n) = O(g(n)) if there Exist constants c>0, n 0 >0 such That 0 f(n) cg(n) for all nn 0. E XAMPLE: 2n 2 = O(n 3 ) (c= 1, n 0 = 2) functions, Not values Funny, one-way equality

6
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.6 asymptotic upper bound

7
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.7 -notation (lower bounds) O-notation is an upper-bound notation. It makes no sense to say f(n) is at least O(n2).

8
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.8 -notation (lower bounds) O-notation is an upper-bound notation. It makes no sense to say f(n) is at least O(n2). (g(n))= { f(n) :there exist constants c> 0, n 0 > 0such that 0 cg(n) f(n) for all nn 0 } (g(n))= { f(n) :there exist constants c> 0, n 0 > 0such that 0 cg(n) f(n) for all nn 0 }

9
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.9 -notation (lower bounds) O-notation is an upper-bound notation. It makes no sense to say f(n) is at least O(n2). (g(n))= { f(n) :there exist constants c> 0, n 0 > 0such that 0 cg(n) f(n) for all nn 0 } (g(n))= { f(n) :there exist constants c> 0, n 0 > 0such that 0 cg(n) f(n) for all nn 0 } E XAMPLE:

10
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.10 asymptotic lower bound

11
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.11 -notation (tight bounds)

12
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.12 -notation (tight bounds) E XAMPLE:

13
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.13 The definition of required every member of be asymptotically nonnegative. Graphic Example of Θ

14
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.14 Theorem 3.1. For any two functions f(n) and g(n), if and only if and.

15
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.15 -notation and ω-notation O-notation and Ω-notation are like and. o-notation and ω-notation are like. (g(n))= { f(n) : for any constant c> 0 there is a constant n 0 > 0 such that 0 f(n) < cg(n) for all n n 0 } (g(n))= { f(n) : for any constant c> 0 there is a constant n 0 > 0 such that 0 f(n) < cg(n) for all n n 0 } 2n 2 = o(n 3 )(n 0 = 2/c) E XAMPLE: But 2n 2 o(n 2 )

16
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.16 -notation and ω-notation O-notation and Ω-notation are like and. o-notation and ω-notation are like. (g(n))= { f(n) : for any constant c> 0 there is a constant n 0 > 0 such that 0 cg(n) < f(n) for all n n 0 } (g(n))= { f(n) : for any constant c> 0 there is a constant n 0 > 0 such that 0 cg(n) < f(n) for all n n 0 } E XAMPLE: n 2 /2 =w (n), but n 2 /2 w(n 2 )

17
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.17 Solving recurrences The analysis of merge sort from Lecture 1 required us to solve a recurrence. Recurrences are like solving integrals, differential equations, etc. Learn a few tricks. Lecture 3: Applications of recurrences to divide-and-conquer algorithms.

18
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.18 Substitution method The most general method: 1.Guess the form of the solution. 2.Verify by induction. 3.Solve for constants.

19
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.19 Substitution method The most general method: 1.Guess the form of the solution. 2.Verify by induction. 3.Solve for constants. E XAMPLE: [Assume that T(1) = Θ(1).] Guess O(n 3 ). (Prove O and Ωseparately.) Assume that T(k) ck 3 for k< n. Prove T(n) cn 3 by induction. T(n) = 4T(n/2) + n

20
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.20 Example of substitution T(n) = 4T(n/2) + n 4c(n/2) 3 + n = (c/2)n 3 + n = cn 3 –((c/2)n 3 – n) desired – residual cn 3 desired Whenever (c/2)n 3 – n 0, for example, if c 2 and n 1. residual

21
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.21 Example (continued) We must also handle the initial conditions, that is, ground the induction with base cases. Base: T(n) = Θ(1) for all n< n 0, where n 0 is a suitable constant. For 1 n < n 0, we have Θ(1) cn 3, if we pick c big enough.

22
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.22 Example (continued) We must also handle the initial conditions, that is, ground the induction with base cases. Base: T(n) = Θ(1) for all n< n 0, where n 0 is a suitable constant. For 1 n < n 0, we have Θ(1) cn 3, if we pick c big enough. This bound is not tight!

23
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.23 A tighter upper bound? We shall prove that T(n) = O(n 2 ).

24
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.24 A tighter upper bound? We shall prove that T(n) = O(n 2 ). Assume that T(k) ck 2 for k < n: T(n)= 4T(n/2) + n 4c(n/2) 2 + n = cn 2 + n = O(n 2 )

25
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.25 A tighter upper bound? We shall prove that T(n) = O(n 2 ). Assume that T(k) ck 2 for k < n: T(n)= 4T(n/2) + n 4c(n/2) 2 + n = cn 2 + n = O (n 2 ) Wrong! We must prove the I.H.

26
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.26 A tighter upper bound? We shall prove that T(n) = O(n 2 ). Assume that T(k) ck 2 for k < n: T(n)= 4T(n/2) + n 4c(n/2) 2 + n = cn 2 + n = O (n 2 ) = cn 2 – (- n) [ desired - residual] cn 2 for no choice of c > 0. Lose! Wrong! We must prove the I.H.

27
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.27 A tighter upper bound! I DEA: Strengthen the inductive hypothesis. Subtract a low-order term. Inductive hypothesis: T(k) c 1 k 2 – c 2 k for k < n.

28
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.28 A tighter upper bound! I DEA: Strengthen the inductive hypothesis. Subtract a low-order term. Inductive hypothesis: T(k) c 1 k 2 – c 2 k for k < n. T(n)= 4T(n/2) + n = 4(c 1 (n/2) 2 –c 2 (n/2))+ n = c 1 n 2 –2c 2 n+ n = c 1 n 2 –c 2 n–(c 2 n–n) c 1 n 2 –c 2 n if c 2 1.

29
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.29 A tighter upper bound! I DEA: Strengthen the inductive hypothesis. Subtract a low-order term. Inductive hypothesis: T(k) c 1 k 2 – c 2 k for k < n. T(n)= 4T(n/2) + n = 4(c 1 (n/2) 2 –c2(n/2))+ n = c 1 n 2 –2c 2 n+ n = c 1 n 2 –c 2 n–(c 2 n–n) c 1 n 2 –c 2 n if c 2 1. Pick c 1 big enough to handle the initial conditions.

30
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.30 Recursion-tree method A recursion tree models the costs (time) of a recursive execution of an algorithm. The recursion-tree method can be unreliable, just like any method that uses ellipses (…). The recursion-tree method promotes intuition, however. The recursion tree method is good for generating guesses for the substitution method.

31
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.31 Example of recursion tree Solve T(n) = T(n/4) + T(n/2)+ n 2 :

32
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.32 Example of recursion tree Solve T(n) = T(n/4) + T(n/2)+ n 2 : T(n)

33
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.33 Example of recursion tree Solve T(n) = T(n/4) + T(n/2)+ n 2 : n2n2 T(n/4)T(n/2)

34
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.34 Example of recursion tree

35
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.35 Example of recursion tree

36
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.36 Example of recursion tree

37
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.37 Example of recursion tree

38
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.38 Example of recursion tree

39
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.39 Example of recursion tree geometric series

40
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.40 The master method The master method applies to recurrences of the form T(n) = aT(n/b) + f(n), where a1, b> 1, and f is asymptotically positive.

41
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.41 Three common cases Compare f (n)with n logba : 1. f (n) = O (n logba–ε )for some constant ε> 0. f (n) grows polynomially slower than n logba (by an n ε factor). Solution: T(n) = (n logba )

42
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.42 Three common cases Compare f (n)with n logba : 1. f (n) = O (n logba–ε )for some constant ε> 0. f (n) grows polynomially slower than n logba (by an n ε factor). Solution: T(n) = (n logba ) 2. f (n) = (n logba lg k n)for some constant k0. f (n) and n l ogba grow at similar rates. Solution: T(n) = (n logba lg k+1 n).

43
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.43 Three common cases (cont.) Compare f (n)with n logba : 3. f (n) = Ω(n logba+ ε )for some constant ε> 0. f (n) grows polynomially faster than n logba (by an n ε factor), and f (n) satisfies the regularity condition that af(n/b) cf(n) for some constant c< 1. Solution: T(n) = (f(n)).

44
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.44 Examples EX. T(n) = 4T(n/2) + n a =4, b= 2 n logba =n 2 ; f(n) = n. CASE1: f(n) = O(n 2–ε )for ε= 1. T(n) = (n 2 ).

45
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.45 Examples EX. T(n) = 4T(n/2) + n a =4, b= 2 n logba =n 2 ; f(n) = n. CASE1: f(n) = O(n 2–ε )for ε= 1. T(n) = (n 2 ). EX.T(n) = 4T(n/2) + n 2 a =4, b= 2 n logba =n 2 ; f(n) = n 2. CASE2: f(n) = (n 2 lg 0 n), that is, k = 0. T(n) = (n 2 lgn).

46
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.46 Examples EX.T(n) = 4T(n/2) + n 3 a =4, b= 2 n logba =n 2 ; f(n) = n 3. CASE3: f(n) = Ω(n 2+ ε )for ε= 1 and 4(n/2) 3cn 3 (reg. cond.) for c= 1/2. T(n) = (n 3 ).

47
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.47 Examples EX.T(n) = 4T(n/2) + n 3 a =4, b= 2 n logba =n 2 ; f(n) = n 3. CASE3: f(n) = Ω(n 2+ ε )for ε= 1 and 4(n/2) 3cn 3 (reg. cond.) for c= 1/2. T(n) = (n 3 ). EX.T(n) = 4T(n/2) + n 2 /lgn a =4, b= 2 n logba =n 2 ; f(n) = n 2 /lgn. Master method does not apply. In particular, for every constant ε> 0, we have n ε = ω(lgn).

48
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.48 Idea of master theorem Recursion tree:

49
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.49 Idea of master theorem Recursion tree:

50
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.50 Idea of master theorem Recursion tree:

51
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.51 Idea of master theorem Recursion tree:

52
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.52 CASE1: The weight increases geometrically from the root to the leaves. The leaves hold a constant fraction of the total weight. Idea of master theorem Recursion tree:

53
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.53 CASE2: (k= 0) The weight is approximately the same on each of the log b n levels. Idea of master theorem Recursion tree:

54
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.54 Idea of master theorem CASE3: The weight decreases geometrically from the root to the leaves. The root holds a constant fraction of the total weight. Recursion tree:

55
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.55 Polynomials v.s. Exponentials Polynomials: –A function is polynomial bounded if Exponentials: –Any positive exponential function grows faster than any polynomial.

56
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.56 Logarithms A polylogarithmic function in n is a polynomial in the logarithm of n,polynomiallogarithm A function f(n) is polylogarithmically bounded if for any constant a > 0. Any positive polynomial function grows faster than any polylogarithmic function.

57
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.57 Function iteration For example, if, then

58
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.58 Since the number of atoms in the observable universe is estimated to be about, which is much less than, we rarely encounter a value of n such that. Slowly Growing function

59
September 12, 2005 Copyright ? Erik D. Demaine and Charles E. Leiserson L2.59 Appendix: geometric series Return to last slide viewed.

Similar presentations

© 2016 SlidePlayer.com Inc.

All rights reserved.

Ads by Google