Presentation is loading. Please wait.

Presentation is loading. Please wait.

1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.

Similar presentations


Presentation on theme: "1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations."— Presentation transcript:

1 1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations

2 1/6/20162 Outline Review of last lecture Order of growth Asymptotic notations –Big O, big Ω, Θ

3 1/6/20163 How to express algorithms? Nature language (e.g. English) Pseudocode Real programming languages Increasing precision Ease of expression Describe the ideas of an algorithm in nature language. Use pseudocode to clarify sufficiently tricky details of the algorithm.

4 1/6/20164 Insertion Sort InsertionSort(A, n) { for j = 2 to n { } 1 j sorted 1. Find position i in A[1..j-1] such that A[i] ≤ A[j] < A[i+1] 2. Insert A[j] between A[i] and A[i+1] ▷ Pre condition: A[1..j-1] is sorted ▷ Post condition: A[1..j] is sorted

5 1/6/20165 InsertionSort(A, n) { for j = 2 to n { key = A[j]; i = j - 1; while (i > 0) and (A[i] > key) { A[i+1] = A[i]; i = i – 1; } A[i+1] = key } } Insertion Sort 1i j Key sorted

6 1/6/20166 Use loop invariants to prove the correctness of loops A loop invariant (LI) is a formal statement about the variables in your program which holds true throughout the loop Proof by induction –Initialization: the LI is true prior to the 1 st iteration –Maintenance: if the LI is true before the j th iteration, it remains true before the (j+1) th iteration –Termination: when the loop terminates, the invariant shows the correctness of the algorithm

7 1/6/20167 Loop invariants and correctness of insertion sort Claim: at the start of each iteration of the for loop, the subarray consists of the elements originally in A[1..j-1] but in sorted order. Proof: by induction

8 1/6/20168 Prove correctness using loop invariants InsertionSort(A, n) { for j = 2 to n { key = A[j]; i = j - 1; ▷ Insert A[j] into the sorted sequence A[1..j-1] while (i > 0) and (A[i] > key) { A[i+1] = A[i]; i = i – 1; } A[i+1] = key } } Loop invariant: at the start of each iteration of the for loop, the subarray A[1..j-1] consists of the elements originally in A[1..j-1] but in sorted order.

9 1/6/20169 Initialization InsertionSort(A, n) { for j = 2 to n { key = A[j]; i = j - 1; ▷ Insert A[j] into the sorted sequence A[1..j-1] while (i > 0) and (A[i] > key) { A[i+1] = A[i]; i = i – 1; } A[i+1] = key } } Subarray A[1] is sorted. So loop invariant is true before the loop starts.

10 1/6/201610 Maintenance InsertionSort(A, n) { for j = 2 to n { key = A[j]; i = j - 1; ▷ Insert A[j] into the sorted sequence A[1..j-1] while (i > 0) and (A[i] > key) { A[i+1] = A[i]; i = i – 1; } A[i+1] = key } } Assume loop variant is true prior to iteration j 1i j Key sorted Loop variant will be true before iteration j+1

11 1/6/201611 Termination InsertionSort(A, n) { for j = 2 to n { key = A[j]; i = j - 1; ▷ Insert A[j] into the sorted sequence A[1..j-1] while (i > 0) and (A[i] > key) { A[i+1] = A[i]; i = i – 1; } A[i+1] = key } } 1 j=n+1 Sorted Upon termination, A[1..n] contains all the original elements of A in sorted order. n The algorithm is correct!

12 1/6/201612 Efficiency Correctness alone is not sufficient –Brute-force algorithms exist for most problems –E.g. use permutation to sort –Problem: too slow! How to measure efficiency? –Accurate running time is not a good measure –It depends on input, computer, and implementation, etc.

13 1/6/201613 Machine-independent A generic uniprocessor random-access machine (RAM) model –No concurrent operations –Each simple operation (e.g. +, -, =, *, if, for) takes 1 step. Loops and subroutine calls are not simple operations. –All memory equally expensive to access Constant word size Unless we are explicitly manipulating bits

14 1/6/201614 Analysis of insertion Sort InsertionSort(A, n) { for j = 2 to n { key = A[j] i = j - 1; while (i > 0) and (A[i] > key) { A[i+1] = A[i] i = i - 1 } A[i+1] = key } } How many times will this line execute?

15 1/6/201615 Analysis of insertion Sort InsertionSort(A, n) { for j = 2 to n { key = A[j] i = j - 1; while (i > 0) and (A[i] > key) { A[i+1] = A[i] i = i - 1 } A[i+1] = key } } How many times will this line execute?

16 1/6/201616 Analysis of insertion Sort Statement cost time__ InsertionSort(A, n) { for j = 2 to n { c 1 n key = A[j] c 2 (n-1) i = j - 1; c 3 (n-1) while (i > 0) and (A[i] > key) { c 4 S A[i+1] = A[i] c 5 (S-(n-1)) i = i - 1 c 6 (S-(n-1)) } 0 A[i+1] = key c 7 (n-1) } 0 } S = t 2 + t 3 + … + t n where t j is number of while expression evaluations for the j th for loop iteration

17 1/6/201617 Analyzing Insertion Sort T(n) = c 1 n + c 2 (n-1) + c 3 (n-1) + c 4 S + c 5 (S - (n-1)) + c 6 (S - (n-1)) + c 7 (n-1) = c 8 S + c 9 n + c 10 What can S be? –Best case -- inner loop body never executed t j = 1  S = n - 1 T(n) = an + b is a linear function –Worst case -- inner loop body executed for all previous elements t j = j  S = 2 + 3 + … + n = n(n+1)/2 - 1 T(n) = an 2 + bn + c is a quadratic function –Average case Can assume that on average, we have to insert A[j] into the middle of A[1..j-1], so t j = j/2 S ≈ n(n+1)/4 T(n) is still a quadratic function Θ (n) Θ (n 2 )

18 1/6/201618 Analysis of insertion Sort Statement cost time__ InsertionSort(A, n) { for j = 2 to n { c 1 n key = A[j] c 2 (n-1) i = j - 1; c 3 (n-1) while (i > 0) and (A[i] > key) { c 4 S A[i+1] = A[i] c 5 (S-(n-1)) i = i - 1 c 6 (S-(n-1)) } 0 A[i+1] = key c 7 (n-1) } 0 } What are the basic operations (most executed lines)?

19 1/6/201619 Statement cost time__ InsertionSort(A, n) { for j = 2 to n { c 1 n key = A[j] c 2 (n-1) i = j - 1; c 3 (n-1) while (i > 0) and (A[i] > key) { c 4 S A[i+1] = A[i] c 5 (S-(n-1)) i = i - 1 c 6 (S-(n-1)) } 0 A[i+1] = key c 7 (n-1) } 0 } Analysis of insertion Sort

20 1/6/201620 Statement cost time__ InsertionSort(A, n) { for j = 2 to n { c 1 n key = A[j] c 2 (n-1) i = j - 1; c 3 (n-1) while (i > 0) and (A[i] > key) { c 4 S A[i+1] = A[i] c 5 (S-(n-1)) i = i - 1 c 6 (S-(n-1)) } 0 A[i+1] = key c 7 (n-1) } 0 } Analysis of insertion Sort

21 1/6/201621 What can S be? S =  j=2..n t j Best case: Worst case: Average case: 1 ij sorted Inner loop stops when A[i] <= key, or i = 0 Key

22 1/6/201622 Best case Array already sorted S =  j=2..n t j t j = 1 for all j S = n-1 T(n) = Θ (n) 1 ij sorted Key Inner loop stops when A[i] <= key, or i = 0

23 1/6/201623 Worst case Array originally in reverse order sorted S =  j=2..n t j t j = j S =  j=2..n j = 2 + 3 + … + n = (n-1) (n+2) / 2 = Θ (n 2 ) 1 ij sorted Inner loop stops when A[i] <= key Key

24 1/6/201624 Average case Array in random order S =  j=2..n t j t j = j / 2 on average S =  j=2..n j/2 = ½  j=2..n j = (n-1) (n+2) / 4 = Θ (n 2 ) 1 ij sorted Inner loop stops when A[i] <= key Key What if we use binary search? Answer: still Θ(n 2 )

25 1/6/201625 Asymptotic Analysis Running time depends on the size of the input –Larger array takes more time to sort –T(n): the time taken on input with size n –Look at growth of T(n) as n→∞. “Asymptotic Analysis” Size of input is generally defined as the number of input elements –In some cases may be tricky

26 1/6/201626 Asymptotic Analysis Ignore actual and abstract statement costs Order of growth is the interesting measure: –Highest-order term is what counts As the input size grows larger it is the high order term that dominates

27 1/6/201627 Comparison of functions log 2 nnnlog 2 nn2n2 n3n3 2n2n n! 103.3103310 2 10 3 10 6 10 2 6.610 2 66010 4 10 6 10 30 10 158 10 3 1010 3 10 4 10 6 10 9 10 4 1310 4 10 5 10 8 10 12 10 5 1710 5 10 610 10 15 10 6 2010 6 10 7 10 12 10 18 For a super computer that does 1 trillion operations per second, it will be longer than 1 billion years

28 1/6/201628 Order of growth 1 << log 2 n << n << nlog 2 n << n 2 << n 3 << 2 n << n! (We are slightly abusing of the “<<“ sign. It means a smaller order of growth).

29 1/6/201629 Exact analysis is hard! Worst-case and average-case are difficult to deal with precisely, because the details are very complicated It may be easier to talk about upper and lower bounds of the function.

30 1/6/201630 Asymptotic notations O: Big-Oh Ω: Big-Omega Θ: Theta o: Small-oh ω: Small-omega

31 1/6/201631 Big O Informally, O (g(n)) is the set of all functions with a smaller or same order of growth as g(n), within a constant multiple If we say f(n) is in O(g(n)), it means that g(n) is an asymptotic upper bound of f(n) –Intuitively, it is like f(n) ≤ g(n) What is O(n 2 )? –The set of all functions that grow slower than or in the same order as n 2

32 1/6/201632 So: n  O(n 2 ) n 2  O(n 2 ) 1000n  O(n 2 ) n 2 + n  O(n 2 ) 100n 2 + n  O(n 2 ) But: 1/1000 n 3  O(n 2 ) Intuitively, O is like ≤

33 1/6/201633 small o Informally, o (g(n)) is the set of all functions with a strictly smaller growth as g(n), within a constant multiple What is o(n 2 )? –The set of all functions that grow slower than n 2 So: 1000n  o(n 2 ) But: n 2  o(n 2 ) Intuitively, o is like <

34 1/6/201634 Big Ω Informally, Ω (g(n)) is the set of all functions with a larger or same order of growth as g(n), within a constant multiple f(n)  Ω(g(n)) means g(n) is an asymptotic lower bound of f(n) –Intuitively, it is like g(n) ≤ f(n) So: n 2  Ω(n) 1/1000 n 2  Ω(n) But: 1000 n  Ω(n 2 ) Intuitively, Ω is like ≥

35 1/6/201635 small ω Informally, ω (g(n)) is the set of all functions with a larger order of growth as g(n), within a constant multiple So: n 2  ω(n) 1/1000 n 2  ω(n) n 2  ω(n 2 ) Intuitively, ω is like >

36 1/6/201636 Theta (Θ) Informally, Θ (g(n)) is the set of all functions with the same order of growth as g(n), within a constant multiple f(n)  Θ(g(n)) means g(n) is an asymptotically tight bound of f(n) –Intuitively, it is like f(n) = g(n) What is Θ(n 2 )? –The set of all functions that grow in the same order as n 2

37 1/6/201637 So: n 2  Θ(n 2 ) n 2 + n  Θ(n 2 ) 100n 2 + n  Θ(n 2 ) 100n 2 + log 2 n  Θ(n 2 ) But: nlog 2 n  Θ(n 2 ) 1000n  Θ(n 2 ) 1/1000 n 3  Θ(n 2 ) Intuitively, Θ is like =

38 1/6/201638 Tricky cases How about sqrt(n) and log 2 n? How about log 2 n and log 10 n How about 2 n and 3 n How about 3 n and n!?

39 1/6/201639 Big-Oh Definition: O(g(n)) = {f(n):  positive constants c and n 0 such that 0 ≤ f(n) ≤ cg(n)  n≥n 0 } lim n→∞ g(n)/f(n) > 0 (if the limit exists.) Abuse of notation (for convenience): f(n) = O(g(n)) actually means f(n)  O(g(n)) There exist For all

40 1/6/201640 Big-Oh Claim: f(n) = 3n 2 + 10n + 5  O(n 2 ) Proof by definition (Hint: to prove this claim by definition, we need to find some positive constants c and n 0 such that f(n) <= cn 2 for all n ≥ n 0.) (Note: you just need to find one concrete example of c and n 0 satisfying the condition, but it needs to be correct for all n ≥ n 0. So do not try to plug in a concrete value of n and show the inequality holds.) Proof: 3n 2 + 10n + 5  3n 2 + 10n 2 + 5,  n ≥ 1  3n 2 + 10n 2 + 5n 2,  n ≥ 1  18 n 2,  n ≥ 1 If we let c = 18 and n 0 = 1, we have f(n)  c n 2,  n ≥ n 0. Therefore by definition, f(n) = O(n 2 ).

41 1/6/201641 Big-Omega Definition: Ω(g(n)) = {f(n):  positive constants c and n 0 such that 0 ≤ cg(n) ≤ f(n)  n≥n 0 } lim n→∞ f(n)/g(n) > 0 (if the limit exists.) Abuse of notation (for convenience): f(n) = Ω(g(n)) actually means f(n)  Ω(g(n))

42 1/6/201642 Big-Omega Claim: f(n) = n 2 / 10 = Ω(n) Proof by definition: f(n) = n 2 / 10, g(n) = n Need to find a c and a n o to satisfy the definition of f(n)  Ω(g(n)), i.e., f(n) ≥ cg(n) for n ≥ n 0 Proof: n ≤ n 2 / 10 when n ≥ 10 If we let c = 1 and n 0 = 10, we have f(n) ≥ cn,  n ≥ n 0. Therefore, by definition, n 2 / 10 = Ω(n).

43 1/6/201643 Theta Definition: –Θ(g(n)) = {f(n):  positive constants c 1, c 2, and n 0 such that 0  c 1 g(n)  f(n)  c 2 g(n),  n  n 0 } f(n) = O(g(n)) and f(n) = Ω(g(n)) Abuse of notation (for convenience): f(n) = Θ(g(n)) actually means f(n)  Θ(g(n)) Θ(1) means constant time.

44 1/6/201644 Theta Claim: f(n) = 2n 2 + n = Θ (n 2 ) Proof by definition: –Need to find the three constants c 1, c 2, and n 0 such that c 1 n 2 ≤ 2n 2 +n ≤ c 2 n 2 for all n ≥ n 0 –A simple solution is c 1 = 2, c 2 = 3, and n 0 = 1

45 1/6/201645 More Examples Prove n 2 + 3n + lg n is in O(n 2 ) Need to find c and n 0 such that n 2 + 3n + lg n <= cn 2 for n ≥ n 0 Proof: n 2 + 3n + lg n <= n 2 + 3n 2 + n for n ≥ 1 <= n 2 + 3n 2 + n 2 for n ≥ 1 <= 5n 2 for n ≥ 1 Therefore by definition n 2 + 3n + lg n  O(n 2 ). (Alternatively: n 2 + 3n + lg n <= n 2 + n 2 + n 2 for n ≥ 10 <= 3n 2 for n ≥ 10)

46 1/6/201646 More Examples Prove n 2 + 3n + lg n is in Ω(n 2 ) Want to find c and n 0 such that n 2 + 3n + lg n >= cn 2 for n ≥ n 0 n 2 + 3n + lg n >= n 2 for n ≥ 1 n 2 + 3n + lg n = O(n 2 ) and n 2 + 3n + lg n = Ω (n 2 ) => n 2 + 3n + lg n = Θ(n 2 )

47 1/6/201647 O, Ω, and Θ The definitions imply a constant n 0 beyond which they are satisfied. We do not care about small values of n.

48 1/6/201648 Using limits to compare orders of growth 0 lim f(n) / g(n) = c > 0 ∞ n→∞ f(n)  o(g(n)) f(n)  Θ (g(n)) f(n)  ω (g(n)) f(n)  O(g(n)) f(n)  Ω(g(n))

49 1/6/201649 logarithms compare log 2 n and log 10 n log a b = log c b / log c a log 2 n = log 10 n / log 10 2 ~ 3.3 log 10 n Therefore lim(log 2 n / log 10 n) = 3.3 log 2 n = Θ (log 10 n)

50 1/6/201650 Compare 2 n and 3 n lim 2 n / 3 n = lim(2/3) n = 0 Therefore, 2 n  o(3 n ), and 3 n  ω(2 n ) How about 2 n and 2 n+1 ? 2 n / 2 n+1 = ½, therefore 2 n = Θ (2 n+1 ) n→∞

51 1/6/201651 L’ Hopital’s rule You can apply this transformation as many times as you want, as long as the condition holds lim f(n) / g(n) = lim f(n)’ / g(n)’ n→∞ Condition: If both lim f(n) and lim g(n) are  or 0

52 1/6/201652 Compare n 0.5 and log n lim n 0.5 / log n = ? (n 0.5 )’ = 0.5 n -0.5 (log n)’ = 1 / n lim (n -0.5 / 1/n) = lim(n 0.5 ) = ∞ Therefore, log n  o(n 0.5 ) In fact, log n  o(n ε ), for any ε > 0 n→∞

53 1/6/201653 Stirling’s formula (constant)

54 1/6/201654 Compare 2 n and n! Therefore, 2 n = o(n!) Compare n n and n! Therefore, n n = ω(n!) How about log (n!)?

55 1/6/201655

56 1/6/201656 More advanced dominance ranking

57 1/6/201657 Asymptotic notations O: Big-Oh Ω: Big-Omega Θ: Theta o: Small-oh ω: Small-omega Intuitively: O is like  o is like <  is like   is like >  is like =


Download ppt "1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations."

Similar presentations


Ads by Google