Presentation is loading. Please wait.

Presentation is loading. Please wait.

2IL50 Data Structures Fall 2015 Lecture 2: Analysis of Algorithms.

Similar presentations


Presentation on theme: "2IL50 Data Structures Fall 2015 Lecture 2: Analysis of Algorithms."— Presentation transcript:

1 2IL50 Data Structures Fall 2015 Lecture 2: Analysis of Algorithms

2 Analysis of algorithms the formal way …

3 Analysis of algorithms  Can we say something about the running time of an algorithm without implementing and testing it? InsertionSort(A) 1. initialize: sort A[1] 2. for j = 2 to A.length 3. do key = A[j] 4. i = j -1 5. while i > 0 and A[i] > key 6. do A[i+1] = A[i] 7. i = i -1 8. A[i +1] = key

4 Analysis of algorithms  Analyze the running time as a function of n (# of input elements) best case average case worst case elementary operations add, subtract, multiply, divide, load, store, copy, conditional and unconditional branch, return … An algorithm has worst case running time T(n) if for any input of size n the maximal number of elementary operations executed is T(n).

5 Linear Search Input: increasing sequence of n numbers A = ‹a 1, a 2, …, a n › and value v Output: an index i such that A[i] = v or NIL if v not in A LinearSearch(A, v) 1. for i = 1 to n 2. do if A[i] = v 3. then return i 4. return NIL Running time best case: 1 average case: n/2 (if successful) worst case: n 134781417212835 v = 7

6 Binary Search Input: increasing sequence of n numbers A = ‹a 1, a 2, …, a n › and value v Output: an index i such that A[i] = v or NIL if v not in A BinarySearch(A, v) 1. x = 0 2. y = n + 1 3. while x + 1 < y 4. do h = floor((x + y)/2) 5. do if A[h] ≤ v then x = h 6. else y = h 7. if A[x] = v then return x else return NIL Running time best case: log n average case: log n worst case: log n 134781417212835 v = 7

7 Analysis of algorithms: example InsertionSort: 15 n 2 + 7n – 2 MergeSort: 300 n lg n + 50 n n=10 n=100 n=1000 1568 150698 1.5 x 10 7 10466 204316 3.0 x 10 6 InsertionSort 6 x faster InsertionSort 1.35 x faster MergeSort 5 x faster n = 1,000,000 InsertionSort 1.5 x 10 13 MergeSort 6 x 10 9 2500 x faster ! The rate of growth of the running time as a function of the input is essential!

8 Θ-notation Intuition: concentrate on the leading term, ignore constants 19 n 3 + 17 n 2 - 3n becomesΘ(n 3 ) 2 n lg n + 5 n 1.1 - 5 becomes n - ¾ n √n becomes Θ(n 1.1 ) ---

9 Θ-notation Let g(n) : N ↦ N be a function. Then we have Θ(g(n)) = { f(n) : there exist positive constants c 1, c 2, and n 0 such that 0 ≤ c 1 g(n) ≤ f(n) ≤ c 2 g(n) for all n ≥ n 0 } “Θ(g(n)) is the set of functions that grow as fast as g(n)”

10 Θ-notation Let g(n) : N ↦ N be a function. Then we have Θ(g(n)) = { f(n) : there exist positive constants c 1, c 2, and n 0 such that 0 ≤ c 1 g(n) ≤ f(n) ≤ c 2 g(n) for all n ≥ n 0 } n n0n0 c 1 g(n) c 2 g(n) f(n) 0 Notation: f(n) = Θ(g(n))

11 Θ-notation Let g(n) : N ↦ N be a function. Then we have Θ(g(n)) = { f(n) : there exist positive constants c 1, c 2, and n 0 such that 0 ≤ c 1 g(n) ≤ f(n) ≤ c 2 g(n) for all n ≥ n 0 } Claim:19n 3 + 17n 2 - 3n = Θ(n 3 ) Proof:Choose c 1 = 19, c 2 = 36 and n 0 = 1. Then we have for all n ≥ n 0 : 0 ≤ c 1 n 3 = 19n 3 (trivial) ≤ 19n 3 + 17n 2 - 3n (since 17n 2 > 3n for n ≥ 1) ≤ 19n 3 + 17n 3 (since 17n 2 ≤ 17n 3 for n ≥1) = c 2 n 3 ■

12 Θ-notation Let g(n) : N ↦ N be a function. Then we have Θ(g(n)) = { f(n) : there exist positive constants c 1, c 2, and n 0 such that 0 ≤ c 1 g(n) ≤ f(n) ≤ c 2 g(n) for all n ≥ n 0 } Claim:19n 3 + 17n 2 - 3n ≠ Θ(n 2 ) Proof:Assume that there are positive constants c 1, c 2, and n 0 such that for all n ≥ n 0 0 ≤ c 1 n 2 ≤ 19n 3 + 17n 2 - 3n ≤ c 2 n 2 Since 19n 3 + 17n 2 - 3n ≤ c 2 n 2 implies 19n 3 ≤ c 2 n 2 + 3n – 17n 2 ≤ c 2 n 2 (3n – 17n 2 ≤ 0) we would have for all n ≥ n 0 19n ≤ c 2.

13 O-notation Let g(n) : N ↦ N be a function. Then we have O(g(n)) = { f(n) : there exist positive constants c and n 0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n 0 } “O(g(n)) is the set of functions that grow at most as fast as g(n)”

14 O-notation Let g(n) : N ↦ N be a function. Then we have O(g(n)) = { f(n) : there exist positive constants c and n 0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n 0 } n n0n0 cg(n) f(n) 0 Notation: f(n) = O(g(n))

15 Ω-notation Let g(n) : N ↦ N be a function. Then we have Ω(g(n)) = { f(n) : there exist positive constants c and n 0 such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n 0 } “Ω(g(n)) is the set of functions that grow at least as fast as g(n)”

16 Ω-notation Let g(n) : N ↦ N be a function. Then we have Ω(g(n)) = { f(n) : there exist positive constants c and n 0 such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n 0 } n n0n0 cg(n) f(n) 0 Notation: f(n) = Ω(g(n))

17 Asymptotic notation Θ(…) is an asymptotically tight bound O(…) is an asymptotic upper bound Ω(…) is an asymptotic lower bound  other asymptotic notation o(…) → “grows strictly slower than” ω(…) → “grows strictly faster than” “ asymptotically equal” “asymptotically smaller or equal” “asymptotically greater or equal”

18 More notation … f(n) = n 3 + Θ(n 2 )means f(n) = means O(1) or Θ(1) means 2n 2 + O(n) = Θ(n 2 )means there is a function g(n) such that f(n) = n 3 + g(n) and g(n) = Θ(n 2 ) there is one function g(i) such that f(n) = and g(i) = O(i) a constant for each function g(n) with g(n)=O(n) we have 2n 2 + g(n) = Θ(n 2 )

19 Quiz 1.O(1) + O(1) = O(1) 2.O(1) + … + O(1) = O(1) 3. = 4.O(n 2 ) ⊆ O(n 3 ) 5.O(n 3 ) ⊆ O(n 2 ) 6.Θ(n 2 ) ⊆ O(n 3 ) 7.An algorithm with worst case running time O(n log n) is always slower than an algorithm with worst case running time O(n) if n is sufficiently large. true false true false true false

20 Quiz 8.n log 2 n = Θ(n log n) 9.n log 2 n = Ω(n log n) 10. n log 2 n = O(n 4/3 ) 11. O(2 n ) ⊆ O(3 n ) 12. O(2 n ) ⊆ Θ(3 n ) false true false

21 Analysis of algorithms

22 Analysis of InsertionSort InsertionSort(A) 1. initialize: sort A[1] 2. for j = 2 to A.length 3. do key = A[j] 4. i = j -1 5. while i > 0 and A[i] > key 6. do A[i+1] = A[i] 7. i = i -1 8. A[i +1] = key  Get as tight a bound as possible on the worst case running time. ➨ lower and upper bound for worst case running time Upper bound: Analyze worst case number of elementary operations Lower bound: Give “bad” input example

23 Analysis of InsertionSort InsertionSort(A) 1. initialize: sort A[1] 2. for j = 2 to A.length 3. do key = A[j] 4. i = j -1 5. while i > 0 and A[i] > key 6. do A[i+1] = A[i] 7. i = i -1 8. A[i +1] = key Upper bound: Let T(n) be the worst case running time of InsertionSort on an array of length n. We have T(n) = Lower bound: O(1) worst case: (j-1) ∙ O(1) O(1) + { O(1) + (j-1)∙O(1) + O(1) } = O(j) = O(n 2 ) Array sorted in de-creasing order ➨ Ω(n 2 ) The worst case running time of InsertionSort is Θ(n 2 ).

24 MergeSort(A) // divide-and-conquer algorithm that sorts array A[1..n] 1. if A.length = 1 2. then skip 3. else 4. n = A.length ; n 1 = floor(n/2); n 2 = ceil(n/2); 5. copy A[1.. n 1 ] to auxiliary array A 1 [1.. n 1 ] 6. copy A[n 1 +1..n] to auxiliary array A 2 [1.. n 2 ] 7. MergeSort(A 1 ); MergeSort(A 2 ) 8. Merge(A, A 1, A 2 ) Analysis of MergeSort O(1) O(n) ?? T( n/2 ) + T( n/2 ) MergeSort is a recursive algorithm ➨ running time analysis leads to recursion

25 Analysis of MergeSort  Let T(n) be the worst case running time of MergeSort on an array of length n. We have O(1) if n = 1 T(n) = T( n/2 ) + T( n/2 ) + Θ(n) if n > 1 frequently omitted since it (nearly) always holds often written as 2T(n/2)

26 Solving recurrences

27  Easiest: Master theorem caveat: not always applicable  Alternatively: Guess the solution and use the substitution method to prove that your guess it is correct.  How to guess: 1. expand the recursion 2. draw a recursion tree

28 Let a and b be constants, let f(n) be a function, and let T(n) be defined on the nonnegative integers by the recurrence T(n) = aT(n/b) + f(n) Then we have: 1.If f(n) = O(n log a – ε ) for some constant ε > 0, then T(n) = Θ(n log a ). 2.If f(n) = Θ(n log a ), then T(n) = Θ(n log a log n) 3.If f(n) = Ω(n log a + ε ) for some constant ε > 0, and if af(n/b) ≤ cf(n) for some constant c < 1 and all sufficiently large n, then T(n) = Θ(f(n)) The master theorem b b b b can be rounded up or down note: log b a - ε b

29 The master theorem: Example T(n) = 4T(n/2) + Θ(n 3 )  Master theorem with a = 4, b = 2, and f(n) = n 3 log b a = log 2 4 = 2 ➨ n 3 = f(n) = Ω(n log a + ε ) = Ω(n 2 + ε ) with, for example, ε = 1  Case 3 of the master theorem gives T(n) = Θ(n 3 ), if the regularity condition holds. choose c = ½ and n 0 = 1 ➨ af(n/b) = 4(n/2) 3 = n 3 /2 ≤ cf(n) for n ≥ n 0 ➨ T(n) = Θ(n 3 ) b

30 The substitution method  The Master theorem does not always apply  In those cases, use the substitution method: 1.Guess the form of the solution. 2.Use induction to find the constants and show that the solution works Use expansion or a recursion-tree to guess a good solution.

31 Recursion-trees T(n) = 2T(n/2) + n n n/2 n/4 n/4 n/2 i n/2 i … n/2 i Θ(1) Θ(1) … Θ(1)

32 Recursion-trees T(n) = 2T(n/2) + n n 2 ∙ (n/2) = n 4 ∙ (n/4) = n 2 i ∙ (n/2 i ) = n n ∙ Θ(1) = Θ(n) log n + Θ(n log n) n n/2 n/4 n/4 n/2 i n/2 i … n/2 i Θ(1) Θ(1) … Θ(1)

33 Recursion-trees T(n) = 2T(n/2) + n 2 n2n2 (n/2) 2 (n/4) 2 (n/4) 2 (n/2 i ) 2 (n/2 i ) 2 … (n/2 i ) 2 Θ(1) Θ(1) … Θ(1)

34 Recursion-trees T(n) = 2T(n/2) + n 2 n2n2 (n/2) 2 (n/4) 2 (n/4) 2 (n/2 i ) 2 (n/2 i ) 2 … (n/2 i ) 2 Θ(1) Θ(1) … Θ(1) n2n2 2 ∙ (n/2) 2 = n 2 /2 4 ∙ (n/4) 2 = n 2 /4 2 i ∙ (n/2 i ) 2 = n 2 /2 i n ∙ Θ(1) = Θ(n) + Θ(n 2 )

35 Recursion-trees T(n) = 4T(n/2) + n n n/2 n/2 … n/4 n/4 n/4 n/4 … Θ(1) Θ(1) … Θ(1)

36 Recursion-trees T(n) = 4T(n/2) + n n n/2 n/2 … n/4 n/4 n/4 n/4 … Θ(1) Θ(1) … Θ(1) n 4 ∙ (n/2) = 2n 16 ∙ (n/4) = 4n n 2 ∙ Θ(1) = Θ(n 2 ) + Θ(n 2 )

37 The substitution method Claim: T(n) = O(n log n) Proof: by induction on n to show: there are constants c and n 0 such that T(n) ≤ c n log n for all n ≥ n 0 n = 1 ➨ T(1) = 2 ≤ c 1 log 1  n = n 0 = 2 is a base case  Need more base cases? Base cases: n = 2: T(2) = 2T(1) + 2 = 2∙2 + 2 = 6 = c 2 log 2 for c = 3 n = 3: T(3) = 2T(1) + 3 = 2∙2 + 3 = 7 ≤ c 3 log 3 T(n) = 2 if n = 1 2T( n/2 ) + nif n > 1 3/2 = 1, 4/2 = 2 ➨ 3 must also be base case ➨ n 0 = 2

38 The substitution method Claim: T(n) = O(n log n) Proof: by induction on n to show: there are constants c and n 0 such that T(n) ≤ c n log n for all n ≥ n 0  choose c = 3 and n 0 = 2 Inductive step: n > 3 T(n) = 2T( n/2 ) + n ≤ 2 c n/2 log n/2 + n (ind. hyp.) ≤ c n ((log n) - 1) + n ≤ c n log n ■ T(n) = 2 if n = 1 2T( n/2 ) + nif n > 1

39 The substitution method Claim: T(n) = O(n) Proof: by induction on n Base case: n = n 0 T(2) = 2T(1) + 2 = 2c + 2 = O(2) Inductive step: n > n 0 T(n) = 2T( n/2 ) + n = 2O( n/2 ) + n (ind. hyp.) = O(n) ■ T(n) = Θ(1) if n = 1 2T( n/2 ) + nif n > 1 Never use O, Θ, or Ω in a proof by induction!

40 Analysis of algorithms one more example …

41 Example (A) // A is an array of length n 1. n = A.length 2. if n=1 3. then return A[1] 4. else begin 5. Copy A[1… n/2 ] to auxiliary array B[1... n/2 ] 6. Copy A[1… n/2 ] to auxiliary array C[1… n/2 ] 7. b = Example(B); c = Example(C) 8. for i = 1 to n 9. do for j = 1 to i 10. do A[i] = A[j] 11. return 43 12. end Example

42 Example (A) // A is an array of length n 1. n = A.length 2. if n=1 3. then return A[1] 4. else begin 5. Copy A[1… n/2 ] to auxiliary array B[1... n/2 ] 6. Copy A[1… n/2 ] to auxiliary array C[1… n/2 ] 7. b = Example(B); c = Example(C) 8. for i = 1 to n 9. do for j = 1 to i 10. do A[i] = A[j] 11. return 43 12. end Example Let T(n) be the worst case running time of Example on an array of length n. Lines 1,2,3,4,11, and 12 take Θ(1) time. Lines 5 and 6 take Θ(n) time. Line 7 takes Θ(1) + 2 T( n/2 ) time. Lines 8 until 10 take time. If n=1 lines 1,2,3 are executed, else lines 1,2, and 4 until 12 are executed. ➨ T(n): ➨ use master theorem … Θ(1) if n=1 2T(n/2) + Θ(n 2 ) if n>1

43 Tips  Analysis of recursive algorithms: find the recursion and solve with master theorem if possible  Analysis of loops: summations  Some standard recurrences and sums: T(n) = 2T(n/2) + Θ(n) ➨ ½ n(n+1) = Θ(n 2 ) Θ(n 3 ) T(n) = Θ(n log n)

44 Announcements  If you are not in a group yet, talk to me immediately!  Assignment A1 due on Sunday!  You can ask questions during the tutorial on Wednesday.  Email your assignment as.pdf to your tutor, not to me.


Download ppt "2IL50 Data Structures Fall 2015 Lecture 2: Analysis of Algorithms."

Similar presentations


Ads by Google