Presentation is loading. Please wait.

Presentation is loading. Please wait.

E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.

Similar presentations


Presentation on theme: "E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a."— Presentation transcript:

1 E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a slow program it is likely not to be used  a program that demands too much memory may not be executable on the machines that are available

2 E.G.M. PetrakisAlgorithm Analysis2 Memory - Time  It is important to be able to predict how an algorithm behaves under a range of possible conditions  usually different inputs  The emphasis is on the time rather than on the space efficiency  memory is cheap!

3 E.G.M. PetrakisAlgorithm Analysis3 Running Time  The time to solve a problem increases with the size of the input  measure the rate at which the time increases  e.g., linearly, quadratically, exponentially  This measure focuses on intrinsic characteristics of the algorithm rather than incidental factors  e.g., computer speed, coding tricks, optimizations, quality of compiler

4 E.G.M. PetrakisAlgorithm Analysis4 Additional Consideration  Sometimes, simple but less efficient algorithms are preferred over more efficient ones  the more efficient algorithms are not always easy to implement  the program is to be changed soon  time may not be critical  the program will run only few times (e.g., once a year)  the size of the input is always very small

5 E.G.M. PetrakisAlgorithm Analysis5 Bubble sort  Count the number of steps executed by the algorithm  step: basic operation  ignore operations that are executed a constant number of times for ( i = 1; i < ( n – 1); i ++) for (j = n; j < (i + 1); j --) if ( A[ j – 1 ] > A[ j ] ) T(n) = c n2 swap(A[ j ], A[ j – 1 ]); //step

6 E.G.M. PetrakisAlgorithm Analysis6 Asymptotic Algorithm Analysis  Measures the efficiency of a program as the size of the input becomes large  focuses on “growth rate”: the rate at which the running time of an algorithm grows as the size of the input becomes “big”  Complexity => Growth rate  focuses on the upper and lower bounds of the growth rates  ignores all constant factors

7 E.G.M. PetrakisAlgorithm Analysis7 Growth Rate  Depending on the order of n an algorithm can beof  Linear time Complexity: T(n) = c n  Quadratic Complexity: T(n) = c n 2  Exponential Complexity: T(n) = c n k  The difference between an algorithm whose running time is T(n) = 10n and another with running time is T(n) = 2n 2 is tremendous  for n > 5 the linear algorithm is must faster despite the fact that it has larger constant  it is slower only for small n but, for such n, both algorithms are very fast

8 E.G.M. PetrakisAlgorithm Analysis8

9 E.G.M. PetrakisAlgorithm Analysis9 Upper – Lower Bounds  Several terms are used to describe the running time of an algorithm:  The upper bound to its growth rate which is expressed by the Big-Oh notation  f(n) is upper bound of T(n)  T(n) is in O(f(n))  The lower bound to it growth rate which is expressed by the Omega notation  g(n) is lower bound of T(n)  T(n) is in Ω(g(n))  If the upper and lower bounds are the same  T(n) is in Θ(g(n))

10 E.G.M. PetrakisAlgorithm Analysis10 Big O (upper bound)  Definition:  Example:

11 E.G.M. PetrakisAlgorithm Analysis11 Properties of O  O is transitive: if  If

12 E.G.M. PetrakisAlgorithm Analysis12 More on Upper Bounds  T(n)  O(g(n))  g(n) :upper bound for T(n)  There exist many upper bounds  e.g., T(n) = n 2  O(n 2 ), O(n 3 ) … O(n 100 )...  Find the smallest upper bound!!  O notation “hides” the constants  e.g., cn 2, c 2 n 2, c!n 2,…c k n 2  O(n 2 )

13 E.G.M. PetrakisAlgorithm Analysis13 O in Practice  Among different algorithms solving the same problem, prefer the algorithm with the smaller “rate of growth” O  it will be faster for large n  Ο(1) < Ο(log n) < Ο(n) < Ο(n log n) < Ο(n 2 ) < Ο(n 3 ) < …< Ο(2 n ) < O(n n )  Ο(n), Ο(n log n), Ο(n 2 ) < Ο(n 3 ): polynomial  Ο(log n): logarithmic  O(n k ), O(2 n ), (n!), O(n n ): exponential!!

14 E.G.M. PetrakisAlgorithm Analysis14 100 0 500200 n 100 20 Complexity 58 min 35 yrs 1.0 2.7 hrs 0.1.0001 125 cent 220 da y 1.1.4 27 hrs 21 min 1.0 min 101.0.08 2 min 254.01.0.25.04 104.51.5.6.3.91000nlogn 1.0.5.2.1.5.021000n 10 -6 sec/command

15 E.G.M. PetrakisAlgorithm Analysis15 Omega Ω (lower bound)  Definition:  g(n) is a lower bound of the rate of growth of f(n)  f(n) increases faster than g(n) as n increases  e.g: T(n) = 6n 2 – 2n + 7 =>T(n) in Ω(n 2 )

16 E.G.M. PetrakisAlgorithm Analysis16 More on Lower Bounds  Ω provides lower bound of the growth rate of T(n)  if there exist many lower bounds  T(n) = n 4  Ω(n),  Ω(n 2 ),  Ω(n 3 ),  Ω(n 4 )  find the larger one => T(n)  Ω(n 4 )

17 E.G.M. PetrakisAlgorithm Analysis17 Thetα Θ  If the lower and upper bounds are the same:  e.g.: T(n) = n 3 +135n  Θ(n 3 )

18 E.G.M. PetrakisAlgorithm Analysis18 Theorem: T(n)=Σ α i n i  Θ(n m )  T(n) = α n n m +α n-1 n m-1 +…+α 1 n+ α 0  O(n m ) Proof: T(n)  |T(n)|  |α n n m |+|α m-1 n m-1 |+…|α 1 n|+|α 0 |  n m {|α m | + 1/n|α m-1 |+…+1/n m-1 |α 1 |+1/n m |α 0 |}   n m {|α m |+|α m-1 |+…+|α 1 |+|α 0 |}  T(n)  O(n m )

19 E.G.M. PetrakisAlgorithm Analysis19 Theorem (cont.) b)T(n) = α m n m + α m-1 n m-1 +… α 1 n + α 0 >= cn m + cn m-1 + … cn + c >= cn m where c = min{α m, α m-1,… α 1, α 0 } => T(n)  Ω(n m ) (a), (b) => Τ(n)  Θ(n m )

20 E.G.M. PetrakisAlgorithm Analysis20 Theorem: (a) (b) (either or )

21 E.G.M. PetrakisAlgorithm Analysis21 Recursive Relations (1)

22 E.G.M. PetrakisAlgorithm Analysis22 Fibonacci Relation

23 E.G.M. PetrakisAlgorithm Analysis23 Fibonacci relation (cont.)  From (a), (b):  It can be proven that F(n)  Θ(φ n ) where golden ratio

24 E.G.M. PetrakisAlgorithm Analysis24 Binary Search int BSearch(int table[a..b], key, n) { if (a > b) return (– 1) ; int middle = (a + b)/2 ; if (T[middle] == key) return (middle); else if ( key < T[middle] ) return BSearch(T[a..middle – 1], key); else return BSearch(T[middle + 1..b], key); }

25 E.G.M. PetrakisAlgorithm Analysis25 Analysis of Binary Search  Let n = b – a + 1: size of array and n = 2 k – 1 for some k (e.g., 1, 3, 5 etc)  The size of each sub-array is 2 k-1 –1  c: execution time of first command  d: execution time outside recursion  T: execution time

26 E.G.M. PetrakisAlgorithm Analysis26 Binary Search (cont.)

27 E.G.M. PetrakisAlgorithm Analysis27 Binary Search for Random n  T(n) <= T(n+1) <=...T(u(n)) where u(n)=2 k – 1 for some k: n < u(n) < 2n  Then, T(n) d log(2n + 1) + c => T(n)  O(log n)

28 E.G.M. PetrakisAlgorithm Analysis28 Merge sort list mergesort(list L, int n) { if (n == 1) return (L); L 1 = lower half of L; L 2 = upper half of L; return merge (mergesort(L 1,n/2), mergesort(L 2,n/2) ); } n: size of array L (assume L some power of 2) merge: merges the sorted L 1, L 2 in a sorted array

29 E.G.M. PetrakisAlgorithm Analysis29

30 E.G.M. PetrakisAlgorithm Analysis30 Analysis of Merge sort T(n/4) T(n/2) T(n/4) T(n) T(n/4) T(n/2) T(n/4) O(n) cost of merge => Complexity O(n log n) Step 1Step 2 log n steps

31 E.G.M. PetrakisAlgorithm Analysis31 Analysis of Merge sort  Two ways:  recursion analysis  induction

32 E.G.M. PetrakisAlgorithm Analysis32 Induction  Let T(n) = O(n log n) then T(n) can be written as T(n) 0  Proof: (1) for n = 1, true if b >= c 1 (2) let T(k) = a k log k + b for k=1,2,..n-1 (3) for k=n prove that T(n) <= a n logn + b For k = n/2 : T(n/2) <= a n/2 log n/2 + b therefore, T(n) can be written as T(n) = 2 {a n/2 log n/2 + b} + c 2 n = a n(log n – 1)+ 2b + c 2 n =

33 E.G.M. PetrakisAlgorithm Analysis33 Induction (cont.)  T(n) = a n log n – a n + 2 b + c 2 n <= a n log n + c 2 nlog n + 2b  Let b = c 1 and a = c 1 + c 2 => T(n) T(n) <= C n log n + B

34 E.G.M. PetrakisAlgorithm Analysis34 Recursion analysis of mergesort  T(n) = 2T(n/2) + c 2 n <= <= 4T(n/4) + 2 c 2 n <= <= 8T(n/8) + 3 c 2 n <= ……  T(n) <= 2 i T(n/2 i ) + ic 2 n where n = 2 i  T(n) T(n) T(n) T(n) <= C n log n

35 E.G.M. PetrakisAlgorithm Analysis35 Best, Worst, Average Cases  For some algorithms, different inputs require different amounts of time  e.g., sequential search in an array of size n  worst-case: element not in array => T(n) = n  best-case: element is first in array =>T(n) = 1  average-case: element between 1, n =>T(n) = n/2... n comparisons

36 E.G.M. PetrakisAlgorithm Analysis36 Comparison  The best-case is not representative  happens only rarely  useful when it has high probability  The worst-case is very useful:  the algorithm performs at least that well  very important for real-time applications  The average-case represents the typical behavior of the algorithm  not always possible  knowledge about data distribution is necessary


Download ppt "E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a."

Similar presentations


Ads by Google