Presentation is loading. Please wait.

Presentation is loading. Please wait.

تصميم وتحليل الخوارزميات عال311 Chapter 3 Growth of Functions

Similar presentations


Presentation on theme: "تصميم وتحليل الخوارزميات عال311 Chapter 3 Growth of Functions"— Presentation transcript:

1 تصميم وتحليل الخوارزميات عال311 Chapter 3 Growth of Functions

2 How fast will your program run?
The running time of any program will depend on: The algorithm The input Your implementation of the algorithm in a programming language The compiler you use The OS on your computer Your computer hardware other things: temperature outside; other programs on your computer; …

3 Complexity Complexity is the number of steps required to solve a problem The goal is to find the best algorithm to solve the problem with a less number of steps

4 Measures of Algorithm Complexity
Let T(n) denote the number of operations required by an algorithm to solve a given problem. Often T(n) depends on the input n, there are 3 cases: Worst-case complexity, Best-case complexity, Average-case complexity of an algorithm

5 Measures of Algorithm Complexity
Worst-Case Running Time: the longest time for any input size of n provides an upper bound on running time for any input Best-Case Running Time: the shortest time for any input size of n provides lower bound on running time for any input Average-Case Behavior: the expected performance averaged over all possible inputs it is generally better than worst case behavior, but sometimes it’s roughly as bad as worst case

6 Example: Sequential Search
Sequential Search Algorithm Step Count // Searches for x in array A of n items returns // index of found item, or n+1 if not found Seq_Search( A[n]: array, x: item){ i = 1 while ((i  n) & (A[i] <> x)){ i = i +1 } return i _________________________________ Total time 1 n + 1 n ____________ =2n+3 = 2(n+1)

7 Example: Sequential Search
worst-case running time (Sequential Search) when x is not in the original array A in this case, while loop needs 2(n + 1) comparisons + c other operations So, T(n) = 2(n + 1) + c  Linear complexity best-case running time (Sequential Search) when x is found in A[1] in this case, while loop needs 2 comparisons + c other operations So, T(n) = 2 + c  Constant complexity

8 Big-O notation (Upper Bound – Worst Case)
n, n + 1, n + 80, 40n, n + lg n is O(n) n n is O(n1.1) n n is O(n2) 3n2 + 6n + lg n is O(n2) O(1)  O(lg n)  O((lg n)3)  O(n)  O(n2)  O(n3)  O(nlg n)  O(2sqrt(n))  O(2n)  O(n!)  O(nn) Constant  Logarithmic  Linear  Quadratic  Cubic  Exponential  Factorial

9 Some Common Name for Complexity
Constant time O(lg n) Logarithmic time O(lg2 n) Log-squared time O(n) Linear time O(n2) Quadratic time O(n3) Cubic time O(ni) for some i Polynomial time O(2n) Exponential time

10 (Mathematics) Exponents
x0 = 1 x1 = x x-1 = 1/x xa . xb = xa+b xa / xb = xa-b (xa)b = (xb)a = xab xn + xn = 2xn  x2n 2n + 2n = 2.2n = 2n+1

11 Summation

12 Factorials n! (“n factorial”) is defined for integers n  0 as n! =
n!  nn for n  2 Examples: 5! = 1 * 2 * 3 *4 *5 = 120 2! = 1*2 = 2 0! = 1


Download ppt "تصميم وتحليل الخوارزميات عال311 Chapter 3 Growth of Functions"

Similar presentations


Ads by Google