Presentation is loading. Please wait.

Presentation is loading. Please wait.

Prof. Amr Goneid, AUC1 CSCE 210 Data Structures and Algorithms Prof. Amr Goneid AUC Part 3. Introduction to the Analysis of Algorithms.

Similar presentations


Presentation on theme: "Prof. Amr Goneid, AUC1 CSCE 210 Data Structures and Algorithms Prof. Amr Goneid AUC Part 3. Introduction to the Analysis of Algorithms."— Presentation transcript:

1 Prof. Amr Goneid, AUC1 CSCE 210 Data Structures and Algorithms Prof. Amr Goneid AUC Part 3. Introduction to the Analysis of Algorithms

2 Prof. Amr Goneid, AUC2 Introduction to the Analysis of Algorithms Algorithms Analysis of Algorithms Time Complexity Bounds and the Big-O Types of Complexities Rules for Big-O Examples of Algorithm Analysis

3 Prof. Amr Goneid, AUC3 1. Algorithms The word Algorithm comes from the name of Abu Ja’afar Mohamed ibn Musa Al Khowarizmi (c. 825 A.D.) An Algorithm is a procedure to do a certain task An Algorithm is supposed to solve a general, well- specified problem

4 Prof. Amr Goneid, AUC4 Algorithms Example: Sorting Problem: Input: A sequence of keys {a 1, a 2, …, a n } output: A permutation (re-ordering) of the input, {a’ 1, a’ 2, …, a’ n } such that a’ 1 ≤ a’ 2 ≤ …≤ a’ n An instance of the problem might be sorting an array of names or sorting an array of integers. An algorithm is supposed to solve all instances of the problem

5 Prof. Amr Goneid, AUC5 Example: Selection Sort Algorithm Solution: “From those elements that are currently unsorted, find the smallest and place it next in the sorted list” Algorithm: for each i = 0.. n-2 find smallest element in sub-array a[i] to a[n-1] swap that element with that at the start of the sub- array

6 Prof. Amr Goneid, AUC6 Example: Selection Sort Algorithm void selectsort (itemType a[ ], int n) { int i, j, m; for (i = 0; i < n-1; i++) { m = i ; for ( j = i+1; j < n; j++) if (a[j]  a[m]) m = j ; swap (a[i], a[m]); }

7 Prof. Amr Goneid, AUC7 Algorithms Example: Euclide’s Algorithm for the GCD ALGORITHM Euclid(m, n) //Computes gcd(m, n) by Euclid’s algorithm //Input: Two nonnegative, not-both-zero integers m and n //Output: Greatest common divisor of m and n while n ≠ 0 do r ←m mod n m←n n←r return m

8 Prof. Amr Goneid, AUC8 Algorithms Euclide’s Algorithm for the GCD (Recursive Version) function gcd(m, n) if n = 0 return m else return gcd (n, m mod n) "The Euclidean algorithm is the granddaddy of all algorithms, because it is the oldest nontrivial algorithm that has survived to the present day”. Donald Knuth, The Art of Computer Programming, Vol. 2

9 Prof. Amr Goneid, AUC9 Algorithms should be: Transparent Correct Complete Writeable Maintainable Easy to use Efficient

10 Prof. Amr Goneid, AUC10 Algorithms An algorithm should have a clear and Transparent purpose An Algorithm should be Correct, i.e. solves the problem correctly. An Algorithm should be Complete, i.e. solve all instances of the problem An algorithm should be Writeable, i.e. we should be able to express its procedure with available implementation language.

11 Prof. Amr Goneid, AUC11 Algorithms An algorithm should be Maintainable, i.e. easy to debug and modify. An algorithm is supposed to be Easy to use. An Algorithm is supposed to be Efficient. An efficient algorithm uses minimum of resources (space and time). In particular, it should solve the problem in the minimum amount of time.

12 Prof. Amr Goneid, AUC12  The main goal is to determine the cost of running an algorithm and how to reduce that cost. Cost is expressed as Complexity  Time Complexity  Space Complexity 2. Analysis of Algorithms

13 Prof. Amr Goneid, AUC13  Time Complexity Depends on: - Machine Speed - Size of Data and Number of Operations needed (n)  Space Complexity Depends on: - Size of Data - Size of Program Analysis of Algorithms

14 Prof. Amr Goneid, AUC14  Expressed as T(n) = number of operations required.  (n) is the Problem Size: n could be the number of specific operations, or the size of data (e.g. an array) or both. 3. Time Complexity

15 Prof. Amr Goneid, AUC15 Example (1): Factorial Function int factorial (int n) { int i, f = 1; if ( n > 0 ) for (i = 1; i <= n; i++) f * = i ; return f ; } Let T(n) = Number of multiplications. For a given n, then T(n) = n (always) Number of Operations T(n)

16 Prof. Amr Goneid, AUC16 Complexity of the Factorial Algorithm Because T(n) = n always, then T(n) =  (n) T(n) n  (n)

17 Prof. Amr Goneid, AUC17 Location of Minimum Example (2): How many comparisons T(n) are done to find the location of the minimum in an array a[0..n-1]? minimum (a[0..n-1], n) {m = 0 ; for j = 1 to n-1 if (a[j]  a[m]) m = j ; return m; } T(n) = n-1 comparisons Complexity will be T(n) =  (n)

18 Prof. Amr Goneid, AUC18 Number of Operations T(n) Example (3): Linear Search in an array a[0.. n-1 ] linSearch (a, target, n) { for (i = 0 to n-1) if (a i == target) return i; return -1; } T(n) = number of array element comparisons. Best case: T(n) = 1 Worst case:T(n) = n

19 Prof. Amr Goneid, AUC19 Complexity of the Linear Search Algorithm T(n) = 1 in the best case. T(n) = n in the worst case We write that as: T(n) =  (1) and T(n) = O(n) T(n) n O(n)  (1)

20 Prof. Amr Goneid, AUC20 4. Bounds and the Big-O If an algorithm always costs T(n) = f(n) for the same (n) independent of the data, it is an Exact algorithm. In this case we say T(n) =  (f(n)), or Big . The factorial function is an example where T(n) =  (n)

21 Prof. Amr Goneid, AUC21 Bounds If the cost T(n) of an algorithm for a given size (n) changes with the data, it is not an exact algorithm. In this case, we find the Best Case (Lower Bound) T(n) =  (f(n)) or Big  and the Worst Case (Upper Bound) T(n) = O (f(n)) or Big O The linear search function is an example where T(n) =  (1) and T(n) = O(n)

22 Prof. Amr Goneid, AUC22 Constants do not matter IfT(n) = c f(n) we still say that T(n) is O(f(n)) or  (f(n)) or  (f(n)) Constants are always dropped They can be related to machine or language properties Examples: T(n) = 4 (best case) then T(n) =  (1) T(n) = 6 n 2 (worst case) then T(n) = O(n 2 ) T(n) = 3 n (always) then T(n) =  (n)

23 Prof. Amr Goneid, AUC23 Constants do not matter T(n) = 4 (best case) then T(n) =  (1) T(n) = 6 n 2 (worst case) then T(n) = O(n 2 ) T(n) = 3 n (always) then T(n) =  (n) Number of Operations is of Complexity

24 Prof. Amr Goneid, AUC24 5. Types of Complexities Constant Complexity T(n) = constant independent of (n) Runs in constant amount of time  O(1) Example: cout << a[0][0]

25 Prof. Amr Goneid, AUC25 Types of Complexities Logarithmic Complexity Log 2 n = m is equivalent to n=2 m Reduces the problem to half  O(log 2 n) Example: Binary Search T(n) = O(log 2 n) Much faster than Linear Search which has T(n) = O(n)

26 Prof. Amr Goneid, AUC26 Linear vs Logarithmic Complexities n T(n) O(log 2 n) O(n)

27 Prof. Amr Goneid, AUC27 Types of Complexities Polynomial Complexity T(n) = a m n m +…+ a 2 n 2 + a 1 n 1 + a 0 If m=1, then O(a 1 n+a 0 )  O(n) If m > 1, then  O(n m ) as n m dominates

28 Prof. Amr Goneid, AUC28 Polynomial Complexities O(n) O(n 2 ) O(n 3 ) n Log T(n)

29 Prof. Amr Goneid, AUC29 Types of Complexities Exponential Example: List all the subsets of a set of n elements {a,b,c} {a,b,c}, {a,b},{a,c},{b,c},{a},{b},{c},{} Number of operations T(n) = O(2 n ) Exponential expansion of the problem  O(a n ) where a is a constant greater than 1

30 Prof. Amr Goneid, AUC30 Exponential Vs Polynomial Log T(n) n O(n 3 ) O(n) O(2 n )

31 Prof. Amr Goneid, AUC31 Types of Complexities Factorial time Algorithms Example: Traveling salesperson problem (TSP): Find the best route to take in visiting n cities away from home. What are the number of possible routes? For 3 cities: (A,B,C)

32 Prof. Amr Goneid, AUC32 Possible routes in a TSP –Number of operations = 3!=6, Hence T(n) = n! –Expansion of the problem  O(n!)

33 Prof. Amr Goneid, AUC33 Exponential Vs Factorial Log T(n) n O(2 n ) O(n!) O(n n )

34 Prof. Amr Goneid, AUC34 Execution Time Example Example: For the exponential algorithm of listing all subsets of a given set, assume the set size to be of 1024 elements Number of operations is 2 1024 about 1.8*10 308 If we can list a subset every nanosecond the process will take 5.7 * 10 291 yr!!!

35 Prof. Amr Goneid, AUC35 Polynomial & Non-polynomial Times P (Polynomial) Times: O(1), O(log n), O(log n) 2, O(n), O(n logn), O(n 2 ), O(n 3 ), …. NP (Non-Polynomial) Times: O(2 n ), O(e n ), O(n!), O(n n ), …..

36 Prof. Amr Goneid, AUC36 6. Rules for Big-O RuleExample For constant k, O(k) < O(n) O(7) = O(1) < O(n) For constant k, O(kf) = O(f) (Constants are dropped) O(2n) = O(n) O(|f|+|g|) = O(|f|) + O(|g|) = Max (O(|f|), O(|g|) O(6n 2 +n)=O(6n 2 )+O(n) = O(n 2 ) Nesting of loop O(g) within a loop O(f) gives O(f*g) O(n 4 *n 2 )=O(n 6 ) O(n m-1 ) < O(n m )O(n 2 ) < O(n 3 ) O(log n)  O(n) O(log 2 n) < O(n)

37 Prof. Amr Goneid, AUC37 Rules for Big-O RuleExample All logarithms grow at the same rate log 2 n =  (log 3 n) Exponential functions grow faster than powers O(n 3 ) < O(2 n ) Factorials grow faster than exponentials O(2 n ) < O(n!)

38 Prof. Amr Goneid, AUC38 Exercises Which function has smaller complexity ? f = 100 n 4 g = n 5 f = log(log n 3 )g = log n f = n 2 g = n log n f = 50 n 5 + n 2 + ng = n 5 f = e n g = n!

39 Prof. Amr Goneid, AUC39 7. Examples of Algorithm Analysis for (i = 1; i <= n/2; i++) { O(1); for (j = 1; j <= n*n; j++) { O(1); } } T(n) = (n 2 + 1) * n/2 = n 3 /2 + n/2 Hence T(n) = O(n 3 )

40 Prof. Amr Goneid, AUC40 Examples of Algorithm Analysis for (i = 1; i <= n/2; i++) { O(1); } for (j = 1; j <= n*n; j++) { O(1); } T(n) = (n/2) + n 2 Hence T(n) = O(n 2 )

41 Prof. Amr Goneid, AUC41 while Loop k = n; while (k > 0) { d = k % 2; cout << d; k /= 2; } Each iteration cuts the value of (k) by half and the final iteration reduces k to zero. Hence the number of iterations T(n) =  log 2 n  + 1 Hence T(n) = O(log 2 n)

42 Prof. Amr Goneid, AUC42 A function to reverse an array(1) A function to reverse an array a[ ]. Version(1): using a temporary array b[ ]. int a[N]; void reverse_array (int a[ ], int n) { int b[N]; for (int i = 0; i < n ; i++) b[i] = a[n-i-1]; for (int i = 0; i < n ; i++) a[i] = b[i]; } Consider T(n) to be the number array accesses, then T(n) = 2n + 2n = 4n Hence T(n) = O(n), with extra space b[N]

43 Prof. Amr Goneid, AUC43 A function to reverse an array(2) A function to reverse an array a[ ]. Version(2): using swapping. int a[N]; void reverse_array (int a[ ], int n) { int temp; for (int i = 0; i < n/2 ; i++) { temp = a[i]; a[i] = a[n-i-1]; a[n-i-1] = temp; } } Consider T(n) to be the number array accesses, then T(n) = 4(n/2) = 2n Hence T(n) = O(n), without extra space

44 Prof. Amr Goneid, AUC44 Index of the minimum element A function to return the index of the minimum element int index_of_min ( int a[ ], int s, int e ) { int imin = s; for (int i = s+1; i <= e ; i++) if (a[i] < a[imin]) imin = i ; return imin ; } Consider T(n) to be the number of times we compare array elements. When invoked as index_of_min (a, 0, n-1 ), then T(n) = e – (s+1) +1 = e - s = n-1 Hence T(n) = O(n)

45 Prof. Amr Goneid, AUC45 Analysis of Selection Sort void SelSort(int a[ ], int n) { int m, temp; for (int i = 0; i < n-1; i++) { m = index_of_min(a, i, n-1); temp = a[i]; a[i] = a[m]; a[m] = temp; } T(n) = number of array comparisons. For a given iteration (i), T i (n) = n-1-i and T(n) = T 0 (n) + T 1 (n) + …. T n-2 (n) = (n-1) + (n-2) + …+ 1 = n(n-1)/2 = 0.5 n 2 – 0.5 n = O(n 2 ) This cost is the same for any data of size (n) (exact algorithm) Costs n-1-i

46 Prof. Amr Goneid, AUC46 SelectSort vs QuickSort n T(n) Selectsort O(n 2 ) Quicksort O(n log 2 n)

47 Prof. Amr Goneid, AUC47 Analysis of Binary Search int BinSearch( int a[ ], int n, int x) { int L, M, H; bool found = false; L = 0; H = n; while ( (L < H) && ! found) { M = (L + H)/2;// Approximate Middle if (x == a[M]) found = true;// Match else if (x > a[M]) L = M+1;// Discard left half else H = M – 1;// Discard right half } if (found) return M; else return -1; } In the best case, a match occurs in the first iteration, thus T(n) =  (1). In the worst case (not found) half of the array is discarded in each iteration. Hence T(n) = log 2 n +1 = O(log 2 n)

48 Prof. Amr Goneid, AUC48 Linear vs Binary Search n T(n) O(log 2 n) O(n) Linear Search Binary Search

49 Prof. Amr Goneid, AUC49 Polynomial Evaluation A polynomial of degree n can be evaluated directly as P(x) = a n x n + a n-1 x n-1 + …..+ a i x i + ….+ a 1 x + a 0 x i is computed by a function pow(x,i) using i-1 multiplications.The direct algorithm is (consider x and a[ ] to be of type double): double p = a[0]; for (int i = 1; i <= n; i++) p = p + a[i] * pow(x,i); A polynomial of degree n can be evaluated directly as: P(x) = a n x n + a n-1 x n-1 + …..+ a i x i + ….+ a 1 x + a 0 x i is computed by a function pow(x,i) using (i-1) multiplications. The direct algorithm is: (consider x and a[ ] to be of type double): double p = a[0]; for (int i = 1; i <= n; i++) p = p + a[i] * pow(x,i);

50 Prof. Amr Goneid, AUC50 Polynomial Evaluation The number of double arithmetic operations inside the loop is 2 + (i-1), Hence, We never use this method because it is quadratic. Instead, we use Horner’s method.

51 Prof. Amr Goneid, AUC51 Horner’s Algorithm William George Horner (1819) introduced a factorization in the form: P(x) = (…(((a n )x + a n-1 )x + a n-2 )x +..+a 1 )x + a 0 The corresponding algorithm is: double p = a[n]; for ( i = n-1; i >= 0; i--) p = p * x + a[i];

52 Prof. Amr Goneid, AUC52 Horner’s Algorithm Analysis of this algorithm gives for the number of double arithmetic operations: This is a faster linear algorithm

53 Prof. Amr Goneid, AUC53 Learn on your own about: Proving Correctness of algorithms Analyzing Recursive Functions Standard Algorithms in C++


Download ppt "Prof. Amr Goneid, AUC1 CSCE 210 Data Structures and Algorithms Prof. Amr Goneid AUC Part 3. Introduction to the Analysis of Algorithms."

Similar presentations


Ads by Google