Download presentation
Presentation is loading. Please wait.
Published byDavid Goodman Modified over 8 years ago
1
Complexity of Algorithms 15-211 Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005
2
Annoucements
3
Algorithm Analysis The amount of time an algorithm takes to execute depends on many things –Processor, compiler, language, data size, memory management etc.. Model of Computation –uni-processor, RAM, Sequential The run time of an algorithm can be described using input size n –Example: how long does the bubble sort takes to run a data set of size n?
4
Work Area
5
Insertion Sort Assume A[0..i] is sorted after i rounds Algorithm for (i=1; i<n; i++) j = i; k = A[i]; while (j>0 && k < A[j-1]) { A[j] = A[j-1]; j-- } A[j] = k; How complex is insertion sort? Count operations
6
Work Area
7
Asymptotic Notations A complexity of an algorithm can be defined in terms of functions whose domains are the set of natural numbers Theta Notation (Θ Notation) –For a given function, g(n), –Θ(g(n)) = { f(n) : there exists c 1, c 2 and n 0 such that –0 ≤ c 1 g(n) ≤ f(n) ≤ c 2 g(n) for all n≥ n 0 } Examples
8
Big Omega Notation Ω – notation provides a lower bound Given function g(n), Ω (g(n)) = { f(n) : there exists positive constants such that 0 ≤ c g(n) ≤ f(n) for all n≥ n 0 } Examples Theorem: For any two functions f(n) and g(n), f(n) = Θ(g(n)) if and only if f(n) = O(g(n)) and f(n) = Ω(g(n)) Proof : Homework – but do not submit
9
The “big-O” notation Definition: Given a function T that describes the running time of an algorithm on an input of size N: T(N) = O(f(N)) if –there are positive constants c and n 0 such that T(N) c f(N) when N n 0. –c is called the constant factor. –The n 0 constant says that at some point, c f(N) is always bigger than T(N)
10
Upper bounds N c f(N) T(N) n0n0
11
Big-O When T(N) = O(f(N)), we are saying that T(N) grows no faster than f(N). –I.e., f(N) is an upper bound on T(N). Put another way: –Eventually, for some “large enough” N, f(N) dominates T(N).
12
Big-O characteristics log k (N) = O(N) for any constant k. –I.e, logarithms grow very slowly.
13
Big-O characteristics If T 1 (N) = O(f(N)) and T 2 (N) = O(g(N)) then –T 1 (N) + T 2 (N) = max(O(f(N)), O(g(N)). –In other words, the bigger task always dominates eventually. –Also: –T 1 (N) T 2 (N) = O(f(N) g(N)).
14
Some conventions Normally, when we write O(…), we do not include constants. –Usually don’t even include low-order polynomial terms. –E.g., say O(n 2 ), not O(8n 2 ) or O(n 2 +n). We can do this because we are usually interested in the asymptotic complexity of an algorithm. –I.e., how the algorithm behaves “in the limit”.
15
Some common functions
16
Recurrence Relations We can describe algorithms using recurrence equation –T(0) = 0 –T(n) = T(n-1) + 1 We can solve recurrence relations to get a closed form solution to T(n) Methods of solving a recurrence equation –Repeated substitution method
17
Analysis of some known algorithms Linear Search
18
Analysis of some known algorithms Binary Search
19
Analysis of some known algorithms Merge Sort
20
Reversing a List Reversing a list –Rev(L) = c if L is empty –Rev(L) = append(first(L), Rev(tail(L)) Append –T(n) = 1 + T(n-1) Reverse –S(n) = S(n-1) + T(n-1) + 1
21
Properties
22
Integration techniques for adding sums Suppose you need to find the sum –∑ log i (i=1…n) How do you find this sum? –∑ log i ≤ ∫ log x dx = ?? = Θ (n log n) –Homework
23
Work area
24
Algorithm Run times Linear – O(c ) Logarithmic – O(log n) Linear – O( n) N log n - O( n log n) Quadratic - O(n 2 ) Cubic - O(n 3 ) Exponential – O(2 n )
25
Some observations A constant algorithm always runs at the same “speed” – regardless of n –Examples A linear algorithm T(n) = c n –Examples A quadratic algorithm T(n) = c n 2 –Double the input size – runs 1/4 slower
26
Some observations An exponential algorithm T(n) = c e n –Examples –Increase the data size by 1, run time increases by a factor of e (where say e = 2)
27
Master Theorem Master theorem provides a “cookbook” method for solving recurrences –T(n) = a T(n/b) + f(n) –a≥1, b > 1, and f(n) is an asymptotically positive function –Describes the run time of an algorithm, divides into a sub problems, each with size n/b
28
Observations based on Master Theorem (without proof) If f(n) = Θ(n log b a ) then T(n) = Θ(n log b a log b n) –Example – Merge Sort (a =, b = ) Example – Binary Search (a =, b = )
29
Master Theorem Solution
30
Conclusion Look for assignment 0 in Bb Solve some recurrence relations –T(n) = 1 + T(n-1), T(1) = c –T(n) = n + T(n-1), T(1) = c –T(n) = n 2 + T(n-1) –T(n) = 1 + 2 T(n/2) –T(n) = 2T(n/2) (assume T(1) = c for all) Read Chapter 5 of the Weiss Book or any other notes on Algorithm analysis
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.