Presentation is loading. Please wait.

Presentation is loading. Please wait.

Algorithmic Time Complexity Basics Shantanu Dutt ECE Dept. UIC.

Similar presentations


Presentation on theme: "Algorithmic Time Complexity Basics Shantanu Dutt ECE Dept. UIC."— Presentation transcript:

1 Algorithmic Time Complexity Basics Shantanu Dutt ECE Dept. UIC

2 Time Complexity An algorithm time complexity is a function T(n) of problem size n that represents how much time the algorithm will take to complete its task. Note that there could be more than one problem size parameter n, in which case we can denote the time complexity function as T(S), where S is the set of size parameters. E.g., for the shortest path problem on a graph G, we have 2 size parameters, n the # of vertices and e the # of edges (thus T(S) = T(n,e)); for the covering part of Quine-McCluskey (QM), we also have 2 size parameters, m the # of MTs and p the # of PIs (T(S) = T(m,p)). In general, the runtime of an algorithm/program is not only determined by the algorithm, but also the processor speed, memory size and speed, bus speed, etc. However, T(n) generally overlooks technology parameters, and focuses on the intrinsic # of basic steps that an algorithm has to perform as a function of n. The main job of T(n) is representing how the algorithms runtime grows as a function of n, as opposed to giving us an absolute time that the algorithm will take to solve its problem. Thus if T(n) = n, then runtime doubles as n doubles, if T(n) = n 2 then runtime increases by 4 times as n doubles ((2n) 2 ) = 4n 2 ), and if T(n) = 2 n, then runtime increases by an exponent of 2 (i.e., quadratically) as n doubles (2 2n = (2 n ) 2 ).

3 Time Complexity T(n) is determined by counting the number, as a function of n, of basic steps that do not depend upon the input size (i.e., are constant-time operations) that the algorithm has to perform. Note that different basic steps (e.g., addition, multiplication of integers) may take different absolute times, but each counts as 1 operation Count # of basic steps that need to be performed by algorithm A that are independent of n (i.e., are constant time operations) n T(n)

4 Time Complexity: Big O Notation The Big O notation for T(n) specifies the closest or smallest upper bound function f(n) for T(n) (denoted by T(n) = O(f(n)), which essentially says that for a large enough n, T(n) <= f(n). Formally, we say that for two monotonic function T(n) and f(n), T(n) = O(f(n)), if there exist a constant c, and an n 0, s.t. T(n) = n 0 The following is a graphical representation of the T(n) = O(f(n)) relation: T(n) f(n) Ack: Graph obtained from

5 Time Complexity: Big O Notation Example: T(n) = n 2 + 2n + 3, then f(n) = n 2, i.e., T(n) = O(n 2 ). Proof: For n >= 2 (= n 0 ), T(n) = n 2 + 2n + 3 = n 0 In general, to determine the big O complexity term for T(n), determine its most dominant term (the term that will be greater than all other terms for a large enough n), get rid of all constants in the term, and if this simplified term is f(n), then T(n) = O(f(n)) f(n) is also called the asymptotic complexity of T(n) Sometimes, it is important to retain the multiplicative constants or exponentiation constants, or all constants (e.g., in other complexity measures such as hardware cost). Thus it may be important to not simplify 3n 2 to n 2. This depends on the detail to which the algorithm is being analyzed and the complexity of a competing algorithm.

6 Time Complexity: Big Notations Definition of "big Omega" We need the notation for the lower bound. A capital omega Ω notation is used in this case. We say that T(n) = Ω(g(n)) when there exists constant c that T(n) c*g(n) for all sufficiently large n (n >= n 0 ). Examples: – n = Ω(1) – n 2 = Ω(n) – n 2 = Ω(n log(n)) – 2 n + 1 = Ω (n) Definition of "big Theta" To measure the complexity of a particular algorithm, means to find the upper and lower bounds. A new notation is used in this case. We say that T(n) = Θ(g(n)) if and only T(n) = O(g(n)) and T(n) = Ω(g(n)). If a worst-case analysis is quite exact in terms of the dominating term in T(n), we can say T(n) = Θ(g(n)) rather than T(n) = O(g(n)). Examples: – 2 n = Θ(n) – n n + 1 = Θ( n 2 ) Ack: Obtained from

7 Time Complexity: Big Notations Graphs illustrating Big Notations: Ack: Graph obtained from Figure: The asymptotical bounds O and Θ T(n)

8 Time Complexity: Big NotationsWhen we can use and when we cannot Usage Notations: When our analysis of T(n) (or any counting function) is exact at least for the dominating term, we can and should use. Otherwise we use the O notation. Example 1: – m(n) = worst-case # of MTs of a non-trivial n-variable function f( ) (!= 0 or 1). Certainly m(n) < 2 n. If we choose MTs that have an even # of 1s in their binary representation, we have 2n-1 MTs, and the function is non-trivial (it is an even-parity function). Thus m(n) = O(2 n-1 ) = O(2 n ). – Q is whether we can say that m(n), in the worst-case, can be at least of the order of 2 n-1. Since our analysis above was exact, we can say, yes. More formally, if we choose a constant c2 = 0.5, then the # of MTs in an even parity function > c2*2 n-1, and this m(n) > c2*2 n-1. Thus we have m(n) = (2 n-1 ), and thus m(n) = (2 n ) (2 n and 2 n-1 are of the same order as they differ only by a multiplicative constant of 2). – From the above, m(n) = (2 n ). Example 2: – p(n) = worst-case # of PIs of an n-variable function f( ). We know that across all n-variable functions the # of PIs (or the # of ternary notations w/ symbols 0,1,X) are 3 n. So we can say, p(n) = O(3 n ). – However, the above analysis is not exact, in the sense that for this analysis at least, we do not know for certain if there is any function that has p(n) of the order of 3 n. Thus, we cannot say that p(n) = (3 n ), and this we cannot say that p(n) = (3 n ). – Note that as has been shown earlier, we know that the max. # of PIs covering a MT can be of the order of nCn/2 ~ 2 n-1. We can have a function f( ) that is the sum of these PIs. Thus we can say that in the worst-case p(n) = (2 n ). But since 3 n is not of the same order as 2 n (3 n = (2**n) (log 3) ), i.e., 3 n != (2 n ) (3n = (2 n ), but 3n != O(2 n )), we cannot say that p(n) = (3 n ). Ack: Graph obtained from

9 Different Types of Time Complexity Analysis The term analysis of algorithms is used to describe approaches to the study of the performance of algorithms. The most prevalent type is worst-case runtime complexity of the algorithm is the function defined by the maximum number of steps taken on any instance of size a. The best-case runtime complexity of the algorithm is the function defined by the minimum number of steps taken on any instance of size a. The average-case runtime complexity of the algorithm is the function defined by an average number of steps taken on any instance of size a. The amortized runtime complexity of the algorithm is the function defined by a sequence of operations applied to the input of size a and averaged over time. Example. Let us consider an algorithm of sequential searching for an element in an array of size n. – Its worst-case runtime complexity is O(n) = Θ(n) – Its best-case runtime complexity is O(1) – Its average-case runtime complexity is O(n/2) = O(n) = Θ(n) Why? Ack: Obtained from

10 Examples of Time Complexity Analysis Ack: Adapted from This also is = (n 3 ). Generally, important only to count main operations, here only additions and multiplications: n 3 additions and n 3 multiplications, and we arrive at the same O and notation complexities of n 3

11 Examples of Time Complexity Analysis Ack: Obtained from

12 Merge Sort (MS) Analysis One way to solve recurrences, is to open it up and obtain a series of terms to be summed. Below, we obtain the complexity of merge sort directly (ignoring the recurrence). The main basic operation is a comparison of 2 elements/numbers MS(n) MS(n/2) MS(n/4) Merge(n) Merge(n/2) Merge(n): Worst-case comparisons = ? Best-case comparisons = ? Total level 1: worst-case: ? best-case: ? Total level 2: worst-case: ? best-case: ? Total last merge level (?) : worst-case: ? best-case: ? Legend: : computation break-up arrow : data transfer/dependency arrow Total worst-case complexity: ? Total best-case complexity: ?

13 Importance of Asymptotic AnalysisWorst- & Average-Case Ack: Table obtained from Asymptotic analysis tells us whether a technique/algorithm will be practical in all cases (worst-case analysis) or in the average-case (av.-case analysis) for problem sizes of interest

14 Ack: Obtained from Importance of Asymptotic AnalysisWorst- & Average-Case (contd.) Assume each basic oper. takes 1 s T(n); n Table: T(n) values in ss

15 Ack: Obtained from Importance of Asymptotic AnalysisWorst- & Average-Case (contd.) T(n); n

16 Ack: Obtained from Asymptotic Complexity and Efficient Algorithms

17 Ack: Obtained from Concluding Remarks


Download ppt "Algorithmic Time Complexity Basics Shantanu Dutt ECE Dept. UIC."

Similar presentations


Ads by Google