Presentation is loading. Please wait.

Presentation is loading. Please wait.

Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.

Similar presentations


Presentation on theme: "Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma."— Presentation transcript:

1 Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma

2 Objectives (Section 7.6) The concepts of space complexity and time complexity Use the step count to derive a function of the time complexity of a program Asymptotics and orders of magnitude The big-O and related notations Time complexity of recursive algorithms

3 Motivation 35101620…22283660 arr: sorted n

4 Evaluate An Algorithm Two important measures to evaluate an algorithm  Space complexity  Time complexity Space complexity  The maximum storage space needed for an algorithm  Expressed as a function of the problem size  Relatively easy to evaluate

5 Time complexity  Determining the number of steps (operations) needed as a function of the problem size  Our focus

6 Step Count Count the exact number of steps needed for an algorithm as a function of the problem size Each atomic operation is counted as one step:  Arithmetic operations  Comparison operations  Other operations, such as “assignment” and “return”

7 The running time is 1 1 Algorithm 1 1 int count_1(int n) 2 { 3 sum = 0 4 for i=1 to n { 5 for j=i to n { 6 sum++ 7 } 8 } 9 return sum 10 } Note:

8 1 int count_2(int n) 2 { 3 sum = 0 4 for i=1 to n { 5 sum += n+1-i 6 } 7 return sum 8 } 1 1 The running time is Algorithm 2

9 1 int count_3(int n) 2 { 3 sum = n(n+1)/2 4 return sum 5 } 4 The running time is 5 time unit 1 Algorithm 3

10 Asymptotics An exact step count is usually unnecessary  Too dependent on programming languages and programmer’s style But make little difference in whether the algorithm is feasible or not A change in fundamental method can make a vital difference  If the number of operations is proportional to n, then double n will double the running time  If the number of operations is proportional to 2 n, doubling n will square the number of operations

11 Example:  Assume that a computation that takes 1 second may involve 10 6 operations  Also assume that double the problem size will require 10 12 operations  Increase running time from 1 second to 11.5 days 10 12 operations / 10 6 operations per second = 10 6 second  11.5 days

12 Instead of an exact step count, we want a notation that  accurately reflects the increase of computation time with the size, but  ignores details that has little effect on the total Asymptotics: the study of functions of a parameter n, as n becomes larger and larger without bound

13 Orders of Magnitude The idea:  Suppose function f(n) measures the amount of work done by an algorithm on a problem of size n  Compare f(n) for large values of n, with some well-known function g(n) whose behavior we already understand To compare f(n) against g(n):  take the quotient f(n) / g(n), and  take the limit of the quotient as n increases without bound

14 Definition  If then: f(n) has strictly smaller order of magnitude than g(n).  If is finite and nonzero then: f(n) has the same order of magnitude as g(n).  If then: f(n) has strictly greater order of magnitude than g(n).

15 Common choices for g(n):  g(n) = 1 Constant function  g(n) = log n Logarithmic function  g(n) = n Linear function  g(n) = n 2 Quadratic function  g(n) = n 3 Cubic function  g(n) = 2 n Exponential function

16 Notes:  The second case, when f(n) and g(n) have the same order of magnitude, includes all values of the limit except 0 and  Changing the running time of an algorithm by any nonzero constant factor will not affect its order of magnitude

17 Polynomials If f(n) is a polynomial in n with degree r, then f(n) has the same order of magnitude as n r If r < s, then n r has strictly smaller order of magnitude than n s Example 1:  3n 2 - 100n - 25 has strictly smaller order than n 3

18 Example 2:  3n 2 - 100n - 25 has strictly greater order than n Example 3:  3n 2 - 100n - 25 has the same order as n 2

19 Logarithms The order of magnitude of a logarithm does not depend on the base for the logarithms  Let log a n and log b n be logarithms to two different bases a > 1 and b > 1  Since the base for logarithms makes no difference to the order of magnitude, we just generally write log without a base

20 Compare the order of magnitude of a logarithm log n with a power of n, say n r ( r > 0)  It is difficult to calculate the quotient log n / n r  Need some mathematical tool L’Hôpital’s Rule  Suppose that: f(n) and g(n) are differentiable functions for all sufficiently large n, with derivatives f’(n) and g’(n), respectively  and  exists  Then exists and

21 Use L’Hôpital’s Rule Conclusion  log n has strictly smaller order of magnitude than any positive power n r of n, r > 0.

22 Exponential Functions Compare the order of magnitude of an exponential function a n with a power of n, and n r (r > 0) Use L’Hôpital’s Rule again (pp. 308) Conclusion:  Any exponential function a n for any real number a > 1 has strictly greater order of magnitude than any power n r of n, for any positive integer r

23 Compare the order of magnitude of two exponential functions with different bases, a n and b n  Assume 0  a < b, Conclusion:  If 0  a < b then a n has strictly smaller order of magnitude than b n

24 Common Orders For most algorithm analyses, only a short list of functions is needed  1 (constant), log n (logarithmic), n (linear), n 2 (quadratic), n 3 (cubic), 2 n (exponential)  They are in strictly increasing order of magnitude One more important function: n log n (see pp. 309)  The order of some advanced sorting algorithms  n log n has strictly greater order of magnitude than n  n log n has strictly smaller order of magnitude than any power n r for any r > 1

25 Growth Rate of Common Functions

26

27 The Big-O and Related Notations These notations are pronounced “little oh”, “Big Oh”, “Big Theta”, and “Big Omega”, respectively.

28 Examples  On a list of length n, sequential search has running time  (n)  On an ordered list of length n, binary search has running time  (log n)  Retrieval from a contiguous list of length n has running time O(1)  Retrieval from a linked list of length n has running time O(n).  Any algorithm that uses comparisons of keys to search a list of length n must make  (log n) comparisons of keys

29 If f(n) is a polynomial in n of degree r, then f(n) is  (n r ) If r < s, then n r is o(n s ) If a > 1 and b > 1, then log a ( n) is  (log b ( n)) log n is o(n r ) for any r > 0 For any real number a > 1 and any positive integer r, n r is o(a n ) If 0  a < b then a n is o(b n )

30 O(n) 1 int count_0(int n) 2 { 3 sum = 0 4 for i=1 to n { 5 for j=1 to n { 6 If i<=j then 7 sum++ 8 } 9 } 10 return sum 11 } O(1) The running time is O(n 2 ) O(n 2 ) Algorithm 4

31 Summary of Running Times AlgorithmRunning TimeOrder of Running Time Algorithm 1n2n2 Algorithm 25n+2n Algorithm 35Constant

32 Asymptotic Running Times AlgorithmRunning TimeAsymptotic Bound Algorithm 1O(n2)O(n2) Algorithm 25n+2O(n)O(n) Algorithm 35O(1) Algorithm 4-O(n2)O(n2)

33 More Examples 1) int x = 0; for (int i = 0; i < 100; i++) x += i; 2) int x = 0; for (int i = 0; i < n 2 ; i++) x += i; * Assume that the value of n is the size of the problem

34 3) int x = 0; for (int i = 1; i < n; i *= 2) x += i; 4) int x = 0; for (int i = 1; i < n; i++) for (int j = 1; j < i; j++) x += i + j;

35 5) int x = 0; for (int i = 1; i < n; i++) for (int j = i; j < 100; j++) x += i + j; 6) int x = 0; for (int i = 1; i < n; i++) for (int j = n; j > i; j /= 3) x += i + j; 7) int x = 0; for (int i = 1; i < n * n; i++) for (int j = 1; j < i; j++) x += i + j;

36 Review: Arithmetic Sequences/Progressions An arithmetic sequence is a sequence of numbers such that the difference of any two successive members of the sequence is a constant If the first term of an arithmetic sequence is a 1 and the common difference of successive members is d, then the nth term a n of the sequence is:

37 Analyzing Recursive Algorithms Often a recurrence equation is used as the starting point to analyze a recursive algorithm  In the recurrence equation, T(n) denotes the running time of the recursive algorithm for an input of size n We will try to convert the recurrence equation into a closed form equation to have a better understanding of the time complexity  Closed Form: No reference to T(n) on the right side of the equation  Conversions to the closed form solution can be very challenging

38 Example: Factorial int factorial (int n) /* Pre: n is an integer no less than 0 Post: The factorial of n (n!) is returned Uses: The function factorial recursively */ { if (n == 0) return 1; else return n * factorial (n - 1); } 1

39  The time complexity of factorial(n) is:  T(n) is an arithmetic sequence with the common difference 4 of successive members and T(0) equals 2  The time complexity of factorial is O(n) 3+1: The comparison is included

40 Recurrence Equations Examples Divide and conquer: Recursive merge sorting template void Sortable_list :: recursive_merge_sort( int low, int high) /* Post: The entries of the sortable list between index low and high have been rearranged so that their keys are sorted into non- decreasing order. Uses: The contiguous List */ { if (high > low) { recursive_merge_sort(low, (high + low) / 2); recursive_merge_sort((high + low) / 2 + 1, high); merge(low, high); }

41  The time complexity of recursive_merge_sort is:  To obtain a closed form equation for T(n), we assume n is a power of 2  When i = log 2 n, we have:  The time complexity is O(nlogn)

42 Fibonacci numbers int fibonacci(int n) /* fibonacci : recursive version */ { if (n <= 0) return 0; else if (n == 1) return 1; else return fibonacci(n − 1) + fibonacci(n − 2); }

43  The time complexity of fibonacci is:  Theorem (in Section A.4): If F(n) is defined by a Fibonacci sequence, then F(n) is  (g n ), where  The time complexity is exponential: O(g n )


Download ppt "Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma."

Similar presentations


Ads by Google