Download presentation

Presentation is loading. Please wait.

Published byCynthia Ilsley Modified about 1 year ago

1
Discrete Structures CISC 2315 Growth of Functions

2
Introduction Once an algorithm is given for a problem and decided to be correct, it is important to determine how much in the way of resources, such as time or space, that the algorithm will require. –We focus mainly on time in this course. –Such an analysis will often allow us to improve our algorithms.

3
Running Time Analysis N Data Items Algorithm 1 Algorithm 2 Returns in time T 1 (N) Returns in time T 2 (N)

4
Analysis of Running Time (which algorithm is better?) Number of Input Items N Running Time T(N) Algorithm 1 Algorithm 2

5
Comparing Algorithms We need a theoretical framework upon which we can compare algorithms. The idea is to establish a relative order among different algorithms, in terms of their relative rates of growth. The rates of growth are expressed as functions, which are generally in terms of the number/size of inputs N. We also want to ignore details, e.g., the rate of growth is N 2 rather than 3N 2 – 5N + 2. NOTE: The text uses the variable x. We use N here.

6
Big-O Notation: Definition: This says that function T(N) grows at a rate no faster than f(N); thus c f(N) is an upper bound on T(N). big-O NOTE: We use |T(N)| < c|f(N)| if the functions could be negative. In this class, we will assume they are positive unless stated otherwise.

7
Big-O Upper Bound Number of Input Items N Running Time T(N) c f(N) T(N)

8
Big-O Example Prove that – Since –Then We could also prove that but the first upper bound is tighter (lower). Note that Why? How do you show something is not O(something)?

9
Another Big-O Example Prove that – Since –Then We could also prove that but the first bound is tighter. Note that Why?

10
Why Big-O? It gets very complicated comparing the time complexity of algorithms when you have all the details. Big-O gets rid of the details and focuses on the most important part of the comparison. Note that Big-O is a worst-case analysis.

11
Same-Order Functions Let f(N) = N 2 + 2N + 1 Let g(N) = N 2 g(N) = O(f(N)) because N 2 < N 2 + 2N + 1 f(N) = O(g(N)) because: –N 2 + 2N + 1 < N 2 + 2N 2 + N 2 = 4N 2, which is O(N 2 ) with c = 4 and n 0 = 1. In this case we say that f(N) and g(N) are of the same order.

12
Simplifying the Big-O Theorem 1 (in text): –Let f(N) = a n N n + a n-1 N n-1 + … + a 1 N + a 0 –Then f(N) is O(N n ).

13
Another Big-O Example Recall that N! = N*(N-1)*…*3*2*1 when N is a positive integer > 0, and 0! = 1. Then N! = N*(N-1)*…*3*2*1 < N*N*N* … *N = N N Therefore, N! = O(N N ). But this is not all we know… Taking log of both sides, log N! < log N N = N log N. Therefore log N! is O(N log N). (c=1, n 0 = 2) Recall that we assume logs are base 2.

14
Another Big-O Example In Section 3.2 it will be shown that N < 2 N. Taking the log of both sides, log N < N. Therefore log N = O(N). (c=1 and n 0 = 1)

15
Complexity terminology O(1)Constant complexity O(log N)Logarithmic complexity O(N) Linear complexity O(N log N)N log N complexity O(N b )Polynomial complexity O(b N ), b > 1Exponential complexity O(N!)Factorial complexity

16
Growth of Combinations of Functions Sum Rule: Suppose f 1 (N) is O(g 1 (N)) and f 2 (N) is O(g 2 (N)). Then (f 1 + f 2 )(N) is O(max(g 1 (N), g 2 (N))). Product Rule: Suppose that f 1 (N) is O(g 1 (N)) and f 2 (N) is O(g 2 (N)). Then (f 1 f 2 ) (N) is O(g 1 (N)g 2 (N)).

17
Growth of Combinations of Functions O(g 1 (N)) O(g 2 (N)) Subprocedure 1 Subprocedure 2 * What is the big-O time complexity of running the two subprocedures sequentially? Theorem 2: If f 1 (N) is O(g 1 (N)) and f 2 (N) is O(g 2 (N)), then (f 1 + f 2 )(N) = O(max(g 1 (N),g 2 (N)).

18
Growth of Combinations of Functions O(g 1 (N)) O(g 2 (M)) Subprocedure 1 Subprocedure 2 * What is the big-O time complexity of running the two subprocedures sequentially? Theorem 2: If f 1 (N) is O(g 1 (N)) and f 2 (M) is O(g 2 (M)), then f 1 (N)+ f 2 (M) = O(g 1 (N) + g 2 (M)). What about M not equal to N?

19
O(g 2 (N)) Growth of Combinations of Functions O(g 1 (N)) * What is the big-O time complexity for nested loops? Theorem 3: If f 1 (N) is O(g 1 (N)) and f 2 (N) is O(g 2 (N)), then (f 1 * f 2 )(N) = O(g 1 (N) * g 2 (N)). for j=1 to N do for i=1 to 2N do N steps * 2N steps = 2N 2 steps 11 2NN a ij : = 1 1

20
Growth of Combinations of Functions: Example 1 Give a big-O estimate for f(N) = 3N log(N!) + (N 2 + 3) logN, where N is a positive integer. Solution: –First estimate 3N log(N!). Since we know log(N!) is O(N log N), and 3N is O(N), then from the Product Rule we conclude 3N log(N!) = O(N 2 log N). –Next estimate (N 2 + 3) log N. Since (N 2 + 3) 2, N is O(N 2 ). From the Product Rule, we have (N 2 + 3) log N = O(N 2 log N). –Using the Sum Rule to combine these estimates, f(N) = 3N log(N!) + (N 2 + 3) log N is O(N 2 log N).

21
Growth of Combinations of Functions: Example 2 Give a big-O estimate for f(N) = (N + 1) log(N 2 + 1) + 3N 2. Solution: –First estimate (N + 1) log(N 2 + 1). We know (N + 1) is O(N). Also, N Therefore: log(N 2 + 1) 2. We conclude log(N 2 + 1) = O(logN). –From the Product Rule, we have (N + 1)log(N 2 + 1) = O(N logN). Also, we know 3N 2 = O(N 2 ). –Using the Sum Rule to combine these estimates, f(N) = O(max(N logN, N 2 ). Since N log N 1, f(N) = O(N 2 ).

22
Big-Omega Notation Definition: This says that function T(N) grows at a rate no slower than f(N); thus c f(N) is a lower bound on T(N).

23
Big-Omega Lower Bound Number of Input Items N Running Time T(N) T(N) f(N)

24
Big-Omega Example Prove that – Since –Then We could also prove that but the first lower bound is tighter (higher). Note that

25
Another Big-Omega Example Prove that – Since –Then We could also prove that but the first bound is tighter. Note that

26
Big-Theta Notation Definition: –This says that function T(N) grows at the same rate as f(N). Put another way: We say that T(N) is of order f(N).

27
Big-Theta Example Show that 3N 2 + 8N logN is θ(N 2 ). –0 1. –Therefore 3N 2 + 8N logN is O(N 2 ). –Clearly 3N 2 + 8N logN is Ω(N 2 ). –We conclude that 3N 2 + 8N logN is θ(N 2 ).

28
A Hierarchy of Growth Rates If f(N) is O(x) for x one of the above, then it is O(y) for any y > x in the above ordering. But the higher bounds are not tight. We prefer tighter bounds. Note that if the hierarchy states that x < y, then obviously for any expression z > 0, z*x < z*y.

29
Complexity of Algorithms vs Problems We have been talking about polynomial, linear, logarithmic, or exponential time complexity of algorithms. But we can also talk about the time complexity of problems. Example decision problems: –Let a, b, and c be positive integers. Is there a positive integer x < c such that x 2 = a (mod b)? –Does there exist a truth assignment for all variables that can satisfy a given logical expression?

30
Complexity of Problems A problem that can be solved with a deterministic polynomial (or better) worst-case time complexity algorithm is called tractable. Problems that are not tractable are intractable. P is the set of all problems solvable in polynomial time (tractable problems). NP is the set of all problems not solvable by any known deterministic polynomial time algorithm. But their solution can be checked in polynomial time. P = NP ?

31
Complexity of Problems (cont’d) Unsolvable problem: A problem that cannot be solved by any algorithm. Example: –Halting Problem ? program input Will program halt on input? If we try running the program on the input, and it keeps running, how do we know if it will ever stop?

Similar presentations

© 2016 SlidePlayer.com Inc.

All rights reserved.

Ads by Google