# ALG0183 Algorithms & Data Structures Lecture 7 Big-Oh, Big-Omega, Big-Theta, Little-Oh 8/25/20091 ALG0183 Algorithms & Data Structures by Dr Andy Brooks.

## Presentation on theme: "ALG0183 Algorithms & Data Structures Lecture 7 Big-Oh, Big-Omega, Big-Theta, Little-Oh 8/25/20091 ALG0183 Algorithms & Data Structures by Dr Andy Brooks."— Presentation transcript:

ALG0183 Algorithms & Data Structures Lecture 7 Big-Oh, Big-Omega, Big-Theta, Little-Oh 8/25/20091 ALG0183 Algorithms & Data Structures by Dr Andy Brooks T(N) is the running time of an algorithm. Weiss Chapter 5 Sahni Chapter 3

Big-Oh Formally, T(N) is O(F(N)) if there are positive constants c and N 0 such that T(N) ≤ cF(N) when N ≥ N 0. – Remember that for small N, comparing growth behaviour is less straightforward. For a sufficiently large N, T(N) is bounded by some multiple of F(N). When considering growth rates, Big-Oh means less than or equal to. Big-Oh is a potentially reachable upper bound. 8/25/2009 ALG0183 Algorithms & Data Structures by Dr Andy Brooks 2 T(N) is the running time of an algorithm.

Example: T(N) = 3N +2. When N ≥ 2, T(N) ≤ 4N, so T(N) = O(N). c = 4 and N 0 = 2 When N ≥ 1, T(N) ≤ 5N, so T(N) = O(N). c = 5 and N 0 = 1 The particular values of c and N 0 used to satisfy the definition of Big-Oh does not really matter. 8/25/2009 ALG0183 Algorithms & Data Structures by Dr Andy Brooks 3 N3N+24N5N 1545 28810 3111215 4141620 5172025

Example: T(N) = 10N 2 + 4N + 2 When N ≥ 2, T(N) ≤ 10N 2 + 5N. When N ≥ 5, 5N ≤ N 2. When N ≥ 5, T(N) ≤ 10N 2 + N 2 = 11N 2. So T(N) = O(N 2 ). 8/25/2009 ALG0183 Algorithms & Data Structures by Dr Andy Brooks 4 N10N 2 +4N+211N 2 11611 25044 310499 4178176 5272275

Example: T(N) = 6*2 N + N 2. When N ≥ 4, N 2 ≤ 2 N. When N ≥ 4, T(N) ≤ 6*2 N + 2 N = 7*2 N. So T(N) = O(2 N ). exponential complexity 8/25/2009 ALG0183 Algorithms & Data Structures by Dr Andy Brooks 5 N6*2 N + N 2 7*2 N 11314 228 35756 4112 5217224 N = 1, N = 2 are special cases.

Note: Step counts may vary between analysts, but Big-Oh should not. Analyst A may reckon two algorithms to be n 2 +3n and 43n: the break-even point is n=40. If Analyst B reckons the second algorithm is 83n: the break-even point is n=80. If Analyst B reckons the first algorithm is 2n 3 + 3n, the break- even point is 20. Regardless of the discrepancies in step counts, we know that for sufficiently large n, the algorithm which is O(N) is better than the algorithm which is O(N 2 ). 8/25/2009 ALG0183 Algorithms & Data Structures by Dr Andy Brooks 6 Figure ©McGraw-Hill

Some notes on Big-Oh If the running time of an algorithm is linear then saying that the algorithm is O(N 2 ) is technically correct as the inequality holds. – Of course, it is better to say the algorithm is O(N). Do not write O(2N 2 ) or O(N 2 +N). – Write O(N 2 ). In expressing Big-Oh, we throw away constants, lower-order terms, and relational symbols. If an algorithm comprises a sequence of two compound statements, one of which is O(N) and the other O(N 2 ), the overall Big-Oh is O(N 2 ). 8/25/2009 ALG0183 Algorithms & Data Structures by Dr Andy Brooks 7

Limitations of Big-Oh A Big-Oh analysis is not appropriate for small amounts of input. Use the simplest algorithm. – The more complex an algorithm, the more likely there is a bug in the implementation. Sometimes large constants, hidden by a Big-Oh analysis, dictate running time. A Big-Oh analysis assumes an infinite memory capacity, but running time in the real world depends on the sizes of available cache, RAM, etc. Worse-case performance sometimes occurs only under rare input conditions which may never arise in practice: Big-Oh is an overestimation under these circumstances. – (Worse-case bounds are usually easy to calculate, however.) 8/25/2009 ALG0183 Algorithms & Data Structures by Dr Andy Brooks 8

Estimating running times knowing Big-Oh 8/25/2009 ALG0183 Algorithms & Data Structures by Dr Andy Brooks 9 “If an algorithm takes T(N) time to solve a problem of size (N), how long does it take to solve a larger problem? For instance, how long does it take to solve a problem when there is 10 times as much input?” Weiss

How long does it take to solve a problem when there is 10 times as much input? 8/25/2009 ALG0183 Algorithms & Data Structures by Dr Andy Brooks 10 If T(N) = cN T(10N) = c(10N) T(10N) = 10cN = 10T(N) Running time increases by a factor a 10. If T(N) = cN 2 T(10N) = c(10N) 2 T(10N) = 100cN 2 = 100T(N) Running time increases by a factor a 100. If T(N) = cN 3 T(10N) = c(10N) 3 T(10N) = 1000cN 3 = 1000T(N) Running time increases by a factor a 1,000.

Timing measures (microseconds) for the maximum contiguous subsequence sum algorithms. The reported times for N = 10 are not meaningful. Times grow by x10, x100, and x1000 but not exactly so. For N = 100,000 and the cubic algorithm, the predicted time is 1000*255 seconds (over 70 hours). 8/25/2009 ALG0183 Algorithms & Data Structures by Dr Andy Brooks 11

Estimating accurately for ten times as much input knowing Big-Oh has logarithmic terms is not as easy. Suppose an algorithm is O(NlogN). 8/25/2009 ALG0183 Algorithms & Data Structures by Dr Andy Brooks 12 If T(N) = cNlogN T(10N) = c(10N)log(10N) T(10N) = 10cNlog(10N) = 10cNlogN + 10cNlog10 = 10T(N) + c´N where c´= 10clog10 Running time increases by a factor slightly larger than 10. Note, however, for very large N, the ratio T(10N)/T(N) gets closer and closer to 10 because c´N/T(N) = (10log10)/logN which gets smaller as N increases.

Big-Omega Ω Formally, T(N) is Ω(F(N)) if there are positive constants c and N 0 such that T(N) ≥ cF(N) when N ≥ N 0. When considering growth rates, Big-Omega means greater than or equal to. Big-Omega is a potentially reachable lower bound. 8/25/2009 ALG0183 Algorithms & Data Structures by Dr Andy Brooks 13

Big-Omega Ω examples For N ≥ 0, T(N) = 3N + 2 > 3N So T(N) is Ω(N). For N ≥ 0, T(N) = 10N 2 + 4N + 2 > 10N 2 So T(N) is Ω(N 2 ). For N > 0, T(N) = 6 * 2 N + N 2 > 6 * 2 N So T(N) is Ω(2 N ). 8/25/2009 ALG0183 Algorithms & Data Structures by Dr Andy Brooks 14

Big-Theta Θ Big-Theta can be used when an algorithm is bounded from above and below by the same function. Formally, T(N) is Θ(F(N)) if and only if T(N) is O(F(N)) and T(N) is Ω(F(N)). When considering growth rates, Big-Theta means equality. The growth rate of T(N) equals the growth rate of F(N). Big-Theta gives an exact asymptotic characterization. 8/25/2009 ALG0183 Algorithms & Data Structures by Dr Andy Brooks 15 “In spite of the additional precision offered by Big-Theta, however, Big-Oh is more commonly used, except by researchers in the algorithm analysis field.” Weiss

Big-Theta Θ examples T(N) = 3N +2 = Θ(N) (T(N) > 3N, T(N) ≤ 4N, from previous examples) T(N) = 10N 2 + 4N + 2 = Θ(N 2 ) (T(N) > 10N 2, T(N) ≤ 11N 2, from previous examples) T(N) = 6*2 N + N 2 = Θ(2 N ) (T(N) > 6*2N, T(N) ≤ 7*2 N, from previous examples) 8/25/2009 ALG0183 Algorithms & Data Structures by Dr Andy Brooks 16

Little-Oh o Formally, T(N) is o(F(N)) if and only if T(N) is O(F(N)) and T(N) is not Θ(F(N)). When considering growth rates, Little-Oh means less than. strictly less than If an algorithm is o(N 2 ) it is definitley growing at a slower rate than N 2. subquadratic A bound of o(N 2 ) is better than Θ(N 2 ). 8/25/2009 ALG0183 Algorithms & Data Structures by Dr Andy Brooks 17

Little-Oh o example If T(N) = 3N +2, T(N) is O(N 2 ) but is not Ω(N 2 ). T(N) is o(N 2 ) : the growth is less than quadratic. If T(N) is 10N 2 + 4N +2 T(N) is o(N 3 ) : the growth is less than cubic. (T(N) is not o(N 2 ) as the growth is quadratic.) 8/25/2009 ALG0183 Algorithms & Data Structures by Dr Andy Brooks 18

Download ppt "ALG0183 Algorithms & Data Structures Lecture 7 Big-Oh, Big-Omega, Big-Theta, Little-Oh 8/25/20091 ALG0183 Algorithms & Data Structures by Dr Andy Brooks."

Similar presentations