Presentation is loading. Please wait.

Presentation is loading. Please wait.

Week 2 - Wednesday.  What did we talk about last time?  Running time  Big Oh notation.

Similar presentations


Presentation on theme: "Week 2 - Wednesday.  What did we talk about last time?  Running time  Big Oh notation."— Presentation transcript:

1 Week 2 - Wednesday

2  What did we talk about last time?  Running time  Big Oh notation

3

4

5

6

7  What's the running time to factor a large number N?  How many edges are in a completely connected graph?  If you have a completely connected graph, how many possible tours are there (paths that start at a given node, visit all other nodes, and return to the beginning)?  How many different n-bit binary numbers are there?

8  Here is a table of several different complexity measures, in ascending order, with their functions evaluated at n = 100 DescriptionBig Ohf(100) ConstantO(1)1 LogarithmicO(log n)6.64 LinearO(n)O(n)100 LinearithmicO(n log n)664.39 QuadraticO(n2)O(n2)10000 CubicO(n3)O(n3)1000000 ExponentialO(2 n )1.27 x 10 30 FactorialO(n!)9.33 x 10 157

9  Computers get faster, but not in unlimited ways  If computers get 10 times faster, here is how much a problem from each class could grow and still be solvable DescriptionBig OhIncrease in Size ConstantO(1)Unlimited LogarithmicO(log n)1000 LinearO(n)O(n)10 LinearithmicO(n log n)10 QuadraticO(n2)O(n2)3-4 CubicO(n3)O(n3)2-3 ExponentialO(2 n )Hardly changes FactorialO(n!)Hardly changes

10  There is nothing better than constant time  Logarithmic time means that the problem can become much larger and only take a little longer  Linear time means that time grows with the problem  Linearithmic time is just a little worse than linear  Quadratic time means that expanding the problem size significantly could make it impractical  Cubic time is about the reasonable maximum if we expect the problem to grow  Exponential and factorial time mean that we cannot solve anything but the most trivial problem instances

11  What is a logarithm?  Definition:  If b x = y  Then log b y = x (for positive b values)  Think of it as a de-exponentiator  Examples:  log 10 (1,000,000) =  log 3 (81) =  log 2 (512) =

12  Add one to the logarithm in a base and you'll get the number of digits you need to represent that number in that base  In other words, the log of a number is related to its length  Even big numbers have small logs  If there's no subscript, log 10 is assumed in math world, but log 2 is assumed for CS  Also common is ln, the natural log, which is log e

13  log b (xy) = log b (x) + log b (y)  log b (x/y) = log b (x) - log b (y)  log b (x y ) = y log b (x)  Base conversion:  log b (x) = log a (x)/log a (b)  As a consequence:  log 2 (n) = log 10 (n)/c 1 = log 100 (n)/c 2 = log b (n)/c 3 for b > 1  log 2 n is O(log 10 n) and O(log 100 n) and O(log b n) for b > 1

14  Let f(n) and g(n) be two functions over integers  f(n) is O(g(n)) if and only if  f(n) ≤ c∙g(n) for all n > N  for some positive real numbers c and N  In other words, past some arbitrary point, with some arbitrary scaling factor, g(n) is always bigger

15

16  We’ve been sloppy so far, saying that something is O(n) when its running time is proportional to n  Big Oh is actually an upper bound, meaning that something whose running time is proportional to n  Is O(n)  But is also O(n 2 )  And is also O(2 n )  If the running time of something is actually proportional to n, we should say it's Θ(n)  We often use Big Oh because it's easier to find an upper bound than to get a tight bound

17  O establishes an upper bound  f(n) is O(g(n)) if there exist positive numbers c and N such that f(n) ≤ cg(n) for all n ≥ N  Ω establishes a lower bound  f(n) is Ω(g(n)) if there exist positive numbers c and N such that f(n) ≥ cg(n) for all n ≥ N  Θ establishes a tight bound  f(n) is Θ(g(n)) if there exist positive numbers c 1,c 2 and N such that c 1 g(n) ≤ f(n) ≤ c 2 g(n) for all n ≥ N

18  O and Ω have a one-to-many relationship with functions  4n 2 + 3 is O(n 2 ) but it is also O(n 3 ) and O(n 4 log n)  6n log n is Ω(n log n) but it is also Ω(n)  Θ is one-to-many as well, but it has a much tighter bound  Sometimes it is hard to find Θ  Upper bounding isn’t too hard, but lower bounding is difficult for real problems

19 1. If f(n) is O(g(n)) and g(n) is O(h(n)), then f(n) is O(h(n)) 2. If f(n) is O(h(n)) and g(n) is O(h(n)), then f(n) + g(n) is O(h(n)) 3. an k is O(n k ) 4. n k is O(n k+j ), for any positive j 5. If f(n) is cg(n), then f(n) is O(g(n)) 6. log a n is O(log b n) for integers a and b > 1 7. log a n is O(n k ) for integer a > 1 and real k > 0

20  How much time does a binary search take at most?  What about at least?  What about on average, assuming that the value is in the list?

21  Give a tight bound for n 1.1 + n log n  Give a tight bound for 2 n + a where a is a constant  Give functions f 1 and f 2 such that f 1 (n) and f 2 (n) are O(g(n)) but f 1 (n) is not O(f 2 (n))

22

23

24  Implementing an array-backed list  Read section 1.3

25  Finish Assignment 1  Due Friday by 11:59pm  Keep working on Project 1  Due Friday, September 18 by 11:59pm


Download ppt "Week 2 - Wednesday.  What did we talk about last time?  Running time  Big Oh notation."

Similar presentations


Ads by Google