Presentation is loading. Please wait.

Presentation is loading. Please wait.

Not all algorithms are created equally Insertion of words from a dictionary into a sorted list takes a very long time. Insertion of the same words into.

Similar presentations


Presentation on theme: "Not all algorithms are created equally Insertion of words from a dictionary into a sorted list takes a very long time. Insertion of the same words into."— Presentation transcript:

1 Not all algorithms are created equally Insertion of words from a dictionary into a sorted list takes a very long time. Insertion of the same words into a balanced BST takes very little time. How can we express this difference?

2 Empirical studies We can gather empirical data by running both algorithms on different data sets and comparing their performance.

3 Issue #1 Empirical data reflects performance on individual cases: the more data points, the better, but no general understanding of algorithm performance is gained

4 Issue #2 You must code the algorithms to be compared. This can be a non-trivial task.

5 Issue #3 Empirical performance measures depend on many factors: –implementation language –compiler –execution hardware

6 Desiderata (things desired as essential – on-line Webster) We want a measure of algorithm performance which: –gives performance bounds as problem size grows large, –is implementation independent, –describes performance of algorithm in the general case, not just specific cases, and –allows performance of different algorithms to be compared.

7 Asymptotic notation There are many flavors of asymptotic notation. We will study one: the big-Oh notation. big-Oh gives an upper bound typically used to express an upper-bound on the worst-case performance of an algorithm

8 Definition Given two functions, f and g, mapping natural numbers into non-negative reals, we say that f(n) = O(g(n)) if there exist positive constants c and n 0, such that f(n) n 0

9 What does this mean? It means that f can’t grow faster than g. We’re interested only in what happens when the input size of the problem is large. g(n) is a bound on the running time of an algorithm whose actual (unknown) runtime is f(n). We guarantee that the time required by the algorithm grows no more quickly than g, as the problem size gets large.

10 Some Comparative Bounds expression name O(1) constant O(log n)logarithmic O(log 2 n) log squared O(n) linear O(n log n) n log n O(n 2 )quadratic O(n 3 )cubic O(2 n )exponential O(n!)factorial

11 Basic examples f(n)g(n)Is f(n)=O(g(n))? n2n2 n3n3 YES n3n3 n2n2 NO n2n2 n2n2 YES 17n 2 n2n2 YES

12 A little tougher If you know that f(n) = O(n 3 ), what can you say about f(n) = O(n 2 )? Is it impossible, possible, or necessary?

13 A little tougher If you know that f(n) = O(n 3 ), what can you say about f(n) = O(n 2 )? Is it impossible, possible, or necessary? It is possible.

14 How about… If you know that f(n) = O(n 3 ), what can you say about f(n) = O(n 4 )? Is it impossible, possible, or necessary?

15 How about… If you know that f(n) = O(n 3 ), what can you say about f(n) = O(n 4 )? Is it impossible, possible, or necessary? It is necessary.

16 One last example… If you know that f(n) = O(n 3 ), what can you say about f(n) = n 4 ? Is it impossible, possible, or necessary?

17 One last example… If you know that f(n) = O(n 3 ), what can you say about f(n) = n 4 ? Is it impossible, possible, or necessary? It is impossible.

18 Conventions First, it is common practice when writing big oh expressions to drop all but the most significant terms. Thus, instead of O(n 2 + n log n + n) we simply write O(n 2 ). Second, it is common practice to drop constant coefficients. Thus, instead of O(3n 2 ), we simply write O(n 2 ). As a special case of this rule, if the function is a constant, instead of, say O(1024), we simply write O(1). Of course, in order for a particular big oh expression to be the most useful, we prefer to find a tight asymptotic bound. For example, while it is not wrong to write f(n) = n = O(n 3 ), we prefer to write f(n)=O(n), which is a tight bound.

19 Determining Bounds How do we determine the bounds for a particular method? Basic operations are O(1). For loop, consider the number of times the loop is executed.

20 Example for (int i = 0; i < array.length; i++){ System.out.println(array[i]); } The array access is a constant time operation; it is executed array.length times; if we take the size of the problem (n) to be the size of the array, then we can say that the runtime of this loop is O(n).

21 Selection sort A simple algorithm for sorting a collection of N items: –while unsorted collection is not empty, find smallest in unsorted collection, place it last in sorted collection. Finding smallest takes time proportional to size of unsorted collection (initially N). Smallest must be found N times. Overall runtime is therefore O(N*N), or O(N 2 )

22 But wait a minute! N*N simplifies things a bit too much, doesn’t it? After all, the unsorted collection is shrinking all the time: –first call to smallest searches through N items –second call to smallest searches through N-1 items –… –last call to smallest searches through only 1 item More precise analysis is therefore: –runtime needed is 1+2+…+N = N(N+1)/2 = O(N 2 ) So simplifying assumption was justified.


Download ppt "Not all algorithms are created equally Insertion of words from a dictionary into a sorted list takes a very long time. Insertion of the same words into."

Similar presentations


Ads by Google