Download presentation

Presentation is loading. Please wait.

Published byElijah Simmons Modified about 1 year ago

1
http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 1 Algorithm Analysis What is the Big ‘O Bout? Anshuman Razdan Div of Computing Studies razdan@asu.edu http://dcst2.east.asu.edu/~razdan/cst230/

2
http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 2 Algorithm An Algorithm is a procedure or sequence of instructions for solving a problem. Any algorithm may be expressed in many different ways: In English, In a particular programming language, or (most commonly) in a mixture of English and programming called pseudocode.

3
http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 3 Example For each item do something if something greater than nothing return true else return false

4
http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 4 Analyzing Algorithms Questions: Is this algorithm fast enough? How much memory will I need to implement this solution? Is there a better solution? Analysis is useful BEFORE any implementation is done to avoid wasting time on an inappropriate solution. Algorithm analysis involves predicting resources (time, space) required Analyze algorithms independent of machines, implementation, etc.

5
http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 5 Run-time Analysis Number of operations (amount of time) required for the algorithm to terminate The number of operations typically depends on the program’s input size. E.g., sorting 1000 numbers takes longer (performs more operations) than sorting 3 numbers. running time = function(input size) Input size typically expressed as “N” or “n”

6
http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 6 Space Analysis Amount of storage (memory) required for the program to execute correctly. Again, the amount of storage will often depend on the input size. Space complexity = function(input size)

7
http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 7 Example – time complexity Insertion-Sort(A) 1for j := 2 to length[A] 2do key := A[j] 3/* insert A[j] into sorted A[1..j-1] */ 4i := j -1 5while i > 0 and A[i] > key 6do A[i+1] := A[i] 7 i := i - 1 8A[i+1] := key Count number of operations for inputs: A = [3 5 2 8 3] A = [2 3 3 5 8] n(4 + 3(n-1)/2) = (3n^2 + 5n)/2 where n is the size of A O(n^2)

8
http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 8 Algorithm Complexity Worst case running time is an upper bound on running time for any input of size n Best case is often not very useful Average case: average over all possible inputs. Often same complexity as worst case.

9
http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 9 Stair counting problem Technique 1 – Keep a tally = 3n ops Technique 2 – Let someone else keep count = n+2(1+2+…+n) Technique 3 – Ask = 4 ops 2689 steps

10
http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 10 Num Ops (Stair Counting) T1 : 3n T2 : n 2 + 2n T3 : log 10 n + 1 where n is rounded down

11
http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 11 Stair Counting 2689 stairs T1 : 8067 operations T2 : 7,236,099 operations T3 : 4 operations

12
http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 12 Number of Ops in Three Techniques StairsO 10 (log n)O(n)O(n 2 ) 10230120 100330010,200 1000430001,002,000 10000530000100,020,000

13
http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 13 Factoid Order of the algorithm is generally more important then the speed of the processor

14
http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 14 Search Algorithm for (i=0); i < data.length; i++) if data[i] == target return true; What is the? Best Case Worst Case Average

15
http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 15 Worst Case Usually we treat only the worst case performance: The worst case occurs fairly often, example: looking for in entry in a database that is not present. The result of the worst case analysis is often not different to the average case analysis (same order of complexity).

16
http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 16 Asymptotic Notation If we want to treat large problems (these are the critical ones), we are interested in the asymptotic behavior of the growth of the running time function. Thus, when comparing the running times of two algorithms: Constant factors can be ignored. Lower order terms are unimportant when the higher order terms are different. For instance, when we analyze selection sort, we find that it takes T(n) = n 2 + 3n - 4 array accesses. For large values of n, the 3n - 4 part is insignificant compared to the n 2 part. An algorithm that takes a time of 100n 2 will still be faster than an algorithm that is n 3 for any value of n larger than 100.

17
http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 17 Big-O Definition O(g(n)) = {f(n) : there exist positive constants c and n 0, such that 0 f(n) c g(n) for all n n 0 } |f(n)| c|g(n)| (use this to prove big-O). n0n0 c g(n) f(n)

18
http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 18 Big O Naming Convention If a function is O(log(n)) we call it logarithmic. If a function is O(n) we call it linear. If a function is O(n 2 ) we call it quadratic. If a function is O(n k ) with k≥1, we call it polynomial. If a function is O(a n ) with a>1, we call it exponential.

19
http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 19 Asymptotic Notation , O, , o, -- tight bound –When a problem is solved in theoretical min time O -- upper bound -- lower bound o -- non-tight upper bound -- non-tight lower bound

20
http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 20 Big Definition (g(n)) = {f(n) : there exist positive constants c and n 0, such that 0 c g(n) f(n) for all n n 0 } n0n0 c g(n) f(n)

21
http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 21 Definition ( g(n) ) = { f(n) : there exist positive constants c 1, c 2, and n 0, such that 0 c 1 g(n) f(n) c 2 g(n) for all n n 0 } n0n0 c 1 g(n) c 2 g(n) f(n)

22
http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 22 Theorem For any two functions f(n) and g(n), we have f(n) = ( g(n) ) iff f(n) = O(g(n) ) and f(n) = (g(n)) n0n0 c g(n) f(n) n0n0 c g(n) f(n) n0n0 c 1 g(n) c 2 g(n) f(n)

23
http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 23 If lim n-> inf f(n)/g(n) = 0, f(n) = O(g(n)) If lim n-> inf f(n)/g(n) = inf, f(n) = (g(n)) If lin n-> inf f(n)/g(n) = c, f(n) = (g(n))

24
http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 24 o and Definitions f(n) = o( g(n) ) if g(n) grows lot faster than f(n) f(n) = ( g(n) ) if f(n) grows a lot faster than g(n)

25
http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 25 Prove ( 1/2 )n 2 - 3n = (n 2 )

26
http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 26 Insertion Sort Link to Insertion SortInsertion Sort

Similar presentations

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google