Presentation is loading. Please wait.

Presentation is loading. Please wait.

Asymptotic Growth Rates Themes –Analyzing the cost of programs –Ignoring constants and Big-Oh –Recurrence Relations & Sums –Divide and Conquer Examples.

Similar presentations


Presentation on theme: "Asymptotic Growth Rates Themes –Analyzing the cost of programs –Ignoring constants and Big-Oh –Recurrence Relations & Sums –Divide and Conquer Examples."— Presentation transcript:

1 Asymptotic Growth Rates Themes –Analyzing the cost of programs –Ignoring constants and Big-Oh –Recurrence Relations & Sums –Divide and Conquer Examples –Sort –Computing powers –Euclidean algorithm (computing gcds) –Integer Multiplication

2 Asymptotic Growth Rates f(n) = O(g(n)) [grows at the same rate or slower] There exists positive constants c and n 0 such that f(n)  c g(n) for all n  n 0 Ignore constants and low order terms

3 Asymptotic Growth Rates (E.G.) E.G. 1: 5n 2 = O(n 3 ) c = 1, n 0 = 5: 5n 2  n  n 2 = n 3 E.G. 2: 100n 2 = O(n 2 ) c = 100, n 0 = 1 E.G. 3: n 3 = O(2 n ) c = 1, n 0 = 12 n 3  (2 n/3 ) 3, n  2 n/3 for n  12 [use induction]

4 Asymptotic Growth Rates f(n) = o(g(n)) [grows slower] f(n) = O(g(n)) and g(n)  O(f(n)) lim n  f(n)/g(n) = 0 f(n) =  (g(n)) [grows at the same rate] f(n) = O(g(n)) and g(n) = O(f(n))

5 Asymptotic Growth Rates [j < k] lim n  n j /n k = lim n  1/n (k-j) = 0  n j = o(n k ) [c < d] lim n  c n /d n = lim n  (c/d) n = 0  c n = o(d n ) lim n  ln(n)/n =  /   lim n  ln(n)/n = lim n  (1/n)/1 = 0 [L’Hopital’s Rule]  ln(n) = o(n) [  > 0] ln(n) = o(n  ) [similar calculation]

6 Asymptotic Growth Rates [c > 1, k an integer] lim n  n k /c n =  /   lim n  kn k-1 / c n ln(c)  lim n  k(k-1)n k-2 / c n ln(c) 2  …  lim n  k(k-1)…(k-1)/c n ln(c) k = 0  n k = o(c n )

7 Asymptotic Growth Rates (E.G.) lim n  f(n)/g(n) = 0  f(n) = O(g(n))   >0,  n 0 s.t.  n  n 0, f(n)/g(n) < 

8 Asymptotic Growth Rates  (log(n)) – logarithmic [log(2n)/log(n) = 1 + log(2)/log(n)]  (n) – linear [double input  double output]  (n 2 ) – quadratic [double input  quadruple output]  (n 3 ) – cubit [double input  output increases by factor of 8]  (n k ) – polynomial of degree k  (c n ) – exponential [double input  square output]

9 Asymptotic Manipulation  (cf(n)) =  (f(n))  (f(n) + g(n)) =  (f(n)) if g(n) = o(f(n))

10 Computing Time Functions Computing time function is the time to execute a program as a function of its inputs Typically the inputs are parameterized by their size [e.g. number of elements in an array, length of list, size of string,…] –Worst case = max runtime over all possible inputs of a given size –Best case = min runtime over all possible inputs of a given size –Average = avg. runtime over specified distribution of inputs

11 Analysis of Running Time We can only know the cost up to constants through analysis of code [number of instructions depends on compiler, flags, architecture, etc.] Assume basic statements are O(1) Sum over loops Cost of function call depends on arguments Recursive functions lead to recurrence relations

12 Loops and Sums for (i=0;i<n;i++) for (j=i;j<n;j++) S; // assume cost of S is O(1)

13 Merge Sort and Insertion Sort Insertion Sort –T I (n) = T I (n-1) + O(n) =  (n 2 ) [worst case] –T I (n) = T I (n-1) + O(1) =  (1) [best case] Merge Sort –T M (n) = 2T M (n/2) + O(n) =  (nlogn) [worst case] –T M (n) = 2T M (n/2) + O(n) =  (nlogn) [best case]

14 Karatsuba’s Algorithm Using the classical pen and paper algorithm two n digit integers can be multiplied in O(n 2 ) operations. Karatsuba came up with a faster algorithm. Let A and B be two integers with –A = A 1 10 k + A 0, A 0 < 10 k –B = B 1 10 k + B 0, B 0 < 10 k –C = A*B = (A 1 10 k + A 0 )(B 1 10 k + B 0 ) = A 1 B 1 10 2k + (A 1 B 0 + A 0 B 1 )10 k + A 0 B 0 Instead this can be computed with 3 multiplications T 0 = A 0 B 0 T 1 = (A 1 + A 0 )(B 1 + B 0 ) T 2 = A 1 B 1 C = T 2 10 2k + (T 1 - T 0 - T 2 )10 k + T 0

15 Complexity of Karatsuba’s Algorithm Let T(n) be the time to compute the product of two n-digit numbers using Karatsuba’s algorithm. Assume n = 2 k. T(n) =  (n lg(3) ), lg(3)  1.58 T(n)  3T(n/2) + cn  3(3T(n/4) + c(n/2)) + cn = 3 2 T(n/2 2 ) + cn(3/2 + 1)  3 2 (3T(n/2 3 ) + c(n/4)) + cn(3/2 + 1) = 3 3 T(n/2 3 ) + cn(3 2 /2 2 + 3/2 + 1) …  3 i T(n/2 i ) + cn(3 i-1 /2 i-1 + … + 3/2 + 1)...  cn[((3/2) k - 1)/(3/2 -1)] --- Assuming T(1)  c  2c(3 k - 2 k )  2c3 lg(n) = 2cn lg(3)

16 Divide & Conquer Recurrence Assume T(n) = aT(n/b) +  (n) T(n) =  (n) [a < b] T(n) =  (nlog(n)) [a = b] T(n) =  (n log b (a) ) [a > b]


Download ppt "Asymptotic Growth Rates Themes –Analyzing the cost of programs –Ignoring constants and Big-Oh –Recurrence Relations & Sums –Divide and Conquer Examples."

Similar presentations


Ads by Google