Presentation is loading. Please wait.

Presentation is loading. Please wait.

LECTURE 23: LOVE THE BIG-OH CSC 212 – Data Structures.

Similar presentations


Presentation on theme: "LECTURE 23: LOVE THE BIG-OH CSC 212 – Data Structures."— Presentation transcript:

1 LECTURE 23: LOVE THE BIG-OH CSC 212 – Data Structures

2 Algorithm Analysis  Execution time with n inputs on 4GHz machine: n = 10n = 50n = 100n = 1000n = 10 6 O(n log n)9 ns50 ns175 ns2500 ns5 ms O(n 2 )25 ns625 ns2250 ns250000 ns4 min O(n 5 )25000 ns72.5 ms2.7 s2.9 days1x10 13 yrs O(2 n )2500 ns3.25 days1 x 10 14 yrs1 x 10 285 yrs Too long! O(n!)1 ms1.4 x 10 58 yrs7 x 10 141 yrs Too long!

3  Only large data sets are considered in analysis  If a program only takes 2 minutes, who cares?  Multipliers are ignored by this analysis  O( ⅕ n) = O(2n) = O(50000n) = O(n)  Need lots of 5 ms to reach 4 minutes  Only equation’s most significant term kept  O(⅛n 5 + 100000n 2 ) = O(n 5 + n 2 ) = O(n 5 )  What is extra 17 min. after 3 x 10 13 years? Big-Oh Notation

4  Measures how many simple operations executed  Assignments, method calls, arithmetic, comparisons, getting array entry, following a reference, etc.  Provides simple, rough approximation of time  Excellent for narrowing approach to be used  Actual algorithms implementations not important  Precision not an issue: 17 min vs. age of the universe vs. Cage Match It Ain’t

5  Sequences of simple statements take O(1) time  Loops from 1 – n will take time:  Constant amount added: O(n) time  O( log n) when multiplying by constant amount  When run sequentially, add the big-Oh times  Remember to drop multipliers & insignificant details  Complexities should multiply when nested  Slow having loops nested inside loop inside loops… Rules of Thumb

6 Algorithm sneaky(int n) total = 0 for i = 0 to n do for j = 0 to n do total += i * j return total end for end for  sneaky would take O(1) time to execute  Looks like O(n) iterations setup for each loop  But in first pass, method ends after return  Always executes the same number of operations It’s About Time

7 Algorithm power(int a, int b ≥ 0) if a == 0 && b == 0 then return -1 endif exp = 1 repeat b times exp *= a end repeat return exp  power takes O(n) time in most cases  Would only take O(1) if a & b are 0  O(n) algorithm, however, since use worst-case Big-Oh == Murphy’s Law

8 algorithm sum(int[][] a) total = 0 for i = 0 to a.length do for j = 0 to a[i].length do total += a[i][j] end for end for return total  Despite nested loops, this runs in O(n) time  Method’s input is doubly-subscripted array  For this method n is entries in entire array How Big Is My Input?

9 Handling Method Calls  Method call is O(1) operation, …  … but then also need to add time running method  Big-Oh counts operations executed in total  Remember: there is no such thing as free lunch  Borrowing $5 to pay does not make your lunch free  Similarly, need to include all operations executed  In which method run DOES NOT MATTER

10 public static int sumOdds(int n) { int sum = 0; for (int i = 1; i <= n; i+=2) { sum+=i; } return sum; } public static void oddSeries(int n) { for (int i = 1; i < n; i++) { System.out.println(i + “ ” + sumOdds(n)); } }  oddSeries calls sumOdds n times  Each call does O(n) work, so takes O(n 2 ) total time! Methods Calling Methods

11  Important to explain your answer  Saying O(n) not enough to make it O(n)  Methods using recursion especially hard to find  Derive difficult answer using simple process  May find that you can simplify big-Oh computation  Find smaller or larger big-Oh than imagined  Can be proof, but need not be that formal  Explaining your answer is critical for this  Helps you be able to convince others Justifying an Answer

12 Algorithm factorial(int n) if n <= 1 then return 1 else fact = factorial(n – 1) return n * fact endif  Ignoring recursive calls cost, runs in O(1) time  At most n – 1 calls since n decreased by 1 each time  Method’s total complexity is O(n)  Runs O(n – 1) * O(1) = O(n - 1) = O(n) operations Big-Oh Notation

13 Algorithm fib(int n) if n < 1 then return n else return fib(n-1) + fib(n-2) endif  O(1) time for each O(2 n ) calls = O(2 n ) complexity  Calls fib(1), fib(0) when n = 2 ( )  n = 3, total of 4 calls: 3 for fib(2) + 1 for fib(1)  n = 4, total of 8 calls: 4 for fib(3) + 3 for fib(2)  Number of calls doubles when n incremented = O(2 n ) Big-Oh Notation

14  Finish week #8 assignment  Due by 5PM next Tuesday  Start programming assignment #3  Messages are not always sent to everyone!  Read section 5.1 in book before class  Discuss out first ADT: Stack  What is it? How is it used? Why do I now want Pez?  Strongly urge students to fill out ADT Design for Stack Before Next Lecture…


Download ppt "LECTURE 23: LOVE THE BIG-OH CSC 212 – Data Structures."

Similar presentations


Ads by Google