Presentation is loading. Please wait.

Presentation is loading. Please wait.

Problem of the Day  On the next slide I wrote today’s problem of the day. It has 3 possible answers. Can you guess which 1 of the following is the solution?

Similar presentations


Presentation on theme: "Problem of the Day  On the next slide I wrote today’s problem of the day. It has 3 possible answers. Can you guess which 1 of the following is the solution?"— Presentation transcript:

1 Problem of the Day  On the next slide I wrote today’s problem of the day. It has 3 possible answers. Can you guess which 1 of the following is the solution? A. Answer 1 B. Answers 1 or 2 C. Answer 2 D. Answers 2 or 3

2 Problem of the Day  On the next slide I wrote today’s problem of the day. It has 3 possible answers. Can you guess which 1 of the following is the solution? A. Answer 1 B. Answers 1 or 2 C. Answer 2 D. Answers 2 or 3  If answers 1 or 2 were correct, we would not be able to select exactly one solution. So, answer 3 (and selection D) must be right.

3 CSC 212 – Data Structures

4 Analysis Techniques  Running time is critical, …  …but comparing times impossible in many cases  Single problem may have lots of ways to be solved  Many implementations possible for each solution

5 Pseudo-Code  Only for human eyes  Unimportant & implementation details ignored  Serves very real purpose, even if it is not real  Useful for tasks like outlining, designing, & analyzing  Language-like manner to system, though not formal

6 Pseudo-Code  Only needs to include details needed for tracing  Loops, assignments, calls to methods, etc.  Anything that would be helpful analyzing algorithm  Understanding algorithm is only goal  Feel free to ignore punctuation & other formalisms  Understanding & analysis is only goal of using this

7 Pseudo-code Example Algorithm factorial(int n) returnVariable = 1 while (n > 0) returnVariable = returnVariable * n n = n – 1 endwhile return returnVariable

8 “Anything that can go wrong…”  Expresses an algorithm’s complexity  Worst-case  Worst-case analysis of algorithm performance  Usually closely correlated with execution time  Not always right to consider only worst-case  May be situation where worst-case is very rare  Closely related approaches for other cases come later

9 “Should I Even Bother?”  Compare algorithms using big-Oh notation  Could use to compare implementations, also  Saves time implementing all the algorithms  Biases like CPU, typing speed, cosmic rays ignored

10 Algorithmic Analysis

11 Algorithm Analysis  Execution time with n inputs on 4GHz machine: n = 10n = 50n = 100n = 1000n = 10 6 O(n log n)9 ns50 ns175 ns2500 ns5 ms O(n 2 )25 ns625 ns2250 ns250000 ns4 min O(n 5 )25000 ns72.5 ms2.7 s2.9 days1x10 13 yrs O(2 n )2500 ns3.25 days1 x 10 14 yrs1 x 10 285 yrs Too long! O(n!)1 ms1.4 x 10 58 yrs7 x 10 141 yrs Too long!

12  Want results for large data sets  Nobody cares  Nobody cares about 2 minute-long program  Limit considerations to only major details  Ignore multipliers  So, O ( ⅛n ) = O ( 5n ) = O ( 50000n ) = O ( n )  Multipliers usually implementation-specific  How many 5ms can we fit into 4 minutes?  Ignore lesser terms  So, O ( ⅚ n 5 + 23402n 2 ) = O ( n 5 + n 2 ) = O ( n 5 )  Tolerate extra 17 minutes after waiting 3x10 13 years? Big-Oh Notation

13 What is n ?  Big-Oh analysis always relative to input size  But determining input size is not always clear  Quick rules of thumb:  Need to consider what algorithm is processing Analyze values below x : n = x Analyze data in an array: n = size of array Analyze linked list: n = size of linked list Analyze 2 arrays: n = sum of array sizes

14 primitive operations  Big-Oh computes primitive operations executed  Assignments  Calling a method  Performing arithmetic operation  Comparing two values  Getting entry from an array  Following a reference  Returning a value from a method  Accessing a field Analyzing an Algorithm

15 Primitive Statements O(1)  Basis of programming, take constant time: O(1)  Fastest possible big-Oh notation  Time to run sequence of primitive statements, too  But only if the input does not affect sequence Ignore constant multiplier 11 O(5) = O(5 * 1 ) = O( 1 )

16 Simple Loops for (int i = 0; i < n.length; i++){} -or- while (i < n) { i++; }  Each loop executed n times  Primitive statements only within body of loop O(1)  Big –oh complexity of single loop iteration: O(1) O(n)  Either loop runs O(n) iterations O(n)O(1)O(n)  So loop has O(n) * O(1) = O(n) complexity total

17 for (int i = 0; i < n.length; i++){} int i = 0; while (i < n) { i++; }  Add complexities of sequences to compute total  For this example, total big-Oh complexity is: O(n)O(1) O(n) = O(n) + O(1) + O(n) O(n1) = O(2 * n + 1) O(n 1) = O(n + 1) Loops In a Row

18 for (int i = 0; i < n.length; i++){} int i = 0; while (i < n) { i++; }  Add complexities of sequences to compute total  For this example, total big-Oh complexity is: O(n)O(1) O(n) = O(n) + O(1) + O(n) O(n1) = O(2 * n + 1) O(n 1) = O(n + 1) Loops In a Row

19 for (int i = 0; i < n.length; i++){} int i = 0; while (i < n) { i++; }  Add complexities of sequences to compute total  For this example, total big-Oh complexity is: O(n)O(1) O(n) = O(n) + O(1) + O(n) O(n1) = O(2 * n + 1) O(n) = O(n) Loops In a Row

20 More Complicated Loops for (int i = 0; i < n; i += 2) { } i  0, 2, 4, 6,..., n  In above example, loop executes n / 2 iterations O(1)  Iterations takes O(1) time, so total complexity: O( n )O(1) = O( n / 2 ) * O(1) O(n ) = O(n * ½ * 1) O(n) = O(n)

21 Really Complicated Loops for (int i = 1; i < n; i *= 2) { } i  1, 2, 4, 8,..., n  In above code, loop executes log 2 n iterations O(1)  Iterations takes O(1) time, so total complexity: O(log 2 n)O(1) = O(log 2 n) * O(1) O(log 2 n) = O(log 2 n * 1) O(log 2 n) = O(log 2 n)

22 Really Complicated Loops for (int i = 1; i < n; i *= 3) { } i  1, 3, 9, 27,..., n  In above code, loop executes log 3 n iterations O(1)  Iterations takes O(1) time, so total complexity: O(log 3 n)O(1) = O(log 3 n) * O(1) O(log 3 n) = O(log 3 n * 1) O(log 3 n) = O(log 3 n)

23 Math Moment  All logarithms are related, no matter the base  Change base for an answer using constant multiple  But ignore constant multiple using big-Oh notation O(log n)  So can consider all O(log n) solutions identical

24 Nested Loops for (int i = 0; i < n; i++){ for (int j = 0; j < n; j++) { } }  Program would execute outer loop n times  Inner loop run n times each iteration of outer loop  O(n)O(n)  O(n) iterations doing O(n) work each iteration O(n)O(n)O(n 2 )  So loop has O(n) * O(n) = O(n 2 ) complexity total  Loops complexity multiplies when nested

25 +  Only care about approximates on huge data sets  Ignore constant multiples n!2 n n 5 n 2 nlog n1  Drop lesser terms (& n! > 2 n > n 5 > n 2 > n > log n > 1 )  O(1)  O(1) time for primitive statements to execute O(n)  Change by constant amount in loop: O(n) time  O(log n)  O(log n) time if multiply by constant in loop  Ignore constants: does not matter what constant is add  When code is sequential, add their complexities multiplied  Complexities are multiplied when code is nested

26 Algorithm sneaky(int n) total = 0 for i = 0 to n do for j = 0 to n do total += i * j return total end for end for  sneaky would take _____ time to execute  O(n) iterations for each loop in the method It’s About Time

27 Algorithm sneaky(int n) total = 0 for i = 0 to n do for j = 0 to n do total += i * j return total end for end for  sneaky would take O(1) time to execute  O(n) iterations for each loop in the method  But in first pass, method ends after return  Always executes same number of operations It’s About Time

28 Algorithm power(int a, int b ≥ 0) if a == 0 && b == 0 then return -1 endif exp = 1 repeat b times exp *= a end repeat return exp  power takes O(n) time in most cases  Would only take O(1) if a & b are 0  ____ algorithm overall Big-Oh == Murphy’s Law

29 Algorithm power(int a, int b ≥ 0) if a == 0 && b == 0 then return -1 endif exp = 1 repeat b times exp *= a end repeat return exp  power takes O(n) time in most cases  Would only take O(1) if a & b are 0 big-Oh uses worst-case  O(n) algorithm overall; big-Oh uses worst-case Big-Oh == Murphy’s Law

30 algorithm sum(int[][] a) total = 0 for i = 0 to a.length do for j = 0 to a[i].length do total += a[i][j] end for end for return total  Despite nested loops, this runs in O(n) time  Input is doubly-subscripted array for this method  For this method n is number entries in array How Big Am I?

31 Handling Method Calls  Method call is O(1) operation, …  … but then also need to add time running method  Big-Oh counts operations executed in total  Remember: there is no such thing as free lunch  Borrowing $5 to pay does not make your lunch free  Similarly, need to include all operations executed  In which method run DOES NOT MATTER

32 public static int sumOdds(int n) { int sum = 0; for (int i = 1; i <= n; i+=2) { sum+=i; } return sum; } public static void oddSeries(int n) { for (int i = 1; i < n; i++) { System.out.println(i + “ ” + sumOdds(n)); } }  oddSeries calls sumOdds n times  Each call does O(n) work, so takes O(n 2 ) total time! Methods Calling Methods

33  Important to explain your answer  Saying O(n) not enough to make it O(n)  Methods using recursion especially hard to determine  Derive difficult answer using simple process Justifying an Answer

34  Important to explain your answer  Saying O(n) not enough to make it O(n)  Methods using recursion especially hard to determine  Derive difficult answer using simple process Justifying an Answer

35  Important to explain your answer  Saying O(n) not enough to make it O(n)  Methods using recursion especially hard to determine  Derive difficult answer using simple process  May find that you can simplify big-Oh computation  Find smaller or larger big-Oh than imagined  Can be proof, but need not be that formal  Explaining your answer is critical for this  Helps you be able to convince others Justifying an Answer

36 Algorithm factorial(int n) if n <= 1 then return 1 else fact = factorial(n – 1) return n * fact endif  Ignoring recursive calls cost, runs in O(1) time  At most n – 1 calls since n decreased by 1 each time  Method’s total complexity is O(n)  Runs O(n – 1) * O(1) = O(n - 1) = O(n) operations Big-Oh Notation

37 Algorithm fib(int n) if n <= 1 then return n else return fib(n-1) + fib(n-2) endif  O(1) time for each O(2 n ) calls = O(2 n ) complexity  Calls fib(1), fib(0) when n = 2  n = 3, total of 4 calls: 3 for fib(2) + 1 for fib(1)  n = 4, total of 8 calls: 5 for fib(3) + 3 for fib(2)  Number calls 2x when n incremented = O(2 n ) Big-Oh Notation

38 Your Turn  Get into your groups and complete activity

39 For Next Lecture  Read GT5.1 – 5.1.1, 5.1.4, 5.1.5 for Friday's class  What is an ADT and how are they defined?  How does a Stack work?  Also available is week #8 weekly assignment  Programming assignment #1 also on Angel  Pulls everything together and shows off your stuff  Better get moving on it, since due on Monday


Download ppt "Problem of the Day  On the next slide I wrote today’s problem of the day. It has 3 possible answers. Can you guess which 1 of the following is the solution?"

Similar presentations


Ads by Google