Problem of the Day  On the next slide I wrote today’s problem of the day. It has 3 possible answers. Can you guess which 1 of the following is the solution?

Slides:



Advertisements
Similar presentations
MATH 224 – Discrete Mathematics
Advertisements

CHAPTER 2 ALGORITHM ANALYSIS 【 Definition 】 An algorithm is a finite set of instructions that, if followed, accomplishes a particular task. In addition,
Algorithm Complexity Analysis: Big-O Notation (Chapter 10.4)
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis Tyler Robison Summer
Introduction to Analysis of Algorithms
Complexity Analysis (Part I)
Analysis of Algorithms. Time and space To analyze an algorithm means: –developing a formula for predicting how fast an algorithm is, based on the size.
Cmpt-225 Algorithm Efficiency.
The Efficiency of Algorithms
Lecture 3 Aug 31, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis discussion of lab – permutation generation.
Algorithm Analysis CS 201 Fundamental Structures of Computer Science.
Complexity (Running Time)
The Efficiency of Algorithms
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
Lecture 3 Feb 7, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis Image representation Image processing.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
Lecture 16: Big-Oh Notation
Analysis of Algorithms Spring 2015CS202 - Fundamentals of Computer Science II1.
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
For Wednesday Read Weiss chapter 3, sections 1-5. This should be largely review. If you’re struggling with the C++ aspects, you may refer to Savitch, chapter.
Humorous Asides “A journey begins with single step”
SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY Lecture 12 CS2110 – Fall 2009.
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
1 Recursion Algorithm Analysis Standard Algorithms Chapter 7.
Data Structures and Algorithms Lecture 5 and 6 Instructor: Quratulain Date: 15 th and 18 th September, 2009 Faculty of Computer Science, IBA.
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
Question of the Day A friend tells the truth when saying: A road near my house runs directly north-south; I get on the road facing north, drive for a mile,
Complexity Analysis Chapter 1.
Question of the Day While walking across a bridge I saw a boat filled with people. Nobody boarded or left the boat, but on board the boat there was not.
Analysis of Algorithms
1 7.Algorithm Efficiency What to measure? Space utilization: amount of memory required  Time efficiency: amount of time required to process the data Depends.
Object-Oriented Design CSC 212. Announcements Ask more questions!  Your fellow students have the same questions (remember, I grade the daily quizzes)
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
Big Oh Algorithms are compared to each other by expressing their efficiency in big-oh notation Big O notation is used in Computer Science to describe the.
1 Analysis of Algorithms CS 105 Introduction to Data Structures and Algorithms.
Week 12 - Wednesday.  What did we talk about last time?  Asymptotic notation.
Recursion. What is recursion? Rules of recursion Mathematical induction The Fibonacci sequence Summary Outline.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Data Structures R e c u r s i o n. Recursive Thinking Recursion is a problem-solving approach that can be used to generate simple solutions to certain.
Problem of the Day  I am thinking of a question and propose 3 possible answers. Exactly one of the following is the solution. Which is it? A. Answer 1.
Algorithm Analysis Part of slides are borrowed from UST.
LECTURE 20: RECURSION CSC 212 – Data Structures. Humorous Asides.
Algorithm Analysis (Big O)
CSE 373: Data Structures and Algorithms
27-Jan-16 Analysis of Algorithms. 2 Time and space To analyze an algorithm means: developing a formula for predicting how fast an algorithm is, based.
Question of the Day  Move one matchstick to produce a square.
LECTURE 23: LOVE THE BIG-OH CSC 212 – Data Structures.
Question of the Day  What three letter word completes the first word and starts the second one: DON???CAR.
CSC 143Q 1 CSC 143 Program Efficiency [Chapter 9, pp ]
0 Introduction to asymptotic complexity Search algorithms You are responsible for: Weiss, chapter 5, as follows: 5.1 What is algorithmic analysis? 5.2.
CSC 212 – Data Structures Lecture 15: Big-Oh Notation.
LECTURE 22: BIG-OH COMPLEXITY CSC 212 – Data Structures.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
Algorithm Complexity Analysis (Chapter 10.4) Dr. Yingwu Zhu.
1 7.Algorithm Efficiency These factors vary from one machine/compiler (platform) to another  Count the number of times instructions are executed So, measure.
Nyhoff, ADTs, Data Structures and Problem Solving with C++, Second Edition, © 2005 Pearson Education, Inc. All rights reserved Recursion,
Algorithm Analysis 1.
Complexity Analysis (Part I)
Introduction to Analysis of Algorithms
Algorithm Analysis CSE 2011 Winter September 2018.
Building Java Programs
Algorithm design and Analysis
CS 201 Fundamental Structures of Computer Science
Analyzing an Algorithm Computing the Order of Magnitude Big O Notation
Searching, Sorting, and Asymptotic Complexity
Sum this up for me Let’s write a method to calculate the sum from 1 to some n public static int sum1(int n) { int sum = 0; for (int i = 1; i
Complexity Analysis (Part I)
Complexity Analysis (Part I)
Presentation transcript:

Problem of the Day  On the next slide I wrote today’s problem of the day. It has 3 possible answers. Can you guess which 1 of the following is the solution? A. Answer 1 B. Answers 1 or 2 C. Answer 2 D. Answers 2 or 3

Problem of the Day  On the next slide I wrote today’s problem of the day. It has 3 possible answers. Can you guess which 1 of the following is the solution? A. Answer 1 B. Answers 1 or 2 C. Answer 2 D. Answers 2 or 3  If answers 1 or 2 were correct, we would not be able to select exactly one solution. So, answer 3 (and selection D) must be right.

CSC 212 – Data Structures

Analysis Techniques  Running time is critical, …  …but comparing times impossible in many cases  Single problem may have lots of ways to be solved  Many implementations possible for each solution

Pseudo-Code  Only for human eyes  Unimportant & implementation details ignored  Serves very real purpose, even if it is not real  Useful for tasks like outlining, designing, & analyzing  Language-like manner to system, though not formal

Pseudo-Code  Only needs to include details needed for tracing  Loops, assignments, calls to methods, etc.  Anything that would be helpful analyzing algorithm  Understanding algorithm is only goal  Feel free to ignore punctuation & other formalisms  Understanding & analysis is only goal of using this

Pseudo-code Example Algorithm factorial(int n) returnVariable = 1 while (n > 0) returnVariable = returnVariable * n n = n – 1 endwhile return returnVariable

“Anything that can go wrong…”  Expresses an algorithm’s complexity  Worst-case  Worst-case analysis of algorithm performance  Usually closely correlated with execution time  Not always right to consider only worst-case  May be situation where worst-case is very rare  Closely related approaches for other cases come later

“Should I Even Bother?”  Compare algorithms using big-Oh notation  Could use to compare implementations, also  Saves time implementing all the algorithms  Biases like CPU, typing speed, cosmic rays ignored

Algorithmic Analysis

Algorithm Analysis  Execution time with n inputs on 4GHz machine: n = 10n = 50n = 100n = 1000n = 10 6 O(n log n)9 ns50 ns175 ns2500 ns5 ms O(n 2 )25 ns625 ns2250 ns ns4 min O(n 5 )25000 ns72.5 ms2.7 s2.9 days1x10 13 yrs O(2 n )2500 ns3.25 days1 x yrs1 x yrs Too long! O(n!)1 ms1.4 x yrs7 x yrs Too long!

 Want results for large data sets  Nobody cares  Nobody cares about 2 minute-long program  Limit considerations to only major details  Ignore multipliers  So, O ( ⅛n ) = O ( 5n ) = O ( 50000n ) = O ( n )  Multipliers usually implementation-specific  How many 5ms can we fit into 4 minutes?  Ignore lesser terms  So, O ( ⅚ n n 2 ) = O ( n 5 + n 2 ) = O ( n 5 )  Tolerate extra 17 minutes after waiting 3x10 13 years? Big-Oh Notation

What is n ?  Big-Oh analysis always relative to input size  But determining input size is not always clear  Quick rules of thumb:  Need to consider what algorithm is processing Analyze values below x : n = x Analyze data in an array: n = size of array Analyze linked list: n = size of linked list Analyze 2 arrays: n = sum of array sizes

primitive operations  Big-Oh computes primitive operations executed  Assignments  Calling a method  Performing arithmetic operation  Comparing two values  Getting entry from an array  Following a reference  Returning a value from a method  Accessing a field Analyzing an Algorithm

Primitive Statements O(1)  Basis of programming, take constant time: O(1)  Fastest possible big-Oh notation  Time to run sequence of primitive statements, too  But only if the input does not affect sequence Ignore constant multiplier 11 O(5) = O(5 * 1 ) = O( 1 )

Simple Loops for (int i = 0; i < n.length; i++){} -or- while (i < n) { i++; }  Each loop executed n times  Primitive statements only within body of loop O(1)  Big –oh complexity of single loop iteration: O(1) O(n)  Either loop runs O(n) iterations O(n)O(1)O(n)  So loop has O(n) * O(1) = O(n) complexity total

for (int i = 0; i < n.length; i++){} int i = 0; while (i < n) { i++; }  Add complexities of sequences to compute total  For this example, total big-Oh complexity is: O(n)O(1) O(n) = O(n) + O(1) + O(n) O(n1) = O(2 * n + 1) O(n 1) = O(n + 1) Loops In a Row

for (int i = 0; i < n.length; i++){} int i = 0; while (i < n) { i++; }  Add complexities of sequences to compute total  For this example, total big-Oh complexity is: O(n)O(1) O(n) = O(n) + O(1) + O(n) O(n1) = O(2 * n + 1) O(n 1) = O(n + 1) Loops In a Row

for (int i = 0; i < n.length; i++){} int i = 0; while (i < n) { i++; }  Add complexities of sequences to compute total  For this example, total big-Oh complexity is: O(n)O(1) O(n) = O(n) + O(1) + O(n) O(n1) = O(2 * n + 1) O(n) = O(n) Loops In a Row

More Complicated Loops for (int i = 0; i < n; i += 2) { } i  0, 2, 4, 6,..., n  In above example, loop executes n / 2 iterations O(1)  Iterations takes O(1) time, so total complexity: O( n )O(1) = O( n / 2 ) * O(1) O(n ) = O(n * ½ * 1) O(n) = O(n)

Really Complicated Loops for (int i = 1; i < n; i *= 2) { } i  1, 2, 4, 8,..., n  In above code, loop executes log 2 n iterations O(1)  Iterations takes O(1) time, so total complexity: O(log 2 n)O(1) = O(log 2 n) * O(1) O(log 2 n) = O(log 2 n * 1) O(log 2 n) = O(log 2 n)

Really Complicated Loops for (int i = 1; i < n; i *= 3) { } i  1, 3, 9, 27,..., n  In above code, loop executes log 3 n iterations O(1)  Iterations takes O(1) time, so total complexity: O(log 3 n)O(1) = O(log 3 n) * O(1) O(log 3 n) = O(log 3 n * 1) O(log 3 n) = O(log 3 n)

Math Moment  All logarithms are related, no matter the base  Change base for an answer using constant multiple  But ignore constant multiple using big-Oh notation O(log n)  So can consider all O(log n) solutions identical

Nested Loops for (int i = 0; i < n; i++){ for (int j = 0; j < n; j++) { } }  Program would execute outer loop n times  Inner loop run n times each iteration of outer loop  O(n)O(n)  O(n) iterations doing O(n) work each iteration O(n)O(n)O(n 2 )  So loop has O(n) * O(n) = O(n 2 ) complexity total  Loops complexity multiplies when nested

+  Only care about approximates on huge data sets  Ignore constant multiples n!2 n n 5 n 2 nlog n1  Drop lesser terms (& n! > 2 n > n 5 > n 2 > n > log n > 1 )  O(1)  O(1) time for primitive statements to execute O(n)  Change by constant amount in loop: O(n) time  O(log n)  O(log n) time if multiply by constant in loop  Ignore constants: does not matter what constant is add  When code is sequential, add their complexities multiplied  Complexities are multiplied when code is nested

Algorithm sneaky(int n) total = 0 for i = 0 to n do for j = 0 to n do total += i * j return total end for end for  sneaky would take _____ time to execute  O(n) iterations for each loop in the method It’s About Time

Algorithm sneaky(int n) total = 0 for i = 0 to n do for j = 0 to n do total += i * j return total end for end for  sneaky would take O(1) time to execute  O(n) iterations for each loop in the method  But in first pass, method ends after return  Always executes same number of operations It’s About Time

Algorithm power(int a, int b ≥ 0) if a == 0 && b == 0 then return -1 endif exp = 1 repeat b times exp *= a end repeat return exp  power takes O(n) time in most cases  Would only take O(1) if a & b are 0  ____ algorithm overall Big-Oh == Murphy’s Law

Algorithm power(int a, int b ≥ 0) if a == 0 && b == 0 then return -1 endif exp = 1 repeat b times exp *= a end repeat return exp  power takes O(n) time in most cases  Would only take O(1) if a & b are 0 big-Oh uses worst-case  O(n) algorithm overall; big-Oh uses worst-case Big-Oh == Murphy’s Law

algorithm sum(int[][] a) total = 0 for i = 0 to a.length do for j = 0 to a[i].length do total += a[i][j] end for end for return total  Despite nested loops, this runs in O(n) time  Input is doubly-subscripted array for this method  For this method n is number entries in array How Big Am I?

Handling Method Calls  Method call is O(1) operation, …  … but then also need to add time running method  Big-Oh counts operations executed in total  Remember: there is no such thing as free lunch  Borrowing $5 to pay does not make your lunch free  Similarly, need to include all operations executed  In which method run DOES NOT MATTER

public static int sumOdds(int n) { int sum = 0; for (int i = 1; i <= n; i+=2) { sum+=i; } return sum; } public static void oddSeries(int n) { for (int i = 1; i < n; i++) { System.out.println(i + “ ” + sumOdds(n)); } }  oddSeries calls sumOdds n times  Each call does O(n) work, so takes O(n 2 ) total time! Methods Calling Methods

 Important to explain your answer  Saying O(n) not enough to make it O(n)  Methods using recursion especially hard to determine  Derive difficult answer using simple process Justifying an Answer

 Important to explain your answer  Saying O(n) not enough to make it O(n)  Methods using recursion especially hard to determine  Derive difficult answer using simple process Justifying an Answer

 Important to explain your answer  Saying O(n) not enough to make it O(n)  Methods using recursion especially hard to determine  Derive difficult answer using simple process  May find that you can simplify big-Oh computation  Find smaller or larger big-Oh than imagined  Can be proof, but need not be that formal  Explaining your answer is critical for this  Helps you be able to convince others Justifying an Answer

Algorithm factorial(int n) if n <= 1 then return 1 else fact = factorial(n – 1) return n * fact endif  Ignoring recursive calls cost, runs in O(1) time  At most n – 1 calls since n decreased by 1 each time  Method’s total complexity is O(n)  Runs O(n – 1) * O(1) = O(n - 1) = O(n) operations Big-Oh Notation

Algorithm fib(int n) if n <= 1 then return n else return fib(n-1) + fib(n-2) endif  O(1) time for each O(2 n ) calls = O(2 n ) complexity  Calls fib(1), fib(0) when n = 2  n = 3, total of 4 calls: 3 for fib(2) + 1 for fib(1)  n = 4, total of 8 calls: 5 for fib(3) + 3 for fib(2)  Number calls 2x when n incremented = O(2 n ) Big-Oh Notation

Your Turn  Get into your groups and complete activity

For Next Lecture  Read GT5.1 – 5.1.1, 5.1.4, for Friday's class  What is an ADT and how are they defined?  How does a Stack work?  Also available is week #8 weekly assignment  Programming assignment #1 also on Angel  Pulls everything together and shows off your stuff  Better get moving on it, since due on Monday