Chapter 10 Algorithm Efficiency

Slides:



Advertisements
Similar presentations
Analysis of Algorithms CS 477/677
Advertisements

Analysis of Algorithms CS Data Structures Section 2.6.
Razdan with contribution from others 1 Algorithm Analysis What is the Big ‘O Bout? Anshuman Razdan Div of Computing.
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Chapter 3 Growth of Functions
Introduction to Analysis of Algorithms
© 2006 Pearson Addison-Wesley. All rights reserved10-1 Chapter 10 Algorithm Efficiency and Sorting CS102 Sections 51 and 52 Marc Smith and Jim Ten Eyck.
Analysis of Algorithms1 Estimate the running time Estimate the memory space required. Time and space depend on the input size.
Cmpt-225 Algorithm Efficiency.
The Efficiency of Algorithms
Lecture 3 Aug 31, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis discussion of lab – permutation generation.
The Efficiency of Algorithms
Algorithm Efficiency and Sorting Bina Ramamurthy CSE116A,B.
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
Analysis of Algorithm.
Lecture 3 Feb 7, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis Image representation Image processing.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
Analysis of Algorithms Spring 2015CS202 - Fundamentals of Computer Science II1.
Algorithm Analysis (Big O)
COMP s1 Computing 2 Complexity
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Chapter 2.6 Comparison of Algorithms modified from Clifford A. Shaffer and George Bebis.
Week 2 CS 361: Advanced Data Structures and Algorithms
{ CS203 Lecture 7 John Hurley Cal State LA. 2 Execution Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
Mathematics Review and Asymptotic Notation
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
Analysis of Algorithms
Copyright © 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley. Ver Chapter 9: Algorithm Efficiency and Sorting Data Abstraction &
Algorithm Efficiency Chapter 10 Data Structures and Problem Solving with C++: Walls and Mirrors, Carrano and Henry, © 2013.
© 2011 Pearson Addison-Wesley. All rights reserved 10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
Chapter 10 A Algorithm Efficiency. © 2004 Pearson Addison-Wesley. All rights reserved 10 A-2 Determining the Efficiency of Algorithms Analysis of algorithms.
Analyzing algorithms & Asymptotic Notation BIO/CS 471 – Algorithms for Bioinformatics.
Coursenotes CS3114: Data Structures and Algorithms Clifford A. Shaffer Department of Computer Science Virginia Tech Copyright ©
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Algorithm Efficiency and Sorting Data Structure & Algorithm.
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them? At the heart of a computer program.
Computer Science and Software Engineering University of Wisconsin - Platteville 8. Comparison of Algorithms Yan Shi CS/SE 2630 Lecture Notes Part of this.
Algorithm Analysis Part of slides are borrowed from UST.
Algorithm Analysis (Big O)
Algorithm Complexity L. Grewe 1. Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them?
Algorithm Analysis. What is an algorithm ? A clearly specifiable set of instructions –to solve a problem Given a problem –decide that the algorithm is.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
Searching Topics Sequential Search Binary Search.
BITS Pilani Pilani Campus Data Structure and Algorithms Design Dr. Maheswari Karthikeyan Lecture1.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
Chapter 3 Chapter Summary  Algorithms o Example Algorithms searching for an element in a list sorting a list so its elements are in some prescribed.
GC 211:Data Structures Week 2: Algorithm Analysis Tools Slides are borrowed from Mr. Mohammad Alqahtani.
Data Structures & Algorithm CS-102 Lecture 12 Asymptotic Analysis Lecturer: Syeda Nazia Ashraf 1.
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
Algorithm Analysis 1.
Chapter 10 Algorithm Efficiency
CS 302 Data Structures Algorithm Efficiency.
Analysis of Algorithms
GC 211:Data Structures Week 2: Algorithm Analysis Tools
Analysis of Algorithms
Introduction to Algorithms
Algorithms Algorithm Analysis.
CSS 342 Data Structures, Algorithms, and Discrete Mathematics I
Algorithm Efficiency Chapter 10.
Programming and Data Structure
Analysis of Algorithms
Analysis of Algorithms
Algorithm Efficiency Chapter 10
Programming and Data Structure
Algorithm Analysis Bina Ramamurthy CSE116A,B.
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Analysis of Algorithms
Presentation transcript:

Chapter 10 Algorithm Efficiency CS 302 - Data Structures Mehmet H Gunes Modified from authors’ slides

Contents What Is a Good Solution? Measuring the Efficiency of Algorithms Data Structures and Problem Solving with C++: Walls and Mirrors, Carrano and Henry, © 2013

Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them? At the heart of computer program design are two (sometimes conflicting) goals To design an algorithm that is easy to understand, code, debug. makes efficient use of the resources.

Algorithm Efficiency (cont) Goal (1) is the concern of Software Engineering. Goal (2) is the concern of data structures and algorithm analysis. When goal (2) is important, how do we measure an algorithm’s cost?

What Is a Good Solution? Criterion A solution is good if the total cost it incurs over all phases of its life is minimal. Keep in mind, efficiency is only one aspect of a solution’s cost Note: Relative importance of various components of a solution’s cost has changed since early days of computing. Data Structures and Problem Solving with C++: Walls and Mirrors, Carrano and Henry, © 2013

Measuring Efficiency of Algorithms Comparison of algorithms should focus on significant differences in efficiency Difficulties with comparing programs instead of algorithms How are the algorithms coded? What computer should you use? What data should the programs use? Data Structures and Problem Solving with C++: Walls and Mirrors, Carrano and Henry, © 2013

Execution Time of Algorithm Traversal of linked nodes – example: Displaying data in linked chain of n nodes requires time proportional to n Data Structures and Problem Solving with C++: Walls and Mirrors, Carrano and Henry, © 2013

Algorithm Growth Rates Measure algorithm’s time requirement as a function of problem size Compare algorithm efficiencies for large problems Look only at significant differences. Data Structures and Problem Solving with C++: Walls and Mirrors, Carrano and Henry, © 2013

Algorithm Growth Rates Time requirements as a function of the problem size n Data Structures and Problem Solving with C++: Walls and Mirrors, Carrano and Henry, © 2013

Best, Worst, Average Cases Not all inputs of a given size take the same time to run. Sequential search for K in an array of n integers: Begin at first element in array and look at each element in turn until K is found Best case: Worst case: Average case: Best: Find at first position. Cost is 1 compare. Worst: Find at last position. Cost is n compares. Average: (n+1)/2 compares IF we assume the element with value K is equally likely to be in any position in the array.

Time Analysis Provides upper and lower bounds of running time.

Worst Case Provides an upper bound on running time. An absolute guarantee that the algorithm would not run longer, no matter what the inputs are.

Best Case Provides a lower bound on running time. Input is the one for which the algorithm runs the fastest.

Average Case Provides an estimate of “average” running time. Assumes that the input is random. Useful when best/worst cases do not happen very often i.e., few input cases lead to best/worst cases.

Which Analysis to Use? While average time appears to be the fairest measure, it may be difficult to determine. When is the worst case time important? Average time analysis requires knowledge of distributions. For example, the assumption of distribution used for average case in the last example. Worst-case time is important for real-time algorithms.

How to Measure Efficiency? Critical resources: Time, memory, battery, bandwidth, programmer effort, user effort Factors affecting running time: For most algorithms, running time depends on “size” of the input. Running time is expressed as T(n) for some function T on input size n. Empirical comparison is difficult to do “fairly” and is time consuming. Critical resources: Time. Space (disk, RAM). Programmers effort. Ease of use (user’s effort). Factors affecting running time: Machine load. OS. Compiler. Problem size. Specific input values for given problem size. 16

How do we analyze an algorithm? Need to define objective measures. (1) Compare execution times? Empirical comparison (run programs) Not good: times are specific to a particular machine. (2) Count the number of statements? Not good: number of statements varies with programming language and programming style.

How do we analyze an algorithm? (cont.) (3) Express running time t as a function of problem size n (i.e., t=f(n) ) Asymptotic Algorithm Analysis Given two algorithms having running times f(n) and g(n), find which functions grows faster - Such an analysis is independent of machine time, programming style, etc.

Comparing algorithms Given two algorithms having running times f(n) and g(n), how do we decide which one is faster? Compare “rates of growth” of f(n) and g(n)

Understanding Rate of Growth The low order terms of a function are relatively insignificant for large n n4 + 100n2 + 10n + 50 Approximation: n4 Highest order term determines rate of growth!

Visualizing Orders of Growth On a graph, as you go to the right, a faster growing function eventually becomes larger... fA(n)=30n+8 Value of function  fB(n)=n2+1 Increasing n 

Rate of growth The graphs of 3  n2 and n2 - 3  n + 10 Data Structures and Problem Solving with C++: Walls and Mirrors, Carrano and Henry, © 2013

Rate of Growth ≡ Asymptotic Analysis Using rate of growth as a measure to compare different functions implies comparing them asymptotically i.e., as n  If f(x) is faster growing than g(x), then f(x) always eventually becomes larger than g(x) in the limit i.e., for large enough values of x

Complexity Let us assume two algorithms A and B that solve the same class of problems. The time complexity of A is 5,000n, the one for B is 1.1n for an input with n elements For n = 10, A requires 50,000 steps, but B only 3, so B seems to be superior to A. For n = 1000, A requires 5106 steps, while B requires 2.51041 steps.

Names of Orders of Magnitude O(1) bounded (by a constant) time O(log2N) logarithmic time O(N) linear time O(N*log2N) N*log2N time O(N2) quadratic time O(N3) cubic time O(2N ) exponential time

Order of growth of common functions Data Structures and Problem Solving with C++: Walls and Mirrors, Carrano and Henry, © 2013

Example

Growth Rate Graph The lower graph corresponds to the box within the dashed lines in the lower left corner of the upper graph.

A comparison of growth-rate functions Data Structures and Problem Solving with C++: Walls and Mirrors, Carrano and Henry, © 2013

N log2N N*log2N N2 2N 1 0 0 1 2 2 1 2 4 4 4 2 8 16 16 8 3 24 64 256 16 4 64 256 65,536 32 5 160 1024 4.29*109 64 6 384 4096 1.84*1019 128 7 896 16,384 3.40*1038

A comparison of growth-rate functions Data Structures and Problem Solving with C++: Walls and Mirrors, Carrano and Henry, © 2013

Sample Execution Times

The Growth of Functions A problem that can be solved with polynomial worst-case complexity is called tractable Problems of higher complexity are called intractable Problems that no algorithm can solve are called unsolvable

Analysis and Big O Notation Definition: Algorithm A is order f ( n ) Denoted O( f ( n )) If constants k and n0 exist Such that A requires no more than k  f ( n ) time units to solve a problem of size n ≥ n0 . Data Structures and Problem Solving with C++: Walls and Mirrors, Carrano and Henry, © 2013

Asymptotic Notation O notation: asymptotic “less than”: f(n)=O(g(n)) implies: f(n) “≤” c g(n) in the limit* c is a constant (used in worst-case analysis) *formal definition in CS477/677

Asymptotic Notation  notation: asymptotic “greater than”: f(n)=  (g(n)) implies: f(n) “≥” c g(n) in the limit* c is a constant (used in best-case analysis) *formal definition in CS477/677

Asymptotic Notation  notation: asymptotic “equality”: f(n)=  (g(n)) implies: f(n) “=” c g(n) in the limit* c is a constant (provides a tight bound of running time) (best and worst cases are same) *formal definition in CS477/677

More on big-O O(g(n)) is a set of functions f(n) f(n) ϵ O(g(n)) if “f(n)≤cg(n)”

A Common Misunderstanding Confusing worst case with upper bound. Upper bound refers to a growth rate. Worst case refers to the worst input from among the choices for possible inputs of a given size.

Algorithm speed vs function growth An O(n2) algorithm will be slower than an O(n) algorithm (for large n). But an O(n2) function will grow faster than an O(n) function. fA(n)=30n+8 Value of function  fB(n)=n2+1 Increasing n 

Faster Computer or Algorithm? Suppose we buy a computer 10 times faster. n: size of input that can be processed in one second on old computer in 1000 computational units n’: size of input that can be processed in one second on new computer T(n) n n’ Change n’/n 10n 100 1,000 n’ = 10n 10 10n2 31.6 n’= 10n 3.16 3 4 n’ = n + 1 1 + 1/n How much speedup? 10 times. More important: How much increase in problem size for same time expended? That depends on the growth rate. n: Size of input that can be processed in one hour (10,000 steps). n’: Size of input that can be processed in one our on the new machine (100,000 steps). Note: for 2n, if n = 1000, then n’ would be 1003.

How do we find f(n)? Associate a "cost" with each statement (2) Find total number of times each statement is executed (3) Add up the costs

Properties of Growth-Rate Functions Ignore low-order terms Ignore a multiplicative constant in the high-order term O(f(n)) + O(g(n)) = O(f(n) + g(n)) Be aware of worst case, average case Data Structures and Problem Solving with C++: Walls and Mirrors, Carrano and Henry, © 2013

Keeping Your Perspective Array-based getEntry is O(1) Link-based getEntry is O(n) Consider how frequently particular ADT operations occur in given application Some seldom-used but critical operations must be efficient Data Structures and Problem Solving with C++: Walls and Mirrors, Carrano and Henry, © 2013

Keeping Your Perspective If problem size always small, ignore an algorithm’s efficiency Weigh trade-offs between algorithm’s time and memory requirements Compare algorithms for both style and efficiency Data Structures and Problem Solving with C++: Walls and Mirrors, Carrano and Henry, © 2013

Efficiency of Searching Algorithms Sequential search Worst case O(n) Average case O(n) Best case O(1) Binary search of sorted array Worst case O(log2n) Remember required overhead for keeping array sorted Data Structures and Problem Solving with C++: Walls and Mirrors, Carrano and Henry, © 2013

Running Time Examples The body of the while loop: O(N) while (i<N) { X=X+Y; // O(1) result = mystery(X); // O(N) i++; // O(1) } The body of the while loop: O(N) Loop is executed: N times N x O(N) = O(N2)

Running Time Examples if (i<j) for ( i=0; i<N; i++ ) X = X+i; else X=0; Max ( O(N), O(1) ) = O (N) O(N) O(1)

Running Time Examples (cont.’d) Algorithm 1 Cost arr[0] = 0; c1 arr[1] = 0; c1 arr[2] = 0; c1 ... arr[N-1] = 0; c1  ----------- c1+c1+...+c1 = c1 x N Algorithm 2 Cost for(i=0; i<N; i++) c2 arr[i] = 0; c1 ----------- (N+1) x c2 + N x c1 = (c2 + c1) x N + c2

Running Time Examples (cont.’d) Cost   sum = 0; c1 for(i=0; i<N; i++) c2 for(j=0; j<N; j++) c2 sum += arr[i][j]; c3 ------------ c1 + c2 x (N+1) + c2 x N x (N+1) + c3 x N x N 50

Complexity Examples What does the following algorithm compute? int who_knows(int a[n]) { int m = 0; for {int i = 0; i<n; i++} for {int j = i+1; j<n; j++} if ( abs(a[i] – a[j]) > m ) m = abs(a[i] – a[j]); return m; } returns the maximum difference between any two numbers in the input array Comparisons: n-1 + n-2 + n-3 + … + 1 = (n-1)n/2 = 0.5n2 - 0.5n Time complexity is O(n2)

Complexity Examples Another algorithm solving the same problem: int max_diff(int a[n]) { int min = a[0]; int max = a[0]; for {int i = 1; i<n; i++} if ( a[i] < min ) min = a[i]; else if ( a[i] > max ) max = a[i]; return max-min; } Comparisons: 2n - 2 Time complexity is O(n).

Examples of Growth Rate /* @return Position of largest value in "A“ */ static int largest(int[] A) { int currlarge = 0; // Position of largest for (int i=1; i<A.length; i++) if (A[currlarge] < A[i]) currlarge = i; // Remember position return currlarge; // Return largest position } As n grows, how does T(n) grow? Cost: T(n) = c1n + c2 steps

Examples (cont) sum = 0; for (i=1; i<=n; i++) for (j=1; j<=n; j++) sum++; Example 3: Cost: T(n) = c1n2 + c2. Roughly n2 steps, with sum being n2 at the end. Ignore various overhead such as loop counter increments.

Time Complexity Examples (1) a = b; This assignment takes constant time, so it is (1). sum = 0; for (i=1; i<=n; i++) sum += n; Asymptotic analysis is defined for equations. We need to convert programs to equations to analyze them. The traditional notation is (1), not (c). (n) even though the value of sum is n2.

Time Complexity Examples (2) sum = 0; for (j=1; j<=n; j++) for (i=1; i<=j; i++) sum++; for (k=0; k<n; k++) A[k] = k; First statement is (1). Double for loop is i = (n2). Final for loop is (n). Result: (n2).

Time Complexity Examples (3) sum1 = 0; for (i=1; i<=n; i++) for (j=1; j<=n; j++) sum1++; sum2 = 0; for (j=1; j<=i; j++) sum2++; The only difference is j<=n vs. j<= i. First loop, sum is n2. Second loop, sum is (n+1)(n)/2. Both are (n2).

Time Complexity Examples (4) sum1 = 0; for (k=1; k<=n; k*=2) for (j=1; j<=n; j++) sum1++; sum2 = 0; for (j=1; j<=k; j++) sum2++; First loop is n for k = 1 to log n, or (n log n). Second loop is 2k for k = 0 to log n - 1, or (n).

Example

Analyze the complexity of the following code