Asymptotic Analysis-Ch. 3

Slides:



Advertisements
Similar presentations
BY Lecturer: Aisha Dawood. The notations we use to describe the asymptotic running time of an algorithm are defined in terms of functions whose domains.
Advertisements

Analysis of Algorithms
Intro to Analysis of Algorithms. Algorithm “A sequence of unambiguous instructions for solving a problem, i.e., for obtaining a required output for any.
 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Razdan with contribution from others 1 Algorithm Analysis What is the Big ‘O Bout? Anshuman Razdan Div of Computing.
Discrete Structures & Algorithms Functions & Asymptotic Complexity.
Estimating Running Time Algorithm arrayMax executes 3n  1 primitive operations in the worst case Define a Time taken by the fastest primitive operation.
Chapter 3 Growth of Functions
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Complexity Analysis (Part II)
Cmpt-225 Algorithm Efficiency.
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
CHAPTER 2 ANALYSIS OF ALGORITHMS Part 1. 2 Big Oh and other notations Introduction Classifying functions by their asymptotic growth Theta, Little oh,
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
David Luebke 1 8/17/2015 CS 332: Algorithms Asymptotic Performance.
Analysis of Performance
Asymptotic Notations Iterative Algorithms and their analysis
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Algorithmic Complexity: Complexity Analysis of Time Complexity Complexities Nate the Great.
Program Performance & Asymptotic Notations CSE, POSTECH.
1 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Introduction in Computer Science 2 Asymptotic Complexity DEEDS Group.
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Mathematics Review and Asymptotic Notation
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
Design and Analysis Algorithm Drs. Achmad Ridok M.Kom Fitra A. Bachtiar, S.T., M. Eng Imam Cholissodin, S.Si., M.Kom Aryo Pinandito, MT Pertemuan 04.
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
CS 3343: Analysis of Algorithms
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
1 o-notation For a given function g(n), we denote by o(g(n)) the set of functions: o(g(n)) = {f(n): for any positive constant c > 0, there exists a constant.
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
Asymptotic Growth Rates  Themes  Analyzing the cost of programs  Ignoring constants and Big-Oh  Recurrence Relations & Sums  Divide and Conquer 
Time Complexity of Algorithms (Asymptotic Notations)
Big-O. Algorithm Analysis Exact analysis: produce a function f(n) measuring how many basic steps are needed for a given inputs n On any input of size.
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
COP 3530 Spring2012 Data Structures & Algorithms Discussion Session Week 5.
David Luebke 1 1/6/2016 CS 332: Algorithms Asymptotic Performance.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Asymptotic Performance. Review: Asymptotic Performance Asymptotic performance: How does algorithm behave as the problem size gets very large? Running.
ADVANCED ALGORITHMS REVIEW OF ANALYSIS TECHNIQUES (UNIT-1)
13 February 2016 Asymptotic Notation, Review of Functions & Summations.
Asymptotic Notation Faculty Name: Ruhi Fatima
Big O David Kauchak cs302 Spring Administrative Assignment 1: how’d it go? Assignment 2: out soon… Lab code.
Dale Roberts Department of Computer and Information Science, School of Science, IUPUI Dale Roberts, Lecturer Computer Science, IUPUI
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
Algorithms Lecture #05 Uzair Ishtiaq. Asymptotic Notation.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Asymptotic Complexity
Chapter 2 Algorithm Analysis
Introduction to Algorithms: Asymptotic Notation
nalisis de lgoritmos A A
Introduction to Algorithms
Growth of functions CSC317.
CS 3343: Analysis of Algorithms
O-notation (upper bound)
CSC 413/513: Intro to Algorithms
Introduction to Algorithms Analysis
Asymptotic Growth Rate
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
CS 3343: Analysis of Algorithms
Advanced Analysis of Algorithms
Chapter 2.
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Algorithm Course Dr. Aref Rashad
Presentation transcript:

Asymptotic Analysis-Ch. 3 Names for order of growth for classes of algorithms: constant (n0) = (1) logarithmic (lgn) linear (n) <“en log en”> (nlgn) quadratic (n2) cubic (n3) polynomial (nk), k ≥ 1 exponential (an), a > 1 Growth Rate Increasing Listed in order of increasing growth rate Note that linear, quadratic and cubic are polynomial functions

Asymptotic analysis valid only in the limit Example: an algorithm with running time of order n2 will "eventually" (i.e., for sufficiently large n) run slower than one with running time of order n, which in turn will eventually run slower than one with running time of order lgn. "Big Oh", "Theta", and ”Big Omega" are the tools we will use to make these notions precise. Note: By saying valid "in the limit" or "asymptotically”, we mean that the comparison may not hold true for small values of n. So lgn = O(n) and n = O(n^2)

"Big Oh" - Upper Bounding Running Time Definition: f(n)  O(g(n)) if there exist constants c > 0 and n0 > 0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0. Intuition: f(n)  O(g(n)) means f(n) is “of order at most”, or “less than or equal to” g(n) when we ignore small values of n f(n) is eventually less than or equal to some constant multiple of g(n) for large enough n For large enough n, some constant multiple of g(n) is an upper bound for f(n) Technically, it is abuse of notation to say that f(n) = O(g(n)). We really mean that f(n) is an element of a set of functions.

(lg n)2 ≤ n for c=1 and all n ≥ 16, so (lg n)2 is O(n) Example: (lg n)2 is O(n) TIME f(n) = (lg n)2 g(n) = n n = 16 INPUT SIZE (lg n)2 ≤ n for c=1 and all n ≥ 16, so (lg n)2 is O(n)

Basic idea: ignore constant factor differences and lower-order terms 617n3 + 277n2 + 720n + 7n  O(?)

Proving Running Time: finding values of c and n Consider f(n) = 5n3 + 2n2 + 22n + 6 We claim that f(n)  O(n3) Let c = 6 and n0 = 6. Then 5n3 + 2n2 + 22n + 6 ≤ 6n3 for every n ≥ 6

Proving Running Time: finding values of c and n If f(n) = 5n3 + 2n2 + 22n + 6 we have seen that f(n)  O(n3) but f(n) is not in O(n2), because no positive value for c or n0 works for large enough n.

Logarithms Asymptotics allow us to ignore log base Different base changes only constant factor When we say f(n)  O(log n), the base is unimportant. Usually, we use log2

Important Notation Sometimes you will see notation like this: f(n)  O(n2) + O(n) Each occurrence of big-O symbol has a distinct constant multiple. But O(n2) term dominates O(n) term, so the above is equivalent to f(n)  O(n2)

Example: InsertionSort InsertionSort(A) for j = 2 to length(A) key = A[j] i = j - 1 while i>0 and A[i]>key A[i+1] = A[i] i = i - 1 A[i+1]= key INPUT: An array A of n numbers {a1, a2,..., an} OUTPUT: A permutation of input array {a1, a2,..., an} such that a1 a2... an. Time for execution on input array of length n (if exact count is made of the number of times each line is executed): o best-case: b(n) = 5n - 4 o worst-case: w(n) = 3n2/2 + 11n/2 - 4 In the average case, about half of the items are moved. Don't worry about how this execution time was obtained.

Insertion Sort - Time Complexity Time complexities for insertion sort are: o best-case: b(n) = 5n - 4 o worst-case: w(n) = 3n2/2 + 11n/2 - 4 Questions: is b(n)  O(n) ? Yes (5n - 4 < 6n) for all n ≥ 0 2. is w(n)  O(n) ? No (3n2/2 + 11n/2 - 4 ≥ 3n) for all n≥1 3. is w(n)  O(n2)? Yes (3n2/2 + 11n/2 - 4 < 4n2) for all n ≥ 0 4. is w(n)  O(n3)? Yes (3n2/2 + 11n/2 - 4 ≤ 2n3) for all n ≥ 2

Plotting run-time graphically f(n) = 2n+6 f(n)  O(g(n)) if there are positive constants c and n0 such that: f(n)≤c g(n) for n ≥ n0 2n+6 is O(n) since 2n0+6 ≤ 4n0 for n0 ≥ 3 g(n) = 4n T I M E n0 = 3 INPUT SIZE

Plotting run-time graphically On the other hand… n2 is not O(n) because there are no c and n0 such that: n2 ≤ cn for all n ≥ n0 4n g(n) = n2 T I M E This graph is not accurate, a quadratic function should not grow in a straight line. This is just to give you an idea of how a quadratic function compares to a linear one in terms of growth rate. n0 = 4 INPUT SIZE

"Big Omega" - Lower Bounding Running Time Definition: f(n)  (g(n)) if there exist constants c > 0 and n0 > 0 such that f(n) ≥ cg(n) for all n ≥ n0. Intuition: f(n)  (g(n)) means f(n) is “of order at least” or “greater than or equal to” g(n) when we ignore small values of n . f(n) is eventually greater than or equal to some constant multiple of g(n) for large enough n For large enough n, some constant multiple of g(n) is a lower bound for f(n)

Insertion Sort - Time Complexity Time complexities for insertion sort are: o best-case: b(n) = 5n - 4 o worst-case: w(n) = 3n2/2 + 11n/2 - 4 Questions: is b(n)  (n) ? Yes… (5n - 4 ≥ 2n) for all n0 ≥ 2 2. is w(n)  (n) ? Yes… (3n2/2 + 11n/2 - 4 ≥ 3n) for all n0 ≥ 1 3. is w(n)  (n2) ? Yes… (3n2/2 + 11n/2 – 4 ≥ n2) for all n0 ≥1 4. is w(n)  (n3) ? No … (3n2/2 + 11n/2 - 4 < n3) for all n0 ≥ 3

"Theta" - Tightly Bounding Running Time Definition: f(n)  (g(n)) if there exist constants c1, c2 > 0 and n0 > 0 such that c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥ n0. Intuition: f(n)  (g(n)) means f(n) is “of the same order as”, or “equal to” g(n) when we ignore small values of n. f(n) is eventually trapped between two constant multiples of g(n) for large enough n Showing "Theta" relationships: Show both a "Big Oh" and "Big Omega" relationship.

Insertion Sort - Time Complexity Time complexities for insertion sort are: o best-case: b(n) = 5n - 4 o worst-case: w(n) = 3n2/2 + 11n/2 - 4 Questions: is b(n) = (n) ? Yes because b(n) = O(n) and (n) 2. is w(n) = (n) ? No because w(n)  O(n) 3. is w(n) = (n2) ? Yes because w(n) = O(n2) and (n2) 4. is w(n) = (n3)? No because w(n)  (n3)

Asymptotic Analysis O (f(n)): Big Oh--asymptotic upper bound. Classifying algorithms is generally done in terms of worst-case running time: O (f(n)): Big Oh--asymptotic upper bound.  (f(n)): Big Omega--asymptotic lower bound  (f(n)): Theta--asymptotic tight bound Reasons for analyzing worst-case running time: - w-c gives us a guaranteed upper bound on the running time for any input. - w-c occurs often for some algorithms. - we could analyze the average case...but this is often nearly as bad as the worst case and the analysis is significantly more complex.

Useful Properties for Asymptotic Analysis If f(n)  O(g(n)) and g(n)  O(h(n)), then f(n)  O(h(n)) (transitivity) intuition: if f(n) "" g(n) and g(n) "" h(n) then f(n) "" h(n) - transitivity also holds for  and  f(n)  O(g(n)) iff g(n)  (f(n)) (transpose symmetry) intuition: f(n) "" g(n) iff g(n) "" f(n) f(n)  (g(n)) iff g(n)  (f(n)) (symmetry) intuition: f(n) "=" g(n) iff g(n) "=" f(n)

Useful Properties for Asymptotic Analysis O(f(n) + g(n))  O(max( f(n), g(n) ))) e.g., O( n3 + n )  O(n3) O(f(n) * g(n))  O( f(n) * g(n) ) e.g., O( n3 * n )  O(n4)

Little Oh "Little Oh" notation is used to denote strict upper bounds, (Big-Oh bounds are not necessarily strict inequalities). Definition: f(n)  o(g(n)) if for every c > 0, there exists some n0 > 0 such that for all n  n0, f(n) < cg(n). Intuition: f(n)  o(g(n)) means f(n) is "strictly less than" any constant multiple of g(n) when we ignore small values of n f(n) is trapped below any constant multiple of g(n) for large enough n

Little Omega "Little Omega" notation is used to denote strict lower bounds ( bounds are not necessarily strict inequalities). Definition: f(n)  (g(n)) if for every c > 0, there exists some n0 > 0 such that for all n  n0, f(n) > cg(n). Intuition: f(n)  (g(n)) means f(n) is "strictly greater than" any constant multiple of g(n) when we ignore small values of n f(n) is trapped above any constant multiple of g(n) for large enough n

Little Oh and Little Omega Showing "Little Oh and Little Omega" relationships: f(n)  o(g(n)) iff lim f(n) / g(n) = 0 n f(n)  (g(n)) iff lim f(n) / g(n) = ∞ Note that a little oh relationship also implies big oh and the same for little omega. Theta relationships can also be shown using limits: f(n)  θ(g(n)) if lim f(n) / g(n) = c, for some constant c n

Handy Asymptotic Facts If T(n) is a polynomial function of degree k, then T(n)  O(nk) nb  O(an) for any constants a > 1,b > 0 (Exponentials dominate polynomials) In particular, any exponential function with a base strictly greater than 1 grows faster than any polynomial function. n!  o(nn) n!  (2n) lg(n!)  (nlgn) (by Stirling’s approximation) The base of an exponential function and the degree of a polynomial matter asymptotically, but the base of a logarithm does not. remove lower order terms. any poly-log function grows more slowly than any linear function. any polynomial function of n grows more slowly than any exponential function where exponent is > 0 and base > 1. d)the factorial of n grows strictly slower than n to the nth power. e)the factorial of n grows strictly faster than 2 to the nth power. f)the log of n! grows at the same rate as nlgn (Stirling's approximation). g)logs to any base grow at the same rate, but an exponent or polynomial to a higher power grows faster.

Iterated logarithm function (lg*n): the number of times the lg function can be iteratively applied before the result is less than or equal to 1 - "log star of n" - Very slow growing, e.g. lg*(265536) = 5 eg: lg*2 = 1 lg*4 = 2 lg*16 = 3 lg*65536 = 4

Basic asymptotic efficiency of code Class Name Sample algorithm type O(1) Constant Algorithm ignores input (i.e., can’t even scan input) O(lgn) Logarithmic Cuts problem size by constant fraction on each iteration O(n) Linear Algorithm scans its input (at least) O(nlgn) “n-log-n” Some divide and conquer O(n2) Quadratic Loop inside loop = “nested loop” O(n3) Cubic Loop inside nested loop O(2n) Exponential Algorithm generates all subsets of n-element set of binary values O(n!) Factorial Algorithm generates all permutations of n-element set

End of Lecture 3