25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.

Slides:



Advertisements
Similar presentations
Analysis of Algorithms
Advertisements

 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Discrete Structures & Algorithms Functions & Asymptotic Complexity.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Growth of Functions CIS 606 Spring 2010.
Discrete Structures & Algorithms Summations EECE 320 — UBC.
Analysis of Algorithms
Analysis of Algorithms (pt 2) (Chapter 4) COMP53 Oct 3, 2007.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
3.Growth of Functions Hsu, Lih-Hsing.
Introduction to Analysis of Algorithms
Algorithm analysis and design Introduction to Algorithms week1
Introduction to Algorithms (2 nd edition) by Cormen, Leiserson, Rivest & Stein Chapter 3: Growth of Functions (slides enhanced by N. Adlai A. DePano)
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Asymptotics Data Structures & OO Development I 1 Computer Science Dept Va Tech June 2006 ©2006 McQuain & Ribbens Big-O Analysis Order of magnitude analysis.
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Analysis of Algorithms
Mathematics Review and Asymptotic Notation
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
CS 3343: Analysis of Algorithms
Asymptotic Analysis-Ch. 3
CMPT 438 Algorithms Chapter 3 Asymptotic Notations.
3.Growth of Functions Asymptotic notation  g(n) is an asymptotic tight bound for f(n). ``=’’ abuse.
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
1 o-notation For a given function g(n), we denote by o(g(n)) the set of functions: o(g(n)) = {f(n): for any positive constant c > 0, there exists a constant.
Nattee Niparnan Dept. of Computer Engineering, Chulalongkorn University.
Zeinab EidAlgorithm Analysis1 Chapter 4 Analysis Tools.
Growth of Functions. 2 Analysis of Bubble Sort 3 Time Analysis – Best Case The array is already sorted – no swap operations are required.
Asymptotic Growth Rates  Themes  Analyzing the cost of programs  Ignoring constants and Big-Oh  Recurrence Relations & Sums  Divide and Conquer 
CSC – 332 Data Structures Generics Analysis of Algorithms Dr. Curry Guinn.
Introduction to Analysis of Algorithms CS342 S2004.
Time Complexity of Algorithms (Asymptotic Notations)
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 1 Chapter.
General rules: Find big-O f(n) = k = O(1) f(n) = a k n k + a k-1 n k a 1 n 1 + a 0 = O(n k ) Other functions, try to find the dominant term according.
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
Analysis of Algorithm. Why Analysis? We need to know the “behavior” of algorithms – How much resource (time/space) does it use So that we know when to.
Introduction to Algorithms Lecture 2 Chapter 3: Growth of Functions.
Asymptotic Notations By Er. Devdutt Baresary. Introduction In mathematics, computer science, and related fields, big O notation describes the limiting.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
ADVANCED ALGORITHMS REVIEW OF ANALYSIS TECHNIQUES (UNIT-1)
David Meredith Growth of Functions David Meredith
13 February 2016 Asymptotic Notation, Review of Functions & Summations.
Asymptotic Notation Faculty Name: Ruhi Fatima
Dale Roberts Department of Computer and Information Science, School of Science, IUPUI Dale Roberts, Lecturer Computer Science, IUPUI
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Ch03-Algorithms 1. Algorithms What is an algorithm? An algorithm is a finite set of precise instructions for performing a computation or for solving a.
September 18, Algorithms and Data Structures Lecture II Simonas Šaltenis Aalborg University
CSE 3358 NOTE SET 2 Data Structures and Algorithms 1.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 2.
Asymptotic Complexity
Introduction to Algorithms: Asymptotic Notation
nalisis de lgoritmos A A
Asymptotic Analysis.
Introduction to Algorithms
CS 3343: Analysis of Algorithms
Asymptotic Notation Q, O, W, o, w
Asymptotic Notation Q, O, W, o, w
Asymptotic Growth Rate
Asymptotic Analysis.
Ch 3: Growth of Functions Ming-Te Chi
G.PULLAIAH COLLEGE OF ENGINEERING AND TECHNOLOGY
Advanced Algorithms Analysis and Design
At the end of this session, learner will be able to:
Algorithm Course Dr. Aref Rashad
Presentation transcript:

25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations

asymp - 1 Comp 122 Asymptotic Complexity  Running time of an algorithm as a function of input size n for large n.  Expressed using only the highest-order term in the expression for the exact running time.  Instead of exact running time, say  (n 2 ).  Describes behavior of function in the limit.  Written using Asymptotic Notation.

asymp - 2 Comp 122 Asymptotic Notation  , O, , o,   Defined for functions over the natural numbers.  Ex: f(n) =  (n 2 ).  Describes how f(n) grows in comparison to n 2.  Define a set of functions; in practice used to compare two function sizes.  The notations describe different rate-of-growth relations between the defining function and the defined set of functions.

asymp - 3 Comp 122  -notation  (g(n)) = { f(n) :  positive constants c 1, c 2, and n 0, such that  n  n 0, we have 0  c 1 g(n)  f(n)  c 2 g(n) } For function g(n), we define  (g(n)), big-Theta of n, as the set: g(n) is an asymptotically tight bound for f(n). Intuitively: Set of all functions that have the same rate of growth as g(n).

asymp - 4 Comp 122  -notation  (g(n)) = { f(n) :  positive constants c 1, c 2, and n 0, such that  n  n 0, we have 0  c 1 g(n)  f(n)  c 2 g(n) } For function g(n), we define  (g(n)), big-Theta of n, as the set: Technically, f(n)   (g(n)). Older usage, f(n) =  (g(n)). I’ll accept either… f(n) and g(n) are nonnegative, for large n.

asymp - 5 Comp 122 Example  10n 2 - 3n =  (n 2 )  What constants for n 0, c 1, and c 2 will work?  Make c 1 a little smaller than the leading coefficient, and c 2 a little bigger.  To compare orders of growth, look at the leading term.  Exercise: Prove that n 2 /2-3n=  (n 2 )  (g(n)) = { f(n) :  positive constants c 1, c 2, and n 0, such that  n  n 0, 0  c 1 g(n)  f(n)  c 2 g(n) }

asymp - 6 Comp 122 Example  Is 3n 3   (n 4 ) ??  How about 2 2n   (2 n )??  (g(n)) = { f(n) :  positive constants c 1, c 2, and n 0, such that  n  n 0, 0  c 1 g(n)  f(n)  c 2 g(n) }

asymp - 7 Comp 122 O-notation O(g(n)) = { f(n) :  positive constants c and n 0, such that  n  n 0, we have 0  f(n)  cg(n) } For function g(n), we define O(g(n)), big-O of n, as the set: g(n) is an asymptotic upper bound for f(n). Intuitively: Set of all functions whose rate of growth is the same as or lower than that of g(n). f(n) =  (g(n))  f(n) = O(g(n)).  (g(n))  O(g(n)).

asymp - 8 Comp 122 Examples  Any linear function an + b is in O(n 2 ). How?  Show that 3n 3 =O(n 4 ) for appropriate c and n 0. O(g(n)) = {f(n) :  positive constants c and n 0, such that  n  n 0, we have 0  f(n)  cg(n) }

asymp - 9 Comp 122  -notation g(n) is an asymptotic lower bound for f(n). Intuitively: Set of all functions whose rate of growth is the same as or higher than that of g(n). f(n) =  (g(n))  f(n) =  (g(n)).  (g(n))   (g(n)).  (g(n)) = { f(n) :  positive constants c and n 0, such that  n  n 0, we have 0  cg(n)  f(n) } For function g(n), we define  (g(n)), big-Omega of n, as the set:

asymp - 10 Comp 122 Example   n =  ( lg n). Choose c and n 0.  (g(n)) = {f(n) :  positive constants c and n 0, such that  n  n 0, we have 0  cg(n)  f(n)}

asymp - 11 Comp 122 Relations Between , O, 

asymp - 12 Comp 122 Relations Between , , O  I.e.,  (g(n)) = O(g(n))   (g(n))  In practice, asymptotically tight bounds are obtained from asymptotic upper and lower bounds. Theorem : For any two functions g(n) and f(n), f(n) =  (g(n)) iff f(n) = O(g(n)) and f(n) =  (g(n)). Theorem : For any two functions g(n) and f(n), f(n) =  (g(n)) iff f(n) = O(g(n)) and f(n) =  (g(n)).

asymp - 13 Comp 122 Running Times  “Running time is O(f(n))”  Worst case is O(f(n))  O(f(n)) bound on the worst-case running time  O(f(n)) bound on the running time of every input.   (f(n)) bound on the worst-case running time   (f(n)) bound on the running time of every input.  “Running time is  (f(n))”  Best case is  (f(n))  Can still say “Worst-case running time is  (f(n))”  Means worst-case running time is given by some unspecified function g(n)   (f(n)).

asymp - 14 Comp 122 Example  Insertion sort takes  (n 2 ) in the worst case, so sorting (as a problem) is O(n 2 ). Why?  Any sort algorithm must look at each item, so sorting is  (n).  In fact, using (e.g.) merge sort, sorting is  (n lg n) in the worst case.  Later, we will prove that we cannot hope that any comparison sort to do better in the worst case.

asymp - 15 Comp 122 Asymptotic Notation in Equations  Can use asymptotic notation in equations to replace expressions containing lower-order terms.  For example, 4n 3 + 3n 2 + 2n + 1 = 4n 3 + 3n 2 +  (n) = 4n 3 +  (n 2 ) =  (n 3 ). How to interpret?  In equations,  (f(n)) always stands for an anonymous function g(n)   (f(n))  In the example above,  (n 2 ) stands for 3n 2 + 2n + 1.

asymp - 16 Comp 122 o-notation f(n) becomes insignificant relative to g(n) as n approaches infinity: lim [f(n) / g(n)] = 0 n  g(n) is an upper bound for f(n) that is not asymptotically tight. Observe the difference in this definition from previous ones. Why? o(g(n)) = {f(n):  c > 0,  n 0 > 0 such that  n  n 0, we have 0  f(n) < cg(n)}. For a given function g(n), the set little-o:

asymp - 17 Comp 122  (g(n)) = {f(n):  c > 0,  n 0 > 0 such that  n  n 0, we have 0  cg(n) < f(n)}.  -notation f(n) becomes arbitrarily large relative to g(n) as n approaches infinity: lim [f(n) / g(n)] = . n  g(n) is a lower bound for f(n) that is not asymptotically tight. For a given function g(n), the set little-omega:

asymp - 18 Comp 122 Comparison of Functions f  g  a  b f (n) = O(g(n))  a  b f (n) =  (g(n))  a  b f (n) =  (g(n))  a = b f (n) = o(g(n))  a < b f (n) =  (g(n))  a > b

asymp - 19 Comp 122 Limits  lim [f(n) / g(n)] = 0  f(n)   (g(n)) n   lim [f(n) / g(n)] <   f(n)   (g(n)) n   0 < lim [f(n) / g(n)] <   f(n)   (g(n)) n   0 < lim [f(n) / g(n)]  f(n)  (g(n)) n   lim [f(n) / g(n)] =   f(n)   (g(n)) n   lim [f(n) / g(n)] undefined  can’t say n 

asymp - 20 Comp 122 Properties  Transitivity f(n) =  (g(n)) & g(n) =  (h(n))  f(n) =  (h(n)) f(n) = O(g(n)) & g(n) = O(h(n))  f(n) = O(h(n)) f(n) =  (g(n)) & g(n) =  (h(n))  f(n) =  (h(n)) f(n) = o (g(n)) & g(n) = o (h(n))  f(n) = o (h(n)) f(n) =  (g(n)) & g(n) =  (h(n))  f(n) =  (h(n))  Reflexivity f(n) =  (f(n)) f(n) = O(f(n)) f(n) =  (f(n))

asymp - 21 Comp 122 Properties  Symmetry f(n) =  (g(n)) iff g(n) =  (f(n))  Complementarity f(n) = O(g(n)) iff g(n) =  (f(n)) f(n) = o(g(n)) iff g(n) =  ((f(n))

25 June 2015Comp 122, Spring 2004 Common Functions

asymp - 23 Comp 122 Monotonicity  f(n) is  monotonically increasing if m  n  f(m)  f(n).  monotonically decreasing if m  n  f(m)  f(n).  strictly increasing if m < n  f(m) < f(n).  strictly decreasing if m > n  f(m) > f(n).

asymp - 24 Comp 122 Exponentials  Useful Identities:  Exponentials and polynomials

asymp - 25 Comp 122 Logarithms x = log b a is the exponent for a = b x. Natural log: ln a = log e a Binary log: lg a = log 2 a lg 2 a = (lg a) 2 lg lg a = lg (lg a)

asymp - 26 Comp 122 Logarithms and exponentials – Bases  If the base of a logarithm is changed from one constant to another, the value is altered by a constant factor.  Ex: log 10 n * log 2 10 = log 2 n.  Base of logarithm is not an issue in asymptotic notation.  Exponentials with different bases differ by a exponential factor (not a constant factor).  Ex: 2 n = (2/3) n *3 n.

asymp - 27 Comp 122 Polylogarithms  For a  0, b > 0, lim n  ( lg a n / n b ) = 0, so lg a n = o(n b ), and n b =  (lg a n )  Prove using L’Hopital’s rule repeatedly  lg(n!) =  (n lg n)  Prove using Stirling’s approximation (in the text) for lg(n!).

asymp - 28 Comp 122 Exercise Express functions in A in asymptotic notation using functions in B. A B 5n n 3n A   (n 2 ), n 2   (B)  A   (B) log 3 (n 2 ) log 2 (n 3 ) log b a = log c a / log c b; A = 2lgn / lg3, B = 3lgn, A/B =2/(3lg3) n lg4 3 lg n a log b = b log a ; B =3 lg n =n lg 3 ; A/B =n lg(4/3)   as n  lg 2 n n 1/2 lim ( lg a n / n b ) = 0 (here a = 2 and b = 1/2)  A   (B) n  A   (B) A   (B) A   (B)

25 June 2015Comp 122, Spring 2004 Summations – Review

asymp - 30 Comp 122 Review on Summations  Why do we need summation formulas? For computing the running times of iterative constructs (loops). (CLRS – Appendix A) Example: Maximum Subvector Given an array A[1…n] of numeric values (can be positive, zero, and negative) determine the subvector A[i…j] (1  i  j  n) whose sum of elements is maximum over all subvectors

asymp - 31 Comp 122 Review on Summations MaxSubvector(A, n) maxsum  0; for i  1 to n do for j = i to n sum  0 for k  i to j do sum += A[k] maxsum  max(sum, maxsum) return maxsum n n j  T(n) =    1 i=1 j=i k=i  NOTE: This is not a simplified solution. What is the final answer?

asymp - 32 Comp 122 Review on Summations  Constant Series: For integers a and b, a  b,  Linear Series (Arithmetic Series): For n  0,  Quadratic Series: For n  0,

asymp - 33 Comp 122 Review on Summations  Cubic Series: For n  0,  Geometric Series: For real x  1, For |x| < 1,

asymp - 34 Comp 122 Review on Summations  Linear-Geometric Series: For n  0, real c  1,  Harmonic Series: nth harmonic number, n  I +,

asymp - 35 Comp 122 Review on Summations  Telescoping Series:  Differentiating Series: For |x| < 1,

asymp - 36 Comp 122 Review on Summations  Approximation by integrals:  For monotonically increasing f(n)  For monotonically decreasing f(n)  How?

asymp - 37 Comp 122 Review on Summations  nth harmonic number

asymp - 38 Comp 122 Reading Assignment  Chapter 4 of CLRS.