Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.

Slides:



Advertisements
Similar presentations
CSE115/ENGR160 Discrete Mathematics 03/01/12
Advertisements

Analysis of Algorithms
 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Discrete Structures & Algorithms Functions & Asymptotic Complexity.
Estimating Running Time Algorithm arrayMax executes 3n  1 primitive operations in the worst case Define a Time taken by the fastest primitive operation.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Chapter 3 Growth of Functions
Asymptotic Growth Rate
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Mathematics Review and Asymptotic Notation
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
The Growth of Functions Rosen 2.2 Basic Rules of Logarithms log z (xy) log z (x/y) log z (x y ) If x = y If x < y log z (-|x|) is undefined = log z (x)
CS 3343: Analysis of Algorithms
Algorithm Analysis An algorithm is a clearly specified set of simple instructions to be followed to solve a problem. Three questions for algorithm analysis.
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
Asymptotic Analysis-Ch. 3
CMPT 438 Algorithms Chapter 3 Asymptotic Notations.
3.Growth of Functions Asymptotic notation  g(n) is an asymptotic tight bound for f(n). ``=’’ abuse.
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
Fundamentals of Algorithms MCS - 2 Lecture # 8. Growth of Functions.
Nattee Niparnan Dept. of Computer Engineering, Chulalongkorn University.
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
Analysis of Algorithms1 O-notation (upper bound) Asymptotic running times of algorithms are usually defined by functions whose domain are N={0, 1, 2, …}
Growth of Functions. 2 Analysis of Bubble Sort 3 Time Analysis – Best Case The array is already sorted – no swap operations are required.
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
Time Complexity of Algorithms (Asymptotic Notations)
Big-O. Algorithm Analysis Exact analysis: produce a function f(n) measuring how many basic steps are needed for a given inputs n On any input of size.
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
Analysis of Algorithm. Why Analysis? We need to know the “behavior” of algorithms – How much resource (time/space) does it use So that we know when to.
COP 3530 Spring2012 Data Structures & Algorithms Discussion Session Week 5.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
ADVANCED ALGORITHMS REVIEW OF ANALYSIS TECHNIQUES (UNIT-1)
David Meredith Growth of Functions David Meredith
13 February 2016 Asymptotic Notation, Review of Functions & Summations.
Asymptotic Notation Faculty Name: Ruhi Fatima
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
LECTURE 2 : fundamentals of analysis of algorithm efficiency Introduction to design and analysis algorithm 1.
CSE 3358 NOTE SET 2 Data Structures and Algorithms 1.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 2.
Asymptotic Complexity
Chapter 2 Algorithm Analysis
Introduction to Algorithms: Asymptotic Notation
nalisis de lgoritmos A A
Asymptotic Analysis.
Introduction to Algorithms
Growth of functions CSC317.
Chapter 2 Fundamentals of the Analysis of Algorithm Efficiency
CS 3343: Analysis of Algorithms
O-notation (upper bound)
Asymptotic Growth Rate
Asymptotic Analysis.
Fundamentals of Algorithms MCS - 2 Lecture # 9
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
CS 3343: Analysis of Algorithms
Advanced Analysis of Algorithms
Chapter 2.
Ch 3: Growth of Functions Ming-Te Chi
Intro to Data Structures
Performance Evaluation
O-notation (upper bound)
G.PULLAIAH COLLEGE OF ENGINEERING AND TECHNOLOGY
Advanced Algorithms Analysis and Design
Discrete Mathematics 7th edition, 2009
Advanced Analysis of Algorithms
Presentation transcript:

Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate

CS575 / Class 1Growth of Functions 2 Function Growth The running time of an algorithm as input size approaches infinity is called the asymptotic running time We study different notations for asymptotic efficiency. In particular, we study tight bounds, upper bounds and lower bounds.

Cutler/HeadGrowth of Functions 3 Outline Why do we need the different sets? Definition of the sets O, Omega and Theta Classifying examples: –Using the original definition –Using limits General Properties Little Oh Additional properties

Cutler/HeadGrowth of Functions 4 The “sets” and their use – big Oh Big “oh” - asymptotic upper bound on the growth of an algorithm When do we use Big Oh? 1.Theory of NP-completeness 2.To provide information on the maximum number of operations that an algorithm performs –Insertion sort is O(n 2 ) in the worst case This means that in the worst case it performs at most cn 2 operations –Insertion sort is also O(n 6 ) in the worst case since it also performs at most dn 6 operations

Cutler/HeadGrowth of Functions 5 The “sets” and their use – Omega Omega - asymptotic lower bound on the growth of an algorithm or a problem * When do we use Omega? 1. To provide information on the minimum number of operations that an algorithm performs –Insertion sort is  (n) in the best case This means that in the best case its instruction count is at least cn, –It is  (n 2 ) in the worst case This means that in the worst case its instruction count is at least cn 2

Cutler/HeadGrowth of Functions 6 The “sets” and their use – Omega cont. 2. To provide information on a class of algorithms that solve a problem –Sort algorithms based on comparison of keys are  (nlgn) in the worst case This means that all sort algorithms based only on comparison of keys have to do at least cnlgn operations –Any algorithm based only on comparison of keys to find the maximum of n elements is  (n) in every case This means that all algorithms based only on comparison of keys to find maximum have to do at least cn operations

Cutler/HeadGrowth of Functions 7 The “sets” and their use - Theta Theta - asymptotic tight bound on the growth rate of an algorithm –Insertion sort is  (n 2 ) in the worst and average cases The means that in the worst case and average cases insertion sort performs cn 2 operations –Binary search is  (lg n) in the worst and average cases The means that in the worst case and average cases binary search performs clgn operations Note: We want to classify an algorithm using Theta. –In Data Structures used Oh Little “oh” - used to denote an upper bound that is not asymptotically tight. n is in o(n 3 ). n is not in o(n)

CS575 / Class 1Growth of Functions 8 The functions Let f(n) and g(n) be asymptotically nonnegative functions whose domains are the set of natural numbers N={0,1,2,…}. A function g(n) is asymptotically nonnegative, if g(n)  0 for all n  n 0 where n 0  N

Cutler/HeadGrowth of Functions 9 Asymptotic Upper Bound: big O f (n) c  g (n) f (n) = O ( g ( n ) ) N Why only for n  N ? What is the purpose of multiplying by c > 0? Graph shows that for all n  N, f(n)  c*g(n)

Cutler/HeadGrowth of Functions 10 Asymptotic Upper Bound: O Definition: Let f (n) and g(n) be asymptotically non-negative functions. We say f (n) is in O ( g ( n ) ) if there is a real positive constant c and a positive Integer N such that for every n  N 0  f (n)  c  g (n ). Or using more mathematical notation O ( g (n) ) = { f (n )| there exist positive constant c and a positive integer N such that 0  f( n)  c  g (n ) for all n  N }

Cutler/HeadGrowth of Functions 11 n n   O (n 2 ) Why? n n 2 n 2 take c = 2 N = 10 2n 2 > n n for all n>=10

Cutler/HeadGrowth of Functions 12 Does 5n+2  O(n)? Proof: From the definition of Big Oh, there must exist c>0 and integer N>0 such that 0  5n+2  cn for all n  N. Dividing both sides of the inequality by n>0 we get: 0  5+2/n  c. 2/n  2, 2/n>0 becomes smaller when n increases There are many choices here for c and N. If we choose N=1 then c  5+2/1= 7. If we choose c=6, then 0  5+2/n  6. So N  2. In either case (we only need one!) we have a c>o and N>0 such that 0  5n+2  cn for all n  N. So the definition is satisfied and 5n+2  O(n)

Cutler/HeadGrowth of Functions 13 Does n 2  O(n)? No. We will prove by contradiction that the definition cannot be satisfied. Assume that n 2  O(n). From the definition of Big Oh, there must exist c>0 and integer N>0 such that 0  n 2  cn for all n  N. Dividing the inequality by n>0 we get 0  n  c for all n  N. n  c cannot be true for any n >max{c,N }, contradicting our assumption So there is no constant c>0 such that n  c is satisfied for all n  N, and n 2  O(n)

Cutler/HeadGrowth of Functions 14 O ( g (n) ) = { f (n )| there exist positive constant c and positive integer N such that 0  f( n)  c  g (n ) for all n  N } 1,000,000 n 2  O(n 2 ) why/why not? (n - 1)n / 2  O(n 2 ) why /why not? n / 2  O(n 2 ) why /why not? lg (n 2 )  O( lg n ) why /why not? n 2  O(n) why /why not?

Cutler/HeadGrowth of Functions 15 Asymptotic Lower Bound, Omega:  f (n) c * g (n) f(n) =  ( g ( n ) ) N

Cutler/HeadGrowth of Functions 16 Asymptotic Lower Bound:  Definition: Let f (n) and g(n) be asymptotically non-negative functions. We say f (n) is  ( g ( n ) ) if there is a positive real constant c and a positive integer N such that for every n  N 0  c * g (n )  f ( n). Or using more mathematical notation  ( g ( n ) ) = { f (n) | there exist positive constant c and a positive integer N such that 0  c * g (n )  f ( n) for all n  N }

Cutler/HeadGrowth of Functions 17 Is 5n-20   (n)? Proof: From the definition of Omega, there must exist c>0 and integer N>0 such that 0  cn  5n-20 for all n  N Dividing the inequality by n>0 we get: 0  c  5-20/n for all n  N. 20/n  20, and 20/n becomes smaller as n grows. There are many choices here for c and N. Since c > 0, 5 – 20/n >0 and N >4 For example, if we choose c=4, then 5 – 20/n  4 and N  20 In this case we have a c>o and N>0 such that 0  cn  5n-20 for all n  N. So the definition is satisfied and 5n-20  (n)

Cutler/HeadGrowth of Functions 18  ( g ( n ) ) = { f (n) | there exist positive constant c and a positive integer N such that 0  c * g (n )  f ( n) for all n  N } 1,000,000 n 2  (n 2 ) why /why not? (n - 1)n / 2  (n 2 ) why /why not? n / 2  (n 2 ) why /why not? lg (n 2 )  ( lg n ) why /why not? n 2  (n) why /why not?

Cutler/HeadGrowth of Functions 19 Asymptotic Tight Bound:  f (n) d  g (n) f (n) =  ( g ( n ) ) N c  g (n)

Cutler/HeadGrowth of Functions 20 Asymptotic Bound Theta:  Definition: Let f (n) and g(n) be asymptotically non-negative functions. We say f (n) is  ( g ( n ) ) if there are positive constants c, d and a positive integer N such that for every n  N 0  c  g (n )  f ( n)  d  g ( n ). Or using more mathematical notation  ( g ( n ) ) = { f (n) | there exist positive constants c, d and a positive integer N such that 0  c  g (n )  f ( n)  d  g ( n ). for all n  N }

Cutler/HeadGrowth of Functions 21 More on  We will use this definition:  (g (n)) = O( g (n) )  ( g (n) )

Cutler/HeadGrowth of Functions 22 We show: 1. 2.

Cutler/HeadGrowth of Functions 23

Cutler/HeadGrowth of Functions 24

Cutler/HeadGrowth of Functions 25 More  1,000,000 n 2  (n 2 ) why /why not? (n - 1)n / 2  (n 2 ) why /why not? n / 2  (n 2 ) why /why not? lg (n 2 )    ( lg n ) why /why not? n 2  (n) why /why not?

Cutler/HeadGrowth of Functions 26 Limits can be used to determine Order c then f (n) =  ( g (n)) if c > 0 if lim f (n) / g (n) = 0 or c > 0 then f (n) =  ( g(n))  or c > 0  then f (n) =  ( g (n)) The limit must exist { n 

Cutler/HeadGrowth of Functions 27 Example using limits

Cutler/HeadGrowth of Functions 28 L’Hopital’s Rule If f(x) and g(x) are both differentiable with derivatives f’(x) and g’(x), respectively, and if

Cutler/HeadGrowth of Functions 29 Example using limits

Cutler/HeadGrowth of Functions 30 Example using limits lg () ln lg)' ln lim lg lim (lg)' ' lim ln n On n n n n n n n nn nnn           ('= 1 nln2

Cutler/HeadGrowth of Functions 31 Example using limits nO e ee nkn kknk k nn nnnn n k n n k n n k n n nk             )'(ln lim ln lim () ln...lim ! ln n () where k is a positive integer. ()'=ln2

Cutler/HeadGrowth of Functions 32 Another upper bound “little oh”: o Definition: Let f (n) and g(n) be asymptotically non-negative functions.. We say f ( n ) is o ( g ( n ) ) if for every positive real constant c there exists a positive integer N such that for all n  N 0  f(n)  c  g (n ). o ( g (n) ) = {f(n) : for any positive constant c >0, there exists a positive integer N > 0 such that 0  f( n)  c  g (n ) for all n  N } “little omega” can also be defined

Cutler/HeadGrowth of Functions 33 main difference between O and o O ( g (n) ) = { f (n )| there exist positive constant c and a positive integer N such that 0  f( n)  c  g (n ) for all n  N } o ( g (n) ) = { f(n) | for any positive constant c >0, there exists a positive integer N such that 0  f( n)  c  g (n ) for all n  N } For ‘o’ the inequality holds for all positive constants. Whereas for ‘O’ the inequality holds for some positive constants.

Cutler/HeadGrowth of Functions 34 Lower-order terms and constants Lower order terms of a function do not matter since lower- order terms are dominated by the higher order term. Constants (multiplied by highest order term) do not matter, since they do not affect the asymptotic growth rate All logarithms with base b >1 belong to  (lg n) since

Cutler/HeadGrowth of Functions 35 Asymptotic notation in equations What does n 2 + 2n + 99 = n 2 +  (n) mean? n 2 + 2n + 99 = n 2 + f(n) Where the function f(n) is from the set  (n). In fact f(n) = 2n Using notation in this manner can help to eliminate non- affecting details and clutter in an equation. ? 2n 2 + 5n + 21= 2n 2 +  (n ) =  (n 2 )

Cutler/HeadGrowth of Functions 36 Transitivity: If f (n) =  (g(n )) and g (n) =  (h(n )) then f (n) =  (h(n )). If f (n) =  (g(n )) and g (n) =  (h(n )) then f (n) =  (h(n )). If f (n) =  (g(n )) and g (n) =  (h(n )) then f (n) =  (h(n )). If f (n) =  (g(n )) and g (n) =  (h(n )) then f (n) =  (h(n )). If f (n) =  (g(n )) and g (n) =  (h(n )) then f (n) =  (h(n ))

Cutler/HeadGrowth of Functions 37 Reflexivity: f (n) =  (f (n )). f (n) =  (f (n )). f (n) =  (f (n )). “o” is not reflexive

Cutler/HeadGrowth of Functions 38 Symmetry and Transpose symmetry: Symmetry: f (n) =  ( g(n )) if and only if g (n) =  (f (n )). Transpose symmetry: f (n) =  (g(n )) if and only if g (n) =  (f (n )). f (n) =  (g(n )) if and only if g (n) =  (f (n )).

Cutler/HeadGrowth of Functions 39 Analogy between asymptotic comparison of functions and comparison of real numbers. f (n) =  ( g(n))  a  b f (n) =  ( g(n))  a  b f (n) =  ( g(n))  a  b f (n) =  ( g(n))  a  b f (n) =  ( g(n))  a  b

Cutler/HeadGrowth of Functions 40 Not All Functions are Comparable The following functions are not asymptotically comparable:

Cutler/HeadGrowth of Functions 41

Cutler/HeadGrowth of Functions 42 Is O(g(n)) =  (g(n))  o(g(n))? We show a counter example: The functions are: g(n) = n and f(n)  O(n) but f(n)   (n) and f(n)  o(n) Conclusion: O(g(n))   (g(n))  o(g(n))

Cutler/HeadGrowth of Functions 43 General Rules We say a function f (n ) is polynomially bounded if f (n ) = O ( n k ) for some positive constant k We say a function f (n ) is polylogarithmic bounded if f (n ) = O ( lg k n) for some positive constant k Exponential functions –grow faster than positive polynomial functions –grow faster than polylogarithmic functions