Asymptotic Growth Rate

Slides:



Advertisements
Similar presentations
CSE115/ENGR160 Discrete Mathematics 03/01/12
Advertisements

Analysis of Algorithms
 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Discrete Structures & Algorithms Functions & Asymptotic Complexity.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Chapter 3 Growth of Functions
The Growth of Functions
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Growth of Functions CIS 606 Spring 2010.
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
CHAPTER 2 ANALYSIS OF ALGORITHMS Part 1. 2 Big Oh and other notations Introduction Classifying functions by their asymptotic growth Theta, Little oh,
Chapter 2: Algorithm Analysis Big-Oh and Other Notations in Algorithm Analysis Lydia Sinapova, Simpson College Mark Allen Weiss: Data Structures and Algorithm.
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
CSE 373 Data Structures and Algorithms Lecture 4: Asymptotic Analysis II / Math Review.
Introduction to Algorithms (2 nd edition) by Cormen, Leiserson, Rivest & Stein Chapter 3: Growth of Functions (slides enhanced by N. Adlai A. DePano)
Discrete Structures Lecture 11: Algorithms Miss, Yanyan,Ji United International College Thanks to Professor Michael Hvidsten.
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Mathematics Review and Asymptotic Notation
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
The Growth of Functions Rosen 2.2 Basic Rules of Logarithms log z (xy) log z (x/y) log z (x y ) If x = y If x < y log z (-|x|) is undefined = log z (x)
CS 3343: Analysis of Algorithms
Algorithm Analysis An algorithm is a clearly specified set of simple instructions to be followed to solve a problem. Three questions for algorithm analysis.
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
Asymptotic Analysis-Ch. 3
CMPT 438 Algorithms Chapter 3 Asymptotic Notations.
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Fundamentals of Algorithms MCS - 2 Lecture # 8. Growth of Functions.
1 o-notation For a given function g(n), we denote by o(g(n)) the set of functions: o(g(n)) = {f(n): for any positive constant c > 0, there exists a constant.
Nattee Niparnan Dept. of Computer Engineering, Chulalongkorn University.
Analysis of Algorithms1 O-notation (upper bound) Asymptotic running times of algorithms are usually defined by functions whose domain are N={0, 1, 2, …}
Design and Analysis of Algorithms Chapter Asymptotic Notations* Dr. Ying Lu August 30, RAIK.
Growth of Functions. 2 Analysis of Bubble Sort 3 Time Analysis – Best Case The array is already sorted – no swap operations are required.
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
Time Complexity of Algorithms (Asymptotic Notations)
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
Analysis of Algorithm. Why Analysis? We need to know the “behavior” of algorithms – How much resource (time/space) does it use So that we know when to.
Asymptotic Behavior Algorithm : Design & Analysis [2]
COP 3530 Spring2012 Data Structures & Algorithms Discussion Session Week 5.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
ADVANCED ALGORITHMS REVIEW OF ANALYSIS TECHNIQUES (UNIT-1)
David Meredith Growth of Functions David Meredith
13 February 2016 Asymptotic Notation, Review of Functions & Summations.
Asymptotic Notation Faculty Name: Ruhi Fatima
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
DR. Gatot F. Hertono, MSc. Design and Analysis of ALGORITHM (Session 2)
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 2.
Asymptotic Complexity
Chapter 2 Algorithm Analysis
Introduction to Algorithms: Asymptotic Notation
nalisis de lgoritmos A A
Introduction to Algorithms
Growth of functions CSC317.
CS 3343: Analysis of Algorithms
O-notation (upper bound)
Asymptotic Growth Rate
BIG-OH AND OTHER NOTATIONS IN ALGORITHM ANALYSIS
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
CS 3343: Analysis of Algorithms
Advanced Analysis of Algorithms
Chapter 2.
Intro to Data Structures
Performance Evaluation
G.PULLAIAH COLLEGE OF ENGINEERING AND TECHNOLOGY
Advanced Algorithms Analysis and Design
Advanced Analysis of Algorithms
Presentation transcript:

Asymptotic Growth Rate

Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time We study different notations for asymptotic efficiency. In particular, we study tight bounds, upper bounds and lower bounds.

Outline Why do we need the different sets? Definition of the sets O (Oh),  (Omega) and  (Theta), o (oh),  (omega) Classifying examples: Using the original definition Using limits

The functions Let f(n) and g(n) be asymptotically nonnegative functions whose domains are the set of natural numbers N={0,1,2,…}. A function g(n) is asymptotically nonnegative, if g(n)³0 for all n³n0 where n0ÎN

Big Oh Big “oh” - asymptotic upper bound on the growth of an algorithm When do we use Big Oh? To provide information on the maximum number of operations that an algorithm performs Insertion sort is O(n2) in the worst case This means that in the worst case it performs at most cn2 operations Theory of NP-completeness An algorithm is polynomial if it is O(nk) for some constant k P = NP if there is any polynomial time algorithm for any NP-complete problem Note: Don’t worry about NP now; it is to be discussed much later in the semester

Definition of Big Oh O(f(n)) is the set of functions g(n) such that: there exist positive constants c, and N for which, 0 £ g(n) £ cf(n) for all n ³ N f(n) is called an asymptotically upper bound for g(n); that is, g(n) is O(f(n)). Ask questions: Why is n a nonnegative integer Why is f(n) real? Why is n non negative What is n0 doing here? What is c doing here?

I grow at most as fast as f g(n) Î O(f(n)) I grow at most as fast as f cf(n) g(n) N n

n2 + 10 n  O(n2) Why? take c = 2 N = 10 n2+10n <=2n2 for all n>=10 1400 1200 1000 800 n2 + 10n 600 2 n2 400 200 10 20 30

Does 5n+2 ÎO(n)? Proof: From the definition of Big Oh, there must exist c > 0 and integer N > 0 such that 0 £ 5n+2 £ cn for all n ³ N. Dividing both sides of the inequality by n > 0 we get: 0 £ 5+2/n £ c. 2/n ( > 0) becomes smaller as n increases For instance, let N = 2 and c = 6 There are many choices here for c and N. n^2<=cn for all n>=n0. Let n0>0 Divide by n. We get n<c for all n>=n0, so also for n>c We get a contradiction n<c and n>c What do we call such a proof?

Does 5n+2 ÎO(n)? If we choose N = 1 then 5+2/n £ 5+2/1= 7. So any c ³ 7 works. Let’s choose c = 7. If we choose c = 6, then 0 £ 5+2/n £ 6. So any N ³ 2 works. Choose N = 2. In either case (we only need one!) , c > 0 and N > 0 such that 0 £ 5n+2 £ cn for all n ³ N. So the definition is satisfied and 5n+2 ÎO(n) n^2<=cn for all n>=n0. Let n0>0 Divide by n. We get n<c for all n>=n0, so also for n>c We get a contradiction n<c and n>c What do we call such a proof?

Does n2Î O(n)? No. We will prove by contradiction that the definition cannot be satisfied. Assume that n2Î O(n). From the definition of Big Oh, there must exist c > 0 and integer N > 0 such that 0 £ n2 £ cn for all n ³ N. Divide the inequality by n > 0 to get 0 £ n £ c for all n ³ N. n £ c cannot be true for any n > max{c,N }. This contradicts the assumption. Thus, n2 O(n). n^2<=cn for all n>=n0. Let n0>0 Divide by n. We get n<c for all n>=n0, so also for n>c We get a contradiction n<c and n>c What do we call such a proof?

Are they true? 1,000,000 n2  O(n2) why/why not? True (n - 1)n / 2  O(n2) why /why not? n / 2  O(n2) why /why not? lg (n2)  O( lg n ) why /why not? n2  O(n) why /why not? False

Omega Asymptotic lower bound on the growth of an algorithm or a problem When do we use Omega? 1. To provide information on the minimum number of operations that an algorithm performs Insertion sort is (n) in the best case This means that in the best case its instruction count is at least cn, It is (n2) in the worst case This means that in the worst case its instruction count is at least cn2

Omega (cont.) 2. To provide information on a class of algorithms that solve a problem Sorting algorithms based on comparison of keys are (nlgn) in the worst case This means that all sort algorithms based only on comparison of keys have to do at least cnlgn operations Any algorithm based only on comparison of keys to find the maximum of n elements is (n) in every case This means that all algorithms based only on comparison of keys to find maximum have to do at least cn operations

Supplementary topic: Why (nlgn) for sorting? n numbers to sort and no special assumptions unlike pigeonhole sorting (we discussed in a previous class), there are n! permutations One comparison has only two outcomes So, lg(n!) comparisons are required in the worst case n! is approximately equal to (n/e)n Striling’s approximation

Definition of the set Omega W(f(n)) is the set of functions g(n) such that: there exist positive constants c, and N for which, 0 £ cf(n) £ g(n) for all n ³ N f(n) is called an asymptotically lower bound for g(n); that is, g(n) = W(f(n)) Ask questions: Why is n a nonnegative integer Why is f(n) real? Why is n non negative What is n0 doing here? What is c doing here?

I grow at least as fast as f g(n) Î W(f(n)) I grow at least as fast as f g(n) cf(n) N n

Is 5n-20Î W (n)? Proof: From the definition of Omega, there must exist c>0 and integer N>0 such that 0 £ cn £ 5n-20 for all n³N Dividing the inequality by n > 0 we get: 0 £ c £ 5-20/n for all n ³ N. 20/n £ 20, and 20/n becomes smaller as n grows. There are many choices here for c and N. Since c > 0, 5 – 20/n > 0 and N >4. If we choose c=1, then 5 – 20/n ³ 1 and N ³ 5 Choose N = 5. If we choose c=4, then 5 – 20/n ³ 4 and N ³ 20. Choose N = 20. In either case (we only need one!) we have c>o and N>0 such that 0 £ cn £ 5n-20 for all n ³ N. So 5n-20 Î W (n). We need to show 2n³c* n2 for all n³ n0. Let n0>0. Divide by cn. We get n<=1/c for all n³ n0 When n>1/c, the inequality is not satisfied

Are they true? 1,000,000 n2  W (n2) why /why not? (true) (n - 1)n / 2  W (n2) why /why not? n / 2  W (n2) why /why not? (false) lg (n2)  W ( lg n ) why /why not? n2  W (n) why /why not?

Theta Asymptotic tight bound on the growth rate of an algorithm Insertion sort is (n2) in the worst and average cases This means that in the worst case and average cases insertion sort performs cn2 operations Binary search is (lg n) in the worst and average cases The means that in the worst case and average cases binary search performs clgn operations

Definition of the set Theta Q(f(n)) is the set of functions g(n) such that: there exist positive constants c, d, and N, for which, 0£ cf(n) £g(n) £df(n) for all n³N f(n) is called an asymptotically tight bound for g(n). Ask questions: Why is n a nonnegative integer Why is f(n) real? Why is n non negative What is n0 doing here? What is c doing here?

g(n) Î Q(f(n)) We grow at same rate df(n) g(n) cf(n) N n

Another Definition of Theta log n n2 n5 5n + 3 1/2 n2 2n n1.5 5n2+ 3n nlog n Small  (n2) (n2) O(n2) Small o(n2) (n2)

We use the last definition and show:

More Q 1,000,000 n2  Q(n2) why /why not? (true) (n - 1)n / 2  Q(n2) why /why not? n / 2  Q(n2) why /why not? (false) lg (n2)  Q ( lg n ) why /why not? n2  Q(n) why /why not?

small o o(f(n)) is the set of functions g(n) which satisfy the following condition: For every positive real constant c, there exists a positive integer N, for which, g(n) £ cf(n) for all n³N

small o Little “oh” - used to denote an upper bound that is not asymptotically tight. n is in o(n3). n is not in o(n)

small omega (f(n)) is the set of functions g(n) which satisfy the following condition: For every positive real constant c, there exists a positive integer N, for which, g(n) ³ cf(n) for all n³N

small omega and small o g(n)  (f(n)) if and only if f(n)  o(g(n))

Limits can be used to determine Order { c then f (n) = Q ( g (n)) if c > 0 if lim f (n)/g (n) = 0 then f (n) = o ( g(n)) ¥ then f (n) =  ( g (n)) The limit must exist n®¥

Example using limits

L’Hopital’s Rule If f(x) and g(x) are both differentiable with derivatives f’(x) and g’(x), respectively, and if

Example using limits (kth order differentiation of y)

Example using limit

Example using limits

Further study of the growth functions Properties of growth functions O  <=   >=   = o  <   >

Asymptotic Growth Rate Part II

Outline More examples: General Properties Little Oh Additional properties

Order of Algorithm Property - Complexity Categories: θ(lg n) θ(n) θ(n lgn) θ(n2) θ(nj) θ(nk) θ(an) θ(bn) θ(n!) Where k>j>2 and b>a>1. If a complexity function g(n) is in a category that is to the left of the category containing f(n), then g(n)  o(f(n))

Comparing ln n with na (a > 0) Using limits we get: So ln n = o(na) for any a > 0 When the exponent a is very small, we need to look at very large values of n to see that na > ln n

Values for log10n and n 0.01

Values for log10n and n 0.001

Lower-order terms and constants Lower order terms of a function do not matter since lower-order terms are dominated by the higher order term. Constants (multiplied by highest order term) do not matter, since they do not affect the asymptotic growth rate All logarithms with base b >1 belong to (lg n) since

General Rules We say a function f (n ) is polynomially bounded if f (n ) = O ( nk ) for some positive constant k We say a function f (n ) is polylogarithmic bounded if f (n ) = O ( lgk n) for some positive constant k Exponential functions grow faster than positive polynomial functions Polynomial functions grow faster than polylogarithmic functions

More properties The following slides show Two examples of pairs of functions that are not comparable in terms of asymptotic notation How the asymptotic notation can be used in equations That Theta, Big Oh, and Omega define a transitive and reflexive order. Theta also satisfies symmetry, while Big Oh and Omega satisfy transpose symmetry

Are n and nsinn comparable with respect to growth rate? yes Increases from 0 to 1 Increases from 1 to n Decreases from 1 to 0 Decreases from n to 1 Decreases from 0 to –1 Decreases from 1 to 1/n Increases from –1 to 0 Increases from 1/n to 1 Clearly nsin n = O(n), but nsin n  (n)

Are n and nsinn+1 comparable with respect to growth rate? no Increases from 0 to 1 Increases from n to n2 Decreases from 1 to 0 Decreases from n2 to n Decreases from 0 to –1 Decreases from n to 1 Increases from –1 to 0 Increases from 1 to n Clearly nsin n+1  O(n), but nsin n+1  (n)

Are n and nsinn+1 comparable? No

Another example The following functions are not asymptotically comparable:

Transitivity: If f (n) = Q (g(n )) and g (n) = Q (h(n )) then f (n) = Q (h(n )) . If f (n) = O (g(n )) and g (n) = O (h(n )) then f (n) = O (h(n )). If f (n) = W (g(n )) and g (n) = W (h(n )) then f (n) = W (h(n )) . If f (n) = o (g(n )) and g (n) = o (h(n )) then f (n) = o (h(n )) . If f (n) = w (g(n )) and g (n) = w (h(n )) then f (n) = w (h(n ))

Reflexivity: f (n) = Q (f (n )). f (n) = O (f (n )). f (n) = W (f (n )). “o” is not reflexive “” is not reflexive

Symmetry and Transpose symmetry Symmetry: f (n) = Q (g(n)) if and only if g (n) = Q (f (n )) . Transpose symmetry: f (n) = O (g(n )) if and only if g (n) = W (f (n )). f (n) = o (g(n )) if and only if g (n) = w (f (n )).

Analogy between asymptotic comparison of functions and comparison of real numbers. f (n) = O( g(n)) » a £ b f (n) = W ( g(n)) » a ³ b f (n) = Q ( g(n)) » a = b f (n) = o ( g(n)) » a < b f (n) = w ( g(n)) » a > b f(n) is asymptotically smaller than g(n) if f (n) = o ( g(n)) f(n) is asymptotically larger than g(n) if f (n) = w( g(n))

Is O(g(n)) = (g(n))  o(g(n))? We show a counter example: The functions are: g(n) = n and f(n)O(n) but f(n)  (n) and f(n)  o(n) Conclusion: O(g(n))  (g(n))  o(g(n))