 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

Slides:



Advertisements
Similar presentations
CSE115/ENGR160 Discrete Mathematics 03/01/12
Advertisements

Analysis of Algorithms
Discrete Structures & Algorithms Functions & Asymptotic Complexity.
Estimating Running Time Algorithm arrayMax executes 3n  1 primitive operations in the worst case Define a Time taken by the fastest primitive operation.
Chapter 3 Growth of Functions
The Growth of Functions
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Growth of Functions CIS 606 Spring 2010.
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
Asymptotic Analysis Motivation Definitions Common complexity functions
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Chapter 2: Algorithm Analysis Big-Oh and Other Notations in Algorithm Analysis Lydia Sinapova, Simpson College Mark Allen Weiss: Data Structures and Algorithm.
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
CSE 373 Data Structures and Algorithms Lecture 4: Asymptotic Analysis II / Math Review.
Introduction to Algorithms (2 nd edition) by Cormen, Leiserson, Rivest & Stein Chapter 3: Growth of Functions (slides enhanced by N. Adlai A. DePano)
Asymptotic Notations Iterative Algorithms and their analysis
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
The Growth of Functions Rosen 2.2 Basic Rules of Logarithms log z (xy) log z (x/y) log z (x y ) If x = y If x < y log z (-|x|) is undefined = log z (x)
CS 3343: Analysis of Algorithms
Algorithm Analysis An algorithm is a clearly specified set of simple instructions to be followed to solve a problem. Three questions for algorithm analysis.
Asymptotic Analysis-Ch. 3
CMPT 438 Algorithms Chapter 3 Asymptotic Notations.
3.Growth of Functions Asymptotic notation  g(n) is an asymptotic tight bound for f(n). ``=’’ abuse.
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
Fundamentals of Algorithms MCS - 2 Lecture # 8. Growth of Functions.
Lecture 3 Analysis of Algorithms, Part II. Plan for today Finish Big Oh, more motivation and examples, do some limit calculations. Little Oh, Theta notation.
Growth of Functions. 2 Analysis of Bubble Sort 3 Time Analysis – Best Case The array is already sorted – no swap operations are required.
Time Complexity of Algorithms (Asymptotic Notations)
Big-O. Algorithm Analysis Exact analysis: produce a function f(n) measuring how many basic steps are needed for a given inputs n On any input of size.
Relations Over Asymptotic Notations. Overview of Previous Lecture Although Estimation but Useful It is not always possible to determine behaviour of an.
COP 3530 Spring2012 Data Structures & Algorithms Discussion Session Week 5.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Algorithm Complexity L. Grewe 1. Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them?
CSE 373: Data Structures and Algorithms Lecture 4: Math Review/Asymptotic Analysis II 1.
13 February 2016 Asymptotic Notation, Review of Functions & Summations.
Asymptotic Notation Faculty Name: Ruhi Fatima
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
DR. Gatot F. Hertono, MSc. Design and Analysis of ALGORITHM (Session 2)
1 Section 5.6 Comparing Rates of Growth We often need to compare functions ƒ and g to see whether ƒ(n) and g(n) are about the same or whether one grows.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 2.
Asymptotic Complexity
Introduction to Algorithms: Asymptotic Notation
Analysis of Algorithms
Chapter 3 Growth of Functions Lee, Hsiu-Hui
nalisis de lgoritmos A A
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
Introduction to Algorithms
Growth of functions CSC317.
The Growth of Functions
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
Asymptotic Growth Rate
BIG-OH AND OTHER NOTATIONS IN ALGORITHM ANALYSIS
Fundamentals of Algorithms MCS - 2 Lecture # 9
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
CS 3343: Analysis of Algorithms
Advanced Analysis of Algorithms
Chapter 2.
Ch 3: Growth of Functions Ming-Te Chi
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
Intro to Data Structures
Performance Evaluation
G.PULLAIAH COLLEGE OF ENGINEERING AND TECHNOLOGY
Advanced Algorithms Analysis and Design
Advanced Analysis of Algorithms
Analysis of Algorithms
Algorithm Course Dr. Aref Rashad
Presentation transcript:

 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic efficiency.  In particular, we study tight bounds, upper bounds and lower bounds.

 Why do we need the different sets?  Definition of the sets O (Oh),  (Omega) and  (Theta), o (oh)  Classifying examples: ◦ Using the original definition ◦ Using limits

 Let f(n) and g(n) be asymptotically nonnegative functions whose domains are the set of natural numbers N={0,1,2,…}.  A function g(n) is asymptotically nonnegative, if g(n)  0 for all n  n 0 where n 0  N

 Big “oh” - asymptotic upper bound on the growth of an algorithm  When do we use Big Oh? 1. Theory of NP-completeness 2. To provide information on the maximum number of operations that an algorithm performs ◦ If an algorithm is O(n 2 ) in the worst case  This means that in the worst case it performs at most cn 2 operations

 O(f(n)) is the set of functions g(n) such that: there exist positive real constants c, and nonnegative integer N for which, g(n)  cf(n) for all n  N  f(n) is called an asymptotically upper bound for g(n).

Nn cf(n) g(n)g(n)

n n 2 n 2 take c = 2 N = 10 n 2 +10n =10

Proof: From the definition of Big Oh, there must exist c>0 and integer N>0 such that 0  5n+2  cn for all n  N. Dividing both sides of the inequality by n>0 we get: 0  5+2/n  c. 2/n  2, 2/n>0 becomes smaller when n increases There are many choices here for c and N.

If we choose N=1 then 5+2/n  5+2/1= 7. So any c  7. Choose c = 7. If we choose c=6, then 0  5+2/n  6. So any N  2. Choose N = 2. In either case (we only need one!) we have a c>0 and N>0 such that 0  5n+2  cn for all n  N. So the definition is satisfied and 5n+2  O(n)

We will prove by contradiction that the definition cannot be satisfied. Assume that n 2  O(n). From the definition of Big Oh, there must exist c>0 and integer N>0 such that 0  n 2  cn for all n  N. Dividing the inequality by n>0 we get 0  n  c for all n  N. n  c cannot be true for any n >max{c,N }, contradicting our assumption So there is no constant c>0 such that n  c is satisfied for all n  N, and n 2  O(n)

 1,000,000 n 2  O(n 2 ) why/why not?  True  (n - 1)n / 2  O(n 2 ) why /why not?  True  n / 2  O(n 2 ) why /why not?  True  lg (n 2 )  O( lg n ) why /why not?  True  n 2  O(n) why /why not?  False

Omega - asymptotic lower bound on the growth of an algorithm or a problem * When do we use Omega? To provide information on the minimum number of operations that an algorithm performs ◦ If an algorithm is  (n) in the best case  This means that in the best case its instruction count is at least cn, ◦ If it is  (n 2 ) in the worst case  This means that in the worst case its instruction count is at least cn 2

  (f(n)) is the set of functions g(n) such that: there exist positive real constants c, and nonnegative integer N for which, g(n) >= cf(n) for all n  N  f(n) is called an asymptotically lower bound for g(n).

Nn cf(n) g(n)g(n)

Proof: From the definition of Omega, there must exist c>0 and integer N>0 such that 0  cn  5n-20 for all n  N Dividing the inequality by n>0 we get: 0  c  5-20/n for all n  N. 20/n  20, and 20/n becomes smaller as n grows. There are many choices here for c and N. Since c > 0, 5 – 20/n >0 and N >4 If we choose N=5, then 1  5-20/5  5-20/n. So 0 < c  1. Choose c = ½. If we choose c=4, then 5 – 20/n  4 and N  20. Choose N = 20. In either case (we only need one!) we have a c>o and N>0 such that 0  cn  5n-20 for all n  N. So the definition is satisfied and 5n-20  (n)

 1,000,000 n 2  (n 2 ) why /why not?  (true)  (n - 1)n / 2  (n 2 ) why /why not?  (true)  n / 2  (n 2 ) why /why not?  (false)  lg (n 2 )  ( lg n ) why /why not?  (true)  n 2  (n) why /why not?  (true)

 Theta - asymptotic tight bound on the growth rate of an algorithm ◦ If an algorithm is  (n 2 ) in the worst and average cases  This means that in the worst case and average cases it performs cn 2 operations ◦ Binary search is  (lg n) in the worst and average cases  The means that in the worst case and average cases binary search performs clgn operations

  (f(n)) is the set of functions g(n) such that: there exist positive real constants c, d, and nonnegative integer N, for which, cf(n)  g(n)  df(n) for all n  N  f(n) is called an asymptotically tight bound for g(n).

Nn cf(n) g(n)g(n) df(n)

log n n 2 n 5 5n + 3 1/2 n 2 2 n n 1.5 5n 2 + 3n nlog n  (n 2 ) O(n 2 )  (n 2 )

 We use the last definition and show: 1. 2.

 1,000,000 n 2  (n 2 ) why /why not?  (true)  (n - 1)n / 2  (n 2 ) why /why not?  (true)  n / 2  (n 2 ) why /why not?  (false)  lg (n 2 )  ( lg n ) why /why not?  (true)  n 2  (n) why /why not?  (false)

  (f(n)) is the set of functions g(n) which satisfy the following condition: For every positive real constant c, there exists a positive integer N, for which, g(n)  cf(n) for all n  N

 Little “oh” - used to denote an upper bound that is not asymptotically tight. ◦ n is in o(n 3 ). ◦ n is not in o(n)

 Let c>0 be given. we need to find an N such that for n>=N, n<=cn 2  If we divide both sides of this inequality by cn, we get 1/c <=n Therefore we can choose any N>=1/c.

 Proof by contradiction Let c=1/6 If nεo(5n), then there must exist some N such that, for n>=N n<= (5/6)n This contradiction proves that n is not in o(5n).

c then f (n) =  ( g (n)) if c > 0 if lim f (n) / g (n) = 0 then f (n) = o ( g(n))  then g (n) = o ( f (n))  The limit must exist { n 

If f(x) and g(x) are both differentiable with derivatives f’(x) and g’(x), respectively, and if

 Properties of growth functions O  <=   >=   = o  <

 Using limits we get:  So ln n = o(n a ) for any a > 0  When the exponent a is very small, we need to look at very large values of n to see that n a > ln n

O ( g (n) ) = { f (n )| there exist positive constant c and a positive integer N such that 0  f( n)  c  g (n ) for all n  N } o ( g (n) ) = { f(n) | for any positive constant c >0, there exists a positive integer N such that 0  f( n)  c  g (n ) for all n  N } For ‘o’ the inequality holds for all positive constants. Whereas for ‘O’ the inequality holds for some positive constants.

 Lower order terms of a function do not matter since lower-order terms are dominated by the higher order term.  Constants (multiplied by highest order term) do not matter, since they do not affect the asymptotic growth rate  All logarithms with base b >1 belong to  (lg n) since

 We say a function f (n ) is polynomially bounded if f (n ) = O ( n k ) for some positive constant k  We say a function f (n ) is polylogarithmic bounded if f (n ) = O ( lg k n) for some positive constant k  Exponential functions ◦ grow faster than positive polynomial functions  Polynomial functions ◦ grow faster than polylogarithmic functions

What does n 2 + 2n + 99 = n 2 +  (n) mean? n 2 + 2n + 99 = n 2 + f(n) Where the function f(n) is from the set  (n). In fact f(n) = 2n Using notation in this manner can help to eliminate non- affecting details and clutter in an equation. 2n 2 + 5n + 21= 2n 2 +  (n ) =  (n 2 )

If f (n) =  (g(n )) and g (n) =  (h(n )) then f (n) =  (h(n )). If f (n) =  (g(n )) and g (n) =  (h(n )) then f (n) =  (h(n )). If f (n) =  (g(n )) and g (n) =  (h(n )) then f (n) =  (h(n )). If f (n) =  (g(n )) and g (n) =  (h(n )) then f (n) =  (h(n )).

f (n) =  (f (n )). f (n) =  (f (n )). f (n) =  (f (n )). “o” is not reflexive

 Symmetry: f (n) =  (g(n )) if and only if g (n) =  (f (n )).  Transpose symmetry: f (n) =  (g(n )) if and only if g (n) =  (f (n )).