nalisis de lgoritmos A A

Slides:



Advertisements
Similar presentations
Analysis of Algorithms
Advertisements

 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Discrete Structures & Algorithms Functions & Asymptotic Complexity.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Growth of Functions CIS 606 Spring 2010.
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
Asymptotic Analysis Motivation Definitions Common complexity functions
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
3.Growth of Functions Hsu, Lih-Hsing.
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
Introduction to Algorithms (2 nd edition) by Cormen, Leiserson, Rivest & Stein Chapter 3: Growth of Functions (slides enhanced by N. Adlai A. DePano)
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
CS 3343: Analysis of Algorithms
Asymptotic Analysis-Ch. 3
CMPT 438 Algorithms Chapter 3 Asymptotic Notations.
3.Growth of Functions Asymptotic notation  g(n) is an asymptotic tight bound for f(n). ``=’’ abuse.
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
1 o-notation For a given function g(n), we denote by o(g(n)) the set of functions: o(g(n)) = {f(n): for any positive constant c > 0, there exists a constant.
Nattee Niparnan Dept. of Computer Engineering, Chulalongkorn University.
Growth of Functions. 2 Analysis of Bubble Sort 3 Time Analysis – Best Case The array is already sorted – no swap operations are required.
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
Analysis of Algorithm. Why Analysis? We need to know the “behavior” of algorithms – How much resource (time/space) does it use So that we know when to.
COP 3530 Spring2012 Data Structures & Algorithms Discussion Session Week 5.
ADVANCED ALGORITHMS REVIEW OF ANALYSIS TECHNIQUES (UNIT-1)
13 February 2016 Asymptotic Notation, Review of Functions & Summations.
Asymptotic Notation Faculty Name: Ruhi Fatima
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
1 Section 5.6 Comparing Rates of Growth We often need to compare functions ƒ and g to see whether ƒ(n) and g(n) are about the same or whether one grows.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Asymptotic Complexity
Chapter 2 Algorithm Analysis
Introduction to Algorithms: Asymptotic Notation
Chapter 3 Growth of Functions Lee, Hsiu-Hui
Asymptotic Analysis.
Introduction to Algorithms
Introduction to Algorithms (2nd edition)
CS 3343: Analysis of Algorithms
Growth of functions CSC317.
Asymptotic Notation Q, O, W, o, w
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
Asymptotics & Stirling’s Approximation
Asymptotic Notation Q, O, W, o, w
Asymptotic Growth Rate
BIG-OH AND OTHER NOTATIONS IN ALGORITHM ANALYSIS
Asymptotic Analysis.
Fundamentals of Algorithms MCS - 2 Lecture # 9
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
CS 3343: Analysis of Algorithms
Advanced Analysis of Algorithms
Introduction to Algorithms
Ch 3: Growth of Functions Ming-Te Chi
Intro to Data Structures
Asymptotics & Stirling’s Approximation
Asymptotics & Stirling’s Approximation
G.PULLAIAH COLLEGE OF ENGINEERING AND TECHNOLOGY
Advanced Algorithms Analysis and Design
At the end of this session, learner will be able to:
Chapter 3 Growth of Functions
Advanced Analysis of Algorithms
Exponential functions
Big Omega, Theta Defn: T(N) = (g(N)) if there are positive constants c and n0 such that T(N)  c g(N) for all N  n0 . Lingo: “T(N) grows no slower than.
Algorithm Course Dr. Aref Rashad
Intro to Theory of Computation
Presentation transcript:

nalisis de lgoritmos A A Informacion tomada del Depto. De sistemas de la Universidad Nacional sede Bogota.

OUTLINE Growth of Functions Asymptotic notation Common functions

ASYMPTOTIC NOTATION

Asymptotically No Negative Functions f(n) is asymptotically no negative if there exist n0  N such that for every n  n0, 0  f(n) f n0

Theta   (g(n)) ={ f : N R* | ( c1, c2  R +) ( n0 N) ( n  n0) (0 c1 g(n)  f(n)  c2 g(n)) } c1  lim f(n)  c2 n  g(n) g c2g f c1g n0

“f(n) =  (g(n))”  “f(n)   (g(n))” f is asymptotically tight bound for g or f is of the exact order of g Every member of (g(n)) is asymptotically no negative The function g(n) must itself asymptotically no negative, or else (g(n)) = .

Example Lets show that We have to find c1,c2 and n0 such that

For all n  n0, Dividing by n2 yields We have that 3/n is a decreasing sequence 3, 3/2, 1, 3/4, 3/5, 3/6, 3/7… and then 1/2 - 3/n is increasing sequence -5/2, -1, -1/2, -1/4, -1/10, 0, 1/14 that is upper bounded by 1/2.

The right hand inequality can be made to hold for n  1 by choosing c2  1/2. Likewise the left hand inequality can be made to hold for n  7 by choosing c1  1/14.

Thus by choosing c1 = 1/14, c2 = 1/2 and n0 =7 then we can verify that

O(g(n)) = {f: N R* | ( cR +) (n0N) (n n0) ( 0  f(n)  cg(n) )} Big O O(g(n)) = {f: N R* | ( cR +) (n0N) (n n0) ( 0  f(n)  cg(n) )} lim f(n)  c n  g(n) cg g f n0

“f(n) = O (g(n))”  “f(n)  O (g(n))” f is asymptotically upper bounded for g or g is an asymptotic upper bound for f O (g(n)) is pronounced “big oh of g(n)”

We have to find c and n0 such that Example Lets show that We have to find c and n0 such that

For all n  n0, Dividing by n yields The inequality can be made to hold for n  1 by choosing c  a+b. Thus by choosing c = a+b and n0 =1 then we can verify that

a+b a b n0=1

Big Omega   (g(n)) ={ f: N R* | ( cR +) (n0N) ( n n0) ( 0 cg(n)  f(n) ) } c  lim f(n) n  g(n) g cg f n0

“f(n) =  (g(n))”  “f(n)   (g(n))” f is asymptotically lower bounded for g or g is an asymptotic lower bound for f O (g(n)) is pronounced “big omega of g(n)”

“o(g(n)) functions that grow slower than g ” Little o o (g(n)) = { f : N R* | (c R +) (n0N) ( n n0) ( 0  f(n) < cg (n) ) } lim f(n) = 0 n  g(n) “o(g(n)) functions that grow slower than g ” g f

“f(n) = o(g(n))”  “f(n)  o (g(n))” f is asymptotically smaller than g o(g(n)) is pronounced “little-oh of g(n)”

Examples Lets show that We only have to show n2 n

Lets show that We have 3n n

“ (g(n)) functions that grow faster than g ” Little Omega   (g(n)) = { f : N R* | (cR +) (n0N) ( n n0) ( 0  c g(n) < f(n) ) } lim f(n) =  n  g(n) “ (g(n)) functions that grow faster than g ” f g

“f(n) =  (g(n))”  “f(n)   (g(n))” f is asymptotically larger than g o(g(n)) is pronounced “little-omega of g(n)”

Analogy with the comparison of two real numbers Asymptotic Real Notation numbers f(n) O(g(n)) f  g f(n) (g(n)) f  g f(n) (g(n)) f = g f(n) o(g(n)) f < g f(n) (g(n)) f > g Trichotomy does not hold

Example f(n)=n g(n)=n(1+sin n) (1+sin(n))  [0,2] Not all functions are asymptotically comparable

The running time of an algorithm is (f(n)) iff Properties  (f(n))=O(f(n))  (f(n)) The running time of an algorithm is (f(n)) iff Its worst-case running time is O(f(n)) and Its best-case running time is  (f(n))

f(n)  (g(n)) and g(n)  (h(n)) then f(n)  (h(n)) ; Transitivity of O,  ,  f(n)  (g(n)) and g(n)  (h(n)) then f(n)  (h(n)) ; for  = O,  ,  Reflexivity of O,  ,  f(n)  (f(n)) ; for  = O,  ,  Symmetry of  f(n)  (g(n))  g(n)  (f(n)) Anti-symmetry of O,  f(n)   (g(n)) f(n)  (g(n))  g(n)  (f(n)) ; for  = O,  Transpose Symmetry f(n) O(g(n))  g(n)  (f(n)) f(n) o(g(n))  g(n)  (f(n))

f  g  f(n) O(g(n)) order relation reflexive anti-symmetric transitive f  g  f(n)  (g(n)) order relation f = g  f(n)  (g(n)) equivalence relation symmetric

Relation between o and O f(n) o(g(n))  f(n) O(g(n)) Relation between  and  g(n)  (f(n)) g(n)  (f(n)) o(f(n))   (f(n)) = 

Examples A B 5n2 + 100n 3n2 + 2 log3(n2) log2(n3) nlg4 3lg n lg2n n1/2

Examples A B 5n2 + 100n 3n2 + 2 A  (B) log3(n2) log2(n3) A  (B) nlg4 3lg n A  w(B) lg2n n1/2 A  o (B)

Examples A B 5n2 + 100n 3n2 + 2 A  (B) log3(n2) log2(n3) A  (B) A  (n2), n2  (B)  A  (B) log3(n2) log2(n3) A  (B) logba = logca / logcb; A = 2lgn / lg3, B = 3lgn, A/B =2/(3lg3) nlg4 3lg n A  w(B) alog b = blog a; B =3lg n=nlg 3; A/B =nlg(4/3)   as n lg2n n1/2 A  o (B) lim ( loga n / nb ) = 0, here a = 2 and b = 1/2  A  o (B) n

Asymptotic notation two variables O(g(m, n))={ f: N N  R* | ( c R +)( m0 , n0  N) ( n n0) ( m m0) (f(m, n)  c g(m, n)) }

COMMON FUNCTIONS

Monotonicity f is monotonically increasing if m n  f(m) f(n) f is monotonically decreasing if m n  f(m) f(n) f is strictly increasing if m < n  f(m) < f(n) f is strictly decreasing if m < n  f(m) > f(n)

Floors and Ceilings x floor of x x ceiling of x the greatest integer less than or equal to x x ceiling of x the smallest integer greater than or equal to x x  R , x-1 < x  x  x < x +1 n  N , n = n = n and n/2 + n/2 = n -1 1 x -2 2 x x

x and x are monotonically increasing x  R and integers a,b > 0   x/a  /b = x /ab .  x/a /b =  x /ab . a /b  ( a + (b-1)) /b. a /b  ( a - (b-1)) /b. x and x are monotonically increasing

Modular arithmetic For every integer a and any possible positive integer n, a mod n is the remainder of ( or residue) of the quotient a/n a mod n = a - a /n n.

congruency or equivalence mod n If (a mod n) = (b mod n) we write a  b (mod n) and we say that a is equivalent to b modulo n or that a is congruent to b modulo n. In other words a  b (mod n) if a and b have the same remainder when divided by n. Also a  b (mod n) if and only if n is a divisor of b-a.

relation in Z and produces a partitioned set called Example  (mod 4)  (mod n) defines a equivalence relation in Z and produces a partitioned set called Zn= Z/n ={0,1,2,…,n-1} in which can be defined arithmetic operations a+b (mod n) a*b (mod n)         -12 -11 -10 -9 -8 -7 -6 -5 -4 -3 -2 -1 1 2 3 4 5 6 7 8 9 10 11 Z/4={[0],[1],[2],[3]}={0,1,2,3} 4+1 (mod 4) = 1 5*2 (mod 4) = 2 12 13 14 15     [0] [1] [2] [3]

Polynomials Given a no negative integer d, a polynomial in n of degree d is a function p(n) of the form: Where a1 ,a2 ,…, ad are the coefficients and ad  0.

f(n)=O(nd) for some constant d. A polynomial p(n) is asymptotically positive if and only if ad > 0. If p(n), of degree d, is asymptotically positive, we have p(n)= (nd). a  R , a  0, na is monotonically increasing. a  R , a  0, na is monotonically decreasing. A function f(n) is polynomially bounded if f(n)=O(nd) for some constant d.

Exponentials For all reals a  0, m and n, we have the following identities a0 = 1 a1 = a a -1 = 1/a (am)n = amn (am)n = (an)m am an=am+n

If a  0 and for all n, an is monotonically increasing. a,b  R , a >1, then nd = o(an)

x  R, ex  1+x, equality holds for x=0. If |x|  1 1+x  ex  1+x+ x2 . When x 0, ex = 1+x +O(x2) .

Logarithms Notations: lg n = log2 n ln n = loge n lgk n = (lg n ) k lg lg n = lg (lg n ) Logarithms function will only apply to next term in the formula lg n + k = (lg n ) +k. For b > 1 constant and n > 0, logb n is strictly increasing.

For all reals a > 0, b > 0, c > 0 and n we have the following identities logc (ab) = logc a + logc b. logb an = n logb a. logb (1/an) = - logb a.

when x > -1 For x > -1 equality holds for x=0.

A function f(n) is polylogaritmically bounded if f(n)=O(lgk n) for some constant k. We have the following relation between polynomials and polylogarithms: then lgk n = o(nk )

Factorials Definition (no recursive) (recursive) Weak upper bound n!  nn

Stirling´s approximation then n! = o(nn) n! =  (2n) lg(n!) = (n lgn)

where

Functional iteration Given a function f (n) the i-th functional iteration of f is defined as : with I the identity function. For a particular n, we have,

Examples: f(n) = 2n then f(n) = n2 then

f(n) = nn then

Iterated logarithm The iterated logarithm of n, denoted lg* n “log star of n”is defined as lg* n , is a very slowly growing function lg* 1 = 0 lg* 2 = 1 lg* 4 = 2 lg* 16 = 3 lg* 65536 = 4 lg* (65536)2 = 5

In general k