Nattee Niparnan Dept. of Computer Engineering, Chulalongkorn University.

Slides:



Advertisements
Similar presentations
Growth-rate Functions
Advertisements

Analysis of Algorithms
 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Razdan with contribution from others 1 Algorithm Analysis What is the Big ‘O Bout? Anshuman Razdan Div of Computing.
Asymptotic Notation (O, Ω,  ) s Describes the behavior of the time or space complexity for large instance characteristics s Common asymptotic functions.
Discrete Structures & Algorithms Functions & Asymptotic Complexity.
Chapter 3 Growth of Functions
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Complexity Analysis (Part II)
1 Data Structures A program solves a problem. A program solves a problem. A solution consists of: A solution consists of:  a way to organize the data.
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
Asymptotic Analysis Motivation Definitions Common complexity functions
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
CHAPTER 2 ANALYSIS OF ALGORITHMS Part 1. 2 Big Oh and other notations Introduction Classifying functions by their asymptotic growth Theta, Little oh,
Algorithm analysis and design Introduction to Algorithms week1
Introduction to Algorithms (2 nd edition) by Cormen, Leiserson, Rivest & Stein Chapter 3: Growth of Functions (slides enhanced by N. Adlai A. DePano)
Asymptotics Data Structures & OO Development I 1 Computer Science Dept Va Tech June 2006 ©2006 McQuain & Ribbens Big-O Analysis Order of magnitude analysis.
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Mathematics Review and Asymptotic Notation
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
CS 3343: Analysis of Algorithms
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
Asymptotic Analysis-Ch. 3
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
Asymptotic Notation (O, Ω, )
1 o-notation For a given function g(n), we denote by o(g(n)) the set of functions: o(g(n)) = {f(n): for any positive constant c > 0, there exists a constant.
Zeinab EidAlgorithm Analysis1 Chapter 4 Analysis Tools.
Analysis of Algorithms1 O-notation (upper bound) Asymptotic running times of algorithms are usually defined by functions whose domain are N={0, 1, 2, …}
Growth of Functions. 2 Analysis of Bubble Sort 3 Time Analysis – Best Case The array is already sorted – no swap operations are required.
Asymptotic Growth Rates  Themes  Analyzing the cost of programs  Ignoring constants and Big-Oh  Recurrence Relations & Sums  Divide and Conquer 
Asymptotic Analysis (based on slides used at UMBC)
CSC – 332 Data Structures Generics Analysis of Algorithms Dr. Curry Guinn.
Introduction to Analysis of Algorithms CS342 S2004.
Algorithmic Analysis Charl du Plessis and Robert Ketteringham.
Time Complexity of Algorithms (Asymptotic Notations)
General rules: Find big-O f(n) = k = O(1) f(n) = a k n k + a k-1 n k a 1 n 1 + a 0 = O(n k ) Other functions, try to find the dominant term according.
1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried.
Big-O. Algorithm Analysis Exact analysis: produce a function f(n) measuring how many basic steps are needed for a given inputs n On any input of size.
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
Analysis of Algorithm. Why Analysis? We need to know the “behavior” of algorithms – How much resource (time/space) does it use So that we know when to.
COP 3530 Spring2012 Data Structures & Algorithms Discussion Session Week 5.
Asymptotic Notations By Er. Devdutt Baresary. Introduction In mathematics, computer science, and related fields, big O notation describes the limiting.
David Luebke 1 1/6/2016 CS 332: Algorithms Asymptotic Performance.
Algorithm Complexity L. Grewe 1. Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them?
13 February 2016 Asymptotic Notation, Review of Functions & Summations.
Asymptotic Notation Faculty Name: Ruhi Fatima
Dale Roberts Department of Computer and Information Science, School of Science, IUPUI Dale Roberts, Lecturer Computer Science, IUPUI
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
BITS Pilani Pilani Campus Data Structure and Algorithms Design Dr. Maheswari Karthikeyan Lecture1.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
1 Asymptotes: Why? How to describe an algorithm’s running time? (or space, …) How does the running time depend on the input? T(x) = running time for instance.
Ch03-Algorithms 1. Algorithms What is an algorithm? An algorithm is a finite set of precise instructions for performing a computation or for solving a.
CSE 3358 NOTE SET 2 Data Structures and Algorithms 1.
Asymptotic Complexity
Chapter 2 Algorithm Analysis
Introduction to Algorithms: Asymptotic Notation
Introduction to Algorithms
Growth of functions CSC317.
Time Complexity Analysis Neil Tang 01/19/2010
Algorithm Analysis Neil Tang 01/22/2008
O-notation (upper bound)
Asymptotic Growth Rate
Advanced Analysis of Algorithms
Chapter 2.
O-notation (upper bound)
Presentation transcript:

Nattee Niparnan Dept. of Computer Engineering, Chulalongkorn University

Why Analysis? We need to know the “behavior” of the algorithm Behavior == How well does it perform?

Can we? Sure! If we have the algorithm We can implement it We can test it on any input But… Is that what we really want? If you wish to know whether falling from floor 20 of Eng. 4 building would kill you Will you try?

Prediction We wish to know the “Behavior” of the algorithm Without actually trying it Back to the suicidal example Can you guess whether you survive jumping from 20 th floor of Eng. 4 building? What about 15 th floor? What about 10 th floor? What about 5 th floor? What about 2 nd floor? Why?

Modeling If floor number > 3 then Die Else Survive (maybe?)

Modeling What about jumping from Central World’s 20 th floor? What about jumping from Empire State’s 20 th floor? What about jumping from UCL’s 20 th floor? Why?

Generalization Knowledge from some particular instances might be applicable to another instance

Analysis We need something that can tell us the behavior of the algorithm Generalization Modeling

Measurement What we really care? RESOURCE!!! Time (CPU power) Space (amount of RAM)

Model Usage of Resource how does an algo use resource?

Model Resource Function Time Function of an algorithm A Time Function of an algorithm A Space Function of an algorithm A Space Function of an algorithm A Input ? Time used Space used Size of input

Example Inserting a value into a sorted array Input: a sorted array A[1..N] A number X Output A sorted array A[1..N+1] which includes X

Algorithm Element Insertion Assume that X = 20 What if A = [1,2,3]? How much time? What if A = [101,102,103]? idx = N; while (idx >= 1 && A[idx] > X) { A[idx + 1] = A[idx]; idx--; } A[idx] = X;

Using the Model Time Size of Input best worst average

Resource Function Measure resource by “size of input” Why?

Conclusion Measurement for algorithm My modeling and generalization For prediction of behavior Measure as a function on the size of input With some simplification Best, avg, worst case

Nattee Niparnan

What is “better” in our sense? Takes less resource Consider this which one is better? f(x) g(x)

Ruler f(x) g(x)

What is “better” in our sense? which one is better? Which ruler to use? Performance is now a function, not a single value Can we use single value? Use the ruler where it’s really matter i.e., when N is large What is large N? Infinity? Implication?

Comparing by infinite N There is some problem Usually,

Separation between Abstraction and Implementation Rate of Growth by changing the size of input, how does the TIME and SPACE requirement change Describe the TIME, SPACE used by function of N, the size of input F(n) = n 3 +2n 2 + 4n + 10

Rate of Growth Compare by how f(x) grows when n increase, w.r.t. g(x)

Compare by RoG 0 f(x) grows “slowzer” than g(x) ∞ f(x) grows “faster” than g(x) else f(x) grows “similar” to g(x)

Growth Rate Comparison 0.5 n 1 log n log 6 n n 0.5 n 3 2 n n! Sometime it is simple Some time it is not

l’Hôpital’s Rule Limit of ratio of two functions equal to limit of ratio of their derivative. Under specific condition

l’Hôpital’s Rule If then

Example of Growth Rate Comparison © การวิเคราะห์และออกแบบอัลกอริทึม, สมชาย ประสิทธิ์จูตระกูล, 2544

The problem of this approach What if f(x) cannot be differentiate? Too lazy to diff

Compare by Classing Coarse grain comparison Another simplification Work (mostly) well in practice Classing

Simplification by classification Grading Analogy ScoreGrade >= 80A 70 <= x < 80B 60 <= x < 70C 50 <= x < 60D < 50F

Compare by Classification algo

Compare by Classification algo Group A Group C Group B Group F Group D grouping

Compare by Classification algo Group A Group C Group B Group F Group D Select a representative

Compare by Classification Group by some property Select representative Use representative for comparison If we have the comparison of the representative The rest is to do the classification

Complexity Class We define a set of complexity class using rate of growth Here comes the so-called Asymptotic Notation , O, , o,  Classify by asymptotic bound

Asymptote Something that bound curve Asymptote Curve

Asymptote Remember hyperbola?

O-notation O(g(n)) = { f(n) :  positive constants c and n 0, such that  n  n 0, we have 0  f(n)  cg(n) } O(g(n)) = { f(n) :  positive constants c and n 0, such that  n  n 0, we have 0  f(n)  cg(n) } For function g(n), we define O(g(n)), big-O of n, as the set: n0n0 cg(x) f(x) f(x)  O(g(x))

 -notation  (g(n)) = { f(n) :  positive constants c and n 0, such that  n  n 0, we have 0  cg(n)  f(n) }  (g(n)) = { f(n) :  positive constants c and n 0, such that  n  n 0, we have 0  cg(n)  f(n) } For function g(n), we define  (g(n)), big-Omega of n, as the set: n0n0 cg(x) f(x) f(x)   (g(x))

 -notation  (g(n)) = { f(n) :  positive constants c 1, c 2, and n 0, such that  n  n 0, we have 0  c 1 g(n)  f(n)  c 2 g(n) } For function g(n), we define  (g(n)), big-Theta of n, as the set: n0n0 c 1 g(x) f(x) c 2 g(x) f(x)   (g(x))

Example F(n) = 300n + 10 is a member of  (30n) why? let c 1 = 9 let c 2 = 11 let n = 1

Another Example F(n) = 300n n is a member of  (10n 2 ) why? let c 1 = 29 let c 2 = 31 let n = 11

How to Compute? Remove any constant F(n) = n 3 +2n 2 + 4n + 10 is a member of  (n 3 +n 2 + n) Remove any lower degrees F(n) = n 3 +2n 2 + 4n + 10 is a member of  (n 3 )

Relations Between , , O I.e.,  (g(n)) = O(g(n))   (g(n)) In practice, asymptotically tight bounds are obtained from asymptotic upper and lower bounds. For any two functions g(n) and f(n), f(n) =  (g(n)) iff f(n) = O(g(n)) and f(n) =  (g(n)). For any two functions g(n) and f(n), f(n) =  (g(n)) iff f(n) = O(g(n)) and f(n) =  (g(n)).

Practical Usage We say that the program has a worst case running time of O(g(n)) We say that the program has a best case running time of  (g(n)) We say that the program has a tight-bound running time of  (g(n))

Example Insertion sort takes  (n 2 ) in the worst case Insertion sort takes  (n) in the best case

o-notation o(g(n)) = {f(n):  c > 0,  n 0 > 0 such that  n  n 0, we have 0  f(n) < cg(n)}. For a given function g(n), the set little-o:

w (g(n)) = {f(n):  c > 0,  n 0 > 0 such that  n  n 0, we have 0  cg(n) < f(n)}.  -notation For a given function g(n), the set little-omega:

Remark on Notation An asymptotic group is a set Hence f(n) is a member of an asymptotic group E.g., f(n)  O( n ) Not f(n) = O( n ) But we will see this a lot It’s traditions

Comparison of Functions f (n)  g(n)  a  b f (n) = O(g(n))  a  b f (n) =  (g(n))  a  b f (n) =  (g(n))  a = b f (n) = o(g(n))  a < b f (n) =  (g(n))  a > b

Lost of Trichotomy Trichotomy given two numbers a,b it must be one of the following a b, a=b For our asymptotic notation given f(n) and g(n) it is possible that f(n) != O(g(n)) and f(n) !=  (g(n)) e.g., n, n 1+sin n

Properties Transitivity f(n) =  (g(n)) & g(n) =  (h(n))  f(n) =  (h(n)) f(n) = O(g(n)) & g(n) = O(h(n))  f(n) = O(h(n)) f(n) =  (g(n)) & g(n) =  (h(n))  f(n) =  (h(n)) f(n) = o (g(n)) & g(n) = o (h(n))  f(n) = o (h(n)) f(n) = ω(g(n)) & g(n) = ω(h(n))  f(n) = ω(h(n)) Reflexivity f(n) =  (f(n)) f(n) = O(f(n)) f(n) =  (f(n))

Properties Symmetry f(n) =  (g(n)) iff g(n) =  (f(n)) Complementarity f(n) = O(g(n)) iff g(n) =  (f(n)) f(n) = o(g(n)) iff g(n) = ω((f(n))

Complexity Graphs log(n)

Complexity Graphs log(n) n n log(n)

Complexity Graphs n 10 n log(n) n3n3 n2n2

Complexity Graphs (log scale) n 10 n 20 n 1.1 n 2n2n 3n3n

Common Class of Growth Rate constant : Θ( 1 ) logarithmic : Θ( log n ) polylogarithmic : Θ( log c n ), c ≥ 1 sublinear : Θ( n a ), 0 < a < 1 linear : Θ( n ) quadratic : Θ( n 2 ) polynomial : Θ( n c ), c ≥ 1 exponential : Θ( c n ), c > 1

Logarithm Base of log is irrelevant log b n = ( log c n ) / ( log c b ) log 10 n = ( log 2 n ) / ( log 2 10 ) = Θ( log n ) any polynomial function of n does not matter log n 30 = 30 log n = Θ( log n )

Conclusion Compare which one is better By comparing their ratio when n approach infinity Compare by complexity class Asymptotic notation What is Property