Analysis of Algorithm. Why Analysis? We need to know the “behavior” of algorithms – How much resource (time/space) does it use So that we know when to.

Slides:



Advertisements
Similar presentations
Growth-rate Functions
Advertisements

Analysis of Algorithms
Intro to Analysis of Algorithms. Algorithm “A sequence of unambiguous instructions for solving a problem, i.e., for obtaining a required output for any.
 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Razdan with contribution from others 1 Algorithm Analysis What is the Big ‘O Bout? Anshuman Razdan Div of Computing.
Asymptotic Notation (O, Ω,  ) s Describes the behavior of the time or space complexity for large instance characteristics s Common asymptotic functions.
Discrete Structures & Algorithms Functions & Asymptotic Complexity.
Estimating Running Time Algorithm arrayMax executes 3n  1 primitive operations in the worst case Define a Time taken by the fastest primitive operation.
Chapter 3 Growth of Functions
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Analysis of Algorithms1 Estimate the running time Estimate the memory space required. Time and space depend on the input size.
Analysis of Algorithms
1 Data Structures A program solves a problem. A program solves a problem. A solution consists of: A solution consists of:  a way to organize the data.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 2. Analysis of Algorithms - 1 Analysis.
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
Asymptotic Analysis Motivation Definitions Common complexity functions
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
Introduction to Algorithms (2 nd edition) by Cormen, Leiserson, Rivest & Stein Chapter 3: Growth of Functions (slides enhanced by N. Adlai A. DePano)
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Algorithmic Complexity: Complexity Analysis of Time Complexity Complexities Nate the Great.
Asymptotics Data Structures & OO Development I 1 Computer Science Dept Va Tech June 2006 ©2006 McQuain & Ribbens Big-O Analysis Order of magnitude analysis.
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Mathematics Review and Asymptotic Notation
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
CS 3343: Analysis of Algorithms
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
CSCI 3160 Design and Analysis of Algorithms Tutorial 1
Asymptotic Analysis-Ch. 3
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Fundamentals of Algorithms MCS - 2 Lecture # 8. Growth of Functions.
1 o-notation For a given function g(n), we denote by o(g(n)) the set of functions: o(g(n)) = {f(n): for any positive constant c > 0, there exists a constant.
Nattee Niparnan Dept. of Computer Engineering, Chulalongkorn University.
Zeinab EidAlgorithm Analysis1 Chapter 4 Analysis Tools.
Analysis of Algorithms1 O-notation (upper bound) Asymptotic running times of algorithms are usually defined by functions whose domain are N={0, 1, 2, …}
CMSC 341 Asymptotic Analysis. 2 Complexity How many resources will it take to solve a problem of a given size? –time –space Expressed as a function of.
Asymptotic Analysis (based on slides used at UMBC)
CSC – 332 Data Structures Generics Analysis of Algorithms Dr. Curry Guinn.
Introduction to Analysis of Algorithms CS342 S2004.
Algorithmic Analysis Charl du Plessis and Robert Ketteringham.
Time Complexity of Algorithms (Asymptotic Notations)
General rules: Find big-O f(n) = k = O(1) f(n) = a k n k + a k-1 n k a 1 n 1 + a 0 = O(n k ) Other functions, try to find the dominant term according.
1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried.
Big-O. Algorithm Analysis Exact analysis: produce a function f(n) measuring how many basic steps are needed for a given inputs n On any input of size.
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
Asymptotic Notations By Er. Devdutt Baresary. Introduction In mathematics, computer science, and related fields, big O notation describes the limiting.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
13 February 2016 Asymptotic Notation, Review of Functions & Summations.
A Introduction to Computing II Lecture 5: Complexity of Algorithms Fall Session 2000.
Asymptotic Notation Faculty Name: Ruhi Fatima
Dale Roberts Department of Computer and Information Science, School of Science, IUPUI Dale Roberts, Lecturer Computer Science, IUPUI
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
1 Asymptotes: Why? How to describe an algorithm’s running time? (or space, …) How does the running time depend on the input? T(x) = running time for instance.
Ch03-Algorithms 1. Algorithms What is an algorithm? An algorithm is a finite set of precise instructions for performing a computation or for solving a.
CSE 3358 NOTE SET 2 Data Structures and Algorithms 1.
Data Structures & Algorithm CS-102 Lecture 12 Asymptotic Analysis Lecturer: Syeda Nazia Ashraf 1.
Asymptotic Complexity
Chapter 2 Algorithm Analysis
Introduction to Algorithms
Growth of functions CSC317.
O-notation (upper bound)
Asymptotic Growth Rate
Estimating Algorithm Performance
Presentation transcript:

Analysis of Algorithm

Why Analysis? We need to know the “behavior” of algorithms – How much resource (time/space) does it use So that we know when to use which algorithm – So that two algorithm can be compared whether which one is better, with respect to the situtation

Can we? Sure! If we have the algorithm – We can implement it – We can test it with input But… – Is that what we really want? – If you wish to know whether falling from floor 20 of Eng. 4 building would kill you Will you try?

Prediction We wish to know the “Behavior” of the algorithm – Without actually trying it Back to the suicidal example – Can you guess whether you survive jumping from 20 th floor of Eng. 4 building? – What about 15 th floor? – What about 10 th floor? – What about 5 th floor? – What about 2 nd floor? – Why?

Modeling If floor_number > 3 then – Die Else – Survive (maybe?) Describe behavior using some kind of model, rule, etc.

Generalization What about jumping from Central World’s 20 th floor? What about jumping from Empire State’s 20 th floor? What about jumping from Bai Yok’s 20 th floor? Can our knowledge (our analysis of the situation) be applicable on the above questions?

Generalization Knowledge from some particular instances might be applicable to another instance

Analysis We need something that can tell us the behavior of the algorithm that is… – Useful (give us knowledge without actually doing it) – Applicable (give us knowledge for similar kind of situation) GeneralizationModeling

Analysis (Measurement) What we really care? RESOURCE Time (CPU power) Space (amount of RAM)

Model Usage of Resource how well does an algo use resource?

Model Resource Function Time Function of algorithm A Time Function of algorithm A Space Function of algorithm A Space Function of algorithm A Input ? Time used Space used Size of input

Example Inserting a value into a sorted array Input: – a sorted array A[1..N] – A number X Output – A sorted array A[1..N+1] which includes X

Algorithm Element Insertion Assume that X = 20 – What if A = [1,2,3]? How much time? – What if A = [101,102,103]? idx = N; while (idx >= 1 && A[idx] > X) { A[idx + 1] = A[idx]; idx--; } A[idx] = X; Usually, resource varies according to size of input

Using the Model Time Size of Input best worst average

Resource Function Give us resource by “size of input” – Why? Easy to compute Applicable (usually give meaningful, fairly accurate result without much requirement)

Conclusion Measurement for algorithm – By modeling and generalization – For prediction of behavior Measurement is functions on the size of input – With some simplification – Best, avg, worst case

ASYMPTOTIC NOTATION

Comparing two algorithms We have established that a “resource function” is a good choice of measurement The next step, answering which function is “better”

What is “better” in our sense? Takes less resource Consider this which one is better? f(x) g(x)

Slice f(x) g(x)

What is “better” in our sense? which one is better? – Performance is now a function, not a single value – Which slice to use? Can we say “better” based on only one slice? Use the slice where it’s really matter – i.e., when N is large – What is large N? Infinity? – Implication?

Comparison by infinite N There is some problem – Usually, – The larger the problem, the more resource used

Separation between Abstraction and Implementation

Compare by RoG 0 : f(x) grows “slowzer” than g(x) ∞ : f(x) grows “faster” than g(x) else : f(x) grows “similar” to g(x)

Growth Rate Comparison 0.5 n 1 log n log 6 n n 0.5 n 3 2 n n! Sometime it is simple Some time it is not

l’Hôpital’s Rule Limit of ratio of two functions equal to limit of ratio of their derivative. – Under specific condition

l’Hôpital’s Rule If then

The problem of this approach What if f(x) cannot be differentiated? Too complex to find derivative

Compare by Classing Coarse grain comparison – Another simplification Work (mostly) well in practice Classing

Simplification by classification Grading Analogy ScoreGrade >= 80A 70 <= x < 80B 60 <= x < 70C 50 <= x < 60D < 50F

Compare by Classification algo

Compare by Classification algo Group A Group C Group B Group F Group D grouping

Compare by Classification algo Group A Group C Group B Group F Group D Describe “simplified” property >= <= x < <= x < <= x < 60 x < 50

Compare by Classification Group by some similar property – Select a representative of the group – Use the representative for comparison If we have the comparison of the representative – The rest is to do the classification

Complexity Class

Asymptote Something that bounds curves Asymptote Curve

Asymptote Remember hyperbola?

O-notation O(g(n)) = { f(n) :  positive constants c and n 0, such that  n  n 0, we have 0  f(n)  cg(n) } O(g(n)) = { f(n) :  positive constants c and n 0, such that  n  n 0, we have 0  f(n)  cg(n) } For function g(n), we define O(g(n)), big-O of n, as the set: n0n0 cg(x) f(x) f(x)  O(g(x))

 -notation  (g(n)) = { f(n) :  positive constants c and n 0, such that  n  n 0, we have 0  cg(n)  f(n) }  (g(n)) = { f(n) :  positive constants c and n 0, such that  n  n 0, we have 0  cg(n)  f(n) } For function g(n), we define  (g(n)), big-Omega of n, as the set: n0n0 cg(x) f(x) f(x)   (g(x))

 -notation  (g(n)) = { f(n) :  positive constants c 1, c 2, and n 0, such that  n  n 0, we have 0  c 1 g(n)  f(n)  c 2 g(n) }  (g(n)) = { f(n) :  positive constants c 1, c 2, and n 0, such that  n  n 0, we have 0  c 1 g(n)  f(n)  c 2 g(n) } For function g(n), we define  (g(n)), big-Theta of n, as the set: n0n0 c 1 g(x) f(x) c 2 g(x) f(x)   (g(x))

Example F(n) = 300n + 10 – is a member of  (30n) – why? let c 1 = 9 let c 2 = 11 let n = 1

Another Example F(n) = 300n n – is a member of  (10n 2 ) – why? let c 1 = 29 let c 2 = 31 let n = 11

How to Compute? Remove any constant – F(n) = n 3 +2n 2 + 4n + 10 is a member of  (n 3 +n 2 + n) Remove any lower degrees – F(n) = n 3 +2n 2 + 4n + 10 is a member of  (n 3 )

Relations Between , , O I.e.,  (g(n)) = O(g(n))   (g(n)) In practice, asymptotically tight bounds are obtained from asymptotic upper and lower bounds. For any two functions g(n) and f(n), f(n) =  (g(n)) if and only if f(n) = O(g(n)) and f(n) =  (g(n)). For any two functions g(n) and f(n), f(n) =  (g(n)) if and only if f(n) = O(g(n)) and f(n) =  (g(n)).

Practical Usage We say that the program has a worst case running time of O(g(n)) We say that the program has a best case running time of  (g(n)) We say that the program has a tight-bound running time of  (g(n))

Example Insertion sort takes  (n 2 ) in the worst case – Meaning: at worst, insertion sort, takes time that grows not more than quadratic of the size of the input Insertion sort takes  (n) in the best case – Meaning: at best, insertion sort, takes time that grows not less than linear to the size of the input

o-notation o(g(n)) = {f(n):  c > 0,  n 0 > 0 such that  n  n 0, we have 0  f(n) < cg(n)}. For a given function g(n), the set little-oh:

ω (g(n)) = {f(n):  c > 0,  n 0 > 0 such that  n  n 0, we have 0  cg(n) < f(n)}.  -notation For a given function g(n), the set little-omega:

Remark on Notation An asymptotic group is a set – Hence f(n) is a member of an asymptotic group – E.g., f(n)  O( n ) Strictly speaking, f(n) = O( n ) is syntactically wrong But we will see this a lot It’s traditions

Comparison of Functions f (n)  O(g(n))  f (n)  g(n) f (n)   (g(n))  f (n)  g(n) f (n)   (g(n))  f (n) = g(n) f (n)  o(g(n))  f (n) < g(n) f (n)   (g(n))  f (n) > g(n) Where, = means grows slower, faster, equally

Lost of Trichotomy Trichotomy – given two numbers a,b – Only one of the following must be true a b, a=b For our asymptotic notation – given f(n) and g(n) – it is possible that f(n) != O(g(n)) and f(n) !=  (g(n)) e.g., n, n 1+sin n

Properties Transitivity f(n) =  (g(n)) & g(n) =  (h(n))  f(n) =  (h(n)) f(n) = O(g(n)) & g(n) = O(h(n))  f(n) = O(h(n)) f(n) =  (g(n)) & g(n) =  (h(n))  f(n) =  (h(n)) f(n) = o (g(n)) & g(n) = o (h(n))  f(n) = o (h(n)) f(n) = ω(g(n)) & g(n) = ω (h(n))  f(n) = ω (h(n)) Reflexivity f(n) =  (f(n)) f(n) = O(f(n)) f(n) =  (f(n))

Properties Symmetry f(n) =  (g(n))  g(n) =  (f(n)) Complementarity f(n) = O(g(n))  g(n) =  (f(n)) f(n) = o(g(n))  g(n) = ω((f(n))

Complexity Graphs log(n)

Complexity Graphs log(n) n n log(n)

Complexity Graphs n 10 n log(n) n3n3 n2n2

Complexity Graphs (log scale) n 10 n 20 n 1.1 n 2n2n 3n3n Eventually, 1.1 n will overcome n 10

Common Class of Growth Rate constant : Θ( 1 ) logarithmic : Θ( log n ) polylogarithmic : Θ( log c n ), c ≥ 1 sublinear : Θ( n a ), 0 < a < 1 linear : Θ( n ) quadratic : Θ( n 2 ) polynomial : Θ( n c ), c ≥ 1 exponential : Θ( c n ), c > 1

Logarithm Base of log is irrelevant – log b n = ( log c n ) / ( log c b ) – log 10 n = ( log 2 n ) / ( log 2 10 ) = Θ( log n ) any polynomial function of n does not matter – log n 30 = 30 log n = Θ( log n )

Conclusion Compare which one is better – By comparing their ratio when n approaches infinity – By comparing their asymptotic notation of the resource function Asymptotic notation – What is – Property