Asymptotic Growth Rates  Themes  Analyzing the cost of programs  Ignoring constants and Big-Oh  Recurrence Relations & Sums  Divide and Conquer 

Slides:



Advertisements
Similar presentations
Analysis of Algorithms
Advertisements

Intro to Analysis of Algorithms. Algorithm “A sequence of unambiguous instructions for solving a problem, i.e., for obtaining a required output for any.
CSE 373: Data Structures and Algorithms Lecture 5: Math Review/Asymptotic Analysis III 1.
Razdan with contribution from others 1 Algorithm Analysis What is the Big ‘O Bout? Anshuman Razdan Div of Computing.
Reference: Tremblay and Cheston: Section 5.1 T IMING A NALYSIS.
Introduction to Analysis of Algorithms
Analysis of Algorithms Review COMP171 Fall 2005 Adapted from Notes of S. Sarkar of UPenn, Skiena of Stony Brook, etc.
Runtime Analysis CSC 172 SPRING 2002 LECTURE 9 RUNNING TIME A program or algorithm has running time T(n), where n is the measure of the size of the input.
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
CHAPTER 2 ANALYSIS OF ALGORITHMS Part 1. 2 Big Oh and other notations Introduction Classifying functions by their asymptotic growth Theta, Little oh,
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
Prof. Amr Goneid, AUC1 Analysis & Design of Algorithms (CSCE 321) Prof. Amr Goneid Department of Computer Science, AUC Part 2. Types of Complexities.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
Algorithm/Running Time Analysis. Running Time Why do we need to analyze the running time of a program? Option 1: Run the program and time it –Why is this.
David Luebke 1 8/17/2015 CS 332: Algorithms Asymptotic Performance.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
Analysis of Performance
Asymptotic Growth Rates Themes –Analyzing the cost of programs –Ignoring constants and Big-Oh –Recurrence Relations & Sums –Divide and Conquer Examples.
Asymptotic Notations Iterative Algorithms and their analysis
Lecture 2 We have given O(n 3 ), O(n 2 ), O(nlogn) algorithms for the max sub-range problem. This time, a linear time algorithm! The idea is as follows:
Program Performance & Asymptotic Notations CSE, POSTECH.
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Mathematics Review and Asymptotic Notation
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
CMSC 341 Asymptotic Analysis. 8/3/07 UMBC CMSC 341 AA2-color 2 Complexity How many resources will it take to solve a problem of a given size?  time 
CS 3343: Analysis of Algorithms
Algorithm Analysis An algorithm is a clearly specified set of simple instructions to be followed to solve a problem. Three questions for algorithm analysis.
Design and Analysis of Algorithms - Chapter 21 Analysis of Algorithms b Issues: CorrectnessCorrectness Time efficiencyTime efficiency Space efficiencySpace.
CMSC 341 Asymptotic Analysis. 2 Complexity How many resources will it take to solve a problem of a given size? –time –space Expressed as a function of.
Analysis of Algorithm Efficiency Dr. Yingwu Zhu p5-11, p16-29, p43-53, p93-96.
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
Asymptotic Analysis-Ch. 3
Introduction to Analysis of Algorithms COMP171 Fall 2005.
Divide & Conquer  Themes  Reasoning about code (correctness and cost)  recursion, induction, and recurrence relations  Divide and Conquer  Examples.
1 Dr. J. Michael Moore Data Structures and Algorithms CSCE 221 Adapted from slides provided with the textbook, Nancy Amato, and Scott Schaefer.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
CMSC 341 Asymptotic Analysis. 2 Complexity How many resources will it take to solve a problem of a given size? –time –space Expressed as a function of.
Asymptotic Analysis (based on slides used at UMBC)
Algorithmic Analysis Charl du Plessis and Robert Ketteringham.
1 Asymptotic Notations Iterative Algorithms and their analysis Asymptotic Notations –Big O,  Notations Review of Discrete Math –Summations –Logarithms.
David Luebke 1 1/6/2016 CS 332: Algorithms Asymptotic Performance.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Asymptotic Performance. Review: Asymptotic Performance Asymptotic performance: How does algorithm behave as the problem size gets very large? Running.
ADVANCED ALGORITHMS REVIEW OF ANALYSIS TECHNIQUES (UNIT-1)
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
13 February 2016 Asymptotic Notation, Review of Functions & Summations.
8/3/07CMSC 341 Asymptotic Anaylsis1 CMSC 341 Asymptotic Analysis.
CMSC 341 Asymptotic Anaylsis1 CMSC 341 Asymptotic Analysis.
Dale Roberts Department of Computer and Information Science, School of Science, IUPUI Dale Roberts, Lecturer Computer Science, IUPUI
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
LECTURE 2 : fundamentals of analysis of algorithm efficiency Introduction to design and analysis algorithm 1.
Mathematical Foundations (Growth Functions) Neelima Gupta Department of Computer Science University of Delhi people.du.ac.in/~ngupta.
Asymptotic Complexity
Chapter 2 Algorithm Analysis
Analysis of Algorithms
CS 3343: Analysis of Algorithms
Time Complexity Analysis Neil Tang 01/19/2010
Algorithm Analysis Neil Tang 01/22/2008
Algorithm Analysis (not included in any exams!)
CS 3343: Analysis of Algorithms
BIG-OH AND OTHER NOTATIONS IN ALGORITHM ANALYSIS
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Chapter 2.
At the end of this session, learner will be able to:
Presentation transcript:

Asymptotic Growth Rates  Themes  Analyzing the cost of programs  Ignoring constants and Big-Oh  Recurrence Relations & Sums  Divide and Conquer  Examples  Sort  Computing powers  Euclidean algorithm (computing gcds)  Integer Multiplication

Asymptotic Growth Rates – “Big-O” (upper bound)  f(n) = O(g(n)) [f grows at the same rate or slower than g] iff: There exists positive constants c and n 0 such that f(n)  c g(n) for all n  n 0  f is bound above by g  Note: Big-O does not imply a tight bound  Ignore constants and low order terms

Big-O, Examples  E.G. 1: 5n 2 = O(n 3 ) c = 1, n 0 = 5: 5n 2  n  n 2 = n 3  E.G. 2: 100n 2 = O(n 2 ) c = 100, n 0 = 1  E.G. 3: n 3 = O(2 n ) c = 1, n 0 = 12 n 3  (2 n/3 ) 3, n  2 n/3 for n  12 [use induction]

Little-o Loose upper bound  f(n) = o(g(n)) [f grows strictly slower than g]  f(n) = O(g(n)) and g(n)  O(f(n))  lim n  f(n)/g(n) = 0  “f is bound above by g, but not tightly”

Little-o, restatement  lim n  f(n)/g(n) = 0  f(n) = o(g(n))    >0,  n 0 s.t.  n  n 0, f(n)/g(n) < 

Equivalence - Theta  f(n) =  (g(n)) [grows at the same rate]  f(n) = O(g(n)) and g(n) = O(f(n))  g(n) =  (f(n))  lim n  f(n)/g(n) = c, c ≠ 0  f(n) =  (g(n))  “f is bound above by g, and below by g”

Common Results  [j < k] lim n  n j /n k = lim n  1/n (k-j) = 0  n j = o(n k ), if j<k  [c < d] lim n  c n /d n = lim n  (c/d) n = 0  c n = o(d n ), if c<d  lim n  ln(n)/n =  /   lim n  ln(n)/n = lim n  (1/n)/1 = 0 [L’Hopital’s Rule]  ln(n) = o(n) [  > 0] ln(n) = o(n  ) [similar calculation]

Common Results  [c > 1, k an integer] lim n  n k /c n =  /   lim n  kn k-1 / c n ln(c)  lim n  k(k-1)n k-2 / c n ln(c) 2  …  lim n  k(k-1)…(k-1)/c n ln(c) k = 0  n k = o(c n )

Asymptotic Growth Rates   (log(n)) – logarithmic [log(2n)/log(n) = 1 + log(2)/log(n)]   (n) – linear [double input  double output]   (n 2 ) – quadratic [double input  quadruple output]   (n 3 ) – cubit [double input  output increases by factor of 8]   (n k ) – polynomial of degree k   (c n ) – exponential [double input  square output]

Asymptotic Manipulation   (cf(n)) =  (f(n))   (f(n) + g(n)) =  (f(n)) if g(n) = O(f(n))

Computing Time Functions  Computing time function is the time to execute a program as a function of its inputs  Typically the inputs are parameterized by their size [e.g. number of elements in an array, length of list, size of string,…]  Worst case = max runtime over all possible inputs of a given size  Best case = min runtime over all possible inputs of a given size  Average = avg. runtime over specified distribution of inputs

Analysis of Running Time  We can only know the cost up to constants through analysis of code [number of instructions depends on compiler, flags, architecture, etc.]  Assume basic statements are O(1)  Sum over loops  Cost of function call depends on arguments  Recursive functions lead to recurrence relations

Loops and Sums  for (i=0;i<n;i++)  for (j=i;j<n;j++)  S; // assume cost of S is O(1)

Merge Sort and Insertion Sort  Insertion Sort  T I (n) = T I (n-1) + O(n) =  (n 2 ) [worst case]  T I (n) = T I (n-1) + O(1) =  (n) [best case]  Merge Sort  T M (n) = 2T M (n/2) + O(n) =  (nlogn) [worst case]  T M (n) = 2T M (n/2) + O(n) =  (nlogn) [best case]

Karatsuba’s Algorithm  Using the classical pen and paper algorithm two n digit integers can be multiplied in O(n 2 ) operations. Karatsuba came up with a faster algorithm.  Let A and B be two integers with  A = A 1 10 k + A 0, A 0 < 10 k  B = B 1 10 k + B 0, B 0 < 10 k  C = A*B = (A 1 10 k + A 0 )(B 1 10 k + B 0 ) = A 1 B k + (A 1 B 0 + A 0 B 1 )10 k + A 0 B 0 Instead this can be computed with 3 multiplications  T 0 = A 0 B 0  T 1 = (A 1 + A 0 )(B 1 + B 0 )  T 2 = A 1 B 1  C = T k + (T 1 - T 0 - T 2 )10 k + T 0

Complexity of Karatsuba’s Algorithm  Let T(n) be the time to compute the product of two n-digit numbers using Karatsuba’s algorithm. Assume n = 2 k. T(n) =  (n lg(3) ), lg(3)  1.58  T(n)  3T(n/2) + cn  3(3T(n/4) + c(n/2)) + cn = 3 2 T(n/2 2 ) + cn(3/2 + 1)  3 2 (3T(n/2 3 ) + c(n/4)) + cn(3/2 + 1) = 3 3 T(n/2 3 ) + cn(3 2 / /2 + 1) …  3 i T(n/2 i ) + cn(3 i-1 /2 i-1 + … + 3/2 + 1)...  cn[((3/2) k - 1)/(3/2 -1)] --- Assuming T(1)  c  2c(3 k - 2 k )  2c3 lg(n) = 2cn lg(3)

Divide & Conquer Recurrence Assume T(n) = aT(n/b) +  (n)  T(n) =  (n) [a < b]  T(n) =  (nlog(n)) [a = b]  T(n) =  (n log b (a) ) [a > b]

(blank for notes)