Asymptotics Data Structures & OO Development I 1 Computer Science Dept Va Tech June 2006 ©2006 McQuain & Ribbens Big-O Analysis Order of magnitude analysis.

Slides:



Advertisements
Similar presentations
Analysis of Algorithms
Advertisements

Discrete Structures CISC 2315
 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Algorithms Recurrences. Definition – a recurrence is an equation or inequality that describes a function in terms of its value on smaller inputs Example.
Discrete Structures & Algorithms Functions & Asymptotic Complexity.
Estimating Running Time Algorithm arrayMax executes 3n  1 primitive operations in the worst case Define a Time taken by the fastest primitive operation.
Chapter 3 Growth of Functions
The Growth of Functions
1 ICS 353 Design and Analysis of Algorithms Spring Semester (062) King Fahd University of Petroleum & Minerals Information & Computer Science.
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
© 2004 Goodrich, Tamassia 1 Lecture 01 Algorithm Analysis Topics Basic concepts Theoretical Analysis Concept of big-oh Choose lower order algorithms Relatives.
Complexity Analysis (Part II)
Analysis of Algorithms
Fall 2006CSC311: Data Structures1 Chapter 4 Analysis Tools Objectives –Experiment analysis of algorithms and limitations –Theoretical Analysis of algorithms.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 2. Analysis of Algorithms - 1 Analysis.
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
CHAPTER 2 ANALYSIS OF ALGORITHMS Part 1. 2 Big Oh and other notations Introduction Classifying functions by their asymptotic growth Theta, Little oh,
David Luebke 1 8/17/2015 CS 332: Algorithms Asymptotic Performance.
Algorithm analysis and design Introduction to Algorithms week1
Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Analysis of Algorithms
Mathematics Review and Asymptotic Notation
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
CS 3343: Analysis of Algorithms
Math Review Data Structures & File Management Computer Science Dept Va Tech July 2000 ©2000 McQuain WD 1 Summation Formulas Let N > 0, let A, B, and C.
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
Asymptotic Analysis-Ch. 3
Coursenotes CS3114: Data Structures and Algorithms Clifford A. Shaffer Department of Computer Science Virginia Tech Copyright ©
Copyright © Cengage Learning. All rights reserved. CHAPTER 11 ANALYSIS OF ALGORITHM EFFICIENCY ANALYSIS OF ALGORITHM EFFICIENCY.
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Nattee Niparnan Dept. of Computer Engineering, Chulalongkorn University.
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
Growth of Functions. 2 Analysis of Bubble Sort 3 Time Analysis – Best Case The array is already sorted – no swap operations are required.
Introduction to Analysis of Algorithms CS342 S2004.
Time Complexity of Algorithms (Asymptotic Notations)
General rules: Find big-O f(n) = k = O(1) f(n) = a k n k + a k-1 n k a 1 n 1 + a 0 = O(n k ) Other functions, try to find the dominant term according.
Big-O. Algorithm Analysis Exact analysis: produce a function f(n) measuring how many basic steps are needed for a given inputs n On any input of size.
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
Analysis of Algorithm. Why Analysis? We need to know the “behavior” of algorithms – How much resource (time/space) does it use So that we know when to.
Asymptotic Notations By Er. Devdutt Baresary. Introduction In mathematics, computer science, and related fields, big O notation describes the limiting.
David Luebke 1 1/6/2016 CS 332: Algorithms Asymptotic Performance.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Asymptotic Performance. Review: Asymptotic Performance Asymptotic performance: How does algorithm behave as the problem size gets very large? Running.
Algorithm Complexity L. Grewe 1. Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them?
13 February 2016 Asymptotic Notation, Review of Functions & Summations.
Asymptotic Notation Faculty Name: Ruhi Fatima
CSE 421 Algorithms Richard Anderson Winter 2009 Lecture 4.
Dale Roberts Department of Computer and Information Science, School of Science, IUPUI Dale Roberts, Lecturer Computer Science, IUPUI
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Ch03-Algorithms 1. Algorithms What is an algorithm? An algorithm is a finite set of precise instructions for performing a computation or for solving a.
1 COMP9024: Data Structures and Algorithms Week Two: Analysis of Algorithms Hui Wu Session 2, 2014
CSE 3358 NOTE SET 2 Data Structures and Algorithms 1.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 2.
Asymptotic Complexity
CSC 413/513: Intro to Algorithms
Asymptotic Growth Rate
Advanced Analysis of Algorithms
G.PULLAIAH COLLEGE OF ENGINEERING AND TECHNOLOGY
At the end of this session, learner will be able to:
Complexity Analysis (Part II)
CS 2604 Data Structures and File Management
Presentation transcript:

Asymptotics Data Structures & OO Development I 1 Computer Science Dept Va Tech June 2006 ©2006 McQuain & Ribbens Big-O Analysis Order of magnitude analysis requires a number of mathematical definitions and theorems. The most basic concept is commonly termed big-O (Omicron). Definition:Suppose that f(n) and g(n) are nonnegative functions of n. Then we say that f(n) is O(g(n)) provided that there are constants C > 0 and N > 0 such that for all n > N, f(n)  Cg(n). Big-O expresses an upper bound on the growth rate of a function, for sufficiently large values of n. By the definition above, demonstrating that a function f is big-O of a function g requires that we find specific constants C and N for which the inequality holds (and show that the inequality does, in fact, hold).

Asymptotics Data Structures & OO Development I 2 Computer Science Dept Va Tech June 2006 ©2006 McQuain & Ribbens Big-O Example Take the function obtained in the algorithm analysis example earlier: Intuitively, one should expect that this function grows similarly to n 2. To show that, we will shortly prove that: and that Why? Because then we can argue by substitution (of non-equal quantities): Thus, applying the definition with C = 15/2 and N = 1, T(n) is O(n 2 ).

Asymptotics Data Structures & OO Development I 3 Computer Science Dept Va Tech June 2006 ©2006 McQuain & Ribbens Proving the Details Claim: proof: Obviously 5/2 < 5, so (5/2)n < 5n. Also, if n  1then by the Theorem above, 5n  5n 2. Hence, since  is transitive, (5/2)n  5n 2. Theorem: if a = b and 0  c  d then a*c  b*d. Claim: proof: For all n, n 2  0. Obviously 0  -3. So by transitivity …

Asymptotics Data Structures & OO Development I 4 Computer Science Dept Va Tech June 2006 ©2006 McQuain & Ribbens Big-O Theorems For all the following theorems, assume that f(n) is a non-negative function of n and that K is an arbitrary constant. Theorem 2:A polynomial is O(the term containing the highest power of n) Theorem 3:K*f(n) is O(f(n)) [i.e., constant coefficients can be dropped] Theorem 1:K is O(1) Theorem 4:If f(n) is O(g(n)) and g(n) is O(h(n)) then f(n) is O(h(n)). [transitivity]

Asymptotics Data Structures & OO Development I 5 Computer Science Dept Va Tech June 2006 ©2006 McQuain & Ribbens Big-O Theorems Theorem 5:Each of the following functions is strictly big-O of its successors: K[constant] log b (n)[always log base 2 if no base is shown] n n log b (n) n 2 n to higher powers 2 n 3 n larger constants to the n-th power n! [n factorial]n smaller larger

Asymptotics Data Structures & OO Development I 6 Computer Science Dept Va Tech June 2006 ©2006 McQuain & Ribbens Big-O Theorems Theorem 6:In general, f(n) is big-O of the dominant term of f(n), where “dominant” may usually be determined from Theorem 5. Theorem 7:For any base b, log b (n) is O(log(n)).

Asymptotics Data Structures & OO Development I 7 Computer Science Dept Va Tech June 2006 ©2006 McQuain & Ribbens Big-Omega and Big-Theta In addition to big-O, we may seek a lower bound on the growth of a function: Definition:Suppose that f(n) and g(n) are nonnegative functions of n. Then we say that f(n) is  (g(n)) provided that there are constants C > 0 and N > 0 such that for all n > N, f(n)  Cg(n). Big-  expresses a lower bound on the growth rate of a function, for sufficiently large values of n. Finally, we may have two functions that grow at essentially the same rate: Definition:Suppose that f(n) and g(n) are nonnegative functions of n. Then we say that f(n) is  (g(n)) provided that f(n) is O(g(n)) and also that f(n) is  (g(n)).

Asymptotics Data Structures & OO Development I 8 Computer Science Dept Va Tech June 2006 ©2006 McQuain & Ribbens Order and Limits The task of determining the order of a function is simplified considerably by the following result: Theorem 8:f(n) is  (g(n)) if Recall Theorem 7… we may easily prove it (and a bit more) by applying Theorem 8: The last term is finite and positive, so log b (n) is  (log(n)) by Theorem 8. Corollary: if the limit above is 0 then f(n) is O(g(n)), and if the limit is  then f(n) is  (g(n)).

Asymptotics Data Structures & OO Development I 9 Computer Science Dept Va Tech June 2006 ©2006 McQuain & Ribbens Order and Limits The contrapositive of Theorem 8 is false. However, it is possible to prove: Theorem 9:If f(n) is  (g(n)) then provided that the limit exists. A similar extension of the preceding corollary also follows.

Asymptotics Data Structures & OO Development I 10 Computer Science Dept Va Tech June 2006 ©2006 McQuain & Ribbens More Theorems Some of the big-O theorems may be strengthened to statements about big-  : Theorem 11:A polynomial is  (the highest power of n). Theorem 10:If K > 0 is a constant, then K is  (1). proof: Suppose a polynomial of degree k. Then we have: Now a k > 0 since we assume the function is nonnegative. So by Theorem 8, the polynomial is  (n k ). QED

Asymptotics Data Structures & OO Development I 11 Computer Science Dept Va Tech June 2006 ©2006 McQuain & Ribbens More Theorems Theorems 3, 6 and 7 can be similarly extended. Theorem 12:K*f(n) is  (f(n)) [i.e., constant coefficients can be dropped] Theorem 13:In general, f(n) is big-  of the dominant term of f(n), where “dominant” may usually be determined from Theorem 5. Theorem 14:For any base b, log b (n) is  (log(n)).

Asymptotics Data Structures & OO Development I 12 Computer Science Dept Va Tech June 2006 ©2006 McQuain & Ribbens Big-  Is an Equivalence Relation This follows from Theorem 4 and the observation that big-  is also transitive. Theorem 15:If f(n) is  (g(n)) and g(n) is  (h(n)) then f(n) is  (h(n)). [transitivity] Theorem 16:If f(n) is  (g(n)) then g(n) is  (f(n)). [symmetry] Theorem 17:If f(n) is  (f(n)). [reflexivity] By Theorems 15–17,  is an equivalence relation on the set of positive-valued functions. The equivalence classes represent fundamentally different growth rates. Algorithms whose complexity functions belong to the same class are essentially equivalent in performance (up to constant multiples, which are not unimportant in practice).

Asymptotics Data Structures & OO Development I 13 Computer Science Dept Va Tech June 2006 ©2006 McQuain & Ribbens Applications to Algorithm Analysis Ex 1:An algorithm with complexity function is  (n 2 ) by Theorem 10. Ex 2:An algorithm with complexity function is O(n log(n)) by Theorem 5. Furthermore, the algorithm is also  (n log(n)) by Theorem 8 since: For most common complexity functions, it's this easy to determine the big-O and/or big-  complexity using the given theorems.

Asymptotics Data Structures & OO Development I 14 Computer Science Dept Va Tech June 2006 ©2006 McQuain & Ribbens Complexity of Linear Storage For a contiguous list of N elements, assuming each is equally likely to be the target of a search: -average search cost is  (N) if list is randomly ordered -average search cost is  (log N) is list is sorted -average random insertion cost is  (N) -insertion at tail is  (1) For a linked list of N elements, assuming each is equally likely to be the target of a search: -average search cost is  (N), regardless of list ordering -average random insertion cost is  (1), excluding search time

Asymptotics Data Structures & OO Development I 15 Computer Science Dept Va Tech June 2006 ©2006 McQuain & Ribbens Most Common Complexity Classes Theorem 5 lists a collection of representatives of distinct big-Θ equivalence classes: K[constant] log b (n)[always log base 2 if no base is shown] n n log b (n) n 2 n to higher powers 2 n 3 n larger constants to the n-th power n! [n factorial]n Most common algorithms fall into one of these classes. Knowing this list provides some knowledge of how to compare and choose the right algorithm. The following charts provide some visual indication of how significant the differences are…

Asymptotics Data Structures & OO Development I 16 Computer Science Dept Va Tech June 2006 ©2006 McQuain & Ribbens Graphical Comparison

Asymptotics Data Structures & OO Development I 17 Computer Science Dept Va Tech June 2006 ©2006 McQuain & Ribbens Lower-order Classes log n n n log n n2n2 For significantly large values of n, only these classes are truly practical, and whether n 2 is practical is debated.