Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.

Slides:



Advertisements
Similar presentations
Intro to Analysis of Algorithms. Algorithm “A sequence of unambiguous instructions for solving a problem, i.e., for obtaining a required output for any.
Advertisements

Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Chapter 2: Algorithm Analysis
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Analysis of Algorithms Review COMP171 Fall 2005 Adapted from Notes of S. Sarkar of UPenn, Skiena of Stony Brook, etc.
Algorithm Analysis. Math Review – 1.2 Exponents –X A X B = X A+B –X A /X B =X A-B –(X A ) B = X AB –X N +X N = 2X N ≠ X 2N –2 N+ 2 N = 2 N+1 Logarithms.
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
Algorithm Analysis CS 201 Fundamental Structures of Computer Science.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
CHAPTER 2 ANALYSIS OF ALGORITHMS Part 1. 2 Big Oh and other notations Introduction Classifying functions by their asymptotic growth Theta, Little oh,
Chapter 2: Algorithm Analysis Big-Oh and Other Notations in Algorithm Analysis Lydia Sinapova, Simpson College Mark Allen Weiss: Data Structures and Algorithm.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
Chapter 6 Algorithm Analysis Bernard Chen Spring 2006.
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Algorithm Analysis. Algorithm Def An algorithm is a step-by-step procedure.
Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
Program Performance & Asymptotic Notations CSE, POSTECH.
1 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Introduction in Computer Science 2 Asymptotic Complexity DEEDS Group.
For Wednesday Read Weiss chapter 3, sections 1-5. This should be largely review. If you’re struggling with the C++ aspects, you may refer to Savitch, chapter.
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
Asymptotic Analysis-Ch. 3
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
1 o-notation For a given function g(n), we denote by o(g(n)) the set of functions: o(g(n)) = {f(n): for any positive constant c > 0, there exists a constant.
Nattee Niparnan Dept. of Computer Engineering, Chulalongkorn University.
Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them? At the heart of a computer program.
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
Introduction to Algorithms: Verification, Complexity, and Searching (2) Andy Wang Data Structures, Algorithms, and Generic Programming.
Algorithm Analysis Dr. Bernard Chen Ph.D. University of Central Arkansas Fall 2008.
CE 221 Data Structures and Algorithms Chapter 2: Algorithm Analysis - I Text: Read Weiss, §2.1 – Izmir University of Economics.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 2 Prepared by İnanç TAHRALI.
Analysis of Algorithm. Why Analysis? We need to know the “behavior” of algorithms – How much resource (time/space) does it use So that we know when to.
Introduction to Algorithms: Verification, Complexity, and Searching Andy Wang Data Structures, Algorithms, and Generic Programming.
Algorithm Analysis. What is an algorithm ? A clearly specifiable set of instructions –to solve a problem Given a problem –decide that the algorithm is.
Chapter 2: Algorithm Analysis Application of Big-Oh to program analysis Logarithms in Running Time Lydia Sinapova, Simpson College Mark Allen Weiss: Data.
Vishnu Kotrajaras, PhD.1 Data Structures
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Asymptotic Complexity
Algorithm Analysis 1.
Chapter 2 Algorithm Analysis
Analysis of Non – Recursive Algorithms
Analysis of Non – Recursive Algorithms
nalisis de lgoritmos A A
Introduction to Algorithms
CSS 342 Data Structures, Algorithms, and Discrete Mathematics I
Time Complexity Analysis Neil Tang 01/19/2010
Algorithm Analysis Neil Tang 01/22/2008
CSS 342 Data Structures, Algorithms, and Discrete Mathematics I
Introduction to Algorithms Analysis
BIG-OH AND OTHER NOTATIONS IN ALGORITHM ANALYSIS
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
CS 201 Fundamental Structures of Computer Science
Programming and Data Structure
Advanced Analysis of Algorithms
Chapter 2.
Programming and Data Structure
Algorithm Analysis Bina Ramamurthy CSE116A,B.
Algorithms & Cost.
CE 221 Data Structures and Algorithms
CE 221 Data Structures and Algorithms
8. Comparison of Algorithms
At the end of this session, learner will be able to:
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Complexity Analysis (Part II)
Estimating Algorithm Performance
Presentation transcript:

Chapter 2 Computational Complexity

Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects Metrics  “Big O” Notation O()  “Big Omega” Notation  ()  “Big Theta” Notation  ()

Big “O” Notation f(n) =O(g(n))  If and only if there exist two constants c > 0 and n 0 > 0, such that f(n)  cg(n) for all n  n 0  iff  c, n 0 > 0 s.t.  n  n 0 : 0  f(n)  cg(n) n0n0 f(n) cg(n) f(n) is eventually upper- bounded by g(n)

Big “Omega” Notation f(n) =  (g(n))  iff  c, n 0 > 0 s.t.  n ≥ n0, 0 ≤ cg(n) ≤ f(n) f(n) cg(n) n0n0 f(n) is eventually lower-bounded by g(n)

Big “Theta” Notation f(n) =  (g(n))  iff  c 1, c 2, n 0 > 0 s.t. 0 ≤ c 1 g(n) ≤ f(n) ≤ c 2 g(n),  n >= n 0 f(n) c 1 g(n) n0n0 c 2 g(n) f(n) has the same long-term rate of growth as g(n)

Examples 3n  (1),  (n),  (n 2 )  lower bounds O(n 2 ), O(n 3 ),...  upper bounds  (n 2 )  exact bound

Analogous to Real Numbers f(n) = O(g(n)) (a ≤ b) f(n) =  (g(n)) (a ≥ b) f(n) =  (g(n))(a = b) …The above analogy is not quite accurate, but its convenient to think of function complexity in these terms Caveat: The “hidden constants” in the Big-notations can have have real practical implications.

Transitivity f(n) = O(g(n)) (a ≤ b) f(n) =  (g(n)) (a ≥ b) f(n) =  (g(n))(a = b) If f(n) = O(g(n)) and g(n) = O(h(n))  Then f(n) = O(h(n)) If f(n) =  (g(n)) and g(n) =  (h(n))  Then f(n) =  (h(n)) If f(n) =  (g(n)) and g(n) =  (h(n))  Then f(n) =  (h(n))

Symmetry/ Anti-symmetry f(n) =  (g(n))(a = b) f(n) = O(g(n)) (a ≤ b) f(n) =  (g(n)) (a ≥ b) f(n) =  (g(n)) iff g(n) =  (f(n)) f(n) = O(g(n)) iff g(n) =  (f(n))

Reflexivity f(n) = O(g(n)) (a ≤ b) f(n) =  (g(n)) (a ≥ b) f(n) =  (g(n))(a = b) f(n) = O(f(n)) f(n) =  (f(n)) f(n) =  (f(n))

Dichotomy f(n) = O(g(n)) (a ≤ b) f(n) =  (g(n)) (a ≥ b) f(n) =  (g(n))(a = b) If f(n) = O(g(n)) and g(n) = O(f(n))  Then f(n) =  (g(n)) If f(n) =  (g(n)) and g(n) =  (f(n))  Then f(n) =  (g(n))

Arithmetic Properties Additive Property  If e(n) = O(g(n)) and f(n) = O(h(n))  Then e(n) + f(n) = O(g(n) + h(n)) Multiplicative Property  If e(n) = O(g(n)) and f(n) = O(h(n))  Then e(n)f(n) = O(g(n) h(n))

Typical Growth Rates Function Name R f(x) = c, c  R Constant log(N)Logarithmic log 2 (N)Log-squared NLinear N log(N) N2N2 Quadratic N3N3 Cubic 2N2N Exponential

Some Rules of Thumb If f(n) is a polynomial of degree k  Then f(n) =  (N k ) log k N = O(N), for any k  Logarithms grow very slowly compared to even linear growth

Maximum Subsequence Problem Given a sequence of integers A 1, A 2, …, A N  Find the maximum subsequence (A i + A i+1 + … + A k ), where 1 ≤ i ≤ N  Many algorithms of differing complexity can be found Algorithm time Input Size1 O(N 3 ) 2 O(N 2 ) 3 O(N*logN) 4 O(N) N= N= N=1, N=10,000N.A N=100,000N.A

Maximum Subsequence Problem : How Complexity affects running times

Exercise f(N) = N logN and g(N) = N 1.5  Which one grows faster?? Note that g(N) = N 1.5 = N N 0.5  Hence, between f(N) and g(N), we only need to compare growth rate of log(N) and N 0.5  Equivalently, we can compare growth rate of log 2 N with N  Now, we can refer to the previously state result to figure out whether f(N) or g(N) grows faster!

Complexity Analysis Estimate n = size of input Isolate each atomic activities to be counted Find f(n) = the number of atomic activities done by an input size of n Complexity of an algorithm = complexity of f(n)

Running Time Calculations - Loops for (j = 0; j < n; ++j) { // 3 atomics } Complexity =  (3n) =  (n)

Loops with Break for (j = 0; j < n; ++j) { // 3 atomics if (condition) break; } Upper bound = O(4n) = O(n) Lower bound =  (4) =  (1) Complexity = O(n) Why don’t we have a  (…) notation here?

Loops in Sequence for (j = 0; j < n; ++j) { // 3 atomics } for (j = 0; j < n; ++j) { // 5 atomics } Complexity =  (3n + 5n) =  (n)

Nested Loops for (j = 0; j < n; ++j) { // 2 atomics for (k = 0; k < n; ++k) { // 3 atomics } } Complexity =  ((2 + 3n)n) =  (n 2 )

Consecutive Statements for (i = 0; i < n; ++i) { // 1 atomic if(condition) break; } for (j = 0; j < n; ++j) { // 1 atomic if(condition) break; for (k = 0; k < n; ++k) { // 3 atomics } if(condition) break; } Complexity = O(2n) + O((2+3n)n) = O(n) + O(n 2 ) = ?? = O(n 2 )

If-then-else if(condition) i = 0; else for ( j = 0; j < n; j++) a[j] = j; Complexity = ?? = O(1) + max ( O(1), O(N)) = O(1) + O(N) = O(N)

Sequential Search Given an unsorted vector a[], find if the element X occurs in a[] for (i = 0; i < n; i++) { if (a[i] == X) return true; } return false; Input size: n = a.size() Complexity = O(n)

Binary Search Given a sorted vector a[], find the location of element X unsigned int binary_search(vector a, int X) { unsigned int low = 0, high = a.size()-1; while (low <= high) { int mid = (low + high) / 2; if (a[mid] < X) low = mid + 1; else if( a[mid] > X ) high = mid - 1; else return mid; } return NOT_FOUND; } Input size: n = a.size() Complexity: O( k iterations x (1 comparison + 1 assignment) per loop) = O(log(n))

Recursion long factorial( int n ) { if( n <= 1 ) return 1; else return n * factorial( n - 1 ); } long fib( int n ) { if ( n <= 1) return 1; else return fib( n – 1 ) + fib( n – 2 ); } This is really a simple loop disguised as recursion Complexity = O(n) Fibonacci Series: Terrible way to Implement recursion Complexity = O( (3/2) N ) That’s Exponential !!

Euclid’s Algorithm Find the greatest common divisor (gcd) between m and n  Given that m ≥ n Complexity = O(log(N)) Exercise:  Why is it O(log(N)) ?

Exponentiation Calculate x n Example:  x 11 = x 5 * x 5 * x  x 5 = x 2 * x 2 * x  x 2 = x * x Complexity = O( logN ) Why didn’t we implement the recursion as follows?  pow(x,n/2)*pow(x,n/2)*x